Apple just released their new iPhone 5s, where the biggest news is really the 64-bit processor core inside the new A7 SoC. Sixty four bits in a phone is a first, and it immediately raises the old question of just what 64 bits gives you. We saw this when AMD launched the Opteron and 64-bit x86 PC computing back in the early 2000’s, and in a less public market the same question was asked as 64-bit MIPS took huge chunks out of the networking processor market in the mid-2000s. It was never questioned in servers, however.
Continue reading “The First 64-bit Phone”
Note: This post was caused by listening to an interesting science podcast while thinking about the theories of startups, and the connection might seem a bit odd. Still, I think there is something to be learnt here. End note.
I recently listened to the episode on Bliss, by the Radiolab podcast. As always, Radiolab manages to take a theme and connect all kinds of things to it. In this case, bliss as in happiness turned into Bliss, the man, and his invention of Symbolics. Symbolics was an attempt to create a rational language based on symbols that would not allow the manipulation of human opinion or feeling like regular languages do. It was an attempt to create an antidote to the manipulations of dictators, tricksters, and populists (Bliss himself had been briefly interned in a pre-war German concentration camp, so he definitely knew what words could do). He designed a symbolic writing scheme that was intended to only communicate ideas clearly and unambiguously and with no room for demagogery and oratory. In the end, nobody wanted to use the language for its original purpose.
Continue reading “Bliss: Failing to Pivot for Ideology”
After some discussions at the S4D conference last week, I have some additional updates to the history and technologies of reverse execution. I have found one new commercial product at a much earlier point in time, and an interesting note on memory consistency.
Continue reading “Reverse Execution History Updates”
I recently read the classic book The Soul of a New Machine by Tracy Kidder. Even though it describes the project to build a machine that was launched more than 30 years ago, the story is still fresh and familiar. Corporate intrigue, managing difficult people, clever engineering, high pressure, all familiar ingredients in computing today just as it was back then. With my interesting in computer history and simulation, I was delighted to actually find a simulator in the story too! It was a cycle-accurate simulator of the design, programmed in 1979.
Continue reading ““Eagle” Cycle-Accurate Simulator Anno 1979″
In this final part of my series on the history of reverse debugging I will look at the products that launched around the mid-2000s and that finally made reverse debugging available in a commercially packaged product and not just research prototypes. Part one of this series provided a background on the technology and part two discussed various research papers on the topic going back to the early 1970s. The first commercial product featuring reverse debugging was launched in 2003, and then there have been a steady trickle of new products up until today.
Originally published in January 2012. Post updated 2012-09-28 with a revised timeline for Lauterbach CTS. Post updated 2016-04-05 to include Mozilla RR. Post updated 2016-12-26 to add Simulics. Post updated 2017-10-08 to add Microsoft WinDbg. Post updated 2018-07-28 to add Borland Turbo Debugger.
Continue reading “Reverse History Part Three – Products”
This is the second post in my series on the history of reverse execution, covering various early research papers. It is clear that reverse debugging has been considered a good idea for a very long time. Sadly though, not a practical one (at the time). The idea is too obvious to be considered new. Here are some of the papers that I have found, going back before reverse debugging got started for real in actual products (around 2003) as well later on for interesting research papers that did not make it into products. It is worth noting that products/useful software has become more common in recent times as the way that reverse debugging ideas get expressed.
Continue reading “Reverse History Part Two – Research”
For some reason, when I think of reverse execution and debugging, the sound track that goes through my head is a UK novelty hit from 1987, “Star Trekkin” by the Firm. It contains the memorable line “we’re only going forward ’cause we can’t find reverse“. To me, that sums up the history of reverse debugging nicely. The only reason we are not all using it every day is that practical reverse debugging has not been available until quite recently. However, in the past ten years, I think we can say that software development has indeed found reverse. It took a while to get there, however. This is the first of a series of blog posts that will try to cover some of the history of reverse debugging. The text turned out to be so long that I had to break it up to make each post usefully short. Part two is about research, and part three about products.
Continue reading “Reverse History Part One”
On the very binary date of 11-11-11, my alma mater, the computer science (DV, for datavetenskap) education at Uppsala University celebrated its thirty years’ anniversary. It was a great classic student party in the evening with a nice mix of old alumni and fresh-faced students. Lots of singing and some nice skits on stage. Great fun, and my voice has still not recovered. It also got me thinking about it is that we really do as computer scientists.
Continue reading “DV* 30 Years”
Just for fun, I tried to surf the web of today using a Netscape 4 browser from 2001.
The result: not exactly useful. Netscape 4 was bad back then, and it does not work at all with the current style of web coding.
From what little I had heard and read, the IBM AS/400 (later known as iSeries, and now known as simply IBM i) sounded like a fascinating system. I knew that it had a rich OS stack that contained most of the services a program needs, and a JVM-style byte code format for applications that let it change from custom processors to Power Architecture without impacting users at all. It was supposedly business-critical and IBM-quality rock solid. But that was about it.
So when Software Engineering Radio episode 177 interviewed the i chief architect Steve Will, I was hooked. It turned out that IBM i was cooler than I imagined. Here are my notes on why I think that IBM i is one of the most interesting systems out there in real use.
I just read an interview with Steve Furber, the original ARM designer, in the May 2011 issue of the Communications of the ACM. It is a good read about the early days of the home computing revolution in the UK. He not only designed the ARM processor, but also the BBC Micro and some other early machines.
Continue reading “Steve Furber: Emulated BBC Micro on Archimedes on PC”
There is a new post at my Wind River blog, about some computing history. Wind River turns thirty this year, Simics twenty, and simulation for debug (and probably debug in general) turns sixty. Computing has come a long way.
I recently read the “Cubase64 White Paper” by Pex Tufvesson. It is a fantastic piece of retro computing, where he makes a Commodore 64 do real-time audio effects on a sampled piece of music. There is a Youtube movie showing the demo in action. Considering how hard we worked in the early 1980s to make a computer make any kind of useful noise at all, this is an amazing feat. It is also a feat that I think would have been impossible at the time.
Continue reading “Cubase64 – Impressive Impossible Retro”
In the June 2010 issue of Communications of the ACM, as well as the April 2010 edition of the ACM Queue magazine, George Phillips discusses the development of a simulator for the graphics system of the 1977 Tandy-RadioShack TRS-80 home computer. It is a very interesting read for all interested in simulation, as well as a good example of just why this kind of old hardware is much harder to simulate than more recent machines.
Continue reading “Simple Machine, Hard to Simulate”
The EDSAC was an early computer in the mathematics laboratory at Cambridge in the UK. I have just read an old article on the machine and how it was programmed, from a 1998 issue of the IEEE Annals of the History of Computing.
There are many fascinating aspects to the machine and its utter simplicity, but one that struck me as I read the paper was that so many of the fundamental ideas of programming and practical computing were invented then and there. Indeed, the EDSAC was designed as a machine to experiment with programming, rather than as a machine for maximal computing performance.
Continue reading “EDSAC – First Bootloader and Assembler”
I have just found what almost has to be the first cycle-accurate computer simulator in history. According to the article “Stretch-ing is Great Exercise — It Gets You in Shape to Win” by Frederick Brooks (the man behind the Mythical Man-Month) in the January-March 2010 issue of IEEE Annals of the History of Computing, IBM created a simulator of the pipeline for the IBM 7030 “Stretch” computer developed from 1956 to 1961 (photo from IBM.com).
Continue reading “Pipeline Performance Simulator Anno 1960”
Wow. The eruption of Eyjafjallajökull in Iceland and the resulting ashcloud has had an effect that I would never ever have expected. A near-total closing down of the European airspace is such a drastic thing to happen to nobody seems to have expected. It has certainly not been included in the list of worst-case scenarios to plan for in company and government contingency plans. Where does this leave us? In a very interesting situation indeed. Worst-case, we will have to do without air travel for months.
Continue reading “Eyjafjallajökull is Showing us Something”
I am a regular listener to the Matt’s Today in History podcast. When Matt asked for contributions for this spring (in order to meet a goal of 500 podcasts before Summer) I did give some thought to what I could contribute. Looking over some books, I found one suitable Spring date: the launch of the IBM System/360 back in 1964. The resulting podcast is now live at Matt’s Today in History.
Please be kind to any mistakes… I am trying to paint a broad picture for a computer-history-ignorant audience here.
During the Christmas holidays, I got the chance to compare my oldest child’s brand new Lego set with some from the mid-1980s. It is quite striking how much larger the things in the sets have become, and how much more affordable (in relative terms) Lego has become since then.
Continue reading “Off-Topic: Old and New Lego”
Unless you have been living under a rock I guess the media deluge has made it clear that it was twenty years ago on November ninth that the Berlin Wall fell. Wow. Without a doubt the most momentous and important event that I have lived through. Not at all on the topic of this blog, but important enough to write some personal recollections about.
Continue reading “It was Twenty Years Ago Today”
IEEE Spectrum has an article in its May 2009 issue called “25 Microchips that shook the world“. Not long or deep, but an interesting mix of chips from the 1970s, 80s, 90s, and the 2000s. Recommended as light reading.
Yes, when does hardware acceleration make sense in networking? Hardware acceleration in the common sense of “TCP offload”. This question was answered by a very nicely reasoned “no” in an article by Mike Odell in ACM Queue called “Network Front-End Processors, Yet Again“.
Continue reading “When does Hardware Acceleration make Sense in Networking?”
I just rediscovered my first computer, a Sinclair ZX Spectrum (good picture) which I bought back in 1983 or 1984 (I have no trace of the exact date, unfortunately). The machine was a perfect machine to learn programming on in my opinion, consisting of little more than a Z80 processor with memory, bit-mapped display (with a famously odd-ball addressing scheme and color handling) and ultra-simple sound output and input. Most of my friends in the end bought Commodore C64 machines, which had more powerful graphics and sound hardware, but a processor that was much less fun to program.
The Spectrum came with a built-in BASIC interpreter that are up the bottom 16kB of the 64kB addressing space. The BASIC was actually fairly powerful and easy-to-use, and included a very fun programming textbook. I just reread that textbook, and it is quite strikingly well-written and manages to cover both basic computer-science-style programming and deep close-to-the-machine and real-time programming in a compact 150 pages. There is no credit to a particular author in the book I have (Swedish translation by a group of people at Ord & Form here in Uppsala), but an online scan credits Steven Vickers.
Continue reading “Book review: ZX Spectrum BASIC”
I just found a fairly interesting podcast that offers a nice example on how do marketing for paper-based magazines using digital ephemeral technology. The ancient warfare magazine has a podcast that accompanies each issue, where a set of history buffs discuss around the theme of the current issue of the magazine.
Continue reading “Marketing a Paper Magazine with a Podcast”
I have an old Apple LaserWriter 12/640 PS network printer at home that I bought back in 1997. In those days, I had a PowerBook G3 at 266 MHz, Windows NT was new, and my work computer was one of Sweden’s first 300 MHz Pentium II machines… since then, my home machines have moved from MacOS 8 to Windows NT 4 to Windows 2000 to Windows XP and now Windows Vista 32- and 64-bit. But the trusty LaserWriter remains, keeps printing, and is still on its first toner cartridge!
However, moving to Vista has made the printing bit harder.
Continue reading “Off-Topic: Vista, Laserwriter 12/640 PS, and FoxIt”
I just read a fairly interesting book about the British Spitfire fighter plane of World War 2. The war bits were fairly boring, actually, but the development story was all the more interesting. I find it fascinating to read about how aviation engineers in the 1930s experiment and guess their way from the slow unwiedly biplanes of World War 1 and the 1920s to the sleek very fast aircraft of 1940 and beyond. It is a story that also has something tell us about contemporary software development and optimization.
Continue reading “The Details of Speed”
Strongly recommended thread at stackoverflow: http://stackoverflow.com/questions/102714/what-was-your-first-home-computer is about your first home computer. Some good product shots, and also some really funny things inserted.
As might be evident from this blog, I do have a certain interest in history and the history of computing in particular. One aspect where computing and history collide in a not-so-nice way today is in the archiving of digital data for the long term. I just read an article at Forskning och Framsteg where they discuss some of the issues that use of digital computer systems and digital non-physical documents have on the long-term archival of our intellectual world of today. Basically, digital archives tend to rot in a variety of ways. I think virtual platform technology could play a role in preserving our digital heritage for the future.
Continue reading “Virtual Platforms for Late Hardware and the Winds of History”
Being a bit of a computer history buff, I am often struck by how most key concepts and ideas in computer science and computer architecture were all invented in some form or the other before 1970. And commonly by IBM. This goes for caches, virtual memory, pipelining, out-of-order execution, virtual machines, operating systems, multitasking, byte-code machines, etc. Even so, I have found a quite extraordinary example of this that actually surprised me in its range of modern techniques employed. This is a follow-up to a previous post, after having actually digested the paper I talked about earlier.
Continue reading “The 1970 rule strikes again: Virtual Platform Principles in 1967”
By means of a trip down virtualization history, I found a real gem in 1969 paper called A program simulator by partial interpretation, by Kazuhiro Fuchi, Hozumi Tanaka, Yuriko Manago, Toshitsugu Yuba of the Japanese Government Electrotechnical Laboratory. It was published at the second symposium on Operating systems principles (SOSP) in 1969. It describes a system where regular target instructions are directly interpreted, and any privileged instructions are trapped and simulated. Very similar to how VmWare does it for x86, or any other modern virtualization solution.
Continue reading “Virtual Platform by Virtualization Extensions — 1969”