Carbon Design Systems have been quite busy lately with a flurry of blog posts about various aspects of virtual prototype technology. Mostly good stuff, and I tend to agree with their push that a good approach is to mix fast timing-simplified models with RTL-derived cycle-accurate models. There are exceptions to this, in particular exploratoty architecture and design where AT-style models are needed. Recently, they posted about their new Swap ‘n’ Play technology, which is a old proven idea that has now been reimplemented using ARM fast simulators and Carbon-generated ARM processor models.
Swap ‘n’ play is the technology of taking a machine state out of a fast virtual platform, and transferring it to a detailed virtual platform. This enables a user to boot a machine and position a workload using a fast platform, and then get detailed performance results (or other information from a detailed run, such as chasing low-level timing bugs) using the detailed model. This is an old and proven idea, and I think it was in fact the main reason for the first implementation of “virtual platform checkpointing” that I know of, in SimOS in the early 1990s (see Stanford technical report CSL-TR-94-631). It is also something I have seen implemented in Simics for at least a decade, and used by computer architecture researchers.
When I read an ARM IQ article about Carbon’s technology, Bill Neifert introduces the technology in a way that does not acknowledge the long history of the idea:
Carbon Design Systems has recently introduced a best of both worlds approach to virtual platforms. Relying upon a new technology which Carbon calls “Swap ‘n Play”, Carbon’s newest generation of virtual platforms now enable a user to run a platform at 100s of MIPS using ARM Fast Models and then swap over to a 100% accurate, Carbon model implementation of that same platform at any software breakpoint.
It is certainly new to Carbon and the ARM FastSim simulators, but this way of putting it makes it sound like this is all brand new.
Another proven use of abstraction-level swapping is also described in the article: sampled simulation for computer architecture evaluations:
Instead of actually swapping however, the system continues running in the fast functional mode. The checkpoint manager is use to create additional Swap ‘n Play checkpoints at set time increments, say every 0.1 seconds of real-time. At each time increment, the checkpoint is created and execution continues in the Fast Model system. … Each one of these checkpoints can be executed as a separate, 100% accurate simulation and distributed to a farm of computers.
This somewhat simplifies the problem, as computer architecture experience shows that using regular samples tend to skew results. Tools like Simpoint have been used for a long time to do this sampling in a more sophisticated and effective way. However, as a simplified explanation to introduce the idea to a new audience this makes sense.
While it is certainly interesting to see Carbon implement classic checkpointing and abstraction level transfers, it seems that they have not read or do not acknowledge the many available publications on this from the computer architecture field. EDA is in many ways a world of its own, with surprisingly few bridges to the research done in computer architecture, for example. That technologies are crossing over is good, but it is also good to be aware of the history and legacy of a technology before billing it as “new”. It is good, useful, great to have, certainly, but not original or new.
Update: I did receive some feedback from Carbon about this topic, and they were aware of prior work. It was just hard to fit it into the format of the IQ article, which I can see makes some sense based on my experience with popular editorials. Still, I think it would have made the presentation in the IQ article and their blog a bit deeper to point out that this is an old proven technology with almost two decades of active use behind it.