When I recently turned 50, a friend of mine gave me a book that was about as old as me – Timesharing System Design Concepts, by Richard W Watson. The copyright date is 1970, but the publishing date found online 1971. The book covers the hardware support and software techniques needed to provide multiple users with simultaneous access to a computer system. Typically, using remote teletype terminals. It is a wonderful book, reflecting on the state of computer technology around 1970. It shows both how many of the basic concepts we still use today were already in use back then, but at the same time just how far we have evolved computing since.Continue reading “Timesharing System Design Concepts (1970)”
Last week was spent at the Design Automation Conference (DAC) in Las Vegas. I had a presentation and poster in the Designer/IP track about Clouds, Containers, and Virtual Platforms , and worked in the Intel Simulation Solutions booth at the show floor. The DAC was good as always, meeting many old friends in the industry as well as checking out the latest trends in EDA (hint: same trends as everywhere else). One particularly nice surprise was a book (the printed type, not the Vegas “book” that means something else entirely).Continue reading “DAC 2019 – Cloud, a Book, an Award, and More”
I was at the DAC 2016 conference and exhibition in Austin, Texas, a few weeks ago. On the show floor, going by the S2C booth, I was roped in and got a paper copy of the book Prototypical. The copy was even signed by the authors Daniel Nenni and Don Dingee! Nice touch! The book is more than just marketing material – it provides a good overview of the origins and history of FPGA prototyping, and I found it nice and enjoyable to get more insights into this fairly important part of the EDA tools ecosystem.
At the Wind River corporate blog, there is a blog post that I wrote about continuous integration and Simics. At the Elsevier Computer Science Connect blog, there is also a blog post about continuous integration and Simics that I wrote. These two texts are essentially the same, and I had the good fortune to get it posted in multiple places. The reason it is up at Elsevier is to help promote our soon-to-be-released book at about virtual platforms and simulation (and a little bit about Simics), and hopefully we will reach a larger audience with both messages: CI with Simics is a great idea, and the book is a great book to buy.
For the past six months I have not been doing much blogging at all, neither here nor on the Wind River blog. The reason is that I have been directing my writing energy into writing a text book about Simics together with Daniel Aarno at Intel. Last year, Daniel and I worked on an Intel Technology Journal issue on Simics. The ITJ issue was kind of a first step on the way to the book, collecting several articles about Simics usage at Intel and elsewhere. The book itself will be much more of a detailed description of Simics and how it works and why it works the way it works.
Paranormality – Why we believe the impossible, written by Professor Richard Wiseman, manages to combine four stories into a single book. Wiseman is a well-known name in skeptic circles, and this book does not disappoint in the debunking department. But it also uses the investigation of paranormal phenomena as a way to explain how our brains work. And then some. It all makes for a very satisfying read.
Trust Me, I’m Lying – Confessions of a Media Manipulator by Ryan Holiday is a brilliant book about the online media landscape, and how it is driving public discourse in a very bad direction. Ryan has a very interesting background, having worked in marketing and being part of the problem he describes. In his work, he has exploited the weaknesses of the new media landscape to get stories into blogs, press, and often national television. Stories about his clients, to get them attention and ultimately business. In this book, he describes what he did, how he did it, and why we as a society have a big problem. It has changed the way I read online media, and made me a lot more critical of things I previously did not take notice of.
The September 2013 issue of the Intel Technology Journal (which actually arrived in December) is all about Simics. Daniel Aarno of Intel and I served as the content architects for the issue, which meant that we managed to contributed articles from various sources, and wrote an introductory article about Simics and its usage in general. It has taken a while to get this journal issue out, and now that it is done it feels just great! I am very happy about the quality of all the ten contributed articles, and reading the final versions of them actually taught me some new things you could do with Simics! I already wrote about the issue in a Wind River blog post, so in this my personal blog I want to be a little bit more, well, personal.
Debugging – the 9 Indispensable Rules for Finding Even the Most Elusive Software and Hardware Problems by David Agans was published in 2002, based on several decades of practical experience in debugging embedded systems. Compared to the other debugging book I read this Summer, Debugging is much more a book for the active professional currently working on embedded products. It is more of a guidebook for the practitioner than a textbook for students that need to learn the basics.
This blog post is a review of the book “If I Only Changed the Software, why is the Phone on Fire“, (see more information on Amazon, for example), by Lisa Simone. The book was released in 2007, on the Elsevier Newnes imprint. It is a book about debugging embedded systems, written in a murder-mystery style with a back story about the dynamics of an embedded development team. It sounds strange, but it works well.
Ever since I started using the Amazon Kindle late last year (as an app for my Android devices), I have found myself suddenly reading fiction again after a decade mostly spent on factual books. Recently, I have read through quite a few books that are in the category of self-published military science fiction. All good reads, but the books have started to blur together when I look back at them. There are some interesting common themes and plots that almost make them hard to keep apart, especially those written in recent years.
I just finished reading Society without God, by American sociologist Phil Zuckerman. The book came out back in 2008, but I heard about it recently on a skeptic podcast and I felt I just had to buy it. Phil Zuckerman spent a year in Denmark in 2006, and also visited Sweden during that time to perform interviews with a wide sample of what seems to me to be typical Swedes and Danes, trying to understand their attitude towards god and religion. His conclusion is that the Nordic countries today are a special little area of deep secularism in a world that is mostly religious and apparently growing more religious recently. Even in fairly secularized Western Europe, the Nordic countries stand out (or at least Denmark and Sweden does, in his research). So what? For a Swede like myself this is pretty obvious… but when you combine this with the fact that the standard of living and overall feeling of security and quality of life in Denmark and Sweden is very high, Zuckerman finds a great argument against a certain argument brought forth by Christian conservatives in the US…
Peter Nowak argues that most of the major technological advances of the last sixty years have stemmed from the trio of billion-dollar industries that cater to our basest impulses. From Saran Wrap to aerosols, digital cameras to cold medicine and GM foods to Google, many of the gadgets and conveniences we enjoy today can be traced back to either the porn, military or fast food industry.
This certainly sounded interesting. And the book was a good read. However, it was not a great read.
I am just finishing off reading the chapters of the Processor and System-on-Chip Simulation book (where I was part of contributing a chapter), and just read through the chapter about the Tensilica instruction-set simulator (ISS) solutions written by Grant Martin, Nenad Nedeljkovic and David Heine. They have a slightly different architecture from most other ISS solutions, since that they have an inherently variable target in the configurable and extensible Tensilica cores. However, the more interesting part of the chapter was the discussion on system modeling beyond the core. In particular, how they deal with interrupts to the core in the context of a temporally decoupled simulation.
Freescale has now released the collected, updated, and restyled book version of the article series on embedded multicore that I wrote last year together with Patrik Strömblad of Enea, and Jonas Svennebring, and John Logan of Freescale. The book covers the basics of multicore software and hardware, as well as operating systems issues and virtual platforms. Obviously, the virtual platform part was my contribution.
The book “Taxonomies for the Development and Verification of Digital Systems“, edited by Brian Bailey, Grant Martin, and Thomas Andersson, was published in 2005 by Springer Verlag. It is a legacy of the defunct VSIA, and presents an attempt to bring order to nomenclature and taxonomies in the chip design field (its scope is defined to be broader than that, but in essence, the book is about SoC design for the most part).
The “Handbook of Real-Time and Embedded Systems” (ToC, Amazon, CRC Press) is now out. I and my university research colleague and friend Andreas Ermedahl have written a chapter on worst-case execution time analysis. We talk some about the theories and techniques, but we try to discuss practical experience in actual industrial use. Both static, dynamic, and hybrid techniques are covered.
I just got my personal copy, but my first impression of the book overall is very positive. The contents seems quite practical to a large extent, not as academic as one might have feared. Do check it out if you are into the field. It is not a collection of research paper, rather instructive chapters informed by solid research but with applications in mind.