Back when I was a PhD student working on worst-case execution-time (WCET) analysis, one of the leading groups researching the topic was the “Saarbrücken gang” led by Professor Reinhard Wilhelm. Last year, Professor Wilhelm published a retrospective look on their work on WCET in the Communications of the ACM. It is a really interesting history write-up from the perspective of the Saarbrücken group.
Continue reading “Professor Reinhard Wilhelm on the History of WCET Analysis”Tag: wcet
Execution Time is Random, How Useful
When I was working on my PhD in WCET – Worst-Case Execution Time analysis – our goal was to utterly precisely predict the precise number of cycles that a processor would take to execute a certain piece of code. We and other groups designed analyses for caches, pipelines, even branch predictors, and ways to take into account information about program flow and variable values.
The complexity of modern processors – even a decade ago – was such that predictability was very difficult to achieve in practice. We used to joke that a complex enough processor would be like a random number generator.
Funnily enough, it turns out that someone has been using processors just like that. Guess that proves the point, in some way.
Wind River Blog: “IMA on Simics”
I have a fairly lengthy new blog post at my Wind River blog. This time, I interview Tennessee Carmel-Veilleux, a Canadian MSc student who have done some very smart things with Simics. His research is in IMA, Integrated Modular Avionics, and how to make that work on multicore.
Real-time control when cores become free
A very interesting idea that has been bandied around for a while in manycore land is the notion that in the future, we will see a total inversion in today’s cost intuition for computers. Today, we are all versed in the idea that processor cores and processing times are quite precious, while memory is free. For best performance, you need to care about the cache system, but in the end, the goal is to keep those processor pipelines as busy as possible. Processors have traditionally been the most expensive part of a system, and ideas such as Integrated Modular Avionics are invented to make the best use of a resource perceived as rare and expensive…
But is that really always going to be true? Is it reasonably to think of CPU cores are being free but other resources as expensive? And what happens to program and system design then?
Worst-Case Execution Time Survey Article in TECS
I just got another article published! In the April 2008 issue of the ACM Transactions on Embedded Computing Systems (TECS), we have an article called “The worst-case execution-time problem – overview of methods and survey of tools”. “We” is kind of understatement, the article has fifteen authors from three continents, and presents an overview of the state of the field of WCET (Worst-Case Execution Time) analysis. The article was started back in 2005, with submission in 2006, accepted in January of 2007, and then finally it appeared in 2008. It is probably my last shot in the WCET area where I did my PhD thesis (please see my list of publications for an idea of what all of that is about).
You can find the article at the ACM portal, or at the MRTC publications data base in Västerås.
Handbook of Real-Time and Embedded Systems
The “Handbook of Real-Time and Embedded Systems” (ToC, Amazon, CRC Press) is now out. I and my university research colleague and friend Andreas Ermedahl have written a chapter on worst-case execution time analysis. We talk some about the theories and techniques, but we try to discuss practical experience in actual industrial use. Both static, dynamic, and hybrid techniques are covered.
I just got my personal copy, but my first impression of the book overall is very positive. The contents seems quite practical to a large extent, not as academic as one might have feared. Do check it out if you are into the field. It is not a collection of research paper, rather instructive chapters informed by solid research but with applications in mind.