The Design and Verification Conference Europe (DVCon Europe) took place back in late October 2020. In a normal year, we would add “in München, Germany” to the end of that sentence. But that is not how things were done in 2020. Instead, it was a virtual conference with world-wide attendance. Here are my notes on what I found the most interesting from the conference (for various reasons, this text did come out with a bit of delay).
Developing Hardware like Software
The conference keynote speakers Moshe Zalcberg from Veriest, Vicky Mitchell from ARM in Austin, Matthias Traub from Volkswagen’s new car.software organization, and Mike Mayberry from Intel all addressed the importance of software to hardware design. Without software, hardware is just a rather expensive transformation of sand into shiny objects. The speakers also addressed hardware verification and validation – as that is the core subject of the conference.
A recurring theme was asking how hardware could be developed in a way that is more like software. Software design usually moves faster than hardware, and despite the obvious differences in how the end product is packaged and delivered, there might be aspect of the design process where hardware can learn from software. The speakers offered a variety of perspectives on this, but to a large extent it boils down to looking at Agile methods.
How do you translate that to hardware? One way is to think of the hardware design source code (be it RTL or high-level synthesizable code) as “any other” software. It is quite possible to develop hardware iteratively and in sprints. The requirements on a block can be broken down into discrete features, and the requirements implemented using a continuously evolving priority list (i.e., Agile sprints). A hardware block can be kept continuously deliverable by first building a minimum viable product and then adding features to it in a way that keeps the whole unit working all the time.
Obviously, you cannot expect to go all the way to tape-in and fabrication of a chip on each sprint… instead, the deliverable coming out of the hardware team is “something that another team can use”. For an individual hardware block, that other team could be the integrators for the next level of the design. Vicky Mitchell from ARM talked about this process in the context of the system fabric that her team developed – each sprint would result in a drop to the team that integrates the fabric with processor cores and other controllers.
Thus, one key to Agile in hardware is a clear modularization of the hardware with relatively stable interfaces between the blocks. Which is pretty much like it works in software too – without stable interfaces at appropriate points in the design it is very difficult to work effectively. Mike Mayberry noted something very important and sometimes underappreciated: “if you put the interfaces in the wrong places, you make your life hell.” This is particularly pertinent when we start to disaggregate the design physically as well as logically, with chiplets and advanced packaging technologies.
What happens to Testing?
One of the keys to Agile methods is quick testing turnaround. As you iterate on the code, you need to run tests as soon as possible after a change is made in order to catch issues quickly. Software techniques to deal with this include extensive unit testing and multiple levels of integration testing, all of which can in principle be applied to hardware as well. That said, hardware testing has some unique challenges compared to software testing: it takes much more time to run. RTL simulation is fundamentally slow, and while FPGAs and emulators speed things up, they also carry a very significant cost both in capital expenditure and time to set up for runs.
To solve this, several speakers brought up Machine Learning and Artificial Intelligence (ML/AI). Such techniques have been applied in both commercial EDA tools and internal workflows at companies like ARM – using AI to shorten test times by intelligently selecting the tests to run for a particular code commit or quick feature test. Eventually, all tests do have to be run in order to catch the last few errors and to calibrate that the right subsets of tests are being run in the quick tests, but for much of the work, subset testing is sufficient.
AI can also be used to predict failures by looking at code as it is being checked in, to enable very quick feedback to designers in more of a smart coding environment than a classic EDA flow (some ideas in this direction were presented at DVCon Europe 2018). Mike Mayberry talked about applying AI (and computational brute force) to determine the best interface points in a design by simply exploring more possibilities than a human designer would.
The basis for ML is data – and if companies collect good data on our development and testing and how it translates into successful features and bugs, it becomes possible to apply data analytics to the hardware development process. Not just in determining current issues, but also predicting the resources required for the next project or changing how it is done in order to more efficiently get to the final goal!
To speed hardware testing, companies can use execution resources from cloud computing. Basically running RTL tests “in the cloud” allows a company to quickly scale up execution as needed. Presumably, this is also doable inside an internal enterprise infrastructure.
Mike Mayberry brought the discussion up to the system level and beyond – towards the end of his talk, he pointed out that no matter how much we test and verify in the hardware design process, our customers will in “test” our designs many orders of magnitude more than is ever possible in pre-silicon.
Thus, a key part of building robust systems in practice is to make systems that can keep functioning even in the face of errors. The hardware will not be perfect, and the system needs to be able to handle and work around errors that happen at runtime.
Automotive and Software
DVCon Europe is based in Germany, and no conference in Germany is complete without representation from the automotive sector. This year, Volkswagen provided the automotive keynote speaker. In particular, VW’s newly formed Car.Software organization. Car.Software represents a big change in how the classic automotive companies think about the software in their products. Software used to be spread out all across the org and often subcontracted to suppliers – but the future direction is to bring much more of the software in-house and centralizing its development.
Volkswagen is big enough that it makes sense for them to build their own software stack to be used across all vehicle electronics and all their various brands and types of cars. Software is finally being recognized as a key competence and a key differentiating factor (which could arguably be considered an effect of Tesla, but the trend has been going on since long before they started making an impression on the market). In a way, it is the same journey that telecoms did many decades ago – from being primarily hardware to primarily software companies.
The hardware is still hugely important. Automotive companies have to optimize the computing hardware and power management to maximize range for electrical vehicles. The computing architecture of cars is going through a revolution as compute becomes centralized in a few powerful nodes instead of the traditional highly distributed architecture. Building custom chips for important differentiating functionality is becoming a core competency alongside the software.
High-Level Synthesis
Outside of the keynotes, one of the big trends seen at DVCon Europe is the use of high-level synthesis (HLS). HLS is used to make designers more productive – obviously, by raising the level of abstraction there is less code to write to generate the hardware. HLS also simplifies architectural exploration as changes are easier to make (less code to change). It can even affect verification, by simulating at least some aspects of the hardware at the HLS level rather than at RTL.
Virtual Platforms?
For the past few previous conferences, DVCon Europe was one of the best conferences for virtual platform topics. This year, this was rather less pronounced. The papers have generally moved in the direction of DVCon (US), where it is more about hardware design and hardware validation at the RTL level. Still, Synopsys did present a tutorial on VP technology, and there were a couple of papers. But for 2021, it would be nice to see more VP topics being present.
How Well did Virtual Work this Time?
Turning conferences and other traditionally in-person events into virtual events is a difficult process, but it seems to me that we are getting better at it. In my opinion, DVCon Europe worked better than the past summer’s Design Automation Conference – which is reasonable since the DVCon organizers had more time to prepare and learn from previous experience.
Attending DVCon gave that refreshing feeling of going to a conference and seeing new things and new perspectives on familiar topics, as well as discussions with other people interested in software and hardware. Of course, you did not get the after-conference-hours interactions over beers that you would expect to get in a physical conference, but attending the conference for a few full work days was still functioned like a break from normal work. Discussions during the talks were still there.
DVCon used Zoom to present all talks, along with the CONflux platform from Conference Catalysts company for the schedule, speaker bios, and access to slides and papers. All talks were presented live, with video of the speaker presented alongside the slides. The keynote sessions used the specialized Zoom Q&A function to handle questions to the speakers, while the regular talks used the chat window or even just talking straight out with the speakers.
DVCon also used a Mozilla Hubs-based 3D world for the exhibition, poster sessions, and some networking sessions. This worked surprisingly well, and in its best moments you could walk around between clumps of people “in a room” and join conversations. Much better than trying to organize a set of parallel group chats from a conference session… despite feeling a bit much on the cute side.
One clear advantage of a virtual conference is that it is rather easy to bring in speakers from all over the world. This year, two of the keynote speakers called in from the US – which would have required much more expense and time if done in-person. In a virtual conference, it does not matter from where you show your video (if you tried to do a call-in virtual presentation an in-person conference the result would likely be rather flat).
DVCon also did something rather interesting in that they are selling post-conference access to the recorded presentations. This is not something that is really doable for an in-person conference (even though there used to be a decent business in selling printed conference proceedings once upon a time). Not sure how many people would pick up such an offer though, as I expect those interested to have already signed up for the conference.
3 thoughts on “DVCon Europe 2020 – Developing Hardware like Software?”