I am not in the computer security business really, but I find the topic very interesting. The recent wide coverage and analysis of the Flame malware has been fascinating to follow. It is incredibly scary to see a “well-resourced (probably Western) nation-state” develop this kind of spyware, following on the confirmation that Stuxnet was made in the US (and Israel).
In any case, regardless of the resources behind the creation of such malware, one wonders if it could not be a bit more contained with a different way to structure our operating systems. In particular, Flame’s use of microphones, webcams, bluetooth, and screenshots to spy on users should be containable. Basically, wouldn’t cell-phone style sandboxing and capabilities settings make sense for a desktop OS too?
It is clear that making access to input units an unprivileged operation in OSes like Windows is a bad idea. At the very least, access to such units should be explicitly granted to each application just like it works (to varying degrees) in mobile OSes. This should prevent an arbitrary process from just starting a camera capture with no notification to the user. It would mean that a piece of spying malware would have to either do some social engineering to trick a user into giving it the privilege or it would have to find some way to break into a trusted program. Unfortunately, I guess a typical PC would be littered with programs with access granted. Skype, for example, would be a nice target for someone looking for access to cameras and mikes. To make use of Skype at all, we would have to grant it both IO and network access rights. Thus, if you can infect into Skype, you have what you need.
To add to security, ideally, hardware should be designed with it in mind. It is a bit strange to consider that Iran seems to have used machines that included cameras and microphones inside secure facilities and other places worth spying on. Still, in a military setting, this can probably be fixed by mandate. But I do not think the threat is limited to just obvious high-profile targets. Industrial espionage and other more common threats are certainly an issue for millions of regular computer users who cannot reasonably be expected not to cameras, audio, and similar functions on their machines.
Imagine, for example, if we made status lights mandatory on all input units (cameras, microphones, etc.) on our machines. And made sure that they were driven by the hardware units themselves, not software that could be subverted. Even better, every time a unit was being asked to start capture, what if you had to confirm this with a special hardware button? Easier and safer than an OS pop-up like Windows UAC. A small added cost in the hardware, certainly, but definitely worth paying for to avoid disasters like schools spying on pupils via their school-issued laptops. You always trade convenience and security, but in this case I think the inconvenience could be limited by some good design. At least for me, hardware buttons for a few well-chosen functions tend to trump software dialogs (on the other hand, I have had to help family and friends who had inadvertently used the hardware wifi-off button on their machines and could not get networking going).
If you look at something like taking screenshots, you also have either a strong capability protection – or just abandon it entirely. It is clear that in general, allowing any program to capture the screen is an insane data leak between supposedly isolated processes. It really should be totally impossible on a secure machine, the OS should not even have the ability to export the state of the screen to programs. It is not a function most people need in their daily work… except… maybe they do. I totally depend on screenshots to do my job. I use screenshots to create documentation, blogs, articles, and report GUI bugs. I do screen recordings to demo our products (and report bugs). I use webex to run presentations and demos over the Internet to avoid travel. So, it seems to be an unavoidable function for helpdesks, bug reports, and professionals. Remains to make sure that programs doing it have to ask for user permission.
Debugging works the same way. A debugger really needs to be able to pry open another process and look at its inner workings. But you really do not want that in case the program doing the prying is malware. However, if you outlaw debugging completely, software development would be more than slightly inconvenienced. Most users could do without these hooks, leaving them active only on software developer’s machines. This leaves software developers a bit more at risk than other users, but these seems to be no way to avoid that.
It points the way towards a future split between development machines and user machines, which is how smartphone and most embedded development works today – but which also to some extent negates the great personal computer revolution. As I see it, one of the truly great features of the home computers and professional personal computers that started to appear in the late 1970s is that the machines could be used to develop software on. Consumption and creation of software happened on the same machine, making it possible to buy one machine and both use other people’s software and build your own. We have taken that for granted for a long time now… but it might be under threat from the wave of designs following Apple’s iOS decision to split consumption and creation.
So what is my conclusion from this rambling? I think we can and should build operating systems that control access to input and output units much more tightly, even on a desktop. But leave open for users to grant privileges for global actions like debug and screenshots to particular programs, in case they need it. A totally segregated OS like a MILS (multiple independent levels of security) probably would work fine as a consumption device, but would not work as a creator and software development device. Maybe an OS can be shipped in “locked down” and “open” mode, a bit like mobile handsets are sometimes provided in very open configurations to developers? That could work to both better protect the innocent masses of regular users, and provide useful power for developers.
My mom does not need to use a debugger, but I certainly do. So why not entirely remove that ability from her machine? I could live with that.
Recommended links on Flame:
- Ars Technica has a very good series of articles, start at the most recent one and dig backwards. The first one was here.
- Mikko Hypponen at F-Secure has written a series of very thoughtful blogs, including ones about opening Pandora’s box, and why AV vendors are totally out of their league against something like this. He also noted that it seems that US defense contractors are piling in on an emerging market for virtual weapons.
- More on the Windows Update propagation from Kaspersky.
I’m rather intrigued by Qubes OS: http://qubes-os.org/Home.html
Qubes utilises virtual machines, so that each program can be run in a container with strict rules about what data can travel from one container to another. This is all integrated so the user don’t even recognize that the programs cannot talk to each other. They’ve also abstracted away networking to make it possible to lock down a single program entirely while leaving other programs free to use
They have a rather interesting blog at http://theinvisiblethings.blogspot.se/
So basically, it is very strong sandboxing. A bit like a mobile OS, but more pervasive.
The problem is that this approach works fine for applications that do not need to talk to each other or access local data. But many of the applications we use rely on interactions with each other… screenshots is one obvious example, or just sharing files. An IPhone-style model with its “only one app can access its data” model is horrible for things like editing a document with multiple applications, or just doing simple things like attaching a word document to an email or pushing it to a server or uploading to a webpage in a browser.
Sandboxing is nice for passive consumption, but for active creation of software and text and pictures and movies, we often cannot have strong sandboxing as we pass data around between many different executables.