The Rapidly Changing Desktop
Two years ago, I got into a conversation with another professional about the desktop. I opined that very shortly, the desktop would be our cell phone and there would be no need to put file servers at everyone's desk. This was partially driven by the announcement that morning, at LinuxCon, by Qualcomm, that they were going to put dual-core 1 GHz processors in their next generation cell phones. This professional pooh-poohed the idea as completely unworkable.
Flash forward to 2012, and not only is it workable, it is viable, and very realistic. Note that as I go through this, when I am talking about a desktop, I mean either a physical desktop machine or a laptop, but in either case it is the standard CPU/RAM/hard disk system connected to a monitor or monitors with a mouse/trackball and keyboard via docking station, cables, or wireless and running a fat operating system - whether that is Linux, Windows, or Mac.
I argue that in a typical enterprise environment, the 80/20 rule applies when you look at application use and processing power. 80% of the people are using only 20% of the computing power in their machines. If you have any experience in large enterprises you are snorting because it is unlikely they are even using 20%, but let's use this for illustration. The majority of worker bees are doing simple tasks. They are writing documents, whether in a word processor or in email, they are preparing or delivering presentations, which really is only specialized word processing. They are surfing the web, administering systems, working or submitting tickets, or reading. None of these tasks is particularly computationally taxing.
The other 20% are doing tasks that are computationally taxing. Advanced data analytics, audio/video/graphical composition, CAD/CAM, data or event modeling, even some local compilation to ensure builds will work. These people have need for some serious horsepower.
Now, five years ago, I would have argued that the group doing the computationally taxing work could have justified a personal desktop device. The rest could be connected by a thin client to a virtual desktop located in the server room. Today, I am not sure I can support the argument that the 20% need to have a dedicated machine. Are there corner cases for a personal desktop? Sure. The most obvious one to me is the dedicated system, such as a DAW or CAD/CAM system where there may be additional inputs needed for specialized equipment. In the rest of the cases, where it is just an issue of computation within a system, there really is no need for a desktop anymore.
What drove this home for me were a couple of things. I work for a company that develops software. We used to issue each developer two servers for their development use. As you can imagine, that is expensive, both in terms of hardware and associated environmental costs. When we started the current round of development, we had the option of either refreshing the servers or virtualizing. The solution was to move to virtual development environments, which reduced the server needs and saved us a bit in energy costs. The developers remote into these machines from their desktops, write and build their code, check it in and out of our version control system, and essentially do their work. There is no real difference between having a dedicated machine and running in a virtual environment. In the case of those developers who have been here longer, moving to a virtual environment actually increased their ability to work because of the age of their original development platforms.
But the real eye opener came when our IT department reclaimed the laptops issued to us and reformatted them to the corporate standard Windows 7 x86_64. I was running Linux. I was not a happy camper, but they played the policy card and I really did not have the time or energy to fight them. I had real work to do and this was just a distraction. So I did what any Linux person would do. I tared up my desktop and pushed the contents up to my file server (what? You don't have your own rack of servers to work with? Just because I virtualized my developers does not mean I did not keep a couple of servers for my own use). OK, so I have the luxury of having a number of test servers at my disposal and I converted one of them into a KVM host and stuck my desktop into a virtual container.
No harm, no foul, and here is my laptop, please install putty and a VNC viewer on it so I can use it as a dumb terminal. I was then told it would take two days to image my machine. My retort was to ask to have my iPad put on the wireless network so I could continue to work. Or at least check my email. (Yeah, yeah, this would have worked with any Android tablet too - I just happen to use iDevices, get over it). So as I was sitting in my cube wondering what I was going to do with the next two days, I wondered...is there an app for that? The answer is yes. A quick search of the web and I found a suitable ssh client and the Real VNC viewer. I hooked up my bluetooth keyboard and bang, I was operational - remoted into my desktop. That night I stopped and picked up a VGA cable and I could remote the video from my iPad to my larger monitors. But this got me thinking. If I could do this with my iPad, could I do it with my iPhone? Well, I would not recommend it without a way to remote your video to something bigger, but the answer was yes. And I remembered my conversation. I had moved my desktop to my phone. What was a theory two years ago in my mind had become a practical reality.
I am not going to say that we will see a conversion to this new model overnight. But I think we will see it sooner, rather than later. With the release of the Microsoft Surface, which, as pointed out by one pundit is really nothing more than an Ultrabook® with a removable keyboard, along with Cisco's almost persistent marketing of Bring your own device, and VMWare Federal harping on the cloud and provisioning desktops with thin clients, the move is afoot. And there are solid economic reasons to do this. Further, with the release of the Intel Sandybridge chip, computing power at the server side is only going to increase and we will see a much larger push to move desktop environments, for those that need them, into virtual containers. We will also see another rapid increase on n-tier systems to support mobile device access. To achieve this, we, as administrators and architects have to consider a new raft of issues, the least of which is incorporating IPv6 in our networks, secure network access, and platform agnostic authentication methodologies. These require cutting edge software and forward thinking.
In 2010, I said the desktop is dead and the mobile device will be the new meme, but it had a long way to go. Under Moore's law, a long way is a very short time.