- Curated Computing – Hot or Not? (part 2) - Wed, Apr 13 2011
- Curated Computing – Hot or Not? (part 1) - Tue, Apr 12 2011
In my last post, I introduced the concept of curated computing – a computing model that provides a method non-technical users to install “apps” and for enterprises to embrace consumerisation but that’s only part of the story. Curated computing also offers a fantastic opportunity to hit a reset button – to banish the poor quality applications and keep those that are malware-free, and which offer real value.
Apple wants us to run their hardware (and is doing a pretty good job of it in the next-generation tablet space). Microsoft would like us to think that a Windows tablet is great because… well, because it runs Windows – and that means it runs the same apps that we use on our PCs.
I’ve already explained that it’s the data that counts, not the device, operating system, or applications – but that’s not to say those other layers in the stack can be ignored. Windows’ biggest strength (its massive ecosystem of compatible hardware and software) is also its nemesis because many of those applications (and drivers) are pretty poor. I wrote last year (before Apple announced their plans for a Mac OS X app store, I believe) that what Windows really needs is an application store – read on and I’ll explain why.
If we think of the quality of software applications, we can consider that, statistically, they follow a normal distribution. That is to say that, the applications on the left of the curve tend towards the software that we don’t want on our systems – from malware through to poorly-coded applications. Meanwhile, on the right of the curve are the better applications, right through to the Microsoft and Adobe applications that are in broad use and generally set a high standard in terms of quality. The peak on the curve represents the point with the most apps – basically, most applications can be described as “okay”. Ideally, we should cut out the 50% on the left side of the graph, instantly raising the quality bar for applications. And one way to do this is curated computing.
The problems come when the curator either:
- Messes up and lets a “bad” app through (as highlighted in recent reports of malware in Google's Android Marketplace).
- Does not believe that a particular class of app is good for a platform (which is why we don’t have Adobe Flash or Microsoft Silverlight on Apple iOS).
One way around is to allow multiple curators: for example, enterprise administrators will require a method to load their own applications onto corporate systems but that needn’t exclude the use of other application stores (although it might, if policy dictated). The trouble is that dilutes the benefits – independent software vendors (ISVs) need to submit apps to multiple sources and it’s still up to the individual app stores to apply sufficient rigour to ensure that “bad” apps don’t make it onto systems. There’s also the consideration that a curator’s views on application “goodness” may be subjective. But surely anything is better than the mess we have today and if users learn that the way to load an application is to download it from one or two sources, then they might stop clicking on popups from websites that install all sorts of nasties on the systems that are already bogged down with security suites and OEM-installed crapware.
I started part 1 by commenting on Steve Jobs’ proclamation that we’re in a post-PC world but I don’t agree. I’d say that, rather than entering the post-PC era we’re starting to discover PC 2.0 – a world where we don’t care so much about the class of device or the operating system it runs but we can install applications from a trusted source and access our data wherever we happen to be. Or maybe I’m just an optimist.
About the author:
Mark Wilson is a strategy consultant for a major systems integrator and has more than 16 years’ experience of successfully bidding, designing and delivering innovative, multi-million pound IT infrastructure projects worldwide. Holding certifications from Microsoft, VMware and Red Hat, Mark is also a Microsoft Most Valuable Professional (MVP) and won the Individual IT Professional (Male) award in the 2010 Computer Weekly IT Blog Awards. Follow Mark on Twitter @markwilsonit
Maybe we are at PC 2.0, or how about “Mainframe” 3.0 or simply “Electronic Computing” 3.0.
Big Iron (1.0) showed people the joy of having a computer to crunch numbers a gazillion times faster than clerks with quill pens (equivalent to the “million monkeys” approach to writing Shakespeare).
Then the PC and Lotus 123 (2.0) came along to free people from the tyranny of the IT department in the big glass air conditioned room with their “locked down” green screen applications that took an act of congress to get changed. (Thank you <sarcasm] MS for re-introducing the locked down Ribbon gooey )
(PC 2.5 was the introduction of true computer portability with good laptops and notebooks)
Now comes consumer computing (3.0) where the average ludite has freedom to download/install a gazillion apps to their smart phone. A device that has orders of magnitude more computing power and data storage than was available in the spaceship that took men to the moon.
@Ron – I think you’re spot on and I like your analogy. All we’re talking about here is evolution. Some companies (naming no names) will proclaim that to enter the brave new world you have to follow them; others will take a little longer to get with the programme. Ultimately we will all move forward, but there’s still a place for Big Iron and PC (2.0-2.5) for a while yet!