Some days ago, I linked to a nice poem about cloud computing. I think it expresses nicely the confusion that exists about this term. Usually, it comes into play when people talk or write about the fall of Windows. News sites and blogs were full of this kind of stuff when Google released their new web browser Chrome. I have been using this term frequently, without thinking much about its meaning. After I read this poem, I wasn’t sure, anymore, if I really knew what “cloud computing” actually means. Thus, I thought it would be a good idea to write down at least once, what kind of associations come to mind when I think about cloud computing. After I finished this post, I realized that it had become more like a defense for desktop computing.
- Pip install Boto3 - Thu, Mar 24 2022
- Install Boto3 (AWS SDK for Python) in Visual Studio Code (VS Code) on Windows - Wed, Feb 23 2022
- Automatically mount an NVMe EBS volume in an EC2 Linux instance using fstab - Mon, Feb 21 2022
Software as Commodity/Utility computing ^
The idea that software is just an unimportant commodity is very old. Before Microsoft entered the computing market, it was just like this. At that time, there were, basically, only hardware companies, and software was just some kind of unimportant add-on. Microsoft was able to grow fast because hardware vendors, in particular IBM, realized much too late that software is more than just a commodity. With the rise of Open Source software, these companies saw a chance to reverse the course of history.
Obviously, this expectation hasn’t come true yet, so many are hoping now that cloud computing could step in where Open Source failed. Nicolas Carr is probably the most prominent representative, although he goes even a step further in claiming that IT as a whole will be just an insignificant commodity. His main idea is that companies will buy IT services as they buy electricity nowadays. That is, you just have to plug-in your “IT device” into your “IT socket,” and you will pay the consumed IT along with your next electricity bill. According to his view, IT or software will be a relatively unimportant factor for any business just like electricity is nowadays. If you want to know my humble opinion about this theory, I think it is just plain nonsense! You could, as well, purport that money won’t be important in the future for businesses because it will come over the Internet. After all, online banking made accessing money as easy as consuming electricity.
Software as a Service ^
SaaS is connected to the commodity idea, although it does not consider software or IT to be an unimportant factor in the future. Even though the term “SaaS” is relatively new, it is a very old concept. In the mainframe's ice age, it was the most common way of computing. The question now is whether improved connectivity, respectively, the Internet, will cause everything that the PC has changed in the last 20 years or so to revert back. This is, certainly, an interesting question.
The crucial point is what role the rise of the Internet plays here. In former times, mainframes were connected as well, and companies accessed them over the net, although, they didn’t use TCP/IP then. So what exactly did the Internet change? I have been thinking about this one for a while, but I must admit, I don’t have an answer. PCs revolutionized IT simply because they were cheaper than mainframes, so maybe the Internet will change everything because connectivity is so cheap now? I don’t think so.
The cost for accessing mainframe services was never an issue. Thus, someone who predicts that cloud computing will prevail has to explain how SaaS or other forms of cloud computing are different to the cloud computing we had in the mainframe ice age. In addition, even though many people believe that SaaS and cloud computing mean basically the same, I think they are two distinct concepts. Of course, you can run software that comes as a service on desktops as well.
Web apps ^
If you ask people about the advantages of cloud computing, usually terms such as Web 2.0 or collaboration come up. There is no doubt that Web 2.0 is a success story, and that collaboration will gain more importance in the future. People who think that PC computing or Windows, in particular, is obsolete believe that Web apps will replace common desktop apps. For example, Google apps will replace Microsoft Office because they are cheaper, allow collaboration, and can be accessed from everywhere.
I never understood what those people meant by collaboration here because, in my view, Google apps don’t support it at all. Editing a document simultaneously has nothing to do with collaboration. When I think about collaboration, systems such as Sharepoint or Office Live come to mind. Collaboration means working together in a coordinated way and not working together, simultaneously. Obviously, this works perfectly with desktop apps, too.
Whether web apps will be cheaper than desktop apps has yet to be proven. I see no reason why developing a web app should cost less than developing a desktop app. Also, why should hardware for data centers be cheaper than for ordinary PCs? Payroll costs for desktop management are falling continuously because client management tools are becoming more and more sophisticated. Thus, I doubt that managing computers in the cloud will be significantly cheaper than managing desktop PCs. After all, these are all just computers which are connected over the Internet. Whether they are located in a third party data center or on computer desks around the world, it doesn’t really matter because the Internet makes the location of computational power an unimportant factor.
The “accessible from everywhere” argument is a good one, although, I think that its importance is often exaggerated. Most people just use one PC for their work. They simply don’t need to be able to access all their data from every place in the world. However, I admit that online accessibility of applications and data is an interesting feature. I just doubt that, in the long run, web browsers will play an important role here. Even though I was quite impressed by iCloud, I personally count on RIAs here because they allow the development of more sophisticated user interfaces.
Grid Computing/Distributed Computing ^
Cloud and grid computing are usually associated with each other. Everyone thinks about Google with its large data centers, where thousands of computers work together to perform a full text search on billions of web pages in a blink of an eye. If one thing is for sure, it’s that grid or more generally distributed computing will gain importance in the future.
Actually, I even believe that distributed computing might replace server virtualization in the long run. Server virtualization means that you run one and the same OS multiple times on a piece of hardware. Obviously, this is a waste of resources. With distributed computing, you have the same advantages as with virtualization, i.e., better scalability, but you utilize hardware resources more effectively because you don’t need the guest OS, anymore.
However, the interesting question is whether distributed computing will take place only in big data centers. Projects such as SETI have shown that, in many cases, it doesn’t really matter where the computational power is located. However, I think we will always need computational power where the end user is located, because future user interfaces will be more sophisticated. Sooner or later, we will have 3D user interfaces with virtual- and enhanced- reality. You can’t do that for a lower price in data centers. However, there will always be unused computational power on desktop PCs which can be utilized for other purposes. I wonder when the first company will come up with a workable idea to compete with Google by leveraging these unused resources on billions of desktops and laptops? Then, the whole Internet would be just a big cloud!
Server hosting ^
Another form of cloud computing is the server hosting. Since I just blogged about it recently, I won’t go into this today.
Subscribe to 4sysops newsletter!
Did I forget an important form of cloud computing? What is your view? Is cloud computing a threat for Windows?