Cloud Computing - Understanding Its Basic Nature - Shantanu's Blog
Cloud Computing - Understanding Its Basic Nature

People, even experts, struggle to define cloud computing. It is an abstract entity. An entity that exists beyond the realm of our five senses.
It is a man-made abstraction. So it can be analyzed in terms of design goals.
The first goal is to digitize every aspect of computing. Software components are by definition digitized. They do not have any physical existence. But that is not true for the hardware components like hard disk, CPU, RAM, router etc. The very idea of converting one such machine into a cluster of 0's and 1's is pretty disruptive.
The wow factor of cloud computing lies in that. It also goes by the name virtualization. I find it more like a heavy jargon and hence avoid it.
An electronic component, made of silicon and metal, is basically a signal processing / storage machine. You send some electronic signals to it and expect a different set of signals in reply. For storage, you expect your signals to be stored and retrieved back in their original state.
When you digitize, you replace the silicon and metal chunk with a piece of code that does the same job. A simplistic solution would be to just plug in a function or method or procedure or routine, depending on the programming language you are using.

The next wonderful thing about cloud computing is aggregation of resources. The correct jargon is "resource pooling".
Those who are familiar with Linux would know what a logical volume is. Multiple physical volumes are combined to create one large logical volume.
The following diagram represents a regular Linux installation:

But in the same fashion you cannot pool CPUs on a Linux machine. Cloud computing allows you do to that. Not only CPU, you can pool any computing resource - memory, network, disk etc.
Pooling cheap consumer grade resources to build a high capacity computing resource is one of the important design goals of cloud computing. Very large components are usually prohibitively expensive. Take the example of a 100TB SSD from Nimbus. It has a price tag of $40,000. That is $400 per TB. A 1TB SSD actually costs $90.
Why would one need super computer like machines in the first place? Only a handful of cloud customers do AI/ML stuff. Others use the cloud for basic computing.
Well, you pool resources to create massive capacities. Then you rent out those virtual resources to multiple tenants to achieve an economy of scale. If you have, say, one thousand tenants using a large virtual machine, your costs, fixed as well as operational, come down substantially.
Automation is another important aspect of cloud computing. Computing is all about automation. Most of the business software today performs tasks that humans can perform, albeit at a much slower pace and with lower accuracy levels.
The automation cloud computing uses are significantly different, but essentially the same. The difference lies in the fact that bots provision new instances and install software in those before releasing those directly to the end consumers.
But at a base level automation is about processing digital data as per pre-defined instructions and without human intervention. So when a clerk creates an invoice in a transaction system, he essentially creates a digital avatar of a physical invoice. Once digitized, computer programs can take over and do a variety of things with the data.
Similarly, when you digitize a piece of computer hardware, it becomes "data" for a host of programs. The concept is a little unsettling. But once we get used to seeing every digital chunk as data - be it a virtual CPU or a lambda function - the cloud automation part and its sheer enormity becomes clear.

June 8th, 2022