Machines were invented in the Stone Age, but have come a long way, since their inception. In a sense, we can say nonchalantly that machines are a creation of mankind in the past 500 years. It all started with the invention of the printing press – Guttenberg’s machine. While kings used the papyrus to write letters in ink during olden times, the printing machine totally automated the process. The impact of this invention was beyond our grasp. Who would have thought that a vacuum tube would revolutionize the way we perceive technology? Then came the greatest invention of the century. A sliver of silicon burnt to form a transistor. There are 3 layers and sandwiched between this layer is a dielectric (which does not conduct electricity). There are 2 kinds of transistors – pnp and npn. p stands for positive and n for negative. This configuration conducts electricity. Transistors when combined form gates. There are several types available. (XOR, AND, NAND, etc.) They all obey Boolean algebra.

So, what are chips? What we just discussed are chips. When they are overlaid to form a complex pattern you get memory, CPU and so on. For example, Intel makes mostly CISC processors. CISC stands for Complex Instruction Set Chip. This simply means that the instructions are not reduced. They create super instructions. An example would be the Multiply (MUL) instruction. This is nothing but repetitive Addition. (ADD instruction) Thus Division (DIV) instruction is nothing but repetitive Subtraction (SUB) instruction. Another way of laying out these instructions is to let them be simple. In other words, RISC, which means Reduced Instruction Set Chip. Here there is an absence of MUL or DIV instruction. Simply ADD and SUB. RISC chips are good for operations like graphics where a vector is created by repetitively drawing the same pattern. CISC chips are good for other tasks like Multitasking (running more than 1 task together). ARM is a company in Britain which manufactures RISC chips mainly being used in things like Smartphones. Another player is Qualcomm which makes the Snapdragon processors.

Applications for these chips abound. There are different verticals like Healthcare, Banking, etc. where they are used. For example, in healthcare, all patient monitoring device have chips. You must be familiar with the digital thermometer. Well, you are right! There is a chip inside. In Banking, the ATM (Automated Teller Machines) and Retail Banking Branches all use machines that use chips. A very big consumer of chips is the Technology industry. Other applications include areas like Oil, Stock Markets and Retail. Not to mention the Government is one of the heavy-duty users. Chips are growing smaller and cheaper day by day. There is something called as Moore’s law which states that the number of transistors inside a chip doubles every eighteen months. This simply means that every year or two we can expect a machine that is twice powerful. The number of cores in a CPU’s going up. We have reached 10-nanometer scale fabrication. And advancements like integrated graphics.

I have just touched upon the hardware. Now let me tell you about the software. Software is becoming both easy and complex. From the desktop to the cloud; from basic Excel charts to sophisticated Analytics; from 640 KB to ZB (Zetta Bytes), software is grappling with this monstrous amount of data piling up higher than the Himalayas. From 1 Tier to Many Tiers, software is like a layered cake, whose icing is all that you see, beyond the bottom which is the actual taste. In this case, the hardware which is made smart by software. In networks, software has pervaded up to the data link layer. (Software Defined Networking – SDN) Software architecture is undergoing a drastic change. From the simple software that works on a single machine, every software written today has a network interface. Microservices, Serverless Computing, and other advancements are marking the next wave of computing. And now the latest trend – algorithms which learn. (from big data) This is the rise of AI. (Artificial Intelligence) The world will be a newer and better place because of this game-changer. Nobody can look through a crystal ball and say what lies exactly ahead. But experts say 2029 is the year when a machine will become more intelligent than a human. God Help.

Stay Blessed!

Techno Spiritual Entrepreneur with over 30 years of experience in the IT industry. Author of 5 books, trainer and consultant. Seeker of the truth - inclined towards spirituality and technology. Also, love to read and write inspirational stuff.