AI Chips – 2020 and beyond
AI Chips – 2020 and beyond
CPU (Central Processing Unit) was until now the center of attraction. It was the guts of the machine. With the advent of AI, CPU had been replaced by GPU (Graphics Processing Unit) which is more apt for AI tasks. And now, GPUs are being edged out by FPGA (Field Programmable Gate Arrays) and Tensor Chips. These are specialized AI chips. And, they are not the only game in the town. AI computing is moving to the edge. By that, we mean that we are expecting to replace the cloud-based AI chips with more of them on the device. Take for, instance, the processing of CCTV footage. As you know that CCTV collects a hell of a lot of data every minute. All these need to be processed. And we have just two options: 1. Send the data to a cloud AI chip 2. Process it locally. Option 1. Used to be the de-facto, but now things are changing. There is so much data that needs to be uploaded to the cloud that 5G seems to be the only hope. The other option is to process it locally.
Local AI (Edge AI) is being touted as the mantra to get us out of this situation. If you look at the trends, this seems to be the ethos of the AI community. We need newer forms of chips that can work locally with all the sensor data. Many companies are working towards making this dream come true. In the latest SoC (System on a Chip), about 5% of the total number of transistors within the chip seems to be allocated for AI operations. And this is a trend that will rise in the future. For mobile devices, this is already true (Take the latest MediaTek and Qualcomm processors available). As more and more applications start using DNN (Deep Neural Network), the number of chips that will incorporate AI instruction sets will go up. At present these edge AI chips pack only a part of AI instructions. For example, take mobile phones, most of the AI-related work seems to be on photos and videos (Hence the AI instructions are an indicator of these use cases).
In the future more generic AI would be included within the chips. The AI chip market is expected to be 100 Billion USD by 2025. If say that by then, they sell at a $ apiece, there would be 100 billion AI chips in the market. A general-purpose AI chip (like our good old CPU) seems to be out of the question. As there are varied use cases out there, we need to address the problems with specialized chips. For example, say that a particular chip has a Random Forest algorithm, another one has a Bayes Classifier algorithm, and so on. In fact, in the future, you would get menus of AI Chips. And you select which one solves your problem. The limited general-purpose SoC’s which are found on the phones just do 2 or 3 things very well. As the edge is where the action would be, manufacturers are fighting the limits of size and cores. (some chips are available on 7 nm and 1024 cores today) The order of the day seems to be a hybrid architecture. (part cloud / part edge)
I would bet on quantum computing AI chips, maybe 5 years from now. (when the error correction seems to narrow) To start with, they may be clumsy. Big modules (maybe at cryogenic temperatures) but just having one of them would replace say 10,000 AI chips on the data center. As we achieve more perfection, maybe 10-20 years from now, a quantum AI chip would be available on the edge. But that would be overkill. No need to depend on the internet. This may take some time but cannot be over-ruled. Like an SoC today. It’s almost fully functional albeit without the limitations. Quantum computing can propel us towards those areas of science (like say the discovery of new molecules or say solving the extra-terrestrial problem) which today seems implausible with the kind of hardware and software we are using. In the next 30 years, we are headed for a revolution and maybe there will be a singularity. A machine becoming conscious. Don’t just rule it out, yet.
God Bless!