In my younger years, teens and perhaps even younger (all that transpired many, many years ago; details often get a bit blurry), visual and written science fiction was a decidedly aspirational periscope. All manner of tech, computers processing enormous calculations of 0s and 1s in the blink of an eye, a chatbot in a wristband or in the car, sunglasses that could take a photograph, swipe gestures on an interface, the possibilities were endless. We transcended that realm (some of it was cooler than the rest), fiction into reality, probably a couple of years ago. Yet, the pace at which tech is being chucked our way here and now, almost inconceivable yet very real.
I recall this, because Microsoft CEO Satya Nadella’s year-end address provided that spark. “The pace of AI innovation this past year has been truly remarkable. But looking back, what stands out most is the incredible speed at which this technology has diffused,” his words. On point, and inarguable. Nadella is of course on the AI, and wider tech innovation frontlines. The pace is so rapid, we’ve seen the coming followed by a perceived demise of AI wearables in a short window. Any idea that doesn’t immediately click, gets rapidly discarded.
There’s a reason for that. I’d credit underlying tech that is changing fast, and therefore everything that builds on it, too makes rapid leaps.
In just the past few days, Google’s unveiled a quantum chip thats a sort of breakthrough, Nvidia’s unveiled an affordable yet astonishingly powerful Generative AI supercomputer, OpenAI’s o1 model is out of preview, and Google’s raised that move with their own Gemini 2.0 update. That’s not all. OpenAI has unlocked ChatGPT on WhatsApp too—you must add the official contact number in the phone’s address book, locate it in WhatsApp and begin chatting in text-only input mode (for now, no visual input or voice mode) with the o1-mini model.
It becomes important to assess the significant progress Google and Nvidia have made.
Nvidia’s Jetson Orin Nano Super (the name’s a mouthful, but that shouldn’t hold you back) adds to the Jetson Orin computer line, and is the sort of attempt that’d take this sort of mini PC mainstream. For $249 (that’s around ₹21,180), this registers a “1.7x leap in generative AI inference performance,” claims to have 70% more neural processing power than its predecessor (which means, now at 67 trillion operations per second, or TOPS) and 50% more memory bandwidth at 102GB/s. The base specs include a GPU thats an Nvidia Ampere architecture with 1024 CUDA cores and 32 tensor cores, a 6-core Arm Cortex-A78AE v8.2 64-bit CPU, and 8GB memory.
But why is all this pivotal to how tech is evolving? By being able to handle 67 trillion operations per second in what is really a tiny form factor, Nvidia’s craziness makes this computing device more than relevant for robotics, research, small businesses, even for training advanced AI models and running somewhat lightweight AI applications. Agentic AI (believe me, you’ll hear a lot about it in the coming year) will find its legs in the real world with underlying, powerful tech such as this. Hobbyists would find this quite fun to work with—just 25-watts power consumption. It can essentially be plugged into other computing hardware, adding AI compute power easily.
Before I explain the significance of Google’s Willow chip, it becomes imperative to explain what quantum computing actually is. Simply put, this is a type of computation that harnesses the principles of quantum mechanics—specifically superposition, entanglement, and quantum interference—to perform operations on data. This contrasts with classical computers which use bits representing either 0 or 1; quantum computers use qubits, which can represent 0, 1, or a combination of both simultaneously. This allows quantum computers to work on multiple possibilities in parallel, and can solve complex problems much faster than classical computers.
As Microsoft Azure describes the differences between qubit and bits, it states, “A qubit, however, can represent a 0, a 1, or any proportion of 0 and 1 in superposition of both states, with a certain probability of being a 0 and a certain probability of being a 1.”
Google’s latest quantum chip, they claim, performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion years—they specifically reference ORNL’s Exascale super computer, which itself exceeds a quintillion calculations per second; primed for scientists developing new tech and research for medicine, energy and materials.
Also Read: Tech Tonic | More tech bosses should be very worried about the law catching up
Here’s something that would likely escape wider attention—Google researchers say that the more qubits were being used with 105-qubit quantum processor Willow, the greater a reduction in errors. In fact, more qubits grouped together, get down to higher error correction. Competition is already at the doorstep, get your popcorn ready. Chinese scientists, who’ve developed the Zuchongzhi 3.0, lead us to believe that the race is on to build the world’s most powerful quantum computer.
Scientists from the University of Science and Technology of China, who confirm the Zuchongzhi 3.0 is also a 105-qubit quantum processor, is a significant upgrade over its predecessor. The error-correction methodology used is similar to Google’s approach with Willow, which means it is for all intents, an at-par competition. In case you are wondering about developing use cases for quantum computing, artificial intelligence training, predictive AI deployment, better encryption and simplifying the typical logistical puzzles, seem par for course. And, we aren’t even in 2025 yet.
Vishal Mathur is the technology editor for HT. Tech Tonic is a weekly column that looks at the impact of personal technology on the way we live, and vice-versa. The views expressed are personal.