RSS Feed Source: MIT Technology Review

On January 8, Nvidia CEO Jensen Huang jolted the stock market by saying that practical quantum computing is still 15 to 30 years away, at the same time suggesting those computers will need Nvidia GPUs in order to implement the necessary error correction. 

However, history shows that brilliant people are not immune to making mistakes. Huang’s predictions miss the mark, both on the timeline for useful quantum computing and on the role his company’s technology will play in that future.

I’ve been closely following developments in quantum computing as an investor, and it’s clear to me that it is rapidly converging on utility. Last year, Google’s Willow device demonstrated that there is a promising pathway to scaling up to bigger and bigger computers. It showed that errors can be reduced exponentially as the number of quantum bits, or qubits, increases. It also ran a

Click this link to continue reading the article on the source website.

RSS Feed Source: MIT Technology Review

In the rapidly evolving landscape of digital innovation, staying adaptable isn’t just a strategy—it’s a survival skill. “Everybody has a plan until they get punched in the face,” says Luis Niño, digital manager for technology ventures and innovation at Chevron, quoting Mike Tyson.

Drawing from a career that spans IT, HR, and infrastructure operations across the globe, Niño offers a unique perspective on innovation and how organizational microcultures within Chevron shape how digital transformation evolves. 

Centralized functions prioritize efficiency, relying on tools like AI, data analytics, and scalable system architectures. Meanwhile, business units focus on simplicity and effectiveness, deploying robotics and edge computing to meet site-specific needs and ensure safety.

“From a digital transformation standpoint, what I have learned is that you have to tie your technology to what outcomes drive results for both areas, but you have to allow yourself to be flexible,

Click this link to continue reading the article on the source website.

RSS Feed Source: MIT Technology Review

The Intel 4004, the first commercial microprocessor, was released in 1971. With 2,300 transistors packed into 12mm2, it heralded a revolution in computing. A little over 50 years later, Apple’s M2 Ultra contains 134 billion transistors.

The scale of progress is difficult to comprehend, but the evolution of semiconductors, driven for decades by Moore’s Law, has paved a path from the emergence of personal computing and the internet to today’s AI revolution.

But this pace of innovation is not guaranteed, and the next frontier of technological advances—from the future of AI to new computing paradigms—will only happen if we think differently.

Atomic challenges

The modern microchip stretches both the limits of physics and credulity. Such is the atomic precision, that a few atoms can decide the function of an entire chip. This marvel of engineering is the result of over 50 years of

Click this link to continue reading the article on the source website.

RSS Feed Source: MIT Technology Review

In this exclusive webcast, we delve into the transformative potential of portable microservices for the deployment of generative AI models. We explore how startups and large organizations are leveraging this technology to streamline generative AI deployment, enhance customer service, and drive innovation across domains, including chatbots, document analysis, and video generation.

WATCH NOW

Our discussion focuses on overcoming key challenges such as deployment complexity, security, and cost management. We also discuss how microservices can help executives realize business value with generative AI while maintaining control over data and intellectual property.

WATCH NOW

Click this link to continue reading the article on the source website.