RSS Feed Source: MIT Technology Review

The Intel 4004, the first commercial microprocessor, was released in 1971. With 2,300 transistors packed into 12mm2, it heralded a revolution in computing. A little over 50 years later, Apple’s M2 Ultra contains 134 billion transistors.

The scale of progress is difficult to comprehend, but the evolution of semiconductors, driven for decades by Moore’s Law, has paved a path from the emergence of personal computing and the internet to today’s AI revolution.

But this pace of innovation is not guaranteed, and the next frontier of technological advances—from the future of AI to new computing paradigms—will only happen if we think differently.

Atomic challenges

The modern microchip stretches both the limits of physics and credulity. Such is the atomic precision, that a few atoms can decide the function of an entire chip. This marvel of engineering is the result of over 50 years of

Click this link to continue reading the article on the source website.

RSS Feed Source: MIT Technology Review

In this exclusive webcast, we delve into the transformative potential of portable microservices for the deployment of generative AI models. We explore how startups and large organizations are leveraging this technology to streamline generative AI deployment, enhance customer service, and drive innovation across domains, including chatbots, document analysis, and video generation.

WATCH NOW

Our discussion focuses on overcoming key challenges such as deployment complexity, security, and cost management. We also discuss how microservices can help executives realize business value with generative AI while maintaining control over data and intellectual property.

WATCH NOW

Click this link to continue reading the article on the source website.

RSS Feed Source: MIT Technology Review

Tech companies have been funneling billions of dollars into quantum computers for years. The hope is that they’ll be a game changer for fields as diverse as finance, drug discovery, and logistics.

Those expectations have been especially high in physics and chemistry, where the weird effects of quantum mechanics come into play. In theory, this is where quantum computers could have a huge advantage over conventional machines.

But while the field struggles with the realities of tricky quantum hardware, another challenger is making headway in some of these most promising use cases. AI is now being applied to fundamental physics, chemistry, and materials science in a way that suggests quantum computing’s purported home turf might not be so safe after all.

The scale and complexity of quantum systems that can be simulated using AI is advancing rapidly, says Giuseppe Carleo, a professor of computational

Click this link to continue reading the article on the source website.

RSS Feed Source: MIT Technology Review

It turns out that you don’t need to be a scientist to encode data in DNA. Researchers have been working on DNA-based data storage for decades, but a new template-based method inspired by our cells’ chemical processes is easy enough for even nonscientists to practice. The technique could pave the way for an unusual but ultra-stable way to store information. 

The idea of storing data in DNA was first proposed in the 1950s by the physicist Richard Feynman. Genetic material has exceptional storage density and durability; a single gram of DNA can store a trillion gigabytes of data and retain the information for thousands of years. Decades later, a team led by George Church at Harvard University put the idea into practice, encoding a 53,400-word book.

This early approach relied on DNA synthesis—stringing genetic sequences together piece by piece, like beads on a

Click this link to continue reading the article on the source website.