Specialist ‘carbon nanotube’ AI chip built by Chinese scientists is 1st of its kind and ‘1,700 times more efficient’ than Google’s

Specialist ‘carbon nanotube’ AI chip built by Chinese scientists is 1st of its kind and ‘1,700 times more efficient’ than Google’s

When you buy through links on our articles, Future and its syndication partners may earn a commission.

 A red computer chip with data coming in and out .

Unlike conventional TPUs, this new chip is the first to use carbon nanotubes — tiny, cylindrical structures made of carbon atoms arranged in a hexagonal pattern — in place of traditional semiconductor materials like silicon. . | Credit: Getty Images/sankai

Scientists in China have built a new type of tensor processing unit (TPU) — a special type of computer chip — using carbon nanotubes instead of a traditional silicon semiconductor. They say the new chip could open the door to more energy-efficient artificial intelligence (AI).

AI models are hugely data-intensive and require massive amounts of computational power to run. This presents a significant obstacle to training and scaling up machine learning models, particularly as the demand for AI applications grows. This is why scientists are working on making new components — from processors to computing memory — that are designed to consume orders of magnitude less energy while running the necessary computations.

Google scientists created the TPU in 2015 to address this challenge. These specialized chips act as dedicated hardware accelerators for tensor operations — complex mathematical calculations used to train and run AI models. By offloading these tasks from the central processing unit (CPU) and graphics processing unit (GPU), TPUs enable AI models to be trained faster and more efficiently.

Unlike conventional TPUs, however, this new chip is the first to use carbon nanotubes —  tiny, cylindrical structures made of carbon atoms arranged in a hexagonal pattern — in place of traditional semiconductor materials like silicon. This structure allows electrons (charged particles) to flow through them with minimal resistance, making carbon nanotubes excellent conductors of electricity. The scientists published their research on July 22 in the journal Nature Electronics.

Related: Razor-thin crystalline film ‘built atom-by-atom’ gets electrons moving 7 times faster than in semiconductors

According to the scientists, their TPU consumes just 295 microwatts (μW) of power (where 1 W is 1,000,000 μW) and can deliver 1 trillion operations per watt — a unit of energy efficiency. By comparison, Google’s Edge TPU can perform 4 trillion operations per second (TOPS) using 2 W of power. This makes China’s carbon-based TPU nearly 1,700 times more energy-efficient.

“From ChatGPT to Sora, artificial intelligence is ushering in a new revolution, but traditional silicon-based semiconductor technology is increasingly unable to meet the processing needs of massive amounts of data,” Zhiyong Zhang, co-author of the paper and professor of electronics at Beijing’s Peking University, told TechXplore. “We have found a solution in the face of this global challenge.”

The new TPU is composed of 3,000 carbon nanotube transistors and is built with a systolic array architecture — a network of processors arranged in a grid-like pattern.

RELATED STORIES

Unique transistor ‘could change the world of electronics’ thanks to nanosecond-scale switching speeds and refusal to wear out

Intel unveils largest-ever AI ‘neuromorphic computer’ that mimics the human brain

‘Crazy idea’ memory device could slash AI energy consumption by up to 2,500 times

Systolic arrays pass data through each processor in a synchronized, step-by-step sequence, similar to items moving along a conveyor belt. This enables the TPU to perform multiple calculations simultaneously by coordinating the flow of data and ensuring that each processor works on a small part of the task at the same time.

This parallel processing enables computations to be performed much more quickly, which is crucial for AI models processing large amounts of data. It also reduces how often the memory — specifically a type called static random-access memory (SRAM) — needs to read and write data, Zhang said. By minimizing these operations, the new TPU can perform calculations faster while using much less energy.

To test their new chip, the scientists built a five-layer neural network — a collection of machine learning algorithms designed to mimic the structure of the human brain — and used it for image recognition tasks.

The TPU achieved an accuracy rate of 88% while maintaining power consumption of only 295 μW. In the future, similar carbon nanotube-based technology could provide a more energy-efficient alternative to silicon-based chips, the researchers said.

The scientists plan to continue refining the chip to improve its performance and make it more scalable, they said, including by exploring how the TPU could be integrated into silicon CPUs.

EMEA Tribune is not involved in this news article, it is taken from our partners and or from the News Agencies. Copyright and Credit go to the News Agencies, email news@emeatribune.com Follow our WhatsApp verified Channel210520-twitter-verified-cs-70cdee.jpg (1500×750)

Support Independent Journalism with a donation (Paypal, BTC, USDT, ETH)
WhatsApp channel DJ Kamal Mustafa