A groundbreaking memory cell has been developed that combines light and magnetic fields to perform high-speed calculations while storing data. This innovation allows computations to occur directly within the memory array, significantly enhancing processing speeds and reducing energy consumption. Such advancements could revolutionize data centers for artificial intelligence systems, addressing the growing demand for efficient computing solutions.
The memory cell's unique ability to encode multiple values per cell, unlike traditional binary systems, opens new avenues for machine learning applications. By enabling in-memory computing, this technology could improve the performance of artificial neural networks, which mimic human brain functions. Researchers are optimistic that scaling this technology could lead to substantial reductions in power usage for AI systems.
• New memory chip could significantly reduce power consumption in AI systems.
• In-memory computing enhances processing speeds for artificial intelligence applications.
This term refers to performing computations directly within the memory array, enhancing speed and efficiency.
These are machine learning algorithms that process data similarly to the human brain, benefiting from improved memory technology.
These cells utilize magnetic fields and light signals to perform high-speed computations and data storage.
Live Science on MSN.com 9month
CoinTelegraph on MSN.com 12month
Phys.org on MSN.com 6month
Interesting Engineering on MSN.com 8month
Isomorphic Labs, the AI drug discovery platform that was spun out of Google's DeepMind in 2021, has raised external capital for the first time. The $600
How to level up your teaching with AI. Discover how to use clones and GPTs in your classroom—personalized AI teaching is the future.
Trump's Third Term? AI already knows how this can be done. A study shows how OpenAI, Grok, DeepSeek & Google outline ways to dismantle U.S. democracy.
Sam Altman today revealed that OpenAI will release an open weight artificial intelligence model in the coming months. "We are excited to release a powerful new open-weight language model with reasoning in the coming months," Altman wrote on X.