UC Berkeley research paper finds solution for memory "bottleneck" roadblock for AI models

Education
Webp airesearchers
Co-authors Hao Liu (left) and Pieter Abbeel | path.berkeley.edu, people.eecs.berkeley.edu

LETTER TO THE EDITOR

Have a concern or an opinion about this story? Click below to share your thoughts.
Send a message

Community Newsmaker

Know of a story that needs to be covered? Pitch your story to The Business Daily.
Community Newsmaker

A new research paper titled "Ring Attention with Blockwise Transformers for Near-Infinite Context" has proposed a solution to the memory limitations faced by AI models. The paper, co-authored by UC Berkeley PhD student Hao Liu, UC Berkeley professor Pieter Abbeel, and Databricks CTO Matei Zaharia, aims to eliminate the memory "bottleneck" that restricts the amount of input AI models can handle.

Currently, AI models rely on graphics processing units (GPUs) to store internal outputs. However, the amount of memory space required for processing imposes limits on the capacity of AI models. Liu explained in an interview with Business Insider that the goal of their research was to overcome this limitation, regardless of the speed of the GPUs.

The researchers' solution involves creating a ring of GPUs that can pass portions of the process to the next GPU while simultaneously receiving similar blocks from other GPUs. This distribution process effectively eliminates the memory issue and enables AI models to accept and process larger contexts, Liu said.

The new method, known as Ring Attention, has the potential to significantly increase the number of tokens an AI model can handle. A "context window" refers to the space where users input information into an AI model, and "tokens" represent the words, numbers, or other bits of information being fed to the model. While current models like GPT-4 can handle around 32,000 tokens, larger models may accommodate up to 100,000 tokens, roughly equivalent to the length of a novel. The Ring Attention method could enable AI models to handle millions of tokens, expanding their capabilities exponentially.

The implications of this breakthrough are vast, according to the research paper. The paper suggests that AI models using the Ring Attention method could understand scientific data like gene sequences, interpret and generate code, and learn from trial and error. Additionally, AI models could analyze extensive numbers of books and movies, generating valuable information about them.

Liu expressed his excitement about the potential applications of the Ring Attention method, particularly in the hands of big tech companies. This research opens up a world of possibilities for AI advancements and could pave the way for groundbreaking developments in various fields.

The research paper represents a significant step towards overcoming the memory limitations that have hindered AI models' capabilities. With the Ring Attention method, AI models can handle larger contexts and process more information than ever before, promising exciting advancements in the field of artificial intelligence, according to Business Insider.

LETTER TO THE EDITOR

Have a concern or an opinion about this story? Click below to share your thoughts.
Send a message

Community Newsmaker

Know of a story that needs to be covered? Pitch your story to The Business Daily.
Community Newsmaker

MORE NEWS