November 16, 2024

Taylor Daily Press

Complete News World

Google invents more efficient AI training technology – ITdaily

Google invents more efficient AI training technology – ITdaily

Google DeepMind proposes a new way to train AI in research. This would be thirteen times more efficient, and above all, ten times more efficient, than current techniques.

Google DeepMind puts in Research paper JEST for, short for Choose common example. This is a new technique for training AI models. JEST is said to be 13 times more efficient than current AI training techniques, and 10 times more efficient. The latter is certainly important, as AI training has increased the energy demand of data centers, especially those of tech giants. Google itself has not seen its CO2 emissions decrease, but have increased by 48 percent in five years.

Quality over quantity

JEST is supposed to help make AI training manageable. The model works in several steps. First, a small AI model is built that can partition the training data based on quality. The data is then batched. In a second stage, the training of the large model begins, based on batches of high-quality data. By preemptively filtering for quality, the need for quantity is reduced.

The availability of high-quality data is critical to the JEST approach. In practice, this more complex technique is primarily suited to larger players who want to train large models. That’s where the efficiency gains come in. GPT-4o has already cost around $100 million to train. Anthropic’s CEO expects the next generation of models to cost around $1 billion.

See also  Samsung's "XISO-CELL": the next generation of camera sensors?