DeepMind Lab used AlphaZero AI to solve a fundamental mathematical problem in computer science and broke a record set over 50 years ago. Writes about it Technology Review.

It’s about matrix multiplication. It is a critical type of computation that underpins applications ranging from displaying images on a screen to simulating complex physical processes.

Despite the widespread use of the method, it is still not well understood. A matrix is a grid of numbers that can represent anything. The basic technique of multiplying two such objects is taught in high school.

However, things get more complicated when you try to find a faster method for solving the problem. According to scientists, there may be more options for multiplying two matrices than there are atoms in the universe.

“The number of possible actions is almost infinite,” said DeepMind engineer Thomas Hubert.

The researchers’ approach is to turn the task into a kind of board game called TensorGame. The board is a multiplication problem, and each move is aimed at solving it. Thus, a series of actions taken towards the final goal is an algorithm.

The scientists then trained a new version of AlphaZero called AlphaTensor to play the game. Similar to chess or go, artificial intelligence learned the best series of moves when multiplying matrices. For the victory with the minimum number of moves, AlphaTensor received a reward.

“We turned it into a game — our favorite kind of framework,” Hubert said.

The main result of the researchers is to accelerate the solution of this problem. For example, the basic school method of multiplying matrices four by four consists of 64 steps. The fastest way to solve the problem was discovered in 1969 by the German mathematician Volker Strassen: it consists of 49 moves. AlphaTensor did it in 47 steps.

According to the researchers, the DeepMind system outperforms the best existing algorithms for more than 70 different matrix sizes. They were impressed by the number of different correct algorithms that AlphaTensor found for each problem.

“It’s amazing that there are at least 14,000 ways to multiply four-by-four matrices,” says Hussein Fawzi, a researcher at DeepMind.

After searching for the fastest algorithms in theory, the team used AlphaTensor to search for algorithms on Nvidia V100 GPUs and Google TPUs. According to the test results, the program found the right solutions 10-20% faster than using standard methods on similar chips.

This is also fundamental to machine learning itself, the researchers say. Speeding up computing can have a big impact on thousands of everyday computing tasks, cutting costs and saving energy.

In the future, DeepMind plans to use AlphaTensor to find other types of algorithms.

Recall that in July, the AI lab said that the AlphaFold system predicted the structures of more than 200 million proteins. These are almost all compounds known to science found in plants, bacteria and animals.

In May, DeepMind introduced a visual language model with 80 billion parameters.

Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!

Found a mistake in the text? Select it and press CTRL+ENTER

DeepMind Lab used AlphaZero AI to solve a fundamental mathematical problem in computer science and broke a record set over 50 years ago. Writes about it Technology Review.

It’s about matrix multiplication. It is a critical type of computation that underpins applications ranging from displaying images on a screen to simulating complex physical processes.

Despite the widespread use of the method, it is still not well understood. A matrix is a grid of numbers that can represent anything. The basic technique of multiplying two such objects is taught in high school.

However, things get more complicated when you try to find a faster method for solving the problem. According to scientists, there may be more options for multiplying two matrices than there are atoms in the universe.

“The number of possible actions is almost infinite,” said DeepMind engineer Thomas Hubert.

The researchers’ approach is to turn the task into a kind of board game called TensorGame. The board is a multiplication problem, and each move is aimed at solving it. Thus, a series of actions taken towards the final goal is an algorithm.

The scientists then trained a new version of AlphaZero called AlphaTensor to play the game. Similar to chess or go, artificial intelligence learned the best series of moves when multiplying matrices. For the victory with the minimum number of moves, AlphaTensor received a reward.

“We turned it into a game — our favorite kind of framework,” Hubert said.

The main result of the researchers is to accelerate the solution of this problem. For example, the basic school method of multiplying matrices four by four consists of 64 steps. The fastest way to solve the problem was discovered in 1969 by the German mathematician Volker Strassen: it consists of 49 moves. AlphaTensor did it in 47 steps.

According to the researchers, the DeepMind system outperforms the best existing algorithms for more than 70 different matrix sizes. They were impressed by the number of different correct algorithms that AlphaTensor found for each problem.

“It’s amazing that there are at least 14,000 ways to multiply four-by-four matrices,” says Hussein Fawzi, a researcher at DeepMind.

After searching for the fastest algorithms in theory, the team used AlphaTensor to search for algorithms on Nvidia V100 GPUs and Google TPUs. According to the test results, the program found the right solutions 10-20% faster than using standard methods on similar chips.

This is also fundamental to machine learning itself, the researchers say. Speeding up computing can have a big impact on thousands of everyday computing tasks, cutting costs and saving energy.

In the future, DeepMind plans to use AlphaTensor to find other types of algorithms.

Recall that in July, the AI lab said that the AlphaFold system predicted the structures of more than 200 million proteins. These are almost all compounds known to science found in plants, bacteria and animals.

In May, DeepMind introduced a visual language model with 80 billion parameters.

Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!

Found a mistake in the text? Select it and press CTRL+ENTER