News
Research from all publishers Significant advances in distributed computing have focused on coding techniques to elevate both security and efficiency in matrix multiplication tasks.
Matrix multiplication is a fundamental operation in machine learning, and is one of the most time-consuming, due to the extensive use of multiply-add instructions.
This could lead to more advanced LLMs, which rely heavily on matrix multiplication to function. According to DeepMind, these feats are just the tip of the iceberg for AlphaEvolve.
DeepMind breaks 50-year math record using AI; new record falls a week later AlphaTensor discovers better algorithms for matrix math, inspiring another improvement from afar.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results