AI Energy Use

Поделиться
HTML-код
  • Опубликовано: 5 июн 2024
  • As the use of artificial intelligence systems becomes more widespread in the devices we use daily, challenges of scale are being revealed. A lot of energy is needed to run these networks and as we build more, even more energy will be needed. We’ll explore one new approach, which may allow neural networks to process information more efficiently, in the U.S. National Science Foundation’s “Discovery Files”.
    Today’s AI neural networks rely on electrical components called Graphical Processing Units. In these systems, the entire network and all its interactions must be sequentially loaded from the external memory, which consumes both time and energy.
    NSF-supported researchers at the University of Michigan have developed the first memristor electrical components that can be tuned to have controllable conductivity which was shown to reduce AI's energy needs by a factor of 90 compared to traditional GPUs.
    The group found that variations on base materials result in different relaxation times, enabling memristor networks to be tuned to mimic the timekeeping mechanism that feature in both artificial and biological neural networks without the costs in time and energy of external memory.
    The state-of-the-art material is made from a mixture of earth-abundant and non-toxic ingredients. These Memristors could be further developed for cost effective mass production and result in an energy saving approach as AI systems become more a part of our daily lives.
  • НаукаНаука

Комментарии •