News

Chatbot: study proves rapid development of AI models

Do you regularly use a chatbot such as ChatGPT? You’re not alone. After all, these AI helpers are becoming increasingly popular. And that is unlikely to change any time soon. In fact, AI-based helpers could become even more important in the future. This is suggested by a recently published study. It has shown that the major language models are developing at an impressive rate.

Chatbot development faster than chip technology

When it comes to technical progress, computer chips have so far had to serve as role models. After all, there is hardly any other technical area in which development is as rapid. This has now apparently changed. The performance of AI-based chatbots appears to be increasing even faster. A scientific study has now focused on large language models (LLMs), which to a certain extent represent the lexicon for chatbots.

According to the researchers, computing efficiency in particular has improved significantly over the last eight months. The workload has probably decreased by 50% for the same task. Computer chips simply cannot keep up with this. A doubling of performance is only possible here every one and a half to two years. The whole thing is scientifically proven by Moore’s Law. We may now have to introduce a new scientific law for chatbots, or rather LLMs.

The focus is on the algorithm

While computer chips can only provide more computing power with better hardware, the focus with chatbots or LLMs is on the software. This has now also been emphasized by Tamay Besiroglu from the Massachusetts Institute of Technology (MIT). In an interview with New Scientist, he says that it is algorithms that determine the performance of chatbots.

And it is precisely these that seem to be developing at breakneck speed. According to Besiroglu, however, the increase in performance is also possible in other ways. Apparently, LLMs can now be scaled up. However, this in turn means that more computing power is required from the system. This requires suitable chip technology such as the H100 accelerator, which is currently providing Nvidia with a real windfall.

However, the solution is not only expensive, but also fails due to the limited number of chips specifically designed for AI applications. In their test, the team of scientists scrutinized a total of 231 AI models. If you would like to take a look at the entire study, you can view it as a preprint at Arvix.

Related Articles

Neue Antworten laden...

Avatar of Basic Tutorials
Basic Tutorials

Gehört zum Inventar

13,198 Beiträge 3,020 Likes

Do you regularly use a chatbot such as ChatGPT? You’re not alone. After all, these AI helpers are becoming increasingly popular. And that is unlikely to change any time soon. In fact, AI-based helpers could become even more important in the future. This is suggested by a recently published study. It has shown that the major language models are developing at an impressive rate. Chatbot development faster than chip technology When it comes to technical progress, computer chips have so far had to serve as role models. After all, there is hardly any other technical area in which development is … (Weiterlesen...)

Antworten Like

Back to top button