NIXsolutions: Integrating AI into Search Engines will Require a 5x Increase in Computing Power

Working together between search engines and large language models could require a fivefold increase in computing power and increase carbon emissions, according to Insider and Wired.

Insider, citing QScale co-founder Martin Bouchard, wrote that generative AI search requires at least 4-5 times more processing power than standard search. According to him, the existing data center infrastructure is not enough to meet the demand for technology.


According to UBS, ChatGPT now has about 13 million daily users, while Microsoft Bing handles half a billion searches per day and Google handles 8.5 billion.

Wired cited Alan Woodward, professor of cybersecurity at the University of Surrey (UK), who said that huge resources are involved in indexing and searching Internet content, but AI integration requires completely different capacities.

Although neither OpenAI nor Google has yet disclosed the cost of computing required to develop their products, experts estimate that GPT-3 training required about 1,287 MWh and resulted in over 550 tons of carbon dioxide emissions, says SecurityLab. The Insider edition gave its own example for comparison: one average car emits 4.6 tons of carbon dioxide per year.

It is worth noting that corporations are already thinking about reducing energy costs. For example, Google recently announced that it is going to launch a lighter version of Bard that will require less computing power. Corporate spokesperson Jane Park said that combining efficient models, processors and data centers with clean energy sources can reduce the carbon footprint of a machine learning system by 1,000 times.

Insider believes that it was the desire to lighten the computational load of Bard and the use of a light version of the LaMDA neural network that could lead to an error in the presentation of the Google Bard chat bot, which provoked a fall in the company’s shares and a subsequent reduction in capitalization by almost $ 100 billion, notes NIXsolutions.

According to journalists, in order to reduce the computational and energy costs of AI, corporations will release “stripped down” versions of products to reduce costs. This means that the technology will no longer meet expectations, which will undermine consumer confidence.