AI will make us more productive, the studies said and AI advocates. It is a discourse that is already well known and seemed reasonable: models allow us to automate routine tasks and use that time on other productive things, right? Well, the truth is, (at the moment) no. And what is happening is curiously the same thing that happened 40 years ago.
The productivity paradox. In 1987 the economist and Nobel Prize winner Robert Solow realized of a singular paradox in the so-called “information age”. The transistors, microprocessors, and integrated circuits discovered in the 1960s were supposed to revolutionize businesses and dramatically increase productivity. What happened was just the opposite. Productivity growth did not accelerate, but rather slowed down: between 1948 and 1973 it was 2.9%, but since 1973 that growth was only 1.1%. So much chip for nothing? It seemed that way, at least those first few years.
History repeats itself: AI is of little use. As they point out in Fortunethat paradox has resurfaced just now that we are suffering exactly the same thing with AI. A new study published by the National Bureau of Economic Research (NBER) reveals a striking conclusion after surveying no less than 6,000 CEOs, CFOs and other managers from several countries: they see very little impact of AI on their real operations.
AI is not changing anything. Although two-thirds of the managers surveyed indicated that they used AI in their processes, this use was very limited: about 1.5 hours per week. 25% of participants indicated that they did not use AI at all at work. Nearly 90% of the companies that participated highlighted that AI has not influenced their hiring or productivity in the last three years.
But they are optimistic. The use of AI by these executives appears to be very limited at the moment, but those same companies are still waiting for a substantial impact. In fact, they expect productivity to increase by 1.4% in the next three years. Another paradox: these first years AI was supposed to cut hiring by 0.7%, but respondents revealed a 0.5% increase in those hiring.
The data confirm that at the moment, little. The truth is that the vaunted AI revolution has still not become a reality, at least in terms of productivity and economic return. Economist Torsten Slok recently indicated that “AI is everywhere except in macroeconomic data: you don’t see it in employment, productivity or inflation data.” His thesis: the impact of AI is currently almost zero. In fact, except in the case of technology’s “Magnificent Seven,” there are no signs of profit margins or revenue expectations.
But these revolutions take time. The revolution that semiconductors brought us took a while to crystallize, but it ended up doing so: in the 1990s and 2000s were produced productivity improvements such as an increase of 1.5% between 1995 and 2005. There are experts who they point because in fact this change in trend has already begun to occur: in the US, GDP in the fourth quarter grew by 3.7% despite the fact that there were job cuts. That points to an increase in productivity. Slok also pointed to this possibility, and theorized that the impact could end up having a “J” shape, first slowing down and then exploding.
Let them tell the steam engine. Previous industrial revolutions, such as the one that produced the steam engine or, even more importantly, electricity, took their time. The initial delay disappeared over the course of subsequent decades because these technologies needed time to spread to the rest of the productive sectors. Excessive optimism does not help, of course, and at the moment what is reasonable seems to lie somewhere in between: neither “AI is useless” nor “AI will do everything for us.” Perhaps the only thing AI needs—in addition to improving—is for us to give time to time. It is not in vain that many describe it as “the new electricity.”
Image | The Standing Desk
In Xataka | Until now “software was eating the world.” Now AI is eating software

GIPHY App Key not set. Please check settings