The first to acknowledge his ignorance about artificial intelligence is Warren Buffett. That is consistent with his long-standing belief to avoid using technology that is out of his reach. He stated over the weekend at the Berkshire Hathaway annual meeting that his enormous interest in Apple, for instance—his greatest stock holding even if it has been reduced—came about more as an epiphany tied to its consumer success than as a technical gamble. However, the billionaire investor and CEO and chairman of Berkshire couldn’t get clear of the issue of artificial intelligence when it came to the keenly anticipated event in Omaha this year.

Buffett answered a number of inquiries concerning AI. Buffet referred to AI as “profound” and claimed that, like a “genie,” the technology may have devastating consequences if let loose. The greatest concerns he perceives range from the potential of a scientific breakthrough with unexpected repercussions and peril to civilization comparable to nuclear weapons, made possible by AI’s enormous scamming capabilities. Buffett added that when it comes to the influence of AI on the globe, which has the potential to drastically alter everyone’s everyday life, there is at least one question that no one can truly answer. He said that for a century, the greatest economists had struggled to answer this question.

Buffett cited the work of one of the most influential economists of the modern period, John Maynard Keynes, who accurately predicted that production per capita would expand at an exponential pace but miscalculated about what people would do with greater productivity. The books “The General Theory of Employment, Interest, and Money,” which Buffett suggested adding to a reading list, and Keynes’s advocacy of government intervention through social and job programmes to stabilise the economy during economic downturns like the Great Depression have made him the most well-known figure in macroeconomics.

Over the last few quarters, productivity has increased dramatically. BLS data indicates that following a significant productivity rise during the Covid pandemic, there was a prolonged downturn. The data has just come back over the past four quarters, with annual growth of about 3%. Corporate leaders are now questioning if variables, such as AI or return-to-office rules, may be at play in this resurgence. However, most argue that the technology is still too early to draw any meaningful conclusions, differentiating between generative AI, which is the talk of the town right now and will take some time to manifest itself in data, and AI that has been in use for years and where gains are measurable.

Even if it’s not there yet, AI will eventually play a significant role in labour productivity. Gary Cohn, the former director of the National Economic Council and vice chair of IBM, stated last week on CNBC that while productivity improvements are occurring more slowly, AI adoption is going faster. He stated, “Every company is looking at AI and deciding where it will help them,” in a recent appearance on “Money Movers” on CNBC.

“In the productivity game, we are going to evolve into this, and it will feed through the economy slowly,” Cohn stated. “I don’t think we’ve seen the real productivity boost from AI,” he continued, nevertheless.

The majority of businesses are still in the process of determining how much money to allocate for AI, developing a broad plan for how it may benefit both clients and staff, and attempting to move into implementation mode.

CEO of MongoDB Dev Ittycheria recently told CNBC that CEOs are starting to wonder when they will see the value and return on AI. His business, which unveiled a set of tools last week to assist organisations “overwhelmed by AI,” Now that the industry is moving past the stage where value is only being created at the bottom layer—Nvidia and ChatGPT/OpenAI, for example—it is imperative that businesses get ready for the apps that will be developed on top of that infrastructure.

It is evident that there is a tendency towards “agentic” processes, in which agents operate independently on behalf of the end user. Though it’s still a while off, developers must create apps, improve user experiences, reduce costs, and come up with fresh strategies for expanding their businesses.

Technology and productivity booms
Productivity booms are uncommon and often only occur once in a generation; the previous one occurred in the late 1990s, just before the dotcom bust, and was characterised by strong economic growth that was noticeably unrelated to the creation of new employment.

Numerous surveys covering job losses, or, to use a jargon often used to reflect the uncertainty over exact impact, jobs with “AI exposure,” have been published since ChatGPT was released in late 2022. The problem of technological advancements reducing the amount of available jobs is an ongoing concern.

Businesses prefer to claim that artificial intelligence (AI) won’t replace occupations but rather free up human workers to concentrate on higher-value abilities and activities, handing over routine chores that people find boring to perform to machines. However, over 37% of corporate executives polled by Resume Builder predicted that technology will replace labour in 2023, and 44% said that this year would see more layoffs as a result of AI breakthroughs.

But historically, technical developments like increased industrialization haven’t shown to be the career-ending threats that experts claim they are.

Some academics, billionaires, and politicians have stressed the necessity for a universal basic income, or UBI, to replace reduced or no pay and keep the economy afloat due to the projected lack of work prospects with the advent of AI. Sam Altman, Mark Zuckerberg, and Elon Musk are among the tech titans who are debating the concept.

Nevertheless, there is cause for scepticism regarding the precise nature of the connection between jobs and technology.

Nobel Prize-winning economist Robert Solow once said, “You can see the computer age everywhere but in the productivity statistics,” in reference to the late 1980s. Solow is considered the founder of productivity research.

Solow’s productivity paradox was established, and the boom of the late 1990s would refute it. However, further studies would demonstrate that there was, in fact, a hazy connection between productivity increases and the dotcom period. The Harvard Business Review was informed by a co-author of that McKinsey article that the adoption of Solow’s theory in connection with the tech boom of the 1990s was “oversimplified,” as was the theory that followed, according to which the internet was the primary driver of the productivity explosion.

In light of these worries and unanswerable problems, Buffett stated that businesses that rely heavily on human labour, such as Berkshire Hathaway, must strike a balance between how technology may increase their efficiency and prevent harm to people.

“I don’t know how you make sure that’s what happens any more than I know how to be sure that you knew that you hadn’t created something that could destroy the world later on when you used two atomic bombs in World War II,” remarked Buffett.

You May Also Like

More From Author

+ There are no comments

Add yours