Close

AI Economic Profitability Tug-of-War: Where Does the Balance Lie?

Share this article

 

Artificial Intelligence (AI) has raised hopes and fears in equal measure. Some say human-level AI is coming as early as next year! Some people worry about the ethical dilemmas it presents, while others are concerned that commercial interest will fade as costs continue to exceed revenues. This article explores the key debates, clears up the confusion around the technologies, and offers my own thoughts.

Despite a rush of investments, AI seems to be losing momentum. From investing in AI chips to hiring high-tech talent, the industry has seen an unprecedented influx of money.

The costs of these chips, especially from Nvidia, are extremely high, boosting the company’s stock prices significantly, at least for now. Everyone familiar with high-tech chips knows what’s happening at the New York Stock Exchange, although the shares are highly overvalued!

The rate of return and the limitations of AI are raising concerns that it might be time to temper expectations with a dose of reality. At the heart of the discussions is whether we are entering a new information age or heading towards a disappointment similar to the fiber-optic boom of the late 1990s—a boom that led to some of the biggest crashes of the first dot-com bubble.

Read Related: AI in Africa: Tanzania’s Tryst with Technology and Tradition

However, there are major differences between fiber optics and AI in terms of applications and the potential to replace both skilled and unskilled labor. Fiber optics are just conduits for digital data with limited commercial use beyond that. In contrast, AI tools have the potential to replace human labor in significant ways. Thus, the two industries are quite different regarding quality, profitability, and durability.

AI Screeching to A Halt?

A look at today’s large language model AIs, like OpenAI’s ChatGPT and Google’s Gemini, shows their talent for writing and analysis, improving search results by packing in more data. These tools are essentially advanced search engines loaded with all kinds of data, allowing them to quickly find and provide the information one is looking for in seconds.

The threats to these AI tools include competition, sustainability, and abuse. Many companies are now training their own AIs and do not need to rely on the ones mentioned. Most human-generated data on the internet has likely already been used, and the remaining untouched data is becoming scarce. It seems the saturation point has been reached, and further scraping of data is becoming costly and unrealistic over time.

The next generation of AI is focusing on synthetic data, which is proving to be less effective and unprofitable. Evidence of the slowdown in AI improvements can be seen in research showing that the performance gaps between various AI models are closing. The best proprietary AI models are converging on similar scores in ability tests, and even free, open-source models, like those from Meta and Mistral, are catching up.

AI Industry is Killing Competition!

The commoditization and sustainability of any product depend on lower entry costs, the ability to withstand pressure from big players, and the availability of affordable investment to generate new ideas and solutions. The cost of entry for AI can be relatively low, depending on the product. One only needs a laptop or desktop and access to a good amount of data to start developing solutions and addressing societal problems.

Acquiring the necessary skills is relatively easy, as free tutorials are widely available. However, coming up with the right solutions to societal problems is more challenging. Many startups fail because, while their ideas may be passionate to the founders, they often have little societal value. Once in the market, these ideas generate minimal commercial interest and naturally fade away.

A serious threat to new AI startups comes from big companies that don’t tolerate competition. When these big players realize a startup might eventually eat into their market share, they pursue the startup in several ways.

The first approach is an outright sale; if that works, the startup’s ideas are quickly integrated into the operations of the larger company. If the smaller company refuses to sell, the big players often resort to other tactics, such as stealing, copying, and using the ideas.

This is why intellectual property rights are becoming critical in the industry. The major issue is that it’s difficult for smaller players to codify their innovations in a way that is easily identifiable and legally protectable. Therefore, such cases are complex to litigate. Just like in the music industry, this complexity extends to AI.

Also, read: Artificial Limits: Evaluating AI’s Potential to Transform or Disrupt Human Life

If someone copies a few elements and adds their own twist, it becomes difficult for arbitration to award damages to the true originator. Given the evolving nature of technology, most startups aim to get hired by prominent high-tech companies.

Having a successful high-tech startup attracts suitors who can buy, copy, or kill an idea, which dampens fair competition. Without vigorous competition, product variations, pricing, and quality suffer the most.

Is Today’s AI Unprofitable?

The future of AI paradoxically lies outside the US because of the high cost of chips needed to run it. For example, Silicon Valley venture-capital firm Sequoia calculated that the industry spent $50 billion on Nvidia chips to train AI in 2023, but only brought in $3 billion in revenue.

Nobody makes money by keeping revenue below the cost of doing business. Operating at a loss undermines the commercial viability of the industry. American chips are highly overvalued, which is why the entry of Chinese chips—top-notch quality and fairly cheap—offers hope amidst the gloomy news elsewhere.

Equally concerning is the high operational cost, compounded by the expense of training it, which poses a significant barrier to entry in this complex industry. Another future challenge is the need for AI to sift through vast amounts of unnecessary data, leading to increased costly inefficiencies.

For companies like Google, heavily reliant on advertising revenue and now providing AI-generated summaries across billions of search results, analysts predict that delivering AI-driven answers could erode the company’s profit margins.

What are the AI Chips?

The future of artificial intelligence is heavily reliant on the advancement of AI chips. These specialized computing hardware units are crucial for developing and deploying sophisticated AI systems. As AI capabilities have evolved, so too has the demand for higher processing power, speed, and efficiency in computers.

AI chips encompass a range of specialized integrated circuits designed to handle AI tasks. These include Graphics Processing Units (GPUs), Field-Programmable Gate Arrays (FPGAs), and Application-Specific Integrated Circuits (ASICs). Each type of AI chip is tailored to efficiently manage the complex computational requirements of AI algorithms.

While Central Processing Units (CPUs) can handle simpler AI tasks, they are becoming less effective as technologies advance. Therefore, GPUs, FPGAs, and ASICs play increasingly critical roles in meeting the demanding computational needs of AI applications.

In general, a chip refers to a microchip—an integrated circuit manufactured at a microscopic scale using semiconductor material. These chips contain components like transistors, which control the flow of electrical current within a circuit and power computing functions such as memory and logic. Memory chips manage data storage and retrieval, while logic chips process the data.

AI chips primarily operate on the logic side, handling the intensive data processing demands of tasks—beyond the capabilities of general-purpose chips like CPUs. They are designed with a high density of faster, smaller, and more efficient transistors, enabling them to execute more computations per unit of energy. This design results in faster processing speeds and lower energy consumption compared to chips with larger, less efficient transistors.

Chips also possess unique features that significantly accelerate computations needed for algorithms, such as parallel processing. This capability allows AI chips to perform multiple calculations simultaneously, crucial for efficiently managing complex tasks in artificial intelligence.

Due to their specialized design, Chips are particularly effective for AI workloads and training AI models, providing substantial advantages in speed and efficiency.

Where Does Tanzania Stand in All of This?

Tanzania has made little effort to enter this industry, while many leading countries are actively seeking low-wage nations with highly educated youth to relocate some manufacturing operations for chips. The variations in chip forms and usability warrant separate discussion.

From semiconductors to the templates used for molding chips, this is an industry Tanzania should explore to start producing electronic products on a large scale. China, a leader in low-cost chips and semiconductor manufacturing, could be a viable partner. China might consider shifting some production tasks to manage rising costs, including labor expenses.

Labor costs in China are also escalating, impacting production expenses. As previously mentioned, these production costs hinder the realization of AI chip potential. Tanzania should strategize to establish a foothold in this promising industry despite discouragements and challenges. It’s a leap of faith we must collectively undertake sooner rather than later to avoid playing catch-up.

Additionally, Tanzania urgently needs to enact laws protecting intellectual property rights in the AI industry and specifically addressing cyberbullying issues. While cyberbullying is generally defined, it fails to target the specific challenges posed by AI.

For instance, the trend of pairing real faces with nude torsos through AI is invasive to victims’ privacy rights, causing fear, shame, and low self-esteem, despite the awareness among perpetrators and their audiences that these AI-powered creations propagate falsehoods.

As we embrace new technologies, deliberate efforts are needed to address and mitigate the negative impacts arising from these powerful tools. Rather than remaining assemblers of foreign parts, Tanzania should boldly invest in chips and semiconductor manufacturing, targeting versatile tools with a wide range of applications.

The author is a Development Administration specialist in Tanzania with over 30 years of practical experience, and has been penning down a number of articles in local printing and digital newspapers for some time now.

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Leave a comment
0
Would love your thoughts, please comment.x
()
x
scroll to top