Special Edition - The Global AI Ecosystem
This week I’ve decided to dedicate the entire edition to the AI Ecosystem Map below. A higher-res image (which looks great on A3) is available at this LinkedIn post.
The reason for starting this exercise was simply to plot the relationships between the major listed conglomerates and most relevant private companies developing AI tools - the most famous being Microsoft and OpenAI (which you’ll see at the top of the page).
However, as this ‘simple’ exercise progressed, it very quickly escalated into a bit of a behemoth. A behemoth that was rapidly expanding while the world was seemingly distracted by the hype of blockchain and metaverse.
Before I dive in, I’ll give you a moment to absorb some of the information in this map.
I won’t go specifically into companies but will break down some of the key themes that I feel are most relevant.
North America: As expected, Microsoft, Nvidia, Google, Meta and IBM are extremely invested across the board (large language models, software, semiconductors etc). Crucially, they also own a lot of the data which feeds their language models and AI products. Further, I expect considerable synergy within their ecosystem which will financially play into their hands over the long run.
China: Considerable investment from Alibaba, Tencent, Baidu, Huawei….but also a lot coming from the research/academic sector such as Tsinghua University (Zhipu AI) and Beijing Academy of AI. I would expect China to be toe-to-toe with the US in developing AI, however, there are challenges. In particular, and as I write about extensively, the US, Netherlands and Japan are doing their best to starve China of advanced AI resources. Without these resources, China will be at a significant disadvantage. Of course, China are investing heavily in building their own sovereign capabilities in advanced chips, however, regardless of investment and talent…..the development of advanced semiconductors (5nm and beyond) is likely the most difficult in the world.
Open-Source: In parallel to conglomerate controlled LLMs, there is an abundance of development in open-source (and more independent) LLMs and AI software. Examples include Falcon (UAE’s Technology Innovation Institute), Bloom, AI21’s Jurassic and EleutherAI. The primary question for me is that LLM’s are considerably expensive to develop and then operate. If you ask ChatGPT, they will say “While open-source models can leverage community support, the development and maintenance of large language models generally involve substantial investments in infrastructure, computational resources, data collection, research, and engineering.” Note, this is the only time I’ve used ChatGPT in this report!
Semiconductors: Even though Nvidia are leading, there is considerable development being made by Alphabet, Microsoft, Amazon and IBM in building their own advanced processing units. Further to this, there is a cohort of emerging semiconductor companies which are likely to play a big part in powering AI over the coming decade…..look to SambaNova, Graphcore, Lightmatter, Tenstorrent, Cerebras Systems, SiMa.ai and Groq (to name a few!). Other startups like Mythic have simply run out of money, and I expect cost pressures to see a number of startups and early-stage businesses run into trouble (and likely fall prey to larger operators like Intel and Qualcomm).
Data: Those with all of the data (big tech) have a significant advantage, and those without the data (or who have disorganised data) will be at a significant disadvantage. This isn’t just the case for the conglomerate like Google and Meta, but also Tesla (self-driving data), Siemens and GE (medical imaging data) and Shopify (retail data). On that note, although companies like Shopify are utilising Open AI plug ins today, I would expect that they would be keen to develop their own proprietary language models in time.
Cash Flows: The conglomerates will likely see significant synergy with the vertical integration of data, LLMs, software, semiconductors and cloud. As such, they are likely to see better margins and economy of scale with AI. Those dependent on other LLMs and ecosystems will likely see sub-optimal margins and returns. Further, the investment required to build your own LLM will be considerable (and likely crippling) for many.
Investment: I’ve outlined here with gold, silver and bronze dots, the flow of investment proceeds into the private AI sector. Again, you’ll notice that the conglomerates are very invested here. Expect ongoing consolidation as businesses frantically jostle for position.
Regulation: Another reason why there is likely to be a short-term surge in consolidation is that the regulators are frantically trying to keep up. We saw this with blockchain and we’re seeing it with AI. How will they define markets and set the rules. For example, how would regulators define AI sub-markets if Adobe were to buy Midjourney or GitHub (Microsoft) were to buy HuggingFace? It seems there’s a little window of opportunity where opportunists can make a number of acquisitions before the regulators really get their head around things!
This model will continue to be a work in progress (I already had to update it yesterday for the Databricks deal!), so expect to see it referred to in a few future editions of WIRE.
As usual, I welcome any comments or constructive feedback.
Have a fantastic weekend.
Charlie Nave
LinkedIn or E-Mail (cnave@granitebaycap.com)
Associate Professor (Practice) Monash Business School and Monash Centre for Financial Studies (MCFS)
Granite Bay Capital is an innovation focussed consultancy with a deep focus on the companies at the leading edge of innovation across major themes such as AI, ubiquitous computing, sustainability, automation and longevity. Any views expressed in this article are those of the author and do not constitute financial advice.