Since the dawn of the internet, innovation has often come from some unlikely sources. Today’s most successful companies’ origin stories are often equal parts brilliance and serendipity. Silicon Valley fondly remembers how a group of PhDs turned consultants spearheaded Qualcomm into becoming the largest mobile chip maker on earth, or how a Stanford research project turned into the world’s favourite search engine. Even Meta and Microsoft emerged from the left field.
However, this trend may not follow if recent developments in the field of artificial intelligence (AI) are anything to go by. Tech industry incumbents are beginning to hold a vice-like grip on one of the most exciting technological sectors of recent times. Nvidia, Meta, Microsoft have so far made the biggest strides in this emerging sector.
Just last week, Meta CEO and founder Mark Zuckerberg took to Instagram to announce a strategic partnership between Meta and Microsoft. The two tech giants will share support of Meta's proprietary large language model (LLM) Llama 2. Furthermore, Meta’s fledgling enterprise social networking software, Workplace, will be integrated into Microsoft Teams - the market leading enterprise networking software for companies of all sizes.
The watershed moment for AI came in late 2022 when OpenAI launched GPT-3. The then year-old LLM used a chat interface in order to generate prompts from users and caught the world by storm, briefly holding the title for being the fastest web application to reach 100 million users. A title surpassed by Meta’s Threads earlier this month.
Unbeknownst to many, Microsoft holds a 49% stake in OpenAI, and have since partnered to integrate OpenAI’s most advanced LLM, GPT-4, into Microsoft products such as Bing and the suite of Office 365 solutions including Word, Powerpoint and Teams. Considering their recent partnership with Meta, it seems the enterprise software giant is very well positioned for an AI future.
Generative artificial intelligence is underpinned by serious processing power. Nvidia, widely known for its gaming graphics cards, first pivoted into the space with its parallel computing platform and programming model CUDA in 2007. Since then, enigmatic founder and CEO Jensen Huang has continued to push the envelope to ensure Nvidia’s position as a hardware and software powerhouse in the field.
Today, almost all LLMs are powered by Nvidia’s ultra-powerful A100 and H100 chips. The company recorded record fiscal-year revenue of $26.91 billion in 2022, up 61 percent, with computing and networking becoming their largest source of revenue for the first time. Proving that their investment into this emerging sector is beginning to bear fruit.
As you can imagine, development of the LLMs that underpin AI chatbots and other tools is very expensive. Some industry estimates suggest it costs upwards of $1 billion when you combine the hardware and training the model. It comes as no surprise that well capitalized industry incumbents are taking the earliest and largest swings in the sector. Startups can only bridge this gap with massive financing, which is often dependent on achieving early success. Achieving product-market fit without having the funding to build the necessary infrastructure may be impossible. An interesting catch 22.
Despite this, a few startups have made waves recently and are capturing early market share in the generative technology space. Midjourney is a San-Francisco based generative artificial intelligence program and service that launched in 2022. The software, which is still in beta, has shown impressive possibilities in both image and video generation. An impressive feat for a company with fewer than 20 employees and a reminder that there is exciting potential in the space for startups.
As we continue to observe the ever-shifting tides in this nascent industry, the question of whether non-incumbents can produce successful industries still remains. New players will be hoping that Moore’s Law, a law that states that computers, machines that run on computers, and processing power all become smaller, faster, and cheaper with time, will hold. Until then, the incumbents are firmly in the lead.
I have since written a follow up to this piece here.