🎲 5 data that explain the state of AI in 2024 | Part 1/3
AI data and trends for business leaders | AI systems series
The world of AI is experiencing a gold rush. Investment in generative AI, a powerful branch capable of creating entirely new content, has skyrocketed.
Google, Microsoft, and Nvidia are at the forefront of this race, and its massive resources have positioned it as a dominant force in developing these complex "foundation models."
However, this progress comes with complications. While impressive, these models are often closed-source, meaning their inner workings remain a secret.
This secrecy contrasts with the earlier open-source approach, raising concerns about transparency and accessibility.
The sheer computational power required to train these models has also driven their cost through the roof. Perhaps most worryingly, their environmental impact is significant, with a hefty carbon footprint raising concerns about the sustainability of this technology. The immense resources required for training these models have shifted the landscape. Industry is now leading the charge in their development, leaving academia playing a smaller role.
Only a few companies see a double benefit in releasing foundation models: not only do they push the boundaries of AI capabilities, but they also provide developers with a robust base to build innovative products and services on top of.
↓↓↓ Some facts below ↓↓↓
The status of AI in 2024
📌 Fact 1: Generative AI boom
Investment in generative AI, which can create entirely new content like text, images, or code, has exploded in 2024. This indicates a growing belief in its potential to revolutionize various industries.
📌 Fact 2: Google leads the foundation model race
Google is currently at the forefront of developing complex "foundation models," the underlying technology powering many AI applications. This dominance raises questions about competition and the accessibility of this technology.
Will it last?
📌 Fact 3: Trade-offs emerge
While AI advancements are impressive, challenges are surfacing. Closed-source models are becoming more prevalent, raising transparency concerns. Additionally, the high cost and environmental impact of training these powerful models require solutions for sustainable growth.
Takeaway
As generative AI continues its meteoric rise, these are just some of the burning questions that must be addressed.
1. Generative AI investment skyrockets
While corporate investment dipped last year, generative AI is experiencing an explosive boom. This surge reflects a broader global trend.
The public and policymakers are grappling with the power and potential risks of tools like ChatGPT and DALL-E 2, while industries are responding with significant investments. In fact, the US is leading the charge in private equity funding for generative AI.
Explore more on Private equity-backed investment surge in generative AI defies 2023 deal slump.
2. Google is dominating the foundation model race
In 2023, Google emerged as a dominant force, releasing the most foundation models, showcasing the fierce competition and rapid progress in this critical area of AI development. This presents an exciting opportunity for investors to get involved in shaping the future of AI by supporting the companies leading this revolutionary charge.
Explore more on the AI Index
3. Closed models outperform open ones
While some fervently believe open models pose dangers, others champion their ability to fuel innovation. The data reveals a clear preference for open models in 2023, with 98 released compared to 28 closed models. Interestingly, closed models seem to outperform open ones on benchmark tests.
This raises intriguing questions about the balance between transparency and raw capability.
Explore more on Shifting tides: the competitive edge of open source LLMs over closed source LLMs
4. Foundation models have gotten super expensive
Training these behemoths requires deep pockets, and companies are tight-lipped about the astronomical costs. We can provide some estimates by meticulously analyzing factors like training duration, hardware utilized, and resource consumption gleaned from various sources. For context, consider Google's groundbreaking 2017 transformer model – the foundation for most modern large language models. Back then, training costs were a mere $930.
This stark contrast underscores the exponential rise in resource demands and the financial muscle required to compete in today's AI landscape.
Explore more on Open foundation models: implications of contemporary artificial intelligence
5. Foundation models have a hefty carbon footprint
While focusing on the impressive power of foundation models, we raise a critical question: what's the environmental cost? Several factors contribute to this footprint, including the model's size, data center efficiency, and the energy grid's reliance on fossil fuels. A few reports acknowledge the lack of data on emissions during "inference" – when the model is actually being used. While a single query might have a minimal impact, millions' constant use of these models can significantly magnify their environmental footprint.
This calls for greater transparency from developers to understand the true cost of running these powerful tools.
Explore more on A computer scientist breaks down generative ai’s hefty carbon footprint
Resources
Tracking the whole world's carbon emissions -- with satellites & AI | Gavin McCormick
Continue exploring
🎲 Data and trends
This email is sent to you because you signed up for Wild Intelligence by Yael Rozencwajg. Thank you for your interest in our newsletter!
Data and trends are part of Wild Intelligence, as well as its approaches and strategies.
We share tips to help you lead, launch, and grow your sustainable enterprise.
Become a premium member, and get our tools to start building your AI-based enterprise.