- Starter AI
- Posts
- Anthropic’s “Dictionary learning”, Nvidia boosts chip production, OpenAI’s new deal
Anthropic’s “Dictionary learning”, Nvidia boosts chip production, OpenAI’s new deal
Improving our understanding of AI models.
Hello, Starters!
Fast development requires fast production, and as more companies continue to bring AI innovations to our plates daily, the main player in chip production must keep up the same pace!
Here’s what you’ll find today:
Anthropic researches the mind of an LLM
Nvidia’s chip-making rhythm is on the rise
OpenAI reaches deal with News Corp
Yann LeCun’s opinion on LLMs
Baidu’s CEO doesn’t believe AGI is close
And more.
🔎 Anthropic researches the mind of an LLM (2 min)
Yesterday, one of our quick links mentioned Anthropic making its Claude 3 Sonnet model believe it is the Golden Gate Bridge. Diving deeper, this is part of a research project that maps how artificial neural networks represent ideas as activity patterns within their layers. The method employed is called "dictionary learning," which trains a separate neural network to summarise the activity of a specific layer in the model.
As a result, they get a set of patterns called "features," which represent every concept the model has learned, forming a "dictionary" of activation patterns. When an individual feature is artificially amplified, it influences Claude's generation.
Carrying the entire AI industry on your shoulders isn't an easy task, especially when a shortage of chips slows down innovation and you're one of the main companies in the field. Nvidia's CEO, Jensen Huang, has recently announced they are accelerating the production of their AI chips, adopting a "one-year rhythm."
Yet, they're not stopping there. The company will also speed up the development of every chip they make, including CPUs, GPUs, networking NICs, and more.
🗞️ OpenAI reaches deal with News Corp (1 min)
OpenAI has sealed another deal with a major figure in the journalism field. News Corp, the company behind publications such as The Wall Street Journal, the New York Post, and The Daily Journal, has agreed to provide access to current and archived articles from their publications for AI training.
Although OpenAI has encountered several challenges with some media outlets in the past due to copyright infringement, the company is committed to gathering valuable data to feed its models and improve their performance. This deal is expected to be worth over $250 million in the next five years.
🧐Listening to advice from those who have more experience is beneficial for new generations. Yann LeCunrecently sparked a discussion among users of X by advising developers to stop working on large language models and focus on next-generation AI systems that address their limitations. A trendy example? Multimodal AI.
🧠No one is sure when AGI will emerge. Elon Musk believes we're on the verge of achieving it. However, Robin Li, CEO of China's Baidu, claims that this kind of intelligence, similar to or even smarter than humans, is more than 10 years away, as AI development is progressing slowly according to his perception.
⚡️Quick links
What did you think of today's newsletter? |