• Starter AI
  • Posts
  • Inside GPT-4’s thinking patterns, DeepMind’s approach to ASI, Apple’s WWDC 2024

Inside GPT-4’s thinking patterns, DeepMind’s approach to ASI, Apple’s WWDC 2024

Apple enthusiasts rejoice!

Hello, Starters!

We're starting the week with a main event happening. Apple is finally unveiling its work on AI, and we will witness how its recent partnership with OpenAI will change the user experience. How exciting!

Here’s what you’ll find today:

  • Extracting concepts from GPT-4

  • Open-ended AI is the path for ASI

  • Get ready for Apple’s WWDC

  • Claude 3's character training

  • Hugging Face’s progress with “Le Robot”

  • And more.

Further developing their research, OpenAI has published a paper on the methods they're harnessing to understand the thinking patterns of GPT-4. Although it is currently difficult to fully grasp the inner functions of neural networks, OpenAI has made use of "sparse autoencoders," a technique that allows them to identify the series of "features" that activate within GPT-4, which can then translate into concepts and patterns understandable for human reasoning.

This is just another way OpenAI shows its commitment to AI safety, as it allows users to look inside the inner workings of their models and their robustness. Yet, the method has its challenges. Due to the size of large language models, autoencoders can struggle to scale the analysis and may not get full coverage of the concepts.

Researchers at Google DeepMind have introduced "open-ended" AI as the most viable way to achieve artificial superintelligence. According to their study, the need for bigger datasets and more computing power should be left behind to focus on AI systems that can think and learn new ideas on their own, instead of relying on humans.

The way to reach these possibilities is through methods like reinforcement learning, self-improvement, creating new tasks, and evolving algorithms, which are currently possible for models like AlphaGo, AdA, and POET but only in limited ways.

The day has finally arrived. After months of waiting and keeping updated through rumors, Apple's WWDC 2024 is happening today at 10 a.m. PT, with live streaming available. We will see what they have in store for us and how their approach to "Apple Intelligence" (which conveniently abbreviates to AI) is set to transform their devices' productivity.

We've stated this a few times before, but as this is the last time, it's not unnecessary to remind you that Apple is not as intense as its competitors in integrating AI into its ecosystem. They're still focused on practical ways to enhance iOS's existing features using the technology. Also, we'll get the opportunity to see the results of the company's partnership with OpenAI, so exciting things are coming no matter what.

🔎In a recent blog post, Anthropic shared insights into the training process for Claude 3, more specifically into how they've leveraged "character training" for the first time in the AI assistant, and how this part of the training is what turns Claude into what it is: a thoughtful, curious, and open-minded model.

🤖Hugging Face is making progress with its Le Robot” program, as it has presented “Reachy2,” a humanoid robot created in collaboration with Pollen Robotics. Reachy2 has been trained on a series of household chores and can perform a diverse set of them while being "tele-operated" by a human with a VR headset.

What did you think of today's newsletter?

Login or Subscribe to participate in polls.

Thank you for reading!