• Starter AI
  • Posts
  • LLMs vs. Programming Languages, Apple’s next step, Improving AI's reasoning

LLMs vs. Programming Languages, Apple’s next step, Improving AI's reasoning

AI learns to think like us.

Sponsored by

Hello, Starters!

Grab a seat, sip your coffee, and get ready to face Monday by checking out everything that's been happening in the world of AI over the past few days. We've noticed a recurring theme: Apple wants everyone to know they're not lagging in the race.

Here’s what you’ll find today:

  • Do LLMs eliminate the need for programming languages?

  • Apple seeks a partnership with Google or OpenAI

  • Stanford University presents “Quiet-STaR”

  • Grok is now open source

  • Apple’s research into multimodal AI

  • NVIDIA’s GTC 2024 starts today!

  • And more.

As AI becomes more prevalent in developers' workflows, thanks to its capability of streamlining processes and making coding more "efficient," Modular proposes a thought-provoking question: Are LLMs going to suppress the need for programming languages?

Diving deep into the article, the answer is one that many experts and amateurs in the AI field can reach. Despite the automation and help that coding tools powered by AI can bring to developers, it is unlikely that they will be able to trust the work generated by an LLM fully. This results in a circumstance with which we're all familiar: AI will always require the human touch to ensure it's providing the right results.

According to reports, Apple is exploring the integration of Gemini's capabilities into the iPhone, engaging in active discussions with Google about potential collaboration. It's also been stated that they've considered OpenAI's ChatGPT.

This showcases Apple's ongoing efforts to keep up with the AI landscape after CEO Tim Cook mentioned they were investing significantly in AI functions. If the partnership with Google takes place, it may power cloud-based AI features such as text and image generation. On-device features are likely to be based on Apple's in-house models, which are expected to debut with iOS 18

The saying "think before you speak" also applies to LLMs, as researchers from Stanford University have introduced a method called “Quiet-STaR,” which helps models understand things that aren't explicitly stated in a text.

Based on the "Self-Taught Reasoner" (STaR) approach, the model infers different ways the text may continue, learns from these guesses, and improves its responses over time. This technique, akin to human contemplation, might help future models solve complex tasks more effectively.

Data Power-Up with Bright Data

Bright Data elevates businesses by collecting web data into turning it into actionable insights. Our global proxy network and web unblocking tools enable businesses to build datasets in  real time and at scale. They provide a competitive edge in Ecommerce, Travel, Finance, and beyond.

Tap into the value of clean and structured data for market research, ML/AI development, and strategic decision-making. With our scalable solutions, you can efficiently enhance your data strategy, ensuring you're always one step ahead. Start with our free trial and see the difference yourself.

🤖After CEO Elon Musk's announcement, xAI released the weights and architecture of Grok. The model is now available for everyone on GitHub. The open-source version of Grok-1 is not fine-tuned, allowing users to harness its powers for whichever applications they prefer.

🍎Apple continues to make progress in its AI endeavours. Researchers have recently developed a new method for training LLMs called "MM1," which combines various types of data and model architectures to enhance the AI system's performance. This paves the way for future applications in Apple products.

💥Nvidia is kicking off its annual GTC conference today in San Jose, California with over 300k attendees. Jensen Huang, its CEO, is starting the show with a keynote sharing the company's roadmap and potentially revealing new products. You can register on Nvidia's website to watch it live, starting at 1 PM PDT.

What did you think of today's newsletter?

Login or Subscribe to participate in polls.

Thank you for reading!