×
Apple explores AI model for potential smart glasses
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Apple’s new FastVLM visual language model represents a significant breakthrough in on-device AI for wearable technology, potentially powering future Apple smart glasses. This lightweight, high-speed model processes high-resolution images with minimal computing resources, suggesting Apple is developing the foundational AI technology needed for its rumored 2027 smart eyewear that would compete with Meta’s Ray-Bans.

The big picture: Apple’s Machine Learning Research team has developed FastVLM, a visual language model designed specifically for Apple Silicon that processes high-resolution images with unprecedented efficiency.

  • The model is built on Apple’s open ML framework MLX, released in 2023, which enables local AI processing on Apple devices.
  • This development aligns with Apple’s reported plans to release AI-enabled smart glasses around 2027, alongside camera-equipped AirPods.

Key technical advances: FastVLM achieves significant performance improvements through its specialized FastViTHD encoder designed for high-resolution image processing.

  • The encoder is 3.2 times faster and 3.6 times smaller than comparable models, making it ideal for on-device processing without cloud dependence.
  • According to Apple, the model generates an initial response 85 times faster than similar systems, dramatically reducing the time between user prompts and AI responses.

Why this matters: The introduction of FastVLM suggests Apple is developing the fundamental AI infrastructure needed for future wearable devices that will require real-time visual processing.

  • Local processing capability is crucial for AR glasses that need to interpret what users are seeing without constant cloud connectivity.
  • The efficiency improvements address key limitations in wearable technology: battery life, processing power, and response time.

Reading between the lines: While Apple hasn’t explicitly connected FastVLM to its rumored smart glasses, the technical specifications address precisely the challenges that AR wearables face.

  • The emphasis on high-resolution image processing with minimal computing resources aligns perfectly with the requirements of lightweight, all-day wearable glasses.
  • The focus on speed and efficiency suggests Apple is prioritizing responsive, natural interactions for its future AI wearables.
Apple’s smart glasses might run on this AI model

Recent News

IT leaders face 5 key priorities from CEOs in 2024

CEOs expect IT leaders to deliver practical AI implementations while addressing core business needs amid economic uncertainty.

Chrome browser uses AI to detect tech support scams

Chrome's new on-device AI feature analyzes suspicious webpages in real-time to identify and block tech support scams that traditional security measures often miss.

AI search firm Perplexity nears $14B valuation in funding round

The AI search startup scales back its initial fundraising target of $1 billion at $18 billion valuation, but still secures substantial backing to challenge Google's dominance.