蘋果發力AI可穿戴設備 智能眼鏡成研發重中之重

近日,彭博社帶來一則重磅消息:蘋果正全力加速AI可穿戴設備的研發進程,其中AI眼鏡、AI吊墜以及新款AirPods成為重點研發對象。這三款設備均以Siri數位助理為核心建構,Siri將藉助視覺上下文資訊來執行各類操作。

據知情人士透露,這些AI可穿戴設備都將與蘋果iPhone系統深度連接,並且各自配備功能各異的攝影機系統。AirPods和AI吊墜秉持簡潔設計理念,僅配備低解析度攝影機,主要作用是輔助人工智慧運行,並不側重於拍照或錄影功能。相比之下,AI眼鏡定位更為高階,功能也更為豐富多元。

彭博社消息顯示,近幾個月蘋果在代號為N50的智能眼鏡研發上成績斐然,近期還在硬體工程部門內部大量分發原型機。蘋果計劃最早於12月啟動量產工作,預計在2027年正式面向公眾發售。和Meta目前多數產品類似,這款蘋果智能眼鏡不配備顯示螢幕。其互動介面主要依靠揚聲器、麥克風和攝影機來實現。使用者藉助它能夠輕鬆撥打電話、便捷存取Siri、依據周圍環境執行操作、盡情播放音樂以及隨時拍照。

蘋果著重在兩個關鍵領域打造這款產品的差異化優勢,即精湛做工和先進攝影機技術。蘋果在硬體開發初期,採用將電子元件和攝影機嵌入知名品牌現成鏡框的方式,甚至一度考慮透過合作方式推出產品,符合當時行業趨勢(如Meta與EssilorLuxottica合作、Google與Warby Parker合作)。不過近期,蘋果決定自主研發各種尺寸和顏色的鏡框。早期眼鏡原型需透過線纜連接獨立電池組和iPhone,而新版本將這些組件巧妙嵌入鏡框之中。設計上採用高階材料,如壓克力部件,賦予眼鏡高級質感,且蘋果已在探討未來推出更多款式眼鏡。

這款智能眼鏡配備兩個攝影機:一個用於高解析度成像,另一個專門用於電腦視覺(技術與Vision Pro類似)。第二個感測器能為設備提供豐富的環境資訊,助力設備更精準解讀周圍環境並測量物體間距離。眼鏡目標是成為使用者全天候的AI貼心伴侶,能夠即時理解使用者所見、所聞和所為。佩戴者只需看著某個物體並詢問相關資訊,就能得到答案,並在日常任務中獲得協助。例如詢問某道菜的配料,眼鏡能迅速回答。

蘋果還在積極探索更高級的應用場景,如:

  • 讀取印刷文本並轉換為數位資料,將活動海報資訊直接加入行事曆
  • 建立情境感知提醒,當使用者在超市尋找特定商品時提示拿取
  • Siri導航更加精確,可引用現實世界地標,如告訴使用者在轉彎前先經過某個建築物或車輛

蘋果此前已具備一些視覺AI功能(如iPhone上的視覺智能功能),但應用於智能眼鏡的技術有望憑藉更便捷的使用方式與更廣泛的應用場景,更容易被大眾接受。

Apple Accelerates AI Wearables Development, Smart Glasses Become Top Priority
According to a recent Bloomberg report, Apple is ramping up development of AI-powered wearables, with AI glasses, an AI pendant, and new AirPods as key focuses. All three devices will be built around the Siri digital assistant, which will leverage visual context to perform various tasks.

Insiders reveal that these AI wearables will be deeply integrated with Apple’s iPhone ecosystem, each equipped with different camera systems. AirPods and the AI pendant follow a minimalist design with only low-resolution cameras primarily for AI assistance, not for photography or video recording. In contrast, the AI glasses are positioned as a higher-end product with richer functionality.

According to Bloomberg, Apple has made significant progress in recent months on the smart glasses project, codenamed N50, and has distributed many prototypes internally within its hardware engineering division. Apple plans to begin mass production as early as December, with an official public launch expected in 2027. Similar to most of Meta’s current products, these glasses will not have a display. Interaction will rely on speakers, microphones, and cameras, allowing users to make calls, access Siri, perform actions based on their surroundings, play music, and take photos.

Apple is focusing on two key areas to differentiate its product: premium craftsmanship and advanced camera technology. Early in development, Apple embedded electronics and cameras into existing frames from well-known brands and even considered a co-branded launch (similar to Meta with EssilorLuxottica or Google with Warby Parker). Recently, however, Apple decided to design its own frames in various sizes and colors. Early prototypes required a wired connection to an external battery pack and iPhone, but newer versions embed these components into the frame. Premium materials, such as acrylic, give the glasses a high-end feel, and Apple is already exploring additional frame styles for the future.

The glasses feature two cameras: one for high-resolution imaging and another dedicated to computer vision (similar to the technology used in Vision Pro). The second sensor provides rich environmental data, helping the device interpret surroundings and measure distances between objects. The goal is for the glasses to serve as an all-day AI companion, capable of understanding what the user sees, hears, and does in real time. Wearers can simply look at an object, ask about it, and receive answers or assistance with daily tasks—for example, asking about ingredients in a dish.

Apple is also exploring more advanced use cases, such as:

  • Reading printed text and converting it into digital data (e.g., adding event poster details directly to a calendar)
  • Creating context-aware reminders (e.g., prompting a user to pick up an item while searching in a supermarket)
  • Enhancing Siri navigation by referencing real-world landmarks (e.g., instructing a user to pass a specific building or vehicle before turning)

Apple has already introduced some visual AI features, such as Visual Intelligence on the iPhone, but the technology applied to smart glasses is expected to gain broader adoption due to its more convenient form factor and wider range of use cases.

Summary:
Apple accelerates AI wearables, with smart glasses codenamed N50 launching in 2027. Screenless, dual-camera design leverages Siri and computer vision for object recognition, text reading, and context-aware reminders, emphasizing premium craftsmanship.

Leave a Reply