Is this Meta's smartphone moment?
OpenAI sheds more talent and its non-profit status, James Cameron joins Stability AI's board.
$10,000 per pair. Meta's new Orion AR glasses carry a hefty price tag. Yet the true innovation lies not in the lenses, but on your wrist.
Forget grand gestures ala Apple Vision Pro.
Meta's EMG wristband decodes electrical signals from subtle hand movements and uses that, along with eye tracking, to let people swipe, click, and scroll through content. You could navigate interfaces with your hands tucked away in your pockets or by your side.
This feels like a UI “flip”. If it works, there’s no going back. Your eyes act as the pointer, while pinching your fingers together acts as the click.
This tech was built by CTRL Labs, the startup building EMG wristbands that was acquired by Meta in 2019.
It rivals the impact of the first graphical interfaces on the laptop or the first fully touchscreen phones with no buttons. We're entering an era where thought translates to action.
The “invisible click” has to have intensive prediction and user modeling that adapts to the particular context in time. All the micro-interactions looked intuitive in the demos— an impressive computing feat to achieve that real-time feedback.
Orion is Meta's bid to forge the next ubiquitous computing platform. A device potentially replacing your phone, laptop, even your TV. All commanded by a finger twitch and an eye movement.
And Meta’s even being quite conservative about it, possibly having learnt from the big pushback in 2022 after billions were poured into the metaverse without much to show for it. They are presenting it as a demo, with it possibly coming to market in 3 to 5 years time.
Ambitious? Absolutely. Audacious? Without question. If successful, it could redefine our digital existence as profoundly as smartphones did.
Meta isn't playing catch-up anymore. They're attempting to vault past the competition entirely.
The Latest
OpenAI’s senior leadership departures continue. Mira Murati, CTO, resigned alongside the Chief Research Officer Bob McGrew and Vice President of Research Barret Zoph citing personal reasons and interest. OpenAI also is reportedly restructuring its core business into a for-profit entity rather than its non-profit roots, with CEO Sam Altman taking equity for the first time.
Meta releases its first open AI model that can process images: The new model, Llama 3.2, could allow developers to create more advanced AI applications, like augmented reality apps that provide real-time understanding of video, visual search engines that sort images based on content, or document analysis that summarizes long chunks of text for users.
Ai2’s new Molmo open source AI models beat GPT-4o, Claude on some benchmarks: The Allen Institute for AI (Ai2) unveiled Molmo, an open-source family of state-of-the-art multimodal AI models which outpeform top proprietary rivals including OpenAI’s GPT-4o, Anthropic’s Claude 3.5 Sonnet, and Google’s Gemini 1.5 on several third-party benchmarks.Yet, Ai2 also noted in a post on X that Molmo uses “1000x less data” than the proprietary rivals.
Jony Ive is back with his first major tech project since leaving Apple, teaming up with Sam Altman for an AI-powered hardware device. The project has already raised private funding and plans to raise up to $1 billion by the end of the year.
Legendary filmmaker James Cameron joined the Stability AI Board of Directors.