My hands on with the Ray-Ban Meta Smart Glasses' multimodal AI. Bunch of new features dropped today too — excited to test those out. https://lnkd.in/envxBDJQ
I want them!!
Skip to main content
My hands on with the Ray-Ban Meta Smart Glasses' multimodal AI. Bunch of new features dropped today too — excited to test those out. https://lnkd.in/envxBDJQ
I want them!!
To view or add a comment, sign in
Chief Revenue Officer (CRO) & Chief Technology Officer (CTO) | Co-host RelaXR 🎙| Spatial Computing Enthusiast and a huge single malt whisky fan 🥃 |
"The #eyes of #AI" - The future is now! People who know me well will recognize this #vision of mine: "Extended Reality will be the eyes of AI" With this is mind you can imagine I liked the latest update of the #Raybanstories. Great to see the first steps taken where #AI and #XR meets together! This is definitely not the final stage yet and there is so much room for improvement. However this definitely shows the potential of #XR meets #AI and I personally believe this kind of devices have much more potential than for example a AI pin or a Rabbit R1 The future is bright! https://lnkd.in/epdXUqt4
To view or add a comment, sign in
A.I. Enthusiast | Digital Accessibility Evangelist | VP, Director of Interactive Technology AbelsonTaylor Group
Ray-Ban #Meta Smart Glasses now offer multimodal #AI, allowing users to interact with their surroundings using photos, audio, and text. While the AI is not perfect, it can effectively identify objects, translate languages, and provide information. The familiar form factor and integration with existing habits make the AI accessible and practical, particularly for outdoor activities. However, limitations in zoom capabilities and context-dependent performance require users to adjust their behavior for optimal results. https://lnkd.in/gJ-vxwgz
To view or add a comment, sign in
Building Existence | Into AI, IOT & Metaverse | Trying to solve the problem of existence through exploration
Meta's Ray-Ban Smart Glasses Unlocking AI Magic 👓 Meta is giving us a sneak peek into the future with early access testing of its advanced AI features for Ray-Ban smart glasses! 🚀 Today's announcement unveils the power of multimodal AI, allowing the glasses to see and hear through built-in cameras and microphones 🎤 Mark Zuckerberg showcased the update on Instagram, asking the glasses for fashion advice. The AI not only described the shirt he held but also suggested matching pants 👖 This is just the tip of the iceberg - the glasses can translate text, provide image captions, and more 💬 In a recent interview, Zuckerberg hinted at the potential for users to interact with the Meta AI assistant throughout the day, seeking answers about what they see or where they are 💡 CTO Andrew Bosworth demonstrated the glasses accurately describing a California-shaped wall sculpture, showcasing additional features like captioning photos and language translation 📸 Ready to experience this AI marvel? The test is starting with a limited group in the US. More details here : https://lnkd.in/dSzWmGQb #meta #smartglasses #ai #innovation #iot #techupdate #technews
To view or add a comment, sign in
Meta reveals AI Ray-Ban ‘smart glasses’ that scan what you’re seeing with camera and even talk to you https://lnkd.in/eHsvwz4D #Tech #AI #Smartglasses #Meta #Rayban
To view or add a comment, sign in
Meta's Ray-Ban Glasses Added AI That Can See What You're Seeing - CNET: Meta is rolling out features for its second-gen Ray-Ban glasses, utilizing generative AI to identify images. The glasses can take photos and provide AI analysis in response to voice prompts. The early-access beta aims to improve over time, although issues such as wordy prompts and occasional errors have been noted. Wearable AI is evolving, with potential for diverse applications and implications for privacy. - Artificial Intelligence topics! #ai #artificialintelligence #intelligenzaartificiale
To view or add a comment, sign in
Our latest feature addition to the Ray-Ban Meta glasses - is adding AI capabilities by combining camera and voice. It is very simple to use: Look.... For example at ingredients for a dinner Ask.... how long should I cook it for? ... and Meta Rayban will answer https://lnkd.in/g7KCnze4
To view or add a comment, sign in
BREAKING! Ray-Ban Meta Smartglasses are now supported with Multimodal AI! 🤯 You can now ask your glasses to: → record a video for you → give you useful information about the surroundings → play some of your favorite tracks ... and all of this without lifting a finger. Here are all 7 of its new features: 1. Ask to play some music 2. Ask to translate what you are seeing 3. Send messages via WhatsApp and Messenger 4. Video-call your friends and family 5. Livestream what you are seeing 6. Take photos & videos 7. Ask for more information about something The companies began testing AI in the glasses as early as December, But they have only now released the multimodal AI option, which handles multiple types of content such as text and images. -------------- Found this interesting? ♻ Repost to be the first to inform your network. 🔔 Follow Pete Skarzynski for more AI content. #meta #rayban #llama
To view or add a comment, sign in
Ray-Ban Meta Smart Glasses Update Multimodal AI 🧠: The Ray-Ban Meta Smart Glasses now feature multimodal AI, allowing them to process photos, audio, and text. AI Capabilities 🕶️: Users can ask the glasses to identify plants, translate signs, and more. The AI communicates with the cloud to provide answers. Accuracy & Limits 🎯: The AI is sometimes correct and sometimes confidently wrong. It’s good at identifying certain car models and plants but has its limitations. User Experience 👓: The glasses offer a comfortable and familiar form factor for tech, with the AI being an additional feature to the existing product. #rayban #metaglasses #metaai #meta #glasses #glasses👓 #smartstreetwear #smartgadgets #ai
To view or add a comment, sign in
The Ray-Ban Meta Smart Glasses Have Multimodel AI Now: The Ray-Ban Meta Smart Glasses now feature support for multimodal AI -- without the need for a projector or $24 monthly fee. (We're looking at you, Humane AI.) With the new update, the Meta AI assistant will be able to analyze what you're seeing, and it'll give you smart, helpful answers or suggestions. The Verge reports: First off, there are some expectations that need managing here. The Meta glasses don't promise everything under the sun. The primary command is to say "Hey Meta, look and..." You can fill out the rest with phrases like "Tell me what this plant is." Or read a sign in a different language. Write Instagram captions. Identify and learn more about a monument or landmark. The glasses take a picture, the AI communes with the cloud, and an answer arrives in your ears. The possibilities are not limitless, and half the fun is figuring out where its limits are. [...] To me, it's the mix of a familiar form factor and decent execution that makes the AI workable on these glasses. Because it's paired to your phone, there's very little wait time for answers. It's headphones, so you feel less silly talking to them because you're already used to talking through earbuds. In general, I've found the AI to be the most helpful at identifying things when we're out and about. It's a natural extension of what I'd do anyway with my phone. I find something I'm curious about, snap a pic, and then look it up. Provided you don't need to zoom really far in, this is a case where it's nice to not pull out your phone. [...] But AI is a feature of the Meta glasses. It's not the only feature. They're a workable pair of livestreaming glasses and a good POV camera. They're an excellent pair of open-ear headphones. I love wearing mine on outdoor runs and walks. I could never use the AI and still have a product that works well. The fact that it's here, generally works, and is an alright voice assistant -- well, it just gets you more used to the idea of a face computer, which is the whole point anyway. Read more of this story at Slashdot.
To view or add a comment, sign in
The Ray-Ban Meta Smart Glasses Have Multimodel AI Now: The Ray-Ban Meta Smart Glasses now feature support for multimodal AI -- without the need for a projector or $24 monthly fee. (We're looking at you, Humane AI.) With the new update, the Meta AI assistant will be able to analyze what you're seeing, and it'll give you smart, helpful answers or suggestions. The Verge reports: First off, there are some expectations that need managing here. The Meta glasses don't promise everything under the sun. The primary command is to say "Hey Meta, look and..." You can fill out the rest with phrases like "Tell me what this plant is." Or read a sign in a different language. Write Instagram captions. Identify and learn more about a monument or landmark. The glasses take a picture, the AI communes with the cloud, and an answer arrives in your ears. The possibilities are not limitless, and half the fun is figuring out where its limits are. [...] To me, it's the mix of a familiar form factor and decent execution that makes the AI workable on these glasses. Because it's paired to your phone, there's very little wait time for answers. It's headphones, so you feel less silly talking to them because you're already used to talking through earbuds. In general, I've found the AI to be the most helpful at identifying things when we're out and about. It's a natural extension of what I'd do anyway with my phone. I find something I'm curious about, snap a pic, and then look it up. Provided you don't need to zoom really far in, this is a case where it's nice to not pull out your phone. [...] But AI is a feature of the Meta glasses. It's not the only feature. They're a workable pair of livestreaming glasses and a good POV camera. They're an excellent pair of open-ear headphones. I love wearing mine on outdoor runs and walks. I could never use the AI and still have a product that works well. The fact that it's here, generally works, and is an alright voice assistant -- well, it just gets you more used to the idea of a face computer, which is the whole point anyway. Read more of this story at Slashdot.
To view or add a comment, sign in
Create your free account or sign in to continue your search
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
Product Marketing @ Meta
3moI was literally waiting for your take Victoria! I always love hearing your take on the Ray-Ban Meta smart glasses. 1. The Tabby Cat query and photo will forever live in my core memories. Li-Chen Miller will appreciate it no doubt 2. Did you get a chance to try out the new frames? We just announced them and I feel like you might like some of them 3. Thanks for these reviews! Always love reading the good, the bad and where we can get sharper!