The Real Reason Meta's $800 Ray-Ban Demo Failed and Why You Can’t Buy Them Online
A major Zuckerberg stumble, an oddly familiar smartglasses strategy, and a telling signal of what's to come.
When Meta unveiled its Ray-Ban Display AI glasses on Wednesday, despite the impressive features, Mark Zuckerberg suffered two major demo fails live on stage. And while the product has received positive reviews, some are wondering what happened during the demo, especially since Zuckerberg has done these kinds of live demos dozens of times over the years. Well, now we have an answer.
The day after the tech mishap, Meta’s chief technology officer revealed exactly what they discovered after investigating, and it turns out the problems were two-fold.
“Obviously, I don't love it, but I know the product works. So it really was just a demo fail and not like a product failure,” said Andrew “Boz” Bosworth during a question-and-answer session on Instagram. “When the chef said, ‘Hey Meta, start Live AI,’ it started every single Ray-Ban Meta’s Live AI in the building. And there were a lot of people in that building. That obviously didn't happen in [the empty auditorium space] rehearsal.
“The second thing is, we had routed Live AI traffic to our dev server, in theory, to isolate it. But we had done it for everyone in that building on those access points, which included all of those headsets. So we DDoS’ed ourselves, basically with that demo. And it didn't happen in rehearsal, because we hadn't had as many people with the glasses in the building at the time, so the server was holding up fine.”
So that’s why everyone on stage kept blaming the Wi-Fi. It was a connectivity issue, but not simply “bad Wi-Fi” as one might experience at a local cafe. The auditorium had roughly 2,500 to 3,000 Meta Ray-Ban devices connected at the time, according to Bosworth. In the end, this was Meta simply having a blind spot because the mobile device is pushing through new territory that requires a different kind of checklist for live demo logistics.
The other point of failure was not Wi-Fi, and was indeed a legitimate software bug.
“The video call issue was actually quite a bit more obscure. A never-before-seen bug in a new product,” said Bosworth. “The display had gone to sleep at the very instant the notification had come in that a call was coming. And so it was a *race condition which caused it. Even when Mark woke the display back up, we didn't show the answer notification to him, and we'd never run into that bug before. It was a race condition, that’s the first time we'd ever seen it. It's fixed now, and that's a terrible, terrible place for that bug to show up. We had run that video call 100 times… Obviously, we're bummed about that. I think we missed an opportunity for legendary status with those things. But I do think the launch is still good and people still love it.”
[*In software, “a race condition” can refer to a situation in which multiple computing processes running at the same time are attempting to access or modify the same resource, such as a file, memory, or network data.]
The Face Test
One thing you might notice is how the Meta Ray-Ban Display AI glasses (yes, that’s a mouthful, we need a nickname… “Displays” maybe?) look on different faces. On some faces, the AI glasses look like normal, albeit thicker, glasses. On other faces, the frames look just plain awkward. I suspect this is part of the reason Meta will not allow buyers to purchase the $800 device online, and instead will require consumers to visit Ray-Ban stores and try the glasses on in person. Dealing with a large number of $800 returns could be disastrous if many people bought them online and, upon delivery, discovered they looked terrible when wearing them.
This lesson was learned years ago by another smartglasses company that had a remarkably similar product dynamic. In 2018, Canadian startup North (subsequently acquired by Google) released Focals, a pair of smartglasses that included a tiny display in one of its lenses, and a wireless control mechanism in the form of a ring that allowed you to subtly control what you saw on the display with your fingers. Sounds familiar, right? Mentioning North’s earlier play on the same kind of system isn’t a slap against Meta. The Meta wrist-worn Neural Band is indeed an engineering masterpiece. However, it is worth noting that this basic smartglasses setup has been tried before.


Similar to the Displays rollout, North’s Focals required a visit to storefronts for fitting with the correct frame, ensuring customer satisfaction with their purchase. I visited the company’s former New York store in Brooklyn, got fitted for my glasses and ring, and used the glasses for a decent amount of time. I doubt Meta would acknowledge this, but the Displays strategy is so similar to how Focals handled things, I believe Zuckerberg took notes years ago, and just tried to do it bigger and better with Displays.
I didn’t use Focals for very long because, unlike other AR/VR devices I’ve used, it became a bit painful. Eventually, I found that focusing on the graphic display on the glasses strained my eyes over time. I even developed mild headaches for a short time. Don’t worry, my eyes are fine, but that single eye strain experience with Focals put me off enthusiastically embracing such one-eye display systems from then on. Will I break my policy for Meta’s Displays? Maybe. Stay tuned.
When Launch Meets Reality
The day after the launch of Displays, I visited Ray-Ban's U.S. flagship store in New York City to get some on-the-ground intel from the staff who will be tasked with directly selling this new product to the masses. I won’t reveal my source, but I had a conversation with one staffer who said that the store is already getting a decent number of phone calls inquiring about when Displays will be on the showroom floor and available for in-person try-ons.
So there’s interest, but I must temper that salesperson’s anecdotal account with the reality that the massive store was almost empty in the middle of the day, in one of the busiest shopping districts (Soho) in one of the most populated cities in the U.S. That lack of foot traffic a day after Displays’ big debut doesn’t bode well for the wearable’s future.
Still, this is one of the most positive reactions to a Meta wearable product I’ve ever seen, so I’m hopeful that this is just the beginning for Meta’s next phase of immersive computing. And I’m sure Apple was taking notes, as it prepares its inevitable Apple Vision Air smartglasses.


