
AR glasses translate Bad Bunny in real time during the Super Bowl halftime
A Tom’s Guide writer tests the Ray-Ban Meta Display smart glasses at the Super Bowl halftime, using built‑in AI translations to understand Bad Bunny in real time. The in‑lens overlay provides concise translations without pulling you away from the show, though accuracy isn’t perfect and lines can be missed if the camera doesn’t catch the right moment. Gesture controls exist but voice input remains the easier option. Overall, the experience deepens appreciation for the performance and shows the glasses can translate lyrics on the fly, though it’s not flawless; price is about $799.













