[CES2025]Meta, the evolution from AI glasses to AR glasses
Dan Reed, COO, Reality Labs at META, joins James Kotecki in the CES C Space Studio
세계 최대 혁신 IT축제 CES 2025 C 스페이스 스튜디오 인터뷰에서 진행자 제임스 코테키(James Kotecki)는 메타(Meta)의 리얼리티 랩스(Reality Labs) COO인 댄 리드(Dan Reed)와 대화를 나눴다.
이들은 메타가 추구하는 증강현실(AR), 가상현실(VR), 인공지능(AI)의 통합 및 이러한 신기술이 미래 컴퓨팅 플랫폼에 어떤 변화를 가져올지에 대해 논의했다.
메타의 AR/VR 기술 발전, AI 기능의 통합, 그리고 사람들이 디지털 정보를 일상 생활에서 활용하는 방식에 대한 근본적인 변화를 추구하는 메타의 방향성이 강조됐다. 이러한 혁신들이 앞으로 개인의 삶, 사회적 상호작용, 접근성 분야에서 큰 영향을 미칠 것으로 전망된다.
- 메타의 리얼리티 랩스 역할
- 리얼리티 랩스는 메타 내에서 AR, VR, 메타버스 기술을 개발하는 부서.
- 차세대 컴퓨팅 플랫폼이 ‘공간 컴퓨팅(spatial computing)’으로 발전할 것이며, 이를 통해 더욱 효과적이고 몰입감 있는 사용자 경험을 제공할 수 있다고 믿음
- VR에서 AR(메타버스)로의 진화
- VR은 주로 게이밍에서 시작했지만 점차 엔터테인먼트, 피트니스, 생산성 등 다양한 분야로 확장되고 있음
- AR은 스마트글라스(예: Ray-Ban Meta Smart Glasses)를 통해 대중에게 선보이고 있으며, 점차 풀 디스플레이 AR 글라스로 진화할 예정
- “메타버스”는 이러한 하드웨어 혁신을 연결해주는 소프트웨어·경험 레이어로 설명
- AI 통합: “AI 글래스”는 지능형 비서
- 리드(Reed)는 “AI 기능이 단순 카메라/오디오 기능을 지닌 기기를 지능형 비서 역할까지 수행하게 만들 것이라고 강조”
- 내장된 카메라, 스피커, 마이크를 통해 AI는 사용자의 주변 환경을 “보고”, “듣고”, 실시간대응할 수 있음
- 실시간 언어 번역, 주차 위치 기억, 이동 중 상황별 안내와 같은 발전된 기능들이 가능
- 미래 기능 및 혼합현실(MR)
- 혼합현실 헤드셋을 사용하면 완전한 가상 세계, 실제 환경을 투과 해보는 모드, 또는 그 둘을 혼합하는 모드를 경험할 수 있음
- 시간이 지남에 따라 AR 글라스는 실제 시야 위에 디지털 정보를 겹쳐 일상적인 작업과 경험을 향상시킬 것으로 예상
- 이러한 기술은 사회적 규범과 상호작용 방식을 변화시킬 것으로 전망되며, 메타는 촬영 중임을 알리는 ‘바이스탠더 시그널링(bystander signaling)’과 같은 기능에 주목하고 있음
- 접근성(Accessibility)과 포용성(Inclusivity)
- 메타는 전 세계 수십억 사용자를 대상으로 포용적인 디자인을 최우선으로 삼고 있음.
- 시각장애인 및 저시력자를 위해 “Be My Eyes”와 협력하여, 스마트글라스를 통해 실시간 환경 인식을 지원
- 댄 리드의 경력
- 댄 리드는 과거 NBA에서도 근무한 경험이 있음.
- 팬 및 소비자를 이해하고 그들의 니즈를 최우선으로 고려하는 자세가 NBA와 메타 양쪽에서 모두 중요하다고 말함
- 하이프 사이클과 실제 진전
- VR과 AR은 대중의 관심이 높아지는 시기가 있었다가 다시 잠잠해지는 과정을 반복해왔지만, 기술 자체는 꾸준히 발전해 왔음
- 대중적 인식이 늘 일정하지는 않지만, 실제 사용 사례와 채택은 계속 증가하고 있다고 강조.
- SF(공상 과학)에서 받은 영감
- 많은 사람들이 스노 크래시(Snow Crash) 같은 소설(메타버스 용어의 기원)이나 아이언맨 같은 영화의 AR 헤드셋에서 영감을 받았음
- 리드는 메타가 일종의 “새로운 공상 과학”을 만들어가고 있다며, “오리온(Orion)”이라는 코드명으로 영화 속 기기와 유사한 고급 시제품을 개발 중이라고 말햇음
핵심 요약
Ray-Ban Meta 스마트 안경
- 카메라, 스피커 탑재
- AI 기능 강화: 사용자의 시각과 청각 정보 활용
- 실시간 번역, 주차 위치 알림 등 기능 제공
AI 안경에서 AR 안경으로의 진화
- 디스플레이 기술 발전 예상
- 혼합 현실(Mixed Reality)부터 완전한 AR까지 다양한 형태 개발 중
AI와 메타버스의 융합
- AI가 메타버스 비즈니스 변화의 핵심 요소
- 공간 컴퓨팅과 가상 현실의 일반 컴퓨팅 플랫폼으로의 전환
- AI를 통한 증강 능력(augmented abilities) 실현
사회적 영향 및 고려사항
- 바이스탠더 시그널링: 안경 사용 시 주변인에게 알림 기능 중요
- 접근성: 다양한 사용자를 위한 포용적 제품 설계
- 사회적 규범 변화 가능성: 새로운 기술 사용에 따른 상호작용 방식 변화
비즈니스 전략
- Reality Labs의 R&D에서 고객 중심 비즈니스로 전환
- 게임에서 엔터테인먼트, 피트니스, 학습, 생산성 등으로 VR 활용 확대
- AI 안경에서 AR 안경으로의 전환을 통한 시장 확대
미래 전망
- 'Orion' 프로젝트: 영화 속 AR 안경 기술 구현 중
- 다양한 사용 사례에 맞는 기기 개발 예정
- AI와 AR의 융합을 통한 새로운 가능성 모색
In this C Space Studio interview at CES 2025, host James Kotecki speaks with Dan Reed, COO of Reality Labs at Meta. Their conversation covers Meta’s vision for augmented reality (AR), virtual reality (VR), AI integration, and how these emerging technologies will shape future computing platforms.
- Role of Reality Labs at Meta
- Reality Labs is Meta’s division focused on developing AR, VR, and metaverse technologies.
- They believe the next wave of computing platforms will be “spatial computing,” offering more effective, immersive ways for people to interact with technology.
- Evolution from VR to AR (and the Metaverse)
- VR started primarily with gaming but is expanding to entertainment, fitness, productivity, and beyond.
- AR is being introduced through smart glasses (e.g., Ray-Ban Meta Smart Glasses) and will evolve toward full AR display glasses.
- The “metaverse” is described as the software or experience layer connecting these hardware innovations.
- AI Integration: “AI Glasses”
- Reed highlights how AI capabilities transform these devices from simple camera/audio tools into intelligent assistants.
- With built-in cameras, speakers, and microphones, AI can “see” and “hear” the user’s surroundings, responding in real time.
- Examples of AI-enabled features include live language translation, contextual reminders (e.g., where you parked your car), and advanced assistance on the go.
- Future Capabilities & Mixed Reality
- Mixed Reality (MR) headsets can present full virtual worlds, pass-through views of the real world, or a blend of both.
- Over time, AR glasses might overlay digital information onto one’s view, augmenting daily tasks and experiences.
- The conversation acknowledges this will likely change social norms and interactions, and Meta is focused on “bystander signaling” (e.g., visual indicators that a user is recording).
- Accessibility and Inclusivity
- Meta places a high priority on inclusive design for billions of users, including features like partnering with “Be My Eyes” to assist the blind or visually impaired.
- These technologies, from VR to AI-enabled glasses, aim to accommodate diverse needs.
- Lessons from Reed’s Background
- Dan Reed previously worked at the NBA before joining Meta.
- Understanding consumer needs and maintaining a fan/customer-focused mindset has guided him in both organizations.
- Hype Cycles vs. Real Progress
- VR and AR have gone through peaks of public hype, but the technology has steadily advanced in the background.
- Reed stresses that while mass awareness may fluctuate, actual adoption and use cases continue to grow.
- Science Fiction Influences
- Many are inspired by science fiction like Snow Crash (coining the term “metaverse”) and movies (e.g., Iron Man) depicting futuristic AR headsets.
- Reed says Meta is effectively “creating our own science fiction,” building advanced prototypes (codenamed “Orion”) that resemble these fictional devices.
인터뷰 전문 Dan Reed, COO, Reality Labs at META, joins James Kotecki in the CES C Space Studio
welcome back to the C space video here at CS 2025 I am James, Kentucky, and I'm very excited about our next conversation with Dan Reed, okay, So Reality Lab, Metaverse, PR, meta, yeah, and we believe that the work we're doing are the next Virtual Reality spatial computing is really making this transition to a general computing platform that we think over time will be more effective
and AI glasses will turn into AR glasses, which we think are the next mobile computer platform, and we're really seeing great traction in both of those businesses, as well as the metaverse, which is the software layer, the experience layer, across the route, all of it.
And really AI, the growth of AI over the last year or two has really transformed this business, where AI is a critical part of our vision here, and so it's a very exciting time for us. Thanks for letting me try the glasses.
Can you explain to folks just the basic components of what you're wearing right now? Sure. So these are the Ray Ban meta smart glasses. They are and really they're transitioning to be aI classes.
So they have cameras in here, so they take photos, they take video. They have speakers in the temples here, so you listen to podcasts, listen to music, take calls. But the really exciting thing is the AI in the classes, where it's not just a chat bot, an assistant, which is, of course, useful, but that's something you can do on the phone. The AI in the glasses is really exciting, because the AI can see what you see, you can hear what you hear. It can speak to you privately.
So it unlocks these amazing capabilities, which I'm sure we'll get into, but I imagine a feature not too far off, where I'm wearing a set of those glasses too, and I get hints about what to ask my guests, and they probably get hints about the answers, whether that's from the AI or either PR teams behind the scenes. You can imagine a lot of interesting context for that, but I'm curious about the AI components that you mentioned.
So the way that AI is kind of driving this forward, and so I think I can't remember when the exact name change was, but I know that in conversations with the past, through the safe space studio, we talked about the metaverse, we talked about kind of, you know, continue to rise up at a really prominent trend.
So now, when we add the latest version of AI and generative AI into the mix for the AR VR augmented reality conversation, how does that change the game? It changes the game because it really unlocks. It really, you know, we talk about augmented abilities in this form factor, because, like I, like I described, when the AI can understand your surroundings, speak to you privately while you're on the go without having to pull out your phone, it unlocks amazing things. So 111, an example is live translation.
So we can be having you and you can talk to me in Spanish. I don't understand, you know, I don't understand Spanish. I don't speak Spanish, but you can, it will translate it live in my ear. And as screens come online, it will translate it on a screen.
It will give you a reminder. So, you know, you often forget where you park your car. You know, remind you where you park your car. And then, you know, live AI capability, where in real time while you're doing things. AI can assist you based on what you're seeing. So this is just the tip of the iceberg. You can imagine a lot more capabilities that will be coming. Like, like you described. We're like, you know, we're in a place where it's anticipatory, working, understanding the situation, and just, you know, give you a little help. So it's really transforming.
And how much of an overlay. Now we're in the future. Will the glass be able to put on the reality that I'm seeing? Will it be extremely augmented or just partially or could it be, could I black out everything and just create solo virtual Yeah, well, I think that's that's the continuum of mixed reality to eventually AR glasses, right? Where mixed reality is it's a screen, but it passes through, so the full screen, you can go full virtual, and that's, that's it, but you can also do pass through the MC the world around you
. You know, with and there will be a range within the glasses category where these are, these have no display. Over time, there will be glasses with display. And there's slightly different use cases depending on you know what, what you're what you're using for. So I think there'll be a type of device for any use case in any type of situation. Do you think it will change society, or social conventions around the expectations of what the other person is able to hear and see when they're wearing these devices? Like, like, I know because you told me I have cameras in these, and I can hear things in these that you might not be able to hear, right? He's like, does it change social interactions? It's hard. It's hard to say. I mean, we're in the very early stage of this.
What we talk about is bystander signaling is very important to us. So for example, when you take a picture, you see a pretty bright flash. Okay, if you take a video, it pulses and so, and I think there will be a lot of these norms that emerge over time to your point, because these classes do give you entirely new capabilities that are very exciting, like the fact, like eliminating the barrier of language when you're traveling is like, pretty amazing, right? That's obviously something that, as this technology gets better, will change, and I do think there'll be a bunch of those things, but it's kind of early to figure it out to predict exactly what those will be.
And there's so many more accessibility elements to this that could be totally extreme benefits to this for people who need assistance. We're seeing this now, and this is really important to us as a company. We have nearly 4 billion people on our apps, and these literally go in your face. So like making sure that it works for every type of person really matters.
So we have an inclusive product Council, where every product goes through this, where we understand how it works for a wide range of people. We co design with diverse communities and disabled communities, and there's some examples that we have a partnership with Be My Eyes on leads, which helps blind and low sighted people understand what's in front of them.
And solo, you can do that now with the glasses. Han, free, okay, so it sees what you're where you're looking at what's happening. That's right. Han, solo, example, before you were that you've had a substantial career, and before that you had a fairly substantial career at the NBA. That's right. What have you learned in both of those careers? like, listening to audiences, listening to audiences. I mean, look, these are both consumer brands, consumer businesses, and so, like, understanding your audience is critical. It just is.
And it's funny, the transition we're making in reality Labs is instructive here, because reality labs, historically, have been effectively an R & D department.
We are inventing this technology, these things don't exist in the world. Now we're transitioning to, okay, we need to get very customer focused in terms of different, you know, customer segments, different use cases.
Because now, as we're scaling this business, millions and hundreds of millions of devices, so we're, we're moving into that phase now, and obviously at the NBA, this, this, this is critical to understand, you know who your fans are, whether it's ticket sales or sponsorships or or the product on the floor. Yeah, I think, I think a casual observer of the space might have thought, oh, VR was hot, and then it seemed to have diminished a little bit.
And what it seems like, what I'm understanding for you and maybe other conversations, is like, no, it's kind of gone. It's kind of continued to evolve, and now a lot of the focus is on the utility and actually useful things. We haven't talked as much in this conversation before. When we put the glass in, I told the CI to tell me a joke.
We just did it funny. So there is still funding. But I'm just curious about, like, right now, is this where the focus of the industry, and where that focus is more on the utility side versus the quote, unquote, fun side. But where does the fun side fit into it? For you, it's all the above. You know, like we're human being.
So we want to do things that we want to unlock capabilities for ourselves, improve productivity or whatever. But, you know, we're human beings. I want to socialize. I want you to make your life better on all fronts. So like in mixed reality, you know, these things tend to go in high cycles. Yeah, there's a good high cycle around, you know, VR, and then it sort of, if you're not paying close attention, it sort of dies down.
But the reality is that business continues. And what's interesting is it's making this transition from gaming, which has been the first major use case in VR, which you understand why like gaming immersively is amazing, and now it's making this transition into entertainment, into fitness, into learning, into productivity, and it's sort of over time again, that's the path of becoming a general compute platform. And that's happening. We're seeing growth in all those areas. But, you know, it's if you're not in space. It's ideally under the radar.
But we're seeing the growth and the AI glasses to AR glasses transitions are the same thing. Now we've invented technology.
We have a product market fit for this product, and now it's about, how do we make them useful? How do we help people become aware of it? How do we tell the story of how active they are? I heard a similar argument in the key space studio this year on the blockchain too, which people might have thought is a similar kind of thing, but the person who was sitting where you're sitting was like, no, actually, this is really useful, and it's really pervasive, but people aren't thinking about it as much.
So the hype cycle is really more about what maybe people are paying attention to at the moment versus what's actually being useful. That's right, virtual reality, augmented reality, is so much woven into action. Yeah, is there a piece of science fiction that especially inspires you as you think about what you're building and where you think this is going? I mean, you know, let's start with metaverse. You know, was, was inspired by the book Metaverse, right? Or Snow Crash.
And so, you know, I think there's, we get inspiration from a variety of a variety of different books and movies, but at this point now, we are sort of trying to create our own science fiction in many ways, like we just advanced AR glass in the world. We call them Orion.
And these are the things that you see in the movies like Iron Man, where Tony Stark has the AR glasses, where you have a wide field, a few displays, holographic displays.
You know, you can watch a movie, you can have multiple screens, you can talk to, you know, virtually embodied person, and it has all the same capabilities that these do.
And so, you know, every movie that you've seen that has those AR glasses and all the capabilities they have that's really inspiring, and we're building.
I've seen it, I've tried it, it's pretty cool. I love that quote, we are creating our own science, put that up on a poster screen. Gallagher. Thanks so much for having me. And we really appreciate you watching us. This is the C space studio here at CES 2025 Don't go anywhere because more great conversations with thought leaders are just.