audio
audioduration (s) 0.08
52.4
| transcript
stringlengths 2
993
| __index_level_0__
int64 0
480
|
---|---|---|
I don't know the answer to that, then I'd be some computer construct and not the person who created that meta company. But that would truly be meta. | 0 |
|
I mean, it's not gonna be four decades before we have photorealistic avatars like this. So I think we're much closer to that. | 1 |
|
Well, I think that's, this is like the key question, right? Because the thing that's different about virtual and hopefully augmented reality compared to all other forms of digital platforms before is this feeling of presence, right? The feeling that you're right, they are in an experience and that you're there with other people or in another place. | 2 |
|
And that's just different from all the other screens that we have today, right? Fones, TVs, all the stuff. It's, you know, they're trying to, in some cases, deliver experiences that feel high fidelity, but at no point do you actually feel like you're in it, right? | 3 |
|
At some level, your content is trying to sort of convince you that this is a realistic thing that's happening, but all of the kind of subtle signals are telling you now you're looking at a screen. So the question about how you develop these systems is like, what are all of the things that make the physical world all the different cues? So I think on visual presence and spatial audio, we're making reasonable progress. | 4 |
|
Spatial audio makes a huge deal. I don't know if you've tried this experience, work rooms that we launched where you have meetings. And I basically made a rule for all of the top management folks at the company that they need to be doing standing meetings in work rooms already, right? | 5 |
|
I feel like we got a dog food this. This is how people are going to work in the future. So we have to adopt this now. | 6 |
|
And there are already a lot of things that I think feel significantly better than, than like typical zoom meetings, even though the avatars are a lot lower fidelity. You know, the idea that you have spatial audio, you're around a table in VR with people. If someone's talking from over there, it sounds like it's talking from over there. | 7 |
|
You can see, you know, the, the, the arm gestures and stuff feel more natural. You can have side conversations, which is something that you can't really do in zoom. I mean, I guess you can text someone over, like out of band. | 8 |
|
But if you're actually sitting around a table with people, you know, you can lean over and whisper to the person next to you and like have a conversation that you can't, you know, that you can't really do with in, in, in, in just video communication. So I think it's interesting in what ways some of these things already feel more real than a lot of the technology that we have. Even when the visual fidelity isn't quite there, but I think it'll get there over the next few years. | 9 |
|
Now, I mean, you were asking about comparing that to the, the true physical world, not the zoom or something like that. And there, I mean, I think you have feelings of like temperature, you know, all factory, obviously touch. Right. | 10 |
|
We're working on haptic gloves. You know, the, the sense that you want to be able to, you know, put your hands down and feel some pressure from the table. You know, all these things, I think we're going to be really critical to be able to keep up this illusion that you're in a world and that you're fully present in this world. | 11 |
|
But I don't know, I think we're going to have a lot of these building blocks within, you know, the next 10 years or so. And even before that, I think it's amazing how much you're just going to be able to build with software that sort of masks some of these things. I, I, I'm going along, but I, you know, I was told we have a few hours here. | 12 |
|
Yeah, we're here for five to six hours. Yeah. So I mean, it's, look, I mean, that's, that's on the shorter end of the congressional testimonies I've done. | 13 |
|
But it's, um, but | 14 |
|
You know, one of the things that we found with hand presence, right? So the earliest VR, you just have the headset and then, um, and that was cool. You could look around. | 15 |
|
You feel like you're in a place, but you don't feel like you're really able to interact with it until you have hands. And then there was this big question where once you got hands, what's the right way to represent them? And initially, all of our assumptions was, okay, when I look down and see my hands in the physical world, I see an arm and it's going to be super weird if you see, you know, just your hand. | 16 |
|
Um, but turned out to not be the case because there's this issue with your arms, which is like, what's your elbow angle? And if the elbow angle that we're kind of interpolating based on where, um, your hand is and where your headset is, actually, as an accurate, it creates this very uncomfortable feeling where it's like, oh, like my arm is actually out like this, but it's like showing it in here and that actually broke the, the feeling of presence a lot more. | 17 |
|
Whereas it turns out that if you just show the hands and you don't show the arms, um, it actually is fine for people. So I think that there's a bunch of these interesting psychological cues where it'll be more about getting the right details right. And I think a lot of that will be possible even over, you know, a few year period or five year period and we won't need like every single thing to be solved to deliver this like full sense of presence. | 18 |
|
Yeah, and the way that I come to all of this stuff is, I basically studied psychology and computer science. So all of the work that I do is sort of at the intersection of those things. I think most of the other big tech companies are building technology for you to interact with. | 19 |
|
What I care about is building technology to help people interact with each other. So I think it's a somewhat different approach than most of the other tech entrepreneurs and big companies come at this from. And a lot of the lessons in terms of how I think about designing products come from some just basic elements of psychology, right? | 20 |
|
In terms of, you know, our brains, you know, you can compare to the brains of other animals, you know, we're very wired to specific things, facial expressions, right? I mean, we're very visual, right? So compared to other animals, I mean, that's clearly the main sense that most people have. | 21 |
|
But there's whole part of your brain that's just kind of focused on reading facial cues. So when we're designing the next version of Quest where the VR headset, a big focus for us is face tracking and basically eye tracking so you can make eye contact, which again, isn't really something that you can do over a video conference. It's sort of amazing how much, how far video conferencing has gotten without the ability to make eye contact, right? | 22 |
|
It's sort of a bizarre thing if you think about it. You're like looking at someone's face, you know, sometimes for an hour when you're in a meeting and like you looking at their eyes to them doesn't look like you're looking at their eyes. So it's... | 23 |
|
Yeah. | 24 |
|
We are trying to. Right, you're trying to. Like a lot of times I mean, I, or at least I find myself, I'm trying to look into the other person's eyes. | 25 |
|
Yeah, so then the question is, all right, am I supposed to look at the camera so that way you can, you know, have a sensation that I'm looking at you? I think that that's an interesting question. And then, you know, with VR today, even without eye tracking and knowing what your eyes are actually looking at, you can fake it reasonably well, right? | 26 |
|
So you can look at like where the head poses and if it looks like I'm kind of looking in your general direction, then you can sort of assume that maybe there's some eye contact intended and you can do it in a way where it's okay, maybe not it's like a, maybe it's not a fixated stare, but it's somewhat natural, but once you have actual eye tracking, you can do it for real. And I think that's really important stuff. | 27 |
|
So when I think about Meta's contribution to this field, I have to say it's not clear to me that any of the other companies that are focused on the metaverse or on virtual and augmented reality are going to prioritize putting these features in the hardware because like everything they're trade -offs, right? I mean, it adds some weight to the device, maybe it adds some thickness. You could totally see another company taking the approach but just make the lightest and thinnest thing possible. | 28 |
|
But I want us to design the most human thing possible that creates the richest sense of presence and because so much of human emotion and expression comes from these like micro movements. If I like move my eyebrow, you know, millimeter, you will notice and that like means something. So the fact that we're losing these signals and a lot of communication I think is a loss. | 29 |
|
So it's not like, okay, there's one feature and you add this, then it all of a sudden is going to feel like we have real presence. You can sort of look at how the human brain works and how we express and kind of read emotions and you can just build a roadmap of that. You know, of just what are the most important things to try to unlock over a five to 10 year period and just try to make the experience more and more human and social? | 30 |
|
Yeah, I think it's a really good question. | 31 |
|
Someone, you know, I read this piece that frame this as a lot of people think that the metaverse is about a place, but one definition of this is it's about a time when basically immersive digital worlds become the primary way that we, that we live our lives and spend our time. I think that that's a reasonable construct. And from that perspective, you know, I think, um, you also just want to look at this as a continuation because it's not like, okay, we are building digital worlds, but we don't have that today. | 32 |
|
I think, you know, you know, you and I probably already live a very large part of our life in digital worlds. They're just not 3D immersive virtual reality, but you know, I do a lot of meetings over video or I spend a lot of time writing things over email or WhatsApp or whatever. So what is it going to take to get there for kind of the immersive presence version of this, which I think is what you're asking? | 33 |
|
And for that, I think that there's just a bunch of different use cases, right? And then, um, I think when you're building technology, I think you're, a lot of it is just you're managing this duality where on the one hand, you want to build these elegant things that can scale and, you know, have billions of people use them and get value from them. And then on the other hand, you're fighting this kind of ground game where it's just there are just a lot of different use cases and people do different things and like you want to be able to unlock them. | 34 |
|
So the first ones that we basically went after were gaming, um, with Quest and social experiences. And this is, you know, it goes back to when we started working on virtual reality. My theory at the time was basically, people thought about it as gaming, but if you look at all computing platforms up to that point, you know, gaming is a huge part. | 35 |
|
It was a huge part of PCs. It was a huge part of mobile, but it was also very decentralized, right? There wasn't, you know, for the most part, you know, one or two gaming companies. | 36 |
|
There were a lot of gaming companies and gaming is somewhat hits based. And we're getting some games that are that have more longevity, but, um, but it put in general, you know, there were a lot of different games out there. But on PC and, um, and on mobile, the companies that focused on communication and social interaction, there tended to be a smaller number of those. | 37 |
|
And that ended up being just as important of a thing as all of the games that you did combined. I think productivity is another area. That's obviously something that we've historically been less focused on, but I think it's going to be really important for us. | 38 |
|
I think there's a workroom aspect of this, like a meeting aspect, and then I think that there's like a word, Excel, productivity. You're working or coding or what knowledge work, right? It's as opposed to just meetings. | 39 |
|
So you can kind of go through all these different use cases. You know, gaming, I think we're well in our way. Social, I think, we're just the kind of preeminent company that focuses on this. | 40 |
|
And I think that that's already on quest becoming the, you know, if you look at the list of what are the top apps, you know, social apps are already, you know, number one, two, three. So that's kind of becoming a critical thing. But I don't know, I would imagine for someone like you, it'll be, you know, until we get, you know, a lot of the work things dialed in, right? | 41 |
|
When this is just like much more adopted and clearly better than Zoom for VC, when, you know, if you're doing your, your coding or your writing or whatever it is, in VR, which it's not that far off to imagine that because it's pretty soon, you're just going to be able to have a screen that's bigger than, you know, it'll be your ideal setup and you can bring it with you and put it on anywhere and have your, your kind of ideal work station. | 42 |
|
So I think that there are a few things to work out on that. But I don't think that that's more than, you know, five years off. And then you'll get a bunch of other things that like aren't even possible or you don't even think about using a phone or PC for today, like fitness, right? | 43 |
|
So I mean, I know that you're, you know, we're talking before about how you're, you're into running and like I'm really into, you know, a lot of things around fitness as well, you know, different things in different places. I got really into hydrofoiling recently and thanks. Thanks. | 44 |
|
Yeah, and surfing and I used to fence competitively. I like run so. | 45 |
|
Is that a trick? | 46 |
|
Yeah, no, I mean, I took that seriously. I thought that that was a real suggestion. | 47 |
|
Well, give me a year to train and then and then we can do it. | 48 |
|
The idea of me as Rocky and like fighting is um... | 49 |
|
He dies. Sorry. | 50 |
|
I mean... | 51 |
|
But I mean, a lot of aspects of fitness, you know, I don't know if you've tried supernatural on Quest or | 52 |
|
Yeah. | 53 |
|
Yeah. | 54 |
|
Yeah. | 55 |
|
I think that in building this, we sort of need to balance. There are going to be some new things that you just couldn't do before, and those are going to be the amazing experiences. Teleporting to any place, whether it's a real place or something that people made. | 56 |
|
Some of the experiences around how we can build stuff in new ways, where a lot of the stuff that when I'm coding stuff, it's a cart, you code it, and then you build it, and then you see it afterwards. But increasingly, it's going to be possible to, you know, you're in a world, and you're building the world as you are in it, and kind of manipulating it. You know, one of the things that we showed inside the lab for recent artificial intelligence progress is this builder bot program, where now you can just talk to it and say, hey, I'm in this world, put some trees over there, and it'll do that. | 57 |
|
And like, all right, put some bottles of water on our picnic blanket, and it'll do that, and you're in the world, and I think there are going to be new paradigms for coding. So yeah, there are going to be some things that I think are just pretty amazing, especially the first few times that you do them, but that you're like, whoa, like I've never had an experience like this. But most of your life, I would imagine, is not doing things that are amazing for the first time. | 58 |
|
A lot of this in terms of, I mean, just answering your question from before around, what is it going to take before you're spending most of your time in this? Well, first of all, let me just say it as an aside, the goal isn't to have people spend a lot more time in computing. And that's to make it to that. | 59 |
|
It's to make computing more natural, but it is. | 60 |
|
But I think you will spend most of your computing time in this when it does the things that you use computing for somewhat better. So maybe having your perfect workstation is a 5 % improvement on your coding productivity. It maybe it's not like a completely new thing. | 61 |
|
But I mean, look, if I could increase the productivity of every engineer in meta by 5%, we'd buy those devices for everyone. And I imagine a lot of other companies would too. And that's how you start getting to the scale that I think makes this rival some of the bigger computing platforms that exist today. | 62 |
|
Yeah, I mean, I think that there's going to be a range, right? So we're working on, for expression and avatars, on one end of the spectrum are kind of expressive and cartoonish avatars. And then on the other end of the spectrum are photorealistic avatars. | 63 |
|
And I just think the reality is that they're going to be different use cases for different things. And I guess there's another axis. So if you're going from photorealistic to expressive, there's also like representing you directly versus like some fantasy identity. | 64 |
|
And I think that there are going to be things on all ends of that spectrum too, right? So you'll want photo, like in some experience you might want to be like a photorealistic dragon, right? Or, or you know, if I'm playing onward, or just this military simulator game, it's, you know, I think getting to be more photorealistic as a soldier in that could enhance the experience. | 65 |
|
There are times when I'm hanging out with friends where I want them to, you know, know it's me. So kind of cartoonish or expressive version of me is good. But there are also experiences like, you know, VR chat does this well today, where a lot of the experiences kind of dressing up and wearing a fantastical avatar that's almost like a meme or is humorous. | 66 |
|
So you come into an experience and it's almost like, you have like a built -in icebreaker because like you see people and you're just like, all right, I'm cracking up at what you're wearing because that's funny and it's just like, where'd you get that or oh you made that? That's, you know, it's awesome. | 67 |
|
Whereas, you know, okay, if you're going into a work meeting, maybe a photorealistic version of your real self is gonna be the most appropriate thing for that. So I think the reality is there aren't going to be, it's not just gonna be one thing. You know, my own sense of kind of how you wanna express identity online has sort of evolved over time in that early days in Facebook, I thought, okay, people are gonna have one identity. | 68 |
|
And now I think that's clearly not gonna be the case. I think you're gonna have all these different things and there's utility and being able to do different things. So some of the technical challenges that I'm really interested in around it are, how do you build the software to allow people to seamlessly go between them? | 69 |
|
So say, so you could view them as just completely discrete points on a spectrum, but let's talk about the metaverse economy for a second. Let's say I buy a digital shirt for my photorealistic avatar, which by the way, I think at the time where we're spending a lot of time in the metaverse doing a lot of our work meetings in the metaverse, et cetera, I would imagine that the economy around virtual clothing as an example is going to be quite as big. | 70 |
|
Why wouldn't I spend almost as much money in investing in my appearance or expression for my photorealistic avatar for meetings as I would for whatever I'm gonna wear in my video chat? But the question is, okay, so let's see you buy some shirt for your photorealistic avatar. Wouldn't it be cool if there was a way to basically translate that into a more expressive thing for your kind of cartoonish or expressive avatar? | 71 |
|
And there are multiple ways to do that. You can view them as two discrete points and, okay, maybe if a designer sells one thing, then it actually comes in a pack and there's two and you can use either one on that, but I actually think the stuff might exist more as a spectrum in the future. And that's what I do think the direction on some of the AI advances that is happening to be able to, especially stuff around like style transfer, being able to take a piece of art or express something and say, okay, paint me this photo in the style of go -gan or whoever it is that you're interested in, take this shirt and put it in the style of what I've designed for my expressive avatar. | 72 |
|
I think that's gonna be pretty compelling. | 73 |
|
I mean, you could do that, although I think that's, I think some people will, but I think like, I think there's going to be a huge aspect of just people doing creative commerce here. So I think there is going to be a big market around people designing digital clothing. But the question is, if you're designing digital clothing, do you need to design, if you're the designer, do you need to make it for each kind of specific discrete point along a spectrum, or you're just designing it for kind of a photo -realistic case or an expressive case, or can you design one and have it translate across these things? | 74 |
|
If I buy a style from a designer who I care about and now I'm a dragon, is there a way to morph that so it goes on the dragon in a way that makes sense? And that I think is an interesting AI problem because you're probably not going to make it so that that designers have to go design for all those things. But the more useful the digital content is that you buy in a lot of uses, in a lot of use cases, the more that economy will just explode. | 75 |
|
That's a lot of what all of the, we were joking about NFTs before, but I think a lot of the promise here is that if the digital goods that you buy are not just tied to one platform or one use case, they end up being more valuable, which means that people are more willing and more likely to invest in them, and that just spurs the whole economy. | 76 |
|
Well, let's break that down into a few different cases. I mean, because knowing that you're talking to someone who has good intentions is something that I think is not even solved in... | 77 |
|
right, and pretty much anywhere. But, I mean, if you're talking to someone who's a dragon, I think it's pretty clear that they're not representing themselves as a person. I think probably the most pernicious thing that you want to solve for is, I think probably one of the scariest ones is how do you make sure that someone isn't impersonating you? | 78 |
|
Right, so you like, okay, you're in a future version of this conversation. And we have photo realistic avatars, and we're doing this in work rooms or whatever the future version of that is. And someone walks in who looks like me. | 79 |
|
How do you know that that's me? And... | 80 |
|
One of the things that we're thinking about is, it's still a pretty big AI project to be able to generate photorealistic avatars that basically can like, they work like these codecs of you. Right? So you kind of have a map from your headset and whatever sensors of what your body is actually doing and it takes the model and it kind of displays it in VR. | 81 |
|
But there's a question which is should there be some sort of biometric security so that like when I put on my VR headset or I'm going to go use that avatar, I need to first prove that I am that. And I think you probably are going to want something like that. So as we're developing these technologies, we're also thinking about the security for things like that because people aren't going to want to be impersonated. | 82 |
|
That's a huge security issue. Then you just get the question of people hiding behind fake accounts to do malicious things which is not going to be unique to the metaverse although certainly in a environment where it's more immersive and you have more of a sense of presence, it could be more painful, but more painful. But this is obviously something that we've just dealt with for years in social media and the internet more broadly. | 83 |
|
And there I think | 84 |
|
There have been a bunch of tactics that I think we've just evolved to, you know, we've built up these different AI systems to basically get a sense of, is this account behaving in the way that a person would? And it turns out, you know, so in all of the work that we've done around, you know, we call it community integrity. And it's basically like policing harmful content and trying to figure out where to draw the line. | 85 |
|
And there are all these like really hard and philosophical questions around like, where do you draw the line on some of this stuff? And the thing that I've kind of found the most effective is as much as possible trying to figure out who are the inauthentic accounts or where the accounts that are behaving in an overall harmful way at the account level, rather than trying to get into like policing what they're saying, right? | 86 |
|
Which I think the metaverse is going to be even harder because that the metaverse I think will have more properties of, it's almost more like a phone call, right? Or like, or you're, you know, it's not like I post a piece of content and is that piece of content good or bad. So I think more of this stuff will have to be done at the level of the account. | 87 |
|
But... | 88 |
|
This is the area where between the counterintelligence teams that we built up inside the company and years of building different AI systems to basically detect what is a real account and what isn't. I'm not saying we're perfect, but this is an area where I just think we are years ahead of basically anyone else in the industry in terms of having built those capabilities. And I think that that just is going to be incredibly important for this next wave of things. | 89 |
|
like agent smith | 90 |
|
How does flooding the world with Lex's help me know in our conversation that I'm talking to the real Lex? | 91 |
|
I think that one's not going to work that well for you. | 92 |
|
I mean, for the original copy, it probably fits some things. Like if you're a public figure and you're trying to have, you know, a bunch of, if you're trying to show up in a bunch of different places in the future, you'll be able to do that in the metaverse. So that kind of replication, I think will be useful. | 93 |
|
But I do think you're going to want a notion of like, | 94 |
|
I am talking to the real one. | 95 |
|
Yeah. | 96 |
|
I think that there's different solutions for strategies where it makes sense to have stuff kind of put behind a fortress, right, so the centralized model versus decentralizing. Then I think both have strengths and weaknesses. So I think anyone who says, okay, just decentralize everything that'll make it more secure. | 97 |
|
I think that that's tough because, you know, I mean, the advantage of something like, you know, encryption is that, you know, we run the largest encrypted service in the world with WhatsApp and one of the first to roll out a multi -platform encryption service. And that's something that I think was a big advance for the industry. And one of the promises that we can basically make because of that, our company doesn't see when you're sending an encrypted message and an encrypted message, what the content is of what you're sharing. | 98 |
|
So that way if someone hacks meta servers, they're not going to be able to access, you know, your sending to your friend. And that I think matters a lot to people because obviously if someone is able to compromise a company's servers and that company has hundreds of millions or billions of people, then that ends up being a very big deal. The flip side of that is, okay, all the content is on your phone. | 99 |
End of preview. Expand
in Dataset Viewer.
README.md exists but content is empty.
- Downloads last month
- 41