laitimes

Jensen Huang in conversation with Zuckerberg: The future of AI

Huang and Zuckerberg shared the stage to talk about AI and swapped coats to express their "brotherly affection."

On July 29, local time, at the SIGGRAPH 2024 conference held in Denver, United States, NVIDIA CEO Jensen Huang and Meta founder Zuckerberg had a conversation to discuss the future of artificial intelligence and simulation, robotics smart glasses and open source philosophy.

In this hour-long conversation, Zuckerberg talks about the vision of the future of AI chatbots. He pointed out that in the future, AI will be used not only for content recommendation systems, but also for instant content generation and the integration of new content from existing content, which will revolutionize the flow of information and recommendation systems on platforms such as Instagram and Facebook.

Huang also praised Meta's new open-source model, Llama 3.1. Zuckerberg talked about Apple's closed system impact, which made him realize the importance of open source: "Open source makes our AI ecosystem stronger, not because it's altruistic, but because it's easier to use." ”

When it comes to smart glasses, he thought holographic AR glasses would come before AI, but smart glasses are still a mobile version of the next generation of computing platforms. "We're still a while away from having holographic glasses. I don't think it's too far off to achieve this in a sleek, thicker-framed pair of glasses, though. ”

During the conversation, Huang mentioned that when Zuckerberg was a guest, the two prepared cheesesteak sandwiches together, and when Zuckerberg cut tomatoes, each slice was equally thin and neatly arranged, and Huang laughed, "That's why he needs an AI that won't judge him." ”

Toward the end of the conversation, Huang and Zuckerberg gave each other a leather coat. Zuckerberg responded to Huang by saying that it was worth more money because it was passed through.

The following is the full text of the conversation:

Jensen Huang

Welcome Mark, welcome to SIGGRAPH for the first time, as one of the pioneers of computing, the promoter of modern computing, I had to invite him to come.

You know, these 90% PhDs. As you know, the really great thing about SIGGRAPH is that it showcases the combination of computer graphics, image processing, artificial intelligence, and robotics. Over the years, some companies have shown and revealed amazing things, including Disney, Pixar, Adobe, Epic Games, and of course, Nvidia. We've done a lot of work here this year, and we've presented 20 papers at the intersection of AI and simulation. We're using AI to help simulate larger-scale, higher-speed computations, for example, differentiable physics, we're using simulations to create simulated environments for synthetic data generation, and the two fields really come together.

The work done by Meta Japan Toyota BF · is proud, you guys have done amazing work on AI. The media often writes about how Meta has entered the AI space over the last few years, and like you know, we've been using Meta's PyTorch, and you've been doing amazing groundbreaking work in computer vision, language models, real-time translation, and so on. My first question is, what do you think of the advancements in generative AI and Meta today? And how can it be applied to enhance your operations or introduce new features?

Zuckerberg

Yes, there are many things to answer. First of all, it's a pleasure to be here. Meta has done a lot of work in the AI space. Although we are new to Nvidia, we are constantly improving. Thank you for having me.

Back in 2018, we showed off some of the early hand tracking work on VR and mixed reality headsets, and I think we've talked a lot about the progress we've made with Codec Avat AR, and we're hoping to be able to drive photorealistic avatars with consumer-grade headsets, and we're getting closer and closer, so we're really excited about that. We've done a lot of work on display systems, and we've shown a lot of future prototypes and research in order to make mixed reality headsets very thin, but with fairly advanced optical stacking, display systems, and integrated systems.

Zuckerberg

It's a pleasure to be here. This year is not just about the metaverse stuff, but also about all the AI parts. When we set up the FAIR Artificial Intelligence Research Center, it was Facebook at the time, and now it is Meta. We've been doing it for a while before we started FAIR, and it's an interesting revolution about everything in the age of artificial intelligence.

I think it's going to end up making different products that we do in a different and interesting way. You can look at the main product lines that we already have, like the feed and recommendation system on Instagram and Facebook, and we've been in this process all the time, from simply focusing on connecting with friends and ranking has always been important, because even if you're just following friends, if somebody does something really important, like your cousin has a baby or something, you're going to want to be at the top. If we are buried in your feed, you will be angry. You are very important to us, so the recommender system is important.

Zuckerberg

Over the past few years, it has come to a point where more stuff is just different public content. The referral system is very important because now it's not just a few thousand or a few thousand potential candidate posts from a friend, but millions of pieces of content, which becomes a very interesting referral question. With generative AI, I think we're going to get into an area very quickly where most of the content that you see on Instagram today, as long as it's something from the world that matches your interests, will be recommended to you, whether you follow those people or not.

I think in the future, a lot of things will be created with AI tools. Some of these will be creators using tools to create new content, which will eventually be content created on the fly for you, or content that has been collected and synthesized from different content. This is just one example of how the core part of what we're doing has evolved, and that's been evolving over the last 20 years.

Jensen Huang

It is recognized that one of the largest computing systems in the world is a recommendation system.

Zuckerberg

I mean, it's a completely different path, it's not the kind of generative AI hotspot that people talk about, but like all Transformer architectures, it has similarities, just building more and more generic models.

Jensen Huang

Embed unformatted data into a feature.

Zuckerberg

Yes, a big driver of quality improvement is that in the past there were different models for many types of content. A recent example is that we have a model for ranking and recommending short videos, and another for ranking and recommending more longer videos. Then, do some product work that basically makes it possible for the system to display anything that meets the criteria. The more you create a generic recommendation model that can cover everything, the better it will be built.

I think this is getting better and better as the model gets bigger and more prevalent. So I had a dream one day, like you can almost imagine all of Facebook or Instagram, to be like a single AI model that unifies all these different content types and systems together, and actually has different goals in different time frames. Some just show you what interesting content you want to see today, but some of them help you build your network over the long term, like people you might know or accounts you want to follow.

Jensen Huang

These multimodal models tend to be better at identifying patterns, weak signals. AI is so deep in your company that you've been building the GPU infrastructure to run these large recommendation systems for a long time.

But once you start getting into this field, you'll get into it. You get into it and you're very engaged. Nowadays, when I use WhatsApp, I feel like I'm working with WhatsApp. I like to imagine that when I type, it generates images. I go back and change my words, and it generates other images.

Zuckerberg

yes, that was last week, and I'm excited about it. I've spent a lot of time with my daughters over the past week imagining them as mermaids and stuff like that, and it's been a lot of fun. I mean, that's the other half of it.

On the one hand, a lot of generative AI stuff, it's going to be a major upgrade to all of our long-standing workflows and products.

But on the other hand, there will now be all completely new things that can be created. The idea of Meta AI is to have an AI assistant that can help you with different tasks and be very creative. Like you say. They are very versatile, so you don't need to be limited to that, it will be able to answer any question.

When we move from a model like Llama 3 level to Llama 4 and beyond, it doesn't feel like a chatbot, like you give it a hint and it responds. Then you give a hint and it responds, like back and forth. I think it's going to evolve very quickly into you giving it an intention, and it can actually work on multiple time frames. It should probably know that you gave it an intention beforehand. If something that I think is going to come out eventually, it's going to kick in, and the computational work takes weeks or so and then it just comes back to you and comes back and tells you what's going on, and that's going to be very powerful.

Jensen Huang

Today's AI, as you know, is a turn-based and somewhat monotonous. We think about what would consider multiple options when we were given a task or we were giving a question, or we came up with an option tree, a decision tree, and then we followed the decision tree, simulating in our minds the different outcomes of each decision we might make. So we're planning. In the future, AI will do the same.

Zuckerberg

yes, so we're actually talking about that, but today we're going to roll it out more broadly. A lot of our vision is, I don't think there's just one AI model, they're like building an integrated AI agent. Many of our visions are to empower everyone who uses our products to create agents for themselves.

Whether it's the millions of creators on the platform or a small business, we hope to finally be able to integrate all of our content, quickly build a business agent, and be able to interact with customers, do sales, customer support, etc. So we're just starting to roll out more AI Studios now, which is basically a set of tools, and eventually Jiang Wei builds an AI version of themselves as an agent or assistant that they can interact with.

There's a basic problem here, which is that there aren't enough hours in the day, if you're a creator and want to interact more with the community but have limited time. So the next one is for people to create these Agents, which are an Agent, but you train it to represent you in the way you want on your material. I think it's a very creative endeavor, almost like a piece of art or content that you post. To be very clear, it doesn't interact with the creators themselves. But I think it's going to be another interesting way to be able to get agencies to do that, in the same way that creators post content on these social systems.

Again, I think people can fine-tune and train to create agents for different purposes. Some of it will be entertainment, and some people create things that are just funny and funny in different ways, or have a funny attitude, and those are probably not built into Meta AI as an assistant, but people are very interested in it and want to interact with it.

And then, an interesting use case is people using these agents for support, and one of the main use cases for Meta AI is that people basically use it to simulate social scenarios, whether it's a professional scenario, like "I want to ask my manager how do I get a promotion or a raise?" "Whether it's an argument with a friend, or a difficult situation with a girlfriend, simulate that conversation, see how the conversation goes, and get feedback.

A lot of people don't just want to interact with agents, whether it's MetaAI or ChatGPT or something that someone else uses, they want to create something of their own. That's probably what we're aiming for in AI Studio, but it's all part of a larger vision that we don't think there should be just one big AI for people to interact with. We think the world would be better and more interesting if there were all kinds of different things.

Jensen Huang:

These are different things. I think it's cool that if you're an artist and you have your own style, all your work, fine-tuned into an AI model, you can have the AI create something according to your own art style.

Zuckerberg:

yes, I mean, just like businesses are going to have email addresses, websites, and social media accounts, the future of businesses will have an AI agent to engage with customers. Historically, these things have been difficult to do. If you think about any company, there might be a customer support department, which is separate from the sales department, and as a CEO you don't want that because it's a different skill set.

Zuckerberg:

I guess that's what you have to do when you're the CEO. But when you're building layers of abstraction in an organization, a lot of times these organizations are separate because they're optimized for different goals, ideally they're one thing. As a customer, you don't care if you take a different path when buying something and when you have a problem, you just want a place where your questions can be answered and the business can be interacted with. This also applies to creators. For consumers, these interactions with customers.

Jensen Huang:

Interactions with customers, especially their complaints, will make your company better. It's all about AI, it's going to capture institutional knowledge, and how and all of that can go into analytics, so that AI can be improved and so on.

Zuckerberg:

So I think there's a lot more integration with this commercial version, and we're still in pretty early stages. But AI studios are giving people the ability to create their UGC Agents and different things to get started on this flywheel. I'm very excited about it.

Jensen Huang

So, can I use AI Studio to fine-tune my collection of images?

Zuckerberg

Yes, we will do it.

Jensen Huang

I can load everything I write so I can use it as my fodder. Basically, every time I go back to it, it reloads its memories again. So it remembers where it last stopped. We continued our conversation as if nothing had happened.

Like any product, it will get better over time, and the training tools will get better. It's not just about what you want it to say. I think creators and businesses generally want to be able to have freedom of control, not just passive acceptance, so it's getting better on all of these fronts.

Zuckerberg

Ideally, it's not just text, you almost want to be able to video chat, which intersects with some of the codec avatar work we're doing. We will make that happen. These things are not far from us, and the flywheel spins fast. It's exciting. There's a lot of new stuff to do. Even though progress on the base model has now stopped, we have a five-year period of product innovation to figure out how to use what is already available to the most effective. But in reality, the progress of basic models and basic research is accelerating, and this is a rather crazy time.

Jensen Huang

I will be able to move on. Let me put it this way. I like your vision that everyone can have an AI, and every business can have an AI in our company. I want every engineer and every software developer to have an AI. I like your vision because you also believe that everyone and every company should be able to develop their own AI. So you open-sourced Llama, which I think is great. Llama 2.1, by the way, I think Llama 2 was probably the biggest event for AI last year. Here's why.

Zuckerberg

I thought it was H100, but you know, it's a pity.

Jensen Huang

This is a classic chicken-and-egg problem. Yes, which one comes first?

Zuckerberg:

H100, actually, Llama 2 is not based on A100.

Jensen Huang

I say this is the biggest event because it activates every company, every business, and every industry. All of a sudden, every healthcare company is building AI, every company is building AI, every big company, small company, startup is building AI. This makes it possible for every researcher to re-engage with AI because they have a starting point for what they can do.

Then now that 3.1 has been released, it's exciting that we're working together to deploy 3.1. We're going to bring it to the global enterprise, it's exciting, and I think it's going to make all kinds of applications possible.

But tell me where did your open source idea come from? You open-sourced PyTorch, which is now the framework for completing AI. Now that you've open-sourced Llama 3.1, there's a whole ecosystem built around it. I think it's great. But where did it all come from?

Zuckerberg

Yes, a lot of things have a lot of history. We worked overtime to do a lot of open source work. I think part of that, frankly, we're starting to build distributed computing infrastructure and data centers after some of the other tech companies. Because of this, when we build these things, it's no longer a competitive advantage. It's like, okay, we might as well open it up, and then we'll benefit from the ecosystem around us. So we have a lot of projects like this. I think the biggest one is probably open computing, where we took our server design and network design, and eventually the data center design and released all of that. By making it some sort of industry standard, all of the supply chain is basically organized around it, which saves everyone money. By being open and open, we're getting billions of dollars out of it.

Jensen Huang

Open computing is also what makes it possible for us to design Nvidia HGX for one data center, and it works for every data center as well. Awesome.

Zuckerberg

We thought it was a great experience. And then we've done this with a bunch of infrastructure tools, like React PyTorch. So I would say that when Llama came along, we tended to do this for AI models. I guess there are a couple of ways to look at this. Meaning, it's been really fun to build things in a company for the last 20 years. One of the most difficult things is having to deal with the fact that we are publishing an app through a competitor's mobile platform. So on the one hand, the mobile platform is a huge boost for the industry as a whole, which is fantastic. On the other hand, it can be challenging to launch your product through a competitor's platform. I also grew up in a time when the first version of Facebook was on the web page, it was open. And then when it moves to mobile, the good thing is that everyone has a computer in their pocket, which is great. But the downside is that we're more limited in what we can do.

So when you look at these computing eras, there's a big near-term bias where everybody is just looking at mobile, thinking, okay, because of the closed ecosystem, Apple basically wins and sets these conditions. Like, yes, I know there are actually more Android phones, but Apple basically has a bigger market share. And, all the profits are concentrated in Apple. In terms of development, Android is basically following Apple. So I think Apple is the clear winner in this generation, but that's not always the case. If you look back, Apple has made a difference in terms of closed systems. But Microsoft, although it's not a completely open company, compared to Apple, Windows runs on all the different OEMs and different software, hardware, forming a much more open ecosystem. In the age of PCs, Windows is the leading ecosystem and can be seen as an open ecosystem1.

I'm hopeful for the next generation of computing, and that's when we're going back to a time when open ecosystems were dominant. Again, there will always be both closed and open options. I think both have their rationale, both have their benefits. I'm not paranoid on this issue. I mean, we're going to do some closed-source projects. Not all of what we release is open source. But I think it's going to bring tremendous value to the industry as a whole for the computing platforms that are being built, especially for software open source. So that really influenced the way I think about AI and Llama, and what we're doing in the AR and VR space. We're basically building Horizon OS for mixed reality, an open operating system similar to Android or Windows. Basically, we want all the different kinds of devices to work. We're basically just trying to get the ecosystem back to that level, which is to be an open platform. I'm optimistic that the next generation of technology, especially open technology, will win. I just want to make sure we have access rights. It might be a bit selfish, but my goal is, over the next 10 to 15 years, to make sure that we can build the underlying technology that we use to build the social experience. I wanted to build a lot of stuff and then I was told by the platform provider "No, you can't really build it", and in a way, I wanted to say "nah, fuck that". For the sake of the next generation, we want to build from scratch and make sure where it goes.

Jensen Huang:

Where did our broadcasting opportunities go.

Zuckerberg

I'm sorry. I guess I beeped. yes, you know, we're okay in 20 minutes, but let me talk about closed platforms, I'm going to be angry.

Jensen Huang

Hey, look, that's awesome. I think it's a great world where there are people who are committed to building the best AI and they give it to the world as a service, no matter how they build and build it. But if you want to build your own AI, you can still build your own. So, yes, fully capable of writing, using AI, there's a lot of stuff. I'd rather not make this jacket myself. I prefer to have this jacket made for me. The fact that leather can be open source is not a useful concept for me. But I think you can have great service, incredible service, and open service, open a little bit. yes, and then we basically have the whole spectrum. But what you do with 3.1 is really great, you have 4, 5B, you have 70B, you have 8B, you can use that to generate synthetic data, use larger models to teach those models. Although a larger model will be more versatile, it's less fragile, and you can still build a smaller model that fits any area of operations or operational costs you want.

You come across a guard, I think it's the guardrail guarded by Llama, and it's amazing. So now the way you build the model is transparent. You have a world-class security team and an ethics team. You can build it in such a way that everyone knows how it was set up properly. I really like that.

Zuckerberg

yes, I mean, I wanted to get the idea done before I got transferred there because of the detour. I do think there's this alignment, we're building it because we want this thing to exist, we want to cut it off from some closed model, right? And, it's not just software you can build. You need an ecosystem, right? So if we don't open source it, it hardly works very well. We don't do this because we're altruists, although I think that helps the ecosystem. We do this because we think it's going to make what we're building the best it can be because it has a strong ecosystem.

Jensen Huang

How many people have contributed to the PyTorch ecosystem. Yes, exactly. At Nvidia alone, we probably have hundreds of people working to make PyTorch better, more scalable, more efficient, and so on.

Zuckerberg

And when something becomes the industry standard, others bypass it. So all the silicon in the system is going to end up being optimized to make it work well, which is going to benefit everybody, but it's also going to work well with the system that we're building. I think that's just an example of how this ultimately works very well. So, I mean, I think an open source strategy would be a good strategy. As a business strategy, I think people still don't quite know.

Jensen Huang

We liked it very much. We've built an ecosystem around it. We built this thing I found.

Zuckerberg

You guys have been great. Every time we ship, you're the first to publish and optimize and make it work. So, I'm grateful, Adam.

Jensen Huang

What can I say? We have excellent engineers, and you.

Zuckerberg

Always jump on these things quickly.

Jensen Huang

Yes, I'm very supportive, I'm an older person, but I'm agile. That's what the CEO has to do. I recognize an important thing. I think lamas are really important. We built a concept called AI Factory, AI Foundry, so that we can help everyone build, and many people have the desire to build AI. It's very important for them to have AI because once they put it into their flywheel, their data flywheel, that's how their company's institutional knowledge is coded and embedded in AI. So they can't afford to have an AI flywheel, a data flywheel, and an experience flywheel somewhere else. So open source allows them to do that, but they don't know how to turn the whole thing into an AI.

So we created this thing called the AI Foundry. We provide the tools, we provide the expertise, the Lama technology. We have the ability to help them turn this whole thing into an AI service. And then when we're done, they take it, they own it. Its output is what we call Nim. This Nim, this Neuro microservice, Nvidia inference microservice, they just download it, they take it, and then run it anywhere they like, including locally. We have a whole ecosystem of partners, from OEMs who can run Nims to GSI, such as Accenture, where we train and collaborate to create alpaca-based MEMs and pipelines. Now we're helping businesses around the world do just that. I mean, it's really a really exciting thing, and it's really all sparked by Rama Open Source.

Zuckerberg

yes, I think the ability to help people extract their own models from large models in particular would be a very valuable new thing. Because what we're talking about in terms of products, at least I don't think there's going to be a major AI agency that everybody talks to. I don't think everyone will use one model.

Jensen Huang

Chip AI, chip design AI. We have a software coding AI. Our software coded with AI understands USD as much as we USDOmniverse stuff. We have a software AI that understands Verilog. Our Verilog, we have one, we have software AI, it knows our error database and knows how to help us triage errors and send them to the right engineers. As a result, each of these AIs is fine-tuned for alpacas. So we fine-tune them, we protect them.

You know, if we have an AI design for chip design, we're not interested in asking about politics, religion and things like that. So we guardrail it. I think every company is basically going to have, for every feature that they have, they're probably going to have the AI that's built for that, and they need help to do that.

Zuckerberg

yes, I think that's going to be a big question in the future, to what extent people are just using bigger, more complex models, rather than training their own models to suit their use. At least I'm willing to bet they're going to be just a massive proliferation of different models.

Jensen Huang

The biggest reason is because of our engineers, whose time is so valuable. So we're optimizing the 405B now. As you know, 4 or 5B is not suitable for any graphics processor, no matter how big. That's why MV performance is so important. We have this, and each of our GPs is connected through this MV non-blocking switch called the Link Switch. For example, in HGF there are two such switches, and we make all of these GPUs work and run 4 or 5B.

Real performance. The reason we do this is because the engineer's time is so valuable to us. We want to use the best models. It's cost-effective for just a few cents, and who cares? So we just want to make sure that the best results are presented to them.

Zuckerberg

Yes, I think 4 or 5B is half the cost of GPT-4 model inference. So, at that level, it's already pretty good. But I think people are using devices or want smaller models. They're just going to distill it. So it's like a completely different set of services.

Jensen Huang

AI is running. Let's assume that the chip design AI we hire might be $10 an hour. You're using, if you've been using it, and you're sharing AI across a large group of engineers. So every engineer probably has an AI sitting with them, and the cost is not high. We pay engineers a lot of money. So for us, a couple of dollars an hour amplifies someone's capabilities, and that's really valuable.

Zuckerberg

You don't need to convince me.

Jensen Huang

If you haven't hired an AI yet, do it now. That's all we're going to say. Let's talk about the next wave. One of the things I really like about your work is computer vision. One of the models that we often use internally is segmenting everything. We are now training AI models with video. So that we can understand the world, model robotics and industrial digitalization, and connect these AI models to the omniscope so that we can better model and represent the physical world, and robots can better operate in these omnidirectional worlds. Your application, Ray Belt Element Glass, your vision of bringing AI into the virtual world is really interesting. Tell us about that.

Zuckerberg

Okay, there's a lot of work to be done in there. Any part of the model that you're talking about, we're actually showing, I think, in the next version of siggraph here, any part 2. Yes, it works now. It's faster. It works well with videos. I guess these are actually cattle from my kawaii ranch.

Jensen Huang

But by the way, these are called malicious markers.

So Mark came to my house and we made the Philly Cheese Beef Sandwich together. Next time you bring something.

Zuckerberg

Did. It's really good.

Jensen Huang

That's Chef Sue's comment. Okay, listen, and then.

Zuckerberg

At the end of the night, you say, hey, you've eaten enough, right? I thought, I don't know, I can eat another one. You're like, really?

Jensen Huang

You know, usually when you say you're bigger.

Zuckerberg

Yes, I would definitely say, yes, we are producing more shoes. Did you get it?

Jensen Huang

Is it enough to eat? Usually your guests will say, oh yes, I'm fine.

Zuckerberg

Make me another cheesecake. Jason.

Jensen Huang

So just wanted to let you know how obsessive-compulsive he is. So I turned around and I was preparing the cheesecake and I said, Mark, cut the tomatoes. So Mark, I handed him a knife.

Zuckerberg

I am a precision cutter.

Jensen Huang

So he cut the tomatoes. Each one is perfectly accurate to the millimeter. But what's really interesting is that I expect all the tomatoes to be sliced and stacked like a deck of cards. But when I turned around, he said he needed another plate. The reason is because of all the tomatoes he cuts, once he separates one slice of tomato from another, they don't touch anymore.

Zuckerberg

Look, man, if you want them to touch, you need to tell me you need to. Good.

Jensen Huang

That's why he needs an AI that doesn't judge.

Zuckerberg

Yes, like.

Jensen Huang

So that's cool. Okay, so it's identifying the trajectory of the cow. It identifies and tracks cows.

Zuckerberg

So you can make a lot of interesting effects with this. And because it's going to be open, there are a lot of more serious venues within the industry. So, yes, I mean, scientists are using these things to study coral reefs and natural habitats, and the evolution of landscapes and so forth. But I mean, being able to do that in a video and take a B0 shot and be able to interact with it and tell it what you want to track, that's really cool research.

Jensen Huang

For example, the reason we use it, for example, you have a repository, you have a whole bunch of cameras, and the repository AI is monitoring everything that's happening. Let's say, you know, a bunch of boxes fall down, or somebody spills water on the ground, or, you know, whatever accident is about to happen, the AI recognizes it, generates text, sends people, you know, help along the way. So that's a way to use it, instead of recording everything in the event of an accident, start recording every single nanosecond of video and then go back and retrieve that moment. It just records what's important because it knows what it's looking at. So having a video understanding model, a video language model is very powerful for all these interesting applications. Now, what else are you going to do besides Ray Talk? Talk to me.

Zuckerberg

There are all the clever courses. So I think when we think about the next computing platform, we're breaking it down into mixed reality, headsets, smart courses, and smart glasses. I think it's easier for people to understand and wear it because almost everyone who wears glasses today upgrades to a smart course. That's the equivalent of more than a billion people in the world. So it's going to be quite a big deal.

VRMR headset, I think some people find it interesting for gaming or other uses, some don't. My opinion is that both of them will be present in the world. I think the smart course will be the mobile version of the next computing platform. A mixed reality headset will be more like your workstation or game console when you sit down for a more immersive session and want to access more compute. I mean, the glasses have a very small form factor. This will have a lot of limitations, just like you can't do the same level of calculations on your phone.

Jensen Huang

It happens at a time when all these generative AI breakthroughs are generated.

Zuckerberg

So basically, for smart courses, we've been approaching this from two different directions. On the one hand, we've been building what we believe is the technology needed for the ideal holographic AR glasses. We're doing all the custom silicon work, all the custom display stack work, like all the things you need to do to get them to work on their glasses, right? This is not a headset. It's not like a VRMR headset. They look like glasses, but they're still quite far from the glasses you're wearing now. I mean, those are very thin. But even with the ray belts that we make, you can't fully adapt to all the technology that you need, but for holographic AR, although we're getting closer, in the next couple of years, I think we're basically going to get close. It's still going to be quite expensive, but I think it's going to start being a product.

On the other hand, let's start with good-looking glasses in collaboration with Estelor Luxatka, the best eyewear manufacturer in the world. They basically manufacture, and they own all the big brands that you use. You know, it's Leban, Oakley, Oliver · Pipers or a few others. Yes, a bit like Estellar, Luxatka.

Jensen Huang

Nvidia glasses.

Zuckerberg

I, I think, they might like the metaphor. But who wouldn't at this moment? So we've been working with them on the ray belts. We are the second generation. The goal is, okay, let's limit the form factor to ideas that look great. On top of that, let's put as much technology as we can, understanding that we won't technically get to where we want to be, but it will, eventually, be like pretty glasses.

We now have camera sensors, so you can take pictures and videos. You can actually go live on Instagram. You can make video calls on WhatsApp and stream what you see to other people. It has a microphone and a speaker. The speakers are actually very good. It is open ears. So many people find it more comfortable than earbuds. You can listen to music, just like this private experience. That's pretty good. People love that. You use it to answer the phone, but then it turns out that the sensor package is exactly what you need, and you can also talk to the AI. yes, so if you had asked me five years ago that we would have gotten holographic AR before AI, it was a kind of accident, and I would say, yes, probably. I mean, it seems like it's just the progress of graphics and display, on all things virtual and mixed reality, and building new display stacks. We are constantly moving in that direction.

And then this breakthrough happened with LLMs, and the result is that we now have very high-quality AI and it's getting better at a very fast rate before holographic AR.

Yes, it's an inversion that I didn't expect. We're fortunate to be in a good position because we're working on all these different products. But I think you'll end up with a range of different potential eyewear products at different prices and different levels of technology. So I think based on the band metadata that we're seeing right now, I guess showing fewer AI glasses at a $300 price point is going to be a very big product that will end up being owned by tens or hundreds of millions of people.

Jensen Huang

So you'll have super interactive AI to talk to you. Yes, visual, you've just demonstrated your understanding of visual language. You have real-time translations. You can talk to me in one language. I heard another language.

Zuckerberg

And then the display will obviously be great too. Yes, but this will increase the weight of the glasses, making them more expensive. So I think there will be a lot of people who want holographic displays. But there will also be a lot of people who want something that will end up like very thin glasses.

Jensen Huang

Well, for industrial applications and some working application locations, we need it.

Zuckerberg

Think about it, the same is true for consumer goods. What do you think? Yes, I think, I thought about this a lot during Covid, when everyone was a little bit distant. It's like you're spending all your time on video conferences. It's great that we have this. But in the future, we're not much away from being able to have virtual meetings, like I'm not here. It's just my hologram. Like, it feels like we're there, we're there. We can work and collaborate together. But I think it's going to be particularly important.

Jensen Huang

AI patience. I can live with a device that I don't wear very often.

Zuckerberg

But I think we'll get to the point where it actually exists. Yes, I can. It would be, there are built-in glasses, there are thinner frames and thicker frames, all of which have similarities. But I don't think we're a while away from having full hologlasses, but I don't think it's that far from having a stylish pair of thicker glasses.

Jensen Huang

Sunglasses are now face sizes. I can see it.

Zuckerberg

That's really a very useful style.

Jensen Huang

Yes, that's a very helpful person.

Zuckerberg

Let me be an influencer so I can influence glasses before they hit the market.

Jensen Huang

Well, I can see you're trying it. How has your style influenced and worked on you?

Zuckerberg

It's still early. I think a big part of the future of business is creating fashionable eyewear that people wear. This is something I should start paying more attention to now. So, yes, I agree. We're going to have to retire my version and we're doing the same thing every day. But it's also a problem with glasses. I don't think it's like, even with a watch or a phone, people really don't want to look the same, right? So I do think it's a platform and I think it's going to go back to what we talked about earlier and be an open ecosystem. Because I think the diversity of form factors and styles that people need will be enormous. Not everyone wants to wear that kind of glasses, no matter who designed them, and I don't think it's feasible.

Jensen Huang

I think that's right. Well, Mark, we're going through a time where the entire computing stack is reinventing the way we think about software, which is a bit of an incredible, what Andre calls Software 1 and Software 2. Now we're basically in software 3. Now we're moving from general-purpose computing to these computations that generative neural networks process. The features and applications we can develop now were unimaginable in the past.

And this technology, generative AI, I can't remember any other technology impacting consumers, businesses, industries, and science at such a rapid pace. Being able to span all these different fields of science, from climate technology to biotechnology to physical science, in every field that we come across. Generative AI is in the middle of this fundamental shift. In addition to that, what you're talking about, generative AI will have a profound impact on society. You know, the products that we're making. One thing that really excites me is that someone asked me before, will there be a Jensen AI? Well, that's exactly what you call creative AI, we just build our own AI, I load everything I write and fine-tune it depending on how I answer the questions. Hopefully, over time, the use accumulates, becoming a very good assistant and companion for a lot of people who just want to ask questions or bounce ideas.

This will be Jensen's version, and as you said before, it's not judgmental. You're not afraid of being judged. So you can come and interact with it all the time. I think these are all very incredible things. And, we've been writing a lot. Just give it three or four topics, and then those are the basic topics I want to write about, write it down in my voice, and use that as a starting point. So now we can do a lot of things. It's been really great to work with you and me. I know it's not easy to build a company, it's really unusual for you to move your company from desktop to mobile, from VR to AI, all of these devices, watching and video multiple times, and I know how hard it is. We've both been kicked a lot over the years. But that's what it takes to be a pioneer and innovate. So it's really great to look at you.

Zuckerberg

Again, if you continue to do what you did before, it's not sure if it's a turning point. But you increase it. There are many more chapters for all this. I think the same thing, it's interesting to watch the journey you guys have taken. We went through this period where everybody felt that everything was going to move to these devices, just going to get super cheap computing. You've been trying to do this. Like, actually you're going to want these big systems that can be paralyzed. We went the other way.

Jensen Huang

Yes, we now no longer make smaller and smaller devices, but computers.

Zuckerberg

It's a little unfashionable, it's a little unfashionable.

Jensen Huang

Super unfashionable. But now it's cool. We started building the graphics chip GPU. Now, when you deploy a GPU, you still refer to it as a Hopper H100. When Zach called it the H100, he had hundreds of H100s in his data center and could be on the verge of reaching 600,000.

Zuckerberg

We are good customers. That's how you get the Jensen Q&A. Here's a chart.

Jensen Huang

Wow, wait a minute. I received a Q&A from Mark · Zuckerberg. You're my guest and I want to make sure that one day Mark is just like you.

Zuckerberg

We'll be doing this at SIGGRAPH in a few weeks. I feel like I didn't do anything that day. in Denver. Sounds interesting.

Jensen Huang

Precisely. I didn't do anything that afternoon. It just appeared, but things were incredible. These systems that you've built, they're huge systems, they're hard to orchestrate, they're hard to run. You say you entered the GPU journey later than most, but your operation is bigger than anyone else and looks incredible. Congratulations on everything you've done. You're quite a fashion icon.

Zuckerberg

Stage work. It's the ladies.

Jensen Huang

Gentlemen, Mark · Zuckerberg.

Zuckerberg

Thank you.

Huang held on. The last time we got together after dinner, Mark and I.

Zuckerberg

It used to be.

Jensen Huang

Swapped, we took a picture and turned it into something viral. Now I don't think he's having a problem wearing my jacket. I do not know. Is that what I look like? I don't think so.

Zuckerberg

Should be.

Jensen Huang

Is that right?

Zuckerberg

Yes, actually, I made one for you.

Jensen Huang

Did you do it? Yes, that is a marker.

Zuckerberg

I mean, hey, let's see. We have a box here. It is black, leather, and sheepskin. Oh, it wasn't me who did it. I just ordered it online.

Jensen Huang

Wait a minute. It's a bit cold here. I think I'll try this one.

Zuckerberg

Oh, my God. This is the vibe you need.

Jensen Huang

Is this me?

Zuckerberg

It's going to be beautiful. I went in. Give this person a chain.

Jensen Huang

All right.

Zuckerberg

Next time I see him bringing.

Jensen Huang

Fair enough, right? I'll tell you. I told everyone that Lori bought me a new jacket to celebrate this year's SIGGRAPH. SIGGRAPH is a big deal in our company. RTX was released here. It's a brand new jacket. It's actually been two hours old. So I think we changed another jersey.

Zuckerberg

One is yours. This is more valuable because it is used.

Jensen Huang

Let's see. I do not know. I think Mark is strong. Look, that guy is pretty strong.

This article is from Wall Street News, welcome to download the APP to see more