E2: HUGE OpenAI News, Qualcomm Beats Apple M3, Nvidia's Next Chips & More

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
there has been a lot of incredible news around generative AI over the last few weeks so I figured we'd put together another quick podcast episode talking about open ai's devday presentation and what it could mean for the technology of the future and of course how it's going to affect the stock market your time is valuable so let's get right into it and I want to give a little context for how fast things are moving so openai released chat GPT less than a year ago on November 30th 2022 and at the time it was powered by gpt3 less than a year later chat GPT has 100 million weekly active users 2 million developers using their API they shipped GPT 4 back in March plugins in May and chat GPT Enterprise in August of 2023 so yesterday what they announced at their first devday event was gp4 turbo which comes with some pretty big upgrades their context window for gp4 Turbo is going to be a factor of 16 bigger which means it can now remember an entire 300 Page book you can now see the model which means the same exact answers given the same exact inputs so that introduces uh an element of reliability into the model the knowledge cut off was moved from September 21 2021 to April 2023 and open a has promised to keep this knowledge cut off more and more up to date over time gp4 turbo can take images as inputs and give audio as outputs via speech to text and text to speech they increase the rate limits which basically doubles the amount of calls that you can make per minute and they have much better pricing structure so gp4 turbo will be over 2 and a half times cheaper than gp4 was when it came out on a per token basis so not only is gp4 Turbo a much more capable model it's also a much cheaper model and I think that's a good place to like kick off the conversation so what do you think about these insane developments from Dev day oh the the the OG in uh in generative I AI is back right and taking the lead what I found and probably our viewers are similar is that we ended up trying a few different tools I I use chat GPT often because it had plugins and things but I had Claude because Claude could take a much bigger prompt so even from a in terms of research for our Channel we use it to summarize videos I had a 40-minute video once that I wanted to look at summarize the content from based on the transcript it was too long to do in chat GPT but I was able to dump that in Claude because Claude could take a larger prompt so some of this is them closing the Gap um but the knowledge cut off move is a really big move as well that's a really big step forward in terms of moving from September 2021 because that was always a real kind of weak point or that was when you kind of felt like ah this is still experimental this is still beta it's when you asked any kind of factual information and it was over two years out of date so I'm glad they've improved that and I think this is no longer a science experiment now it's about monetization how can we make it more appealing how can we come up with different price points to make it work and I think that's a theme for some of the stuff we'll talk about a while as well so it feels like open AI you know chat GPT is growing up basically is how i' summarize it I I want to double click on that uh for a minute because with lower prices comes way more adoption right I do think that some of GPT 4's pricing structure made it cost prohibitive for a lot of for example smaller creators small businesses but now with this new pricing structure it just lowers the uh barrier to entry for people trying to use generative AI for me one of the biggest upgrades is definitely to this context window right the ability to give it a lot more data I think is going to be a huge game changer because the other way we were limited in the past when we were using GPT for research we would run out of space to feed it new information about a topic that we wanted to cover and it would start forgetting things as we roll through the conversation so on top of that they announced a whole new paradigm of computing alog together so a lot of Dev day was focused around these AI agents which are also called gpts and the App Store that open aai plans to launch and then maintain over time over the next few months they talked about gpts being new versions of chat GPT tailor made for a specific purpose including custom instructions custom expanded knowledge and the ability to take actions via thirdparty apis so think about zapier canva using Google Maps uh Microsoft Outlook and so on so now these GPT powered AI agent can plug into different apps and run different actions on your behalf with your permission and then they announced their GPT store which is where people can publicly list the gpts that they build for others to download and they promised a revenue sharing model for people who build the most useful and most used gpts so not only are we talking about GPT for Turbo but now we're talking about these AI agents becoming the future of apps what do you think about that there there's a lot of interest interesting thoughts there I think about how um Apple kind of revolutionized things by launching a Marketplace and by doing that it a it opened up and it democratized if you like the app building but it also created a really key Revenue driver for them and it looks like open AI doing something similar the other side to this is I kind of feel sorry for people who tried building businesses after the first version of chat GPT came out because I almost feel like as open air expand what GPT can do itself there's probably entire companies or business ideas that are being taken out you know we talked about like brand new businesses are being launched but some of this functionality is going to step directly on the toes of that so the pace of Change is Going to require some businesses to like pivot strongly and change their approach so interesting time frame it's funny when you think about the cycles of innovation you have the innovators and then the people getting disrupted and usually those Legacy companies are much older but what you're basically saying is yesterday's innovators are today's laggards because of how fast things are changing you have to really pick spot and be able to sort of predict where the market is going to make sure that you're not getting disrupted by the time you even find the product Market fit in the first place so I do think that we're going to see a lot more adoption of these gpts than we did with the early models and the prompt engineering and the rappers that came on top of chat gpts because these AI agents will allow actions they'll allow a lot more diversity in the knowledge base and they'll allow a lot more creative use of these uh large language models overall is the tail wagging the dog in some cases because gen AI is the big the hottest show in town everyone's talking about it we've kind of built this capability and now we're trying to connect things into it but sometimes it's like maybe we should be thought of in another way I'll give you an example like with zapia zap zapia can be used to integrate with stuff further down stream and things like that but actually should chap GPT be the one coordinating that should these large language models just be embedded within some of these other applications and we've seen software companies do that so what will things look like in a few years time will we see lots of different specialized versions of these models as their own apps looked into other stuff or is this really a way to kind of get the message out there get things circulated and really and maybe I'm speaking more from Enterprise will it really be every app will have its own embedded version of a large language model and I guess it could go either way really my current prediction is that actually both things will happen so for example on Microsoft side we're seeing them launch co-pilots whether that's one Windows co-pilot or the Microsoft Excel instance of that co-pilot that you can talk to right in Excel and have it do things for you have it help you problem solve but also teaches you how to solve the problem in the first place oh take this average you want to format your data like this you want things to look like that in this report so in that example you have the co-pilot or the GPT agent embedded in a specific spefic app and its specific knowledge base and the actions that it can take are all tied directly to uh Microsoft Excel on the flip side you can imagine a more generalized GPT agent sort of like a virtual assistant that's able to plug into many different apps and coordinate things for you just like we saw in that Dev day demo where it was accessing uh the demonstrators calendar at work but also her cell phone to send texts to whoever she need to about meeting conflicts and to leave early so I do think we're going to see co-pilots at every level of the type of work that we do and they will not necessarily compete they'll more augment each other and people will have many assistants depending on the tasks that they do and sort of one master virtual assistant that knows a lot about them and knows exactly how they want to get those things done great take great take that's personalized that knows you that knows your history Knows Your Role Etc and that can connect into these other ones together the job done that's yeah I agree with that that's right yeah and knows how to orchestrate between all these other inapp co-pilots right what was the biggest takeaway from the show for you between you know the App Store these new GPT agents gp4 turbo was there something that stood out to you Above the Rest yeah probably something that is neglected and maybe seen as a small thing but just being able to interact with voice and getting voice back I think is actually a big deal because I think at the moment chat GPT appeal to those coming from an engineering background you know we were fine with a a black screen with a limited text prompt but in terms of having things widely adopted being able to speak to this and get a response that does raise the game so I think what they unveiled around uh voice input image input but also text to voice the synthesis of that voice as well was really accurate um and I think you know even for Content creators as well that is a really interesting thing right is there a different Paradigm now could you have a could one of you on our next podcast could one of the guests be chat GPT we just ask its opinion on something and it tells us in real time in voice right who knows that's a really good idea you should have chat GPT on and see how well we can get it to converse as if it was a regular podcast guest that's a great idea um I think for me I was really impressed with the with all of the upgrades that they added to gp4 for gp4 Turbo combined with the price cuts and what I think that is going to push is the whole industry forward in terms of whether you want to call it capabilities per dollar tokens per dollar whatever you think the metric is I think we're going to be seeing chat GPT on gp4 turbo uh overtake all of these models in short order oh I forgot one last thing did you the very last thing Sam Alman said he was like we're already planning the future and it will make the stuff we're building now look quite quaint I think were his words right so it's almost like this is going to look like beta version 0.1 but I think the fact that he committed to more Dev days and the fact that they're at the stage where they're making price cuts that should get everyone confidence that this technology is here to stay and the reason I say that is there was some articles going you know open AI is burning hundred like billions of dollars every month and the cost of running this stuff it made it sound like it was unsustainable but from what I see here is they're in this for the long term there is a business model here I don't think it's just being propped up with Microsoft investment dollars it sounds like you know they're quite confident that there's a scalable business here as well I think one thing that we should talk about is generative AI as a fad versus generative AI as a real productive tool I think what Dev day really showed us is that 100 million users and two million developers using this API for a wide variety of tasks across a wide variety of Industry verticals I it's really hard to keep calling generative AI a fad in my opinion based on how much it's controlled a lot of The Narrative of Technology going forward and how much investment some of the biggest companies in the world have made into it whether we're talking about Google launching Bard uh and palm whether we're talking about Amazon's bed Rock what is the point at which we stop thinking of generative AI as the next metaverse or the next web 3 and we move it from fad and flash in the pan to the next era of computing what needs to happen for us to I think it's about a revenue story I think investors are nervous because they've seen flashes like this before like the dotom bubble maybe maybe it's a cliche but they've seen things where there's a lot of hype there's a lot of movement in the market take 2021 maybe as well right there's times when things happen very quick and that makes people nervous about is this sustainable is there not the way nvidia's stock price jump the way their revenues jump you know has people wondering is this sustainable I think what what will change the needle will be when these providers of these company the software side they also start seeing Revenue increases based off the back of this I think it'll be when Microsoft co-pilot starts to get widescale Enterprise adoption and you start seeing a new line in their earnings call about you know co-pilot or maybe they lump it into Office 365 but you see a bit a revenue Spike I think that's when people will be like okay this is real and it's here to stay uh so I definitely agree with that let's talk about some of the companies uh in the stock market that we think will benefit the most sort of in the near term say in over the next you know year and then in the far term as generative AI becomes more of a Mainstay technology that both consumers and Enterprises use where do you think the short-term winners are right now everyone's going to look at Nvidia to start with it's it's this technology that enabled this whole this whole area so it's like the pcks and the shovels company so the envidias the others there's a massive Reliance on cloud data center for you to able to do any of this so all of those companies that are in the cloud either providing the cloud infrastructure or the connectivity between the cloud like they are the ones who will capture it first and then second to that will be those monetizing this or able to increase their revenue from it so I personally I think of Microsoft first because I feel like they had an they've kind of got this into an Enterprise State and at least have an offering that that they can charge for but even meta and these other companies are going to start having products that are going to be based on generative AI also AMD right we're going to probably talk we're going to talk about this in a bit but you know there's other chip companies that are also trying to get in on this as well so I think they will have meaningful Revenue probably in 2024 as a result of this geni wave so you're talking about AI training in the cloud you're talking about capturing the revenue that comes from building businesses on top of these large AI models but I think a lot of consumers and uh Enterprises and businesses will also make a lot of money just by using these models so I do think there is a case for looking for the winners in the inference markets where the real value there is is that increase in productivity that new capability to quickly generate you know results from the model that help you generate value as a business so I think the actual timeline will be a little more like cloud computing and in parallel with generating Revenue by using these models the chip providers and the infrastructure providers for the inference side of the equation uh and that's actually a really good segue right into our next topic which is the PC CPU race is heating up so let me give a little context here the MacBook switched from Intel's x86 chips to Apple's custom arm chips back in 2020 and since then the MacBook has roughly doubled its market share uh in laptops I think this really legitimizes arm chips for laptops and PCS in general not just in phones and tablets Microsoft is coming out with new arm-based windows PCS to compete with the new MacBooks and then qualcomm's deal to be Microsoft's exclusive chip provider for mobile devices ends in 2024 so I want to spend a minute or two just comparing all of the new chips that have been announced over the last few weeks and talking about what this could mean for consumers and generative AI at the edge so Qualcomm just announced their Snapdragon X Elite processor uh at their Snapdragon Summit on October 24th it's 13.5% faster per thread than Apple's M2 chips and it has the same performance as those chips using 30% less power so a big step function up in terms of performance per watt and then it's also actually a little faster than Intel's best chip at the time or it can perform as well as that triip at a whopping 70% lower power right so this performance per watt step function upwards is exactly what we need to run larger and larger models at the edge uh and Vidia and AMD also are reportedly denouncing arm-based CPUs for Windows PCS so they'll be competing with Qualcomm and Intel right and then separately from that actually IBM has a new prototype chip called the North Pool which solves problems right in memory uh and that prototype showed a 30X efficiency boost used for inference and that was just shown off on October 19th so what do you think about all of the new competition entering the race for PCs starting in 2024 and 2025 it's it all seems to have come out nowhere all at the same time I think um I don't know if this was planned by AMD and Nvidia but there was something that got leaked in the press to say that they are planning to build arm-based CPUs in 2025 and the same week I think it was Qualcomm had a launch event for their their snapdragon processors as well it was like suddenly everyone was talking about arm processors you know so it's it's a really key topic and historically I've always thought as arm if you say the word arm I picture mobile or tablet a small chip that's very power efficient built using the latest technology nodes and so it has really long battery life I don't think of a PC but that's already been challenged with you know Apple bringing out their earlier M1 processors M2 and they announced M3 recently so already things are changing into having these These Arm processors in really powerful systems but honestly when I heard the opening statements from the Qualcomm CEO where he was announcing these chips he was so bullish he was like we've got the fastest chips in the world with no caveat he was like including you know anything on x86 including Apple like he covered everything so some of us were looking at these figures going are they real so from out of nowhere It Feels Like To Me Maybe I was sleeping on this but out of nowhere arm processors have like they the top everyone's talking about and suddenly they seem to be the future where Energy Efficiency is such a crucial thing these days so like um yeah it seems like overnight things seem to have morphed quickly yeah I wonder if this is a little bit of the tail wagging the dog right so previously we would use PCS for things like knowledge work whether that means creating content or writing reports and we would use it for gaming and for those use cases it's seems like all of these advancements aren't really necessary so do you think that generative AI is a whole new use case for Edge devices and PCs or do you think this is just a new way to extract more dollars that aren't really providing any additional value for the consumer yeah probably the latter it's probably the fact that we're moving towards mobile from gaming from like productivity even even what you just described work work used to mean desktop PC sat in a physical space that never moved now work is very much a mobile device on the move and mobile can mean your phone as well as your notebook and I think it's like as we shift into that Paradigm that's what becomes more important the battery life and performance at a low PowerPoint and it feeds perfectly like you were saying it feeds perfectly into actually we can do more at the edge now we can do more AI workloads at the edge and slightly off topic a little bit but also 5G 6G these improvements in like Communications as well make means you could do more and more things without being sat tied to a physical place so it's kind of morphing what is capable because the the nature of work has become mobile really and gaming potentially as well is moving that direction so the bet is that these AI based use cases are coming so fast that by the time these chips hit the market there will be a real need for them that ends up driving more of the workforce mobile that ends up allowing more types of computing to happen at the edge especially in laptops and then it will be a real Market by the time these Hardware upgrades enter the PC space in 2024 and 2025 is that yeah exactly and one of the demos that um that that was shown on the Qualcomm day was the integration between your mobile your tablet and a laptop just seamlessly connectivity they're all speaking the same language they're all on arm and it just makes it more seamless and it kind of felt like actually that's how it should work you should be able to plug your phone directly in and everything just work and and sync up seamlessly and be able to take calls and and things like that in a really easy to use way so that when I saw that I felt like actually they're on to something this is where things are moving it will be more of a seamless mobile experience and the the line between laptop tablet and phone is already a bit blurred it will probably get even more blurred as we go forward that is a great point and I've always struggled with understanding how I should be thinking about this market so when we talk about the winners for PC CPUs my question is should we also be including Apple or is Apple its own separate ecosystem so let me explain why I'm asking that almost right after that Qualcomm event Apple announced their M3 family of chips uh during their scary fast event on October 30th so because Apple's on its own separate ecosystem is it its own Market entirely or because we're always talking about consumers making the choice between Apple's ecosystem and the windows ecosystem that they should be considered part of the same Market how do you think about this Market tricky isn't it it's Apple's fault for being so proprietary but it's kind of the way they role isn't it so yeah you're right they literally lock you in it's almost like a religion there's the designers and the content creators that only will use apple and there's the knowledge workers Etc who are tied into windows-based platforms and they have been forever it doesn't need to be that way it's commercial reasons for that rather than anything technology-wise it should be seamless it should be one combined Market but the fact that this Behemoth company has gone down that route means we probably do need to think about them as two separate worlds and it's weird I didn't realize that Apple was quite behind on gaming from a gaming point of view so there were improvements on the Apple side where they were just catching up with Windows PC so um it is two sort of separate worlds and I don't see any um movement where they're going to kind of be merged together in the near future I think so when we talk about Winners it's kind of an apple and conversation right for example on uh open AI Dev day that even though he uh Sam Alman was standing next to SAA Nadella right Microsoft CEO right behind him was a Macbook which is how they were running all of the demos and the whole event so I thought that was pretty funny but that just goes to show I think people make the choice based on ecosystem first and what Windows would really have to do is give people a compelling reason to choose them over Apple for their next computer and I think that's going to be really a tall order but things like maybe the GPT app store or the windows 11 co-pilots and things like that will start moving certain pockets of knowledge workers over to Windows PCS in which case that would grow that market what do you think is uh the sleeper winner uh in this area I just love how Intel just brushed this off how Pat Ginger when he directly about arm CPUs went yeah we've been talking about arm and windows for ages whereas Intel and apple breaking up and apple moving away like that's been a big issue Intel lost a load of Revenue as a result of that so for me it's almost like I think Pat's words are going to come back to haunt him and I think qual comp their um exclusivity is about to run out but the chips that they announced just now are going to be available in 2024 they get at least kind of about a year ahead start we think before envidian AMD start releasing into this area so I think kcomm is really well positioned with some really strong offerings and I'm really Keen to see when these are real when they come out in laptops from from HP ASA Lenova and others how does that really Stack Up against a you know a an x86 based processor from AMD or Intel and I think it's exciting because another thing they announced along with these announcements was there's there's AI accelerators within these chips as well so they're also focusing on this whole AI at the edge thing so you know we might get laptops that are really exceptional in the next in the next six months I think the consumer wins the person who wins really from this is the consumer There's real competition now like hotting up in this space it's been a two horse race for too long in terms of windows with AMD and Intel it looks like Qualcomm is going to really give them a a run for their money if I had to make a guess today I would be long Qualcomm and long Nvidia and I would be short AMD and Intel right so short Intel because of what you just said Pat Ginger and probably all of Intel's leadership right now underestimating the power of arm CPUs for Windows PCS and the amount of damage that they could do to Intel's grasp on the laptop processor Market AMD I think can actually come out with really good power efficient arm chips amd's problem is that'll cannibalize some of their existing x86 business so they're going to be trading $1 doll for another for a while as they ramp up their arm production so that's going to be a tough order for them they're going to be gaining business in some areas and while they cannibalize it and lose it in others I agree with you completely that Qualcomm is currently like The Benchmark setting arm-based mobile processor uh Builder and designer so I think that's an easy easy win and I think Nvidia would be the second place for me they've had the Nvidia tgra chips in the past but importantly they're good at building on the arm architecture we know they wanted to BU buy arm uh a couple years ago for $40 billion would have been one of the biggest Tech Acquisitions ever if it happened and on top of that they don't currently have a big x86 business in laptop processors right so when Nvidia makes new processors even if they don't get a large Market market share this is a new Revenue stream for NVIDIA it's qualcomm's one of their main revenue streams right now as opposed to Intel and AMD which will have to at least cannibalize some of an existing Revenue stream to keep up in this market so I think another reason there's a risk for AMD here is they in their last quarterly results it was the PC segment that really bounced I think the revenues in there were up over 40% so they're just experiencing an acceleration in that area that client area so yes there's an opportunity because they can build on process but there's a risk there that this new area that's where an area where they're getting more Revenue in and is growing nicely um or it did in the last quarter you know that potentially is at risk and it's an area that has just grown for them with Intel they've already seen declining shares or declining sales or the trend has been down in terms of PCS over a period of time the latest quarter I think they looked better but it was still quite low from where it's been whereas for AMD if that's part of their growth story it could be disrupted quite a lot yeah and and just just people listening who may not care about x86 versus Arm specifically what really matters is arm is a much more power efficient architecture that's why you find it a lot more in tablets and cell phones today so when Intel is ignoring those benefits and Intel has shown time and time again that that's where they struggle to perform on that performance per watt with their x86 architecture so I do think that if they don't change their tune and start at least acknowledging hey we need to come out with very power efficient processors even if it's on x86 that's where they'll start end up ending up losing market share to people who want to buy new PCS specifically for these chat GPT agents the GPT App Store and ecosystem running GPT for Turbo and all the uh AI co-pilots that we're seeing Microsoft talk about and come out with so just connecting it to our earlier topic about open AI Dev day and that actually brings me to one more question I want to ask before we move on to the data center side do you think that the device of the future is still going to be smartphones or do you think that apps turning into gpts or agents is going to change the nature of mobile Computing alog together yeah really I mean even the form factors of what mobile Computing even means are changing you know wearable technology like glasses smart glasses and things like that it's really broadening out to the point where it's really hard to predict right now we're seeing real wearable technology that has advanced you know capabilities in the device itself like we said that will be connected to the cloud like we said with the increase in processing on the edge it'll be able to do so much stuff in the device itself so I think we we will maybe the mobile phone form factor I mean why do we even have it the reason it's a phone is for us to be able to make like audio calls almost the actual form factor is back from when it was a physical phone whereas uh you know most people probably hardly spend much of their time actually making phone calls on their phones these days funly enough so maybe it will change maybe it be integrated into glasses or headsets that are becoming Slimmer and more affordable or you know maybe we'll have a brain chip implanted into our head like you know some people are working on and that will be the human human AI interface I do think that there is a case for I think we will always have phones even if they just sit in our back pocket and sort of act as like the big processor that we have with us but I can definitely see the case for smart glasses uh that tie in they see what you see they hear what you hear you can talk to them and uh what they do is they take your audio and video and location uh in context of the world around you as inputs and they can provide you value added information hey here's what you're staring at here are the directions for where you need to go here's a visual answer to the question you just asked me about whatever's right in front of you so you can think about a wide variety of use cases where all of the sudden llms play a major role in the both the computer and the structure of the problem that we're not getting with PCS today so I do think we're going to keep getting more and more of these Edge devices like the Apple watch and so on uh and we'll have a total Computing ecosystem that we wear and that helps us interpret the world around us I think that would be really cool yeah controlled by one kind of overarching AI thing running on our phone with where all the resources are controlling all yeah that sounds really interesting and then if you think about it that way I do think that apple has a huge leg up there because nobody's better at having multiple peripherals talk to each other they seem to own that space of like a Multi-Device ecosystem working together to provide a really smooth and elegant solution so moving completely away from the edge devices even though the edge device market and PC CPUs are heating up the AI Data Center Market is already on fire right so Nvidia has been dominating the data center GPU Market Market with their h100 gpus in just a little context there they've doubled their total revenue last quarter from year overy year and almost tripled their data center Revenue they became a trillion dollar company on the back of the h100s and they've expanded their gross margins to over 70% which is absolutely insane for a hardware company so no doubt a lot of different players are going to try to Dethrone Nvidia and capture some of those margins for themselves to that end amd's CEO Lisa Sue predicted that that their Mi 300X GPU will be the fastest product in AMD history to ramp up to a billion dollars in sales separately Intel's gudy 2 chip has been shown to outperform the h100s for training multimodal AI models by about 40% qualcomm's Cloud AI 100 chips outperform nvidia's h100s by about a factor of two on the inference side how worried do you think Nvidia should be are they still going to be the leader what do you think I think so first of all Nvidia is winning and will win for a while I think what is going to play into this though is uh Supply constraints right right now they can't create these these h100s quickly enough so the demand is there but it's about the supply so if Nvidia can't fulfill the supply in time and there's excess demand that opens the door for our competition to get in now we've kind of joked about how you know Intel managed to find at least one way where you can measure the gy 2 chip as quicker than the h100s right but generally h100 beats gy it beats the other it beats the other um chips out there so the performance leadership is there it's real Enid's numbers are real but the question is is that enough and if that is not enough then that a opens the doors for other competitors who are maybe not as good as Nvidia but are good enough and then also if they're at a lower price point if they're at in terms of performance per dollar performance per what there's other ways you can chip into this so I think it would be wrong for investors to think oh because Nvidia and has got such a strong road map as well you know even if they are the leader for the next two to three years it doesn't mean that uh AMD Intel colcom cannot have a significant increase in revenue from this area so I think all of these com those chip makers will benefit from increased data center sales they'll each find their Niche probably let's make a prediction qualcom will be the most energy efficient you know server chips out there like AMD will manage to find a way where they may be cheaper on price uh and they're able to to make the you know performance per dollar equation work where they are a viable solution as well um and you know Intel will also Intel have got gy 2 now but gy 3 is around the corner they able to find Partnerships with the right companies to get this stuff out there what about the other massive megatech Cloud providers as well right what's their story what what are they going to do they're not thrilled about being tied into having everything on Nvidia so those Cloud providers will want to diversify will they want to try and build their own chips as well you know there's there's multiple areas where NV 's share could be kind of bit into but then part of me feels like there's enough pie to go around where you know Nvidia can still have a really strong story and continue to grow and dominate and but these other providers will have you know meaningful revenue from uh from this area that's that's what I think I've been thinking about this problem more from like a data center product owner perspective so I think the way it'll shake out in the future is NVIDIA will be the best General platform for high performance Computing for AI training and for all of these massive high-intensity workloads right and then data centers will have to balance how much of that they want versus more specific use cases for certain chips and that'll just depend on their client mix the size of their clients the demand of their clients like let's say multimodal models end up being the mainstream type of large language model all of the sudden we could see way more demand for Intel's gouty chips just because that's something Data Center will want to provide more compute for specifically and it'll move those workloads off of nvidia's h100s which will open up those resources for more appropriate workloads for those chips right and that's just based on the demand that each data center is seeing as well so I think that's a really good way of looking at it and also if you were to look in if you were to go on Microsoft a or AWS today and you were just to you know fire up a virtu or create order a virtual machine you'll get a whole list of CPUs you'll get different PowerPoints different different processor speeds form factors power efficiencies costs as well so I agree with you the data center want to offer all of that available to the users and one thing we should think about as well is the amount of demand for the training of the models versus the inference which is you know the calling of the model so maybe right now training is a really heavy workload maybe it will remain but maybe in time the inference side will be even more important so the number of users or calls that are going to require the inference side might increase which means there may be more revenue for you know those able to provide good quality good power efficient inference chips and it won't just all be about Nvidia yeah I I really think there will be multiple winners as we stop treating the whole AI Market as one market and start treating it as multiple markets the training Market the inference Market the single mode llm Market versus the multi mode however we end up uh breaking it up into submarkets and then choosing winners within that right just like we have it used to be a Computing Market but now it's a CPU Market a separate GPU Market separate server infrastructure market and so on Nvidia isn't just built on the back of these h100 gpus right so they're not standing still either they have the tensor RT llm software that they announced uh and released earlier this Summer that doubled their performance for inference of their h100 chips they had the Grace Hopper Super Chips coming in Q2 of 2024 the h20s and The L4 chips for training and inference the infiniband and Spectrum X networking Solutions as part of their overall AI infrastructure package so Nvidia is not really just about gpus they're offering this entire AI infrastructure ecosystem that data centers can build their clusters on top of right uh when I look at Qualcomm AMD Intel I haven't seen the same like extensive ecosystem as I do for NVIDIA do you think that's part of the equation or do you think that it really is about GPU versus GPU in terms of performance per dollar yeah no Nvidia have got such a strong offering here because they've been delivering on this road map they were already bringing Revenue in and then numbers are popping already from this so they're a trusted um company to deliver the the the realities of what this kind of Technology promises the others are going to be playing catchup for quite a while I think um Nvidia is in a really strong position and they have their earnings coming up in a few weeks and I know everyone in the market is watching them to see like can they do it again I agree with you and I think what that means uh for investors is the way we think about Nvidia really is they are not worried about competition they are worried about Supply right they're worried about meeting as much of the demand as they can and that's how they're going to take market share away from their competitors on both the training side today and the inference side today right and then the more demand they can meet the less that will spill into their competition to Sate that Demand with more specialized chips so I would call Nvidia the number one winner for sure at least in terms of offerings today and then do you have like a sneaking suspicion of who's going to be the sort of runner up yeah I'm a I'm a AMD bull so I'm gonna have to go with AMD I think um I think it's quite unusual for Lisa Sue who's normally quite restrained for her to make a prediction about revenue for a product that hasn't launched yet next year and I feel like it's that's what investors were waiting for we needed to hear what their plans were on the Mi 300X I reckon because I have confidence in AMD to deliver on what they say in terms of the road map I think it will be a strong product it will be competitive enough to get them into enough conversations where they will be like the number two in this place yeah and so you're thinking the demand for the h100s is just so high that the Mi 300X can sort of do no wrong right if yes and I also think one trick AMD often plays they they'll come in cheaper they'll come in cheaper than like say the Nvidia to make it a performance per dollar question so they'll be able to say we are you know the best performance per dollar you can get in in AI training like that's what they'll go for I think that's a really good point and AMD is really known for playing those Computing tricks you know their 3dv caches and their extended L2 caches even in between versions of their chips they're able to squeeze out even more performance by making one or two fundamental changes so I'm curious to see how that strategy plays out with the MI 300X right what'll happen is they'll close that Gap with these kind of tricks and then they can up the dollars they ask for while still coming in cheaper than Nvidia in terms of compute per dollar and that might catch the market by surprise in terms of how close it is to the uh equivalent Nvidia chip of its time I'm curious to see how that shakes out that might change my opinion I was actually GNA say Intel not because you know they're coming in close to the h100 overall but because they're focusing on inference not TR training focusing on CPUs not gpus that that would be a great augmenting technology to get certain workloads off of the h100s onto these chips that are much better at handling it and make better use of uh existing h100s and data centers in the process so I was thinking more of it like hey what menu item would best complement the h100 and you're thinking about which menu item most closely competes with it so I'm curious to see what the thought process is of data centers moving forward so I hope you have found this discussion valuable let us know if you enjoy these kinds of discussions tying together the future of technology AI software and hardware and the markets I think this is one of the most important spaces to pay attention to as an investor and I'm excited to keep talking about it with you each week yeah thanks so much for everyone's kind comments on the previous podcast and we commit we will continue to do these yeah it's going to be an amazing rest of 2023 and 2024 and Beyond until next time this is ticker symol youu my name is Alex and I'm joined by erif and we're reminding you that the best investment you can make is in you amazing my name made it into your famous outro
Info
Channel: Ticker Symbol: YOU
Views: 27,608
Rating: undefined out of 5
Keywords: openai, chatgpt, gpt-4 turbo, gpt-4, gpt, msft, msft stock, microsoft stock, aapl, aapl stock, apple stock, apple m3, apple scary fast event, qcom, qualcomm stock, qcom stock, qualcomm snapdragon x elite, snapdragon, intc, intc stock, intel stock, amd, amd stock, best ai stocks, nvidia arm chips, nvidia arm, intel gaudi
Id: Ah1ZiEfqheg
Channel Id: undefined
Length: 42min 6sec (2526 seconds)
Published: Thu Nov 09 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.