As Nvidia's (NVDA) stock rallies we are conscious that the "story" of the company is complicated and often misunderstood. So we have decided to give you a deeper dive into the greatness of Nvidia. We have written much on it in the past few days (here and here), but what prompts this exercise is a widely disseminated "short" call against Nvidia earlier this week by Andrew Left of Citron who has issued several, largely futile, assaults on the company.
Left himself also placed a short bet on the stock only to watch the stock fly up after a brief post-earnings decline. He said he covered most of his short while also noting that he is "probably about even" from his countless short calls on the stock - by our count, this is his fifth (failed) attempt to short the stock of this fundamentally rock solid company.
Last night, he gave the brief against the stock and we found it largely uninformed. Members who missed the interview can catch a replay, here. Additionally, for members who have yet to see our response to his research note please view our response, here.
That inspired us to trace out what Nvidia REALLY does, how much BETTER it is from the competition, and debunk Left's comments as they have been used against this tremendous company for several years now. We are not saying the stock is cheap, we are saying that when we look back in time it turned out the stock was cheap on the out years.
So, without further ado, our Nvidia Deep Dive!
During the interview there were three points Left chose to focus on regarding his call for the stock to see $200 again: ray tracing, competition and self-driving. We address most of the other points in our prior alert, linked above, however, we want to take a moment out for you to understand why we think Left is very ill-advised in his thinking and why his view of Nvidia as purely a gaming (something it may have been in the past) is misguided. While we will address his competition and autonomous driving arguments, we want to be brief as we believe the real misunderstanding on the part of Left relates to Nvidia's real-time ray tracing technology.
First, to Left's point on competition, where he notes that companies are racing to develop their own chips and that even the Chinese government is looking to bring AI technology domestic. Bottom line, we do not believe competition alone to be a valid reason for a short call as any company doing anything remotely successful or groundbreaking should expect competition. It is certainly something to monitor, however, that can be said for any company in any industry. If anything, we view this as a validation of Nvidia's dominance. We address this in more detail in our prior alert and will simply put it this way: we would not short the stock of Amazon (AMZN) because Microsoft's (MSFT) Azure continues to grow its share of the public cloud space and increase competition with AWS. Therefore, we do not feel it justified to sell Nvidia because of increasing competition in the data center even as we would posit that Azure gives AWS much more of a run for its money than any of Nvidia's competitors.
As for autonomous driving, we were honestly a bit shocked to hear him say that "we saw auto go bust," referring to Tesla's decision to develop its own in-house chip. We expect nothing less from Elon Musk, the man is a genius and we wish him the best of luck. However, it's important to keep in mind that this is not the first time Tesla has bailed on a chip maker, the company previously ended its partnership with Mobileye, for issues that we think were distinctly related to the mercurial, demanding nature of Musk, so this really doesn't come as much of a shock, and in fact, Nvidia CEO Jensen Huang addressed this on the second quarter conference call stating, "we used the 3-year-old Pascal GPU for the current generation of autopilot computers," noting that the company's new Xavier line of chips are far better equipped to handle the needs of autonomous vehicles than those Musk was referring to. For this reason, we don't believe Left's argument that auto has gone "bust" to be sound and note that during the on-air interview, Left points to Tesla's claim of developing a more powerful chip, without acknowledging that Musk's comparison was to the Pascal-based GPU noted above, with no mention whatsoever of Xavier.
Moreover, to call this the end is to completely dismiss the hundreds of other auto OEMs that rely on Nvidia's chips, automakers that by the way make up a much larger share of the vehicle space. To this point, we remind members that just last month, Daimler (Mercedes) and Bosch selected the Nvidia DRIVE platform for their robotaxi fleets. This is only the latest in a slew of OEMs looking to take advantage of Nvidia technology, other larger car manufacturer customers include, Audi, Volkswagen, Volvo, Toyota, even the super-high-end automakers like Rimac Automobili are jumping on board, implementing the Nvidia DRIVE system in the C_Two, their all-electric hypercar equipped with level 4 autonomy.
OK, on to the elephant in the room, real-time ray tracing. During his interview, Left made a few points regarding the ground-breaking technology that we want to address here. First, he said nobody was talking about it before this week, meaning to say that it wasn't part of the bull thesis before last week and therefore is not a justification to own the stock. He also noted that today's games aren't equipped for ray tracing. Lastly, the real kicker is what he didn't mention at all, which is the fact that real-time ray tracing exposes Nvidia to a previously untapped market - the visual effects industry, think movie and TV computer generated imaging (CGI). Given his comments on the new technology, we believe Left may not have a full understanding of what real-time ray tracing is or why it is such an important step.
First, to Left's point of ray tracing not having been on anyone's radar until now, we quite simply don't believe this to be accurate. Ray tracing was discussed in prior earnings calls and referred to by Nvidia's brilliant CFO Colette Kress as being the "holy grail of graphics" during the company's first quarter conference call back on May 10 of this year. So, it may not have been on Left's radar, but it's something we have been watching for quite a few month's now, fully expecting commentary to increase with the introduction of new chips.
As for his commentary regarding today's games lack of ray tracing implementation, if you look here, Nvidia's senior vice president of Content and Technology, Tony Tamasi clearly states, "The NVIDIA RTX platform and GeForce RTX 20-series GPUs bring real-time ray tracing to games 10 years sooner than anyone could have ever imagined," adding that "thanks to the AI and hardware light-ray acceleration built into GeForce RTX GPUs, games using these futuristic features are right around the corner." So, of course it wasn't being widely discussed or implemented - nobody thought it possible for another decade, it's in a sense, the same as when Fortnite caught the gaming community off-guard because nobody thought 100 players battle royal mode (especially on mobile) was possible either, but I bet you wish Fortnite and PUBG were individual stocks.
So, while Left is correct in his assessment that the games slated to release later this year are far from fully optimized to take advantage of the new RTX chips, we believe this to be short-sighted as the games of the future will almost certainly include this feature as often as possible - why wouldn't developers look to make their games as photorealistic as possible given the chance? Therefore, we feel he is misunderstanding why the new technology is so important - it's not about this year's games, it's about games next year and beyond. True movie-like quality according to Strauss Zelnick, CEO of Take-Two Interactive (TTWO) , the development studio behind hit titles such as Grand Theft Auto V and NBA 2K, and one that works closely with Nvidia, as do all the game makers on their newest iterations. Clearly, developers are taking these new developments into account and we would not be surprised to see ray tracing result in a preference by developers to optimizer games for Nvidia chips, potentially to the misfortune of Advance Micro Devices (AMD) (though we love Dr. Su and cannot wait to see her answer to Nvidia's ray tracing chips - competition is what got us to where we are today and what will take both companies to even greater heights). For an example of the difference this technology makes, see here, where there is a video demo that allows for a side by side comparison of Battlefield V play with and without ray tracing enabled - the graphics speak for themselves.
One last point we will make regarding community support is that it's not only coming from the game developers but from engine makers and operating system developers as well. In fact, Microsoft (MSFT) is already supporting the technology via its DirectX Raytracing platform and Epic Games has already integrated RTX technology in the Unreal Engine, the most popular gaming engine in the world, and backbone of countless titles over the past 20 years including, Street Fighter V, Gears of War 4, PUBG, Fortnite and more.
Furthermore, speaking to our belief that Left is being short-sighted by focusing on this year's games, Huang already told us regarding the new technology that "it has to be amazing at today's applications, but utterly awesome at tomorrow's," his focus is clearly not on the present but on the future.
Left also noted that developers do not yet have the technology, speaking to the fact that the workstation RTX chips were just released, as a reason for his view that the stock is tough to own. However, we would argue that this is where gaming is headed, as evidenced by the developer demand for ray tracing and integration by Microsoft and Epic, and we are happy to sit along for the ride as more games are announced with the technology. Remember, we aren't traders, we are not trying to time this story and jump in and out based on when we can expect to see more ray tracing games hit the market. We are steadfast in our conviction that is the direction of the industry and are taking a longer-term view, drowning out those making shorter-term predictions.
By the way, ask any gamer that when new consoles come out (Playstation, Xbox, etc.) it takes years before the developers learn to truly harness the power of the new hardware. That today's games not being fully optimized for the new RTX chips really comes as no surprise. How could they be when we only got the RTX workstation chips last week and these games have been in development for years, being optimized for Nvidia's Pascal architecture - you can't optimize a game for hardware that doesn't even exist at the time of development. In our view his argument that ray tracing shouldn't factor into the bull case is not a good enough reason to sell the stock and at best, justifies the consolidation we are currently seeing at around the $260 level.
All of that said, we believe the biggest misunderstanding by Left is evident due to his view that Nvidia is no more than a gaming stock. We could not disagree with this more and believe that in addition to the company's exposure to medical fields such pharmaceutical research, 3d cell modeling, cancer diagnosis, disease research, or the offshore oil industry or even weather forecasting. Weather is something we desperately need advancements in especially given the increase in hurricane activity and intensity that we have seen these past few years and something that is seeing significant benefits thanks to GPU-acceleration. Left is failing to account for the currently untapped market that ray tracing will expose the company to, the visual effects industry, which according to analysts at Morgan Stanley and Needham, represents a total addressable market (TAM) size of roughly $250 billion! Left failed to note this crucial development in both his research note and on-air interview.
Speaking on the company's second quarter conference call, Huang stated, "Turing has the ability to do ray tracing, accelerated ray tracing, and it also has the ability to combine very large frame buffers because these data sets are extremely large. And so that marketplace [the visualization market requiring photorealistic imagery] is quite large and it's never been served by GPUs before until now. All of that has been run on CPU render farms, gigantic render farms in all these movie studios and service centers and so on and so forth." This is only possible because, as noted here, "Turing accelerates real-time ray tracing operations by up to 25x that of the previous Pascal generation, and GPU nodes can be used for final-frame rendering for film effects at more than 30x the speed of CPU nodes." For additional information on how the visual effects industry stands benefit and for actual video demonstrations, including one that shows a "Star Wars-themed Reflections ray-tracing demo" that was originally shown running on $70,000 DGX Station equipped with four Volta GPUs but this time around, demonstrated on a single Quadro RTX GPU, see here. By the way, the demo is also running on Epic's Unreal Engine on Microsoft's DirectX Raytracing (noted above).
Commenting on this key development, the analysts noted that while the movie opportunity is more near term, checks with management indicate that "the TV opportunity for Quadro could end up being 2-3X that of movies," adding that "the company has been working with movie studios for the past few months and expects new Quadro products to reduce rendering time as well as costs for customers."
So, as we noted above, it is our view that Left is simply failing to account for not only the significant advancements ray tracing will result in for gaming, but in our opinion, one of the biggest factors leading to previously unaccounted for earnings growth, the new markets that Turing has opened up for the company, markets that were simply not priced into shares before the introduction of the Turing and ray tracing.
The last point we want to make is that Left has not even acknowledged that Moore's Law has essentially come to an end. Recall, as we noted in our earnings alert, "Moore's Law (the view that transistors, a factor impacting performance, on an integrated circuit doubles roughly every 12 to 18 months) has stopped, creating a gap between demand and performance, a significant issue given the rapidly growing demand for increased levels of computing power, and one best addressed via Nvidia's GPU accelerated computing solution." To not account for this critical development in the world of computing is to completely underestimate the intense need for GPU-accelerated computing. And when it comes to GPU-acceleration, there are no conventional semiconductors that even come close to the performance of Nvidia. Given the rapidly increasing amount of data that is being stored in the cloud and the exceedingly complex computations required for artificial intelligence, machine/deep learning, neural networks and inferencing, GPU-acceleration is not an option, it is a necessity.
Moreover, Left also forgot to mention the cost and temperature aspects setting up and maintaining a data center. And in case you think cooling is an afterthought, performance being the only thing that matters, consider this, the largest names in cloud computing, including Alphabet (GOOGL) , Microsoft and even Facebook (FB) , have all sought out ways to reduce cooling costs. Alphabet has a datacenter in Hamina, Finland with a "high-tech cooling system, which uses sea water from the Bay of Finland, reduces energy use and is the first of its kind anywhere in the world." Facebook has one Lulea, Sweden, roughly 100km from the Arctic Circle. Microsoft may be the most creative thanks to Project Natick, which involves fully submerging a data center offshore in order to both keep data closer to users and increase cooling efficiencies, As Microsoft puts it "Deepwater deployment offers ready access to cooling and a controlled environment, and has the potential to be powered by co-located renewable power sources." Why mention this? Because we view this as yet a another crucial aspect of the Nvidia story. Speaking on Mad Money in early 2018, Huang noted, that Nvidia's DG x2, which includes 16 Volta processors in a single box "designed specifically for AI researchers" is capable of replacing 300 servers!
From a pure upfront cost perspective, Huang noted that this essentially means those 300 servers, which would cost several million dollars to set up, can now be replaced with a $399,000 box that takes up a fraction of the space. As this relates to hearing, Huang goes on to note that "300 servers consume something along the lines of call it a 160,000 watts, a 160,000 watts. We reduced all of that to 6,000 watts. We save money, we save power, we save floor space and just as importantly we make it incredibly easy for AI researchers to just buy one of these boxes, install it and start developing their models." So, basically, not only does Nvidia's tech reduce up front costs by millions of dollars, it also reduces the cost running it by roughly 96%! We can not stress the importance of this when you consider that data centers will only increase in size as more data is collected - especially in the coming age of cloud gaming, self-driving and 5G.
We believe Left to be undervaluing the significance of this or may be baffled by the differences between GPUs and CPUs, and therefore, missing out on why Nvidia's chips are so crucial and not easily replaced. Others have made that mistake in the past, including former executives at Intel (INTC) who pondered buying Nvdia years ago, so we are willing to forgive Left for his confusion even as we are not in disagreement that the stock is expensive. Then again if you look back any time in the last five years you will discover that the stock turned out to be ridiculously cheap on the out-year numbers.
Now normally, we would not bother to make note of Left's position - or his almost closed position! - but we do believe that the bear case as articulated by Mr. Left demonstrates just how difficult a "story" Nvidia is and we want to be sure you understand it so when it goes down - and it may, given its volatility - you know why we say "buy it!"