The surge was likely related to Nvidia's introduction of new AI-centric products the previous day, including a new high-powered supercomputer and a platform that will put generative AI to work in video game development.
Yahoo Finance further reports that the company expects revenue of $11 billion, plus or minus 2%, for the second quarter, and analysts are "overwhelmingly bullish" on the stock, with 49 buy recommendations, 8 holds, and 1 sell. This is backed up in a feature story examining Nvidia's huge growth, where Michael Spencer at AI Supremacy points out additional significant statistics:
- Quarterly revenue of $7.19 billion, up 19% from previous quarter
- Record Data Center revenue of $4.28 billion
- Second quarter fiscal 2024 revenue outlook of $11.00 billion
GPUs, Artificial Intelligence, and the Banking Industry
Artificial intelligence & machine learning in the banking industry is not a new concept. We've recently reported on the importance of AI tech for community banks and credit unions and how AI & machine learning are driving automation & resiliency for banks and their customers, as well as a variety of ways that AI & machine learning are helping banks stop fraudulent checks.
Powering all these technologies are, of course, Graphics Processing Units, or GPUs -- and a major factor in the surge of Nvidia stock.
The importance of GPUs cannot be understated when banks are deploying artificial intelligence & machine learning technologies. As noted in a previous blog post entitled GPUs vs. CPUs: Understanding Why GPUs are Superior to CPUs for Machine Learning, GPUs are the "heavy lifters" when it comes to computing power:
If you consider a CPU as a Maserati, a GPU can be considered as a big truck. The CPU (Maserati) can fetch small amounts of packages (3 -4 passengers) in the RAM quickly whereas a GPU (the truck) is slower but can fetch large amounts of memory (~20 passengers) in one turn.
With banks processing millions of transactions daily, CPUs do not have the computer power to match GPUs -- which is why banks attempting to utilize CPUs see heavy latency or "lagging" in their systems' performance.
If your neural network has around 10, 100 or even 100,000 parameters. A computer would still be able to handle this in a matter of minutes, or even hours at the most.
But what if your neural network has more than 10 billion parameters? It would take years to train this kind of systems employing the traditional approach. Your computer would probably give up before you’re even one-tenth of the way.
A neural network that takes search input and predicts from 100 million outputs, or products, will typically end up with about 2,000 parameters per product. So you multiply those, and the final layer of the neural network is now 200 billion parameters. And I have not done anything sophisticated. I’m talking about a very, very dead simple neural network model. — a Ph.D. student at Rice University.
Nvidia GPUs for Check Processing and Fraud Detection
As you are aware, OrboGraph's latest OrboAnywhere releases leverage our OrbNet AI technology. And -- you guessed it -- the technology was developed and tested utilizing a variety of Nvidia GPUs -- including the A2, A10, A30, and A100. These GPUs are part of our recommended hardware list provided to customers during the early stages of installation. Additionally, we continue to evaluate the newest options for both on-premise and cloud deployments.
As banks turn to AI & machine learning to automate check processing and increase check fraud detection capabilities, they will be more reliant on GPUs to power their systems -- ensuring accuracy, elimination of latency or "lagging," loss reduction, and ending the need for manual review of false positives.