Skip to content

NVIDIA Data Centers’ Importance to Real-Time Check Processing

In what is very likely a bellwether for machine learning and AI, NVIDIA topped Wall Street estimates for both its fiscal 2021 second quarter and its forecast, with data-center sales topping its core gaming business for the first time.

Jefferies analyst Mark Lipacis, who has a buy rating and hiked his price target to $570, likened Nvidia to Apple AAPL, -0.82% in the smartphone era or to the software/hardware duo of Microsoft Corp. MSFT, +1.30% and Intel Corp. INTC, +0.59% in the PC era when it comes to dominating an ecosystem.

nvidia data center

Source: nvidia.com

In NVIDIA’s case, the ecosystem in question is that of parallel processing, which powers the heart of AI and machine learning. Not only does NVIDIA make the graphics processing units that are in high demand at data centers, but those GPUs run on NVIDIA’s proprietary CUDA programming platform.

Lipacis said NVIDIA “is best positioned to become the de facto standard of the Parallel Processing Era and capture 80% of this ecosystem’s value.”

“We think the company will continue to surprise on the upside, and wouldn’t be surprised to see NVDA undertake more M&A to build out its data center system capabilities,” Lipacis said.

Additionally, Forbes.com notes:

The Nvidia A100 data center GPUs are top-of-the-line data analytics processors, which are used by companies in advanced data analysis and AI applications. Another plus related to this is the rising margins, as the data center business is more profitable than the gaming GPU segment, given the higher selling prices (gross margins came in at 65% in Q1 ’21 vs 58% for the same period last year).

Importance of GPUs in Machine Learning

By now, many of you have heard arguments between utilizing GPU vs CPU. towardsdatascience.com provides a simple explanation on the reasoning behind the need for GPUs for machine learning:

GPUs are optimized for training artificial intelligence and deep learning models as they can process multiple computations simultaneously.

They have a large number of cores, which allows for better computation of multiple parallel processes. Additionally, computations in deep learning need to handle huge amounts of data — this makes a GPU’s memory bandwidth most suitable.

For a simple "illustration" of CPU vs GPU, check out the following video by the Mythbusters:

What Does This Mean for Banking & Check Processing?

There has been significant progress in platform modernization over the past 10 years. The combination of virtual machines along with cloud availability has delivered economies of scale and reduced unit costs. However, there are still many legacy systems in place running mainframes and mini-computer.

The next major wave -- which has already started -- involves the integration of Artificial Intelligence, Deep Learning, and Machine Learning technologies into the IT environment. This will soon deliver benefits of near 100% recognition processing, high speed check processing, and real-time fraud detection.

The importance of GPUs for these technologies is not only to train the Deep Learning Models with millions of check images, but also to provide unparalleled processing speed and accuracy for real-time or batch processing (inference). Whether you plan on migration to a new data center player primarily leveraging NVIDIA, or work with existing AWS or Microsoft Azure, rest assured that AI will soon be a part of every workflow across the omnichannel.

Anywhere Reco Gif

Leave a Comment