Nvidia has acquired Arm. What does this mean for the future of AI, edge computing, and the people who write software for these chips?
In a move that has significant implications for the tech industry, U.S.-based graphics chip maker Nvidia announced last week that it would purchase U.K.-based Arm Holdings from Japanese investment firm Softbank for $40 billion. For anyone programming for AI, data processing, or embedded systems, this could mean your data intensive applications will soon be running on ARM designed chips with native Nvidia GPU support.
Arm Holdings is the enterprise behind Arm Processors, the smart sensor chips which are used to power over 90% of the world’s smartphones and everything from autonomous vehicles to toasters to washing machines. While the company has no manufacturing capabilities, it describes itself as the “Switzerland” of technology: it licenses its chip designs to any company who wants them, and allows others to do the actual manufacturing.
Under the terms of the new deal, Arm will continue to be headquartered in Cambridge as a UK-based company Nvidia has announced that they will be opening an AI research center in Cambridge as well near the Arm headquarters. The research center will serve as a central hub for collaboration between AI scientists and researchers from across the globe.
So, what does this mean for our readers, the folks writing code every day? This acquisition could have major implications for developers working on embedded systems. It may be beneficial to start learning about platforms like CUDA (Compute Unified Device Architecture) and its SDK. Your ability to process large amounts of data in the cloud may speed up, while at the same time, the ability to fit powerful machine learning algorithms onto devices may require less and less memory. Read on to learn the backstory behind this deal and what it will mean for the world of computing and programming.
From graphics to AI
While Nvidia is best known for graphics cards that enable modern video games, the last few years have seen a wealth of new applications for their technology: AI, data processing, and cryptocurrency mining have all turned to Nvidia GPUs. The demand from new areas exploded so fast that, in 2018, a run on Nvidia cards by bitcoin miners led to a global shortage. While the headiest days of the crypto currency boom may have passed, the applications for large scale data processing in parallel continue to proliferate.
Where CPUs are designed for complex logic processes, GPUs are optimized to process many floating point computations in parallel. 3D rendering requires a massive amount of arithmetic calculations as vertices rotate. For machine learning and other large data processing operations, the GPU’s focus on parallel arithmetic is a match made in heaven.
Nvidia already offers products that marry their GPU with Arm chip designs to create data processing workhorses. In late 2019, the company created an Arm-based, GPU accelerated server designed to process information very quickly. Purchasing Arm may be a way to double down on the bet that GPUs will make a popular data processing tool.
As a result of the acquisition, Nivida will now be placed at the forefront of Arm’s IoT ecosystem and cloud-based AI edge computing. Edge computing refers to an approach where information is stored and processed locally rather than in a central data warehouse many miles away. Intel made a similar move when it acquired Movidius.
If edge computing matures the way these chip companies hope, companies can save money gathering information locally and acting upon critical data immediately. For any applications that are latency-sensitive, such as autonomous vehicles, even a millisecond delay in data processing is unacceptable.
With the addition of Arm, it’s possible Nvidia will become a dominant force in everything from microprocessors to tablets, mobile phones to street lamps, washing machines to autonomous vehicles.
What changes in the computing landscape made this possible?
Cloud computing reshaped software architecture in the past few years, where distributed software can automatically scale its computing resources based on its immediate needs. For example, 86% of all enterprises are expected to become dependent upon SaaS (one of the three primary cloud computing categories, along with IaaS and PaaS) for most or all of their software needs by 2022.
At the same time, Moore’s law has held steady and semiconductors have continued to shrink (possibly holding their size in 2021), which has allowed more and more devices to include computing power. Combine that with easy access to scalable computing power, and you have a world where everything is a computer and needs specialized hardware.
As more and more sophisticated applications were developed in these devices, the idea of edge computing rose. Instead of processing data and sending it to a device, now the data is being processed on-device to get around network latency issues or the absence of networks, both of which happen to be prevalent in those countries developing fastest.
As mentioned earlier, those GPUs that Nvidia specializes in are very good at the parallel manipulation and processing of data. One of the most noteworthy advances in the computing industry overall has been the proliferation of GPUs into numerous different solutions (such as 3D mapping, image processing, and deep machine learning), to the point that more traditional CPU power has not been able to keep up.
What could change with the developer landscape
For developers, this may mean that new realms of data processing speed opens up. These data powerhouses could be added to cloud offerings as an add-on, making this a seamless benefit to complex ETL pipelines. For integrated chipsets on smaller devices, it could be that graphics and data processing limits grow, allowing mobile apps with improved graphics and IoT devices with more sophisticated AI.
Speaking of AI, neural nets may become simultaneously more complex and smaller. Specialized AI hardware could be developed to support consumer applications. Neural net software currently used in powerful data processing and forecasting applications could find new use cases, or new applications could arise that provide small scale AI benefits to a wide range of people.
But the biggest effect may be that more developers need to know about the CUDA (Compute Unified Device Architecture) framework SDK. Similar to SIMD intrinsics, this SDK allows programs to directly access the GPU’s parallel processing. And if Nvidia manages to unify the physical memory of the CPU and GPU, it could open up numerous new avenues for optimization and advancement.
In the end, though, much of the details of this may be abstracted away from anyone writing code by libraries and high-level programming languages. The only coders sure to be affected are those working directly with the embedded systems.
Conclusion
Nvidia’s acquisition of Arm is likely to have a lasting effect on the tech industry as a whole. Not only has Nvidia become a much bigger player in the IoT and cloud-based edge computing (to the point that it could be argued they will become the single most influential player), but major corporations that Arm licenses to, such as Apple, Intel, and Samsung, may look to shift to alternative sources for their chip and microprocessor designs.
The biggest benefit that is likely to come out of all of this is the fact that major corporations and smaller startups alike have begun developing AI-based microprocessors that can handle complex neural networks. This means we are likely to see the continued innovation of microprocessors on a scale that we have never seen before.
Tags: ai, arm, big data, nvidia
18 Comments
Earlier this year, Nvidia also acquired Mellanox, makers of high-performance low-latency interconnect solutions (Ethernet Switches, NICs, and Switch ICs up to 400G, Infiniband).
Now, they have CPUs, GPUs (AI Compute Units), and networks. AI Compute Clusters anyone?
Quit Technology.
Based upon Abuse and Slavery.
The world is invincible!
I’m more interested in what it means for the Raspberry Pi.
I would think this translates to more powerful/affordable raspberry Pi boards just in time for the realisation of EDGE computing devices and even autonomous vehicles/robots.
The Nvidia Jetson Nano is essentially a Pi-like Single-Board Computer with a powerful Nvidia GPU. There are two possibilities: Nvidia closes off ARM. Tough luck. That’ll be the end of Pi, Android phones, Apple, XScale, … This would be a very stupid move.
However, so far, Nvidia have publicly committed to ARM’s “Switzerland of CPU designers” philosophy, so it is more likely that any improvements that Nvidia adds to ARM that may benefit the Jetson will also benefit the Pi.
It is important to understand that ARM doesn’t build CPUs. They *design* Instruction Sets and *license* them to CPU designers. Those CPU designers, in turn, typically don’t actually fabricate CPUs (with the exception of Intel), they contract fabs.
Right.So Invest in Intel.
These are just hacks with stolen IP.
Nvidia is a great company. But i am think the future chipsets will be mainly from these architectures.
competition will be tough ahead among these architectures.
x86,
RISC-V,
AVR,
ARM
moi aussi.
I am very worried
I hope it doesn’t have an adverse effect – Apple still need ARM technology so I guess we won’t see any problems going forward.
Whilst “Nvidia acquired ARM from Softbank”, in order to pay for the deal, Softbank became the largest shareholder in Nvidia at just under 10%.
Also interesting – “Arm’s Internet-of-Things Services Group (ISG) business will be carved out from Arm prior to the closing of the Transaction. Accordingly, the ISG business is not within the scope of the Transaction.”
Source: https://group.softbank/en/news/press/20200914_0
Also could be a middle finger to Apple
What makes the ARM design so special that it could not be replaced by say, RISC-V designs/architectures/instruction sets? Those at least are open source so there will never be one company controlling its future.
Good point.. unless it’s just time for technology to die.
Ironic that all these waste peddlers in tech, wasting electricity as well as previous metals, mining rare earth elements with children and creating tons of plastic cases which are trash in 3 years.. also invest in global warming.. they are global warming.
interesting!
We British are fed up with Americans hovering up British companies and paying zero corporation taxes. Arm will be ruined by NVidia, SoftBank had already reneged on promises it made and American companies are no better you only have to look at Kraft purchase of Cadbury.
I hope the best brains at Arm leave and start a new British company.
“We British are fed up with Americans hovering up British companies and paying zero corporation taxes.” Obviously you are not, since you’re willingly selling… Nobody forced the sale of Cadbury at gunpoint either. What Brits owned it (if any, that is) wanted to cash out and not deal with it anymore. So whoever you speak of when you say “we British” is clearly not anyone actually involved with ownership of said companies… so it sounds disingenuous a wee bit.
Nvidia shifted to “AI” 4 years ago. The stock price went from pe ratio of 15 to 100. One could have thought that autonomous cars with Nvidia s “AI” will be driving all over the roads. In fact the HW is used for a lot of AI teaching financed by dreams but no consumer products what so ever.