Over the past several years, artificial intelligence (AI) has quietly become part of our daily lives. The smart replies generated by your email or smartphone, the tagging of photos on social media, product recommendations on e-commerce sites, the directions provided by mapping apps, and the music and video recommendations on streaming services all use the predictive power of AI.

Graphics processing units (GPUs) from NVIDIA (NVDA 0.61%) were the early beneficiary of the trend, resulting in staggering growth since the dawn of AI. The massive parallel processing capability of GPUs to render images also turned out the be the best available solution for AI systems. The company's rivals have been scrambling to build a better mousetrap in an effort to seize control of the lucrative AI chipset market.

Three recent developments show just how heated the competition has become.

A circuit board with a chip labeled AI at the center.

Image source: Getty Images.

Billions of dollars to gain an edge

Intel (INTC -2.00%) has long been the worldwide leader in the CPUs used in personal computers (PCs), but slowing demand for PCs and stiff competition from the likes of Advanced Micro Devices (AMD 0.41%) has stunted Intel's growth. So far in 2019, the company's revenue has been flat year over year, and net income has fallen by 11% -- with no relief in sight. 

The company, having missed out on the initial AI boom, has been scrambling to play catch-up by making a series of AI-related acquisitions. Intel is said to be in "advanced talks" to acquire Israeli chipmaker Habana Labs for a price of between $1 billion and $2 billion, according to Israel-based publication Calcalist. Habana develops processors that are optimized for AI applications, according to the report. The company said that its Gaudi AI processor "will deliver an increase in throughput of up to four times over systems built with equivalent number GPUs." 

This isn't the first such acquisition. Intel paid $16.7 billion for Altera in 2015 to gain control of its field-programmable gate arrays (FPGAs) for use in AI. In 2016, the company bought AI chipmaking start-up Nervana for $400 million, and in 2017 acquired Mobileye for $15.3 billion to gain a foothold in self-driving car processors. Unfortunately for investors, Intel has a questionable track record when it comes to these deals, so time will tell if buying Habana Labs will yield different results.

A different approach

Microsoft (MSFT -1.66%) has enjoyed something of a renaissance in recent years, after originally rising to prominence for its Windows operating system and its Office suite of software. The company repackaged its most popular offerings as software-as-a-service (SaaS) and has been a rising star in the realm of cloud computing, coming in second only to industry leader Amazon.com (AMZN -1.05%).

The company is banking on a new chipset to continue the expansion of its Azure cloud operations with the development of a cutting-edge new processor -- the Colossus intelligent processing unit (IPU) -- in collaboration with British start-up Graphcore. The chips were developed from the ground up with an eye toward AI-centric tasks like facial recognition, speech processing, natural language understanding, and self-driving cars. Microsoft and Graphcore released benchmarks early last month that suggest their state-of-the-art processor meets or exceeds the performance of similar chips by NVIDIA or Alphabet's Google. They even claim that some tasks -- like language processing -- are completed much more quickly using Graphcore's chip and software stack.  

Microsoft Azure is the first public cloud provider to deploy the IPU and offer it to customers for testing and use.

A man in business attire touching a virtual cloud icon

Image source: Getty Images.

re:Invent-ing the AI chip

At its annual re:Invent conference in Las Vegas this week, Amazon announced it would make its custom-designed Inferentia chip available to AWS customers. In a press release, Amazon said the processor delivered "high performance and the lowest-cost machine learning inference in the cloud."

AI systems operate in two distinct phases. The first -- the training phase -- develops and trains the algorithms for a specific task, while the second -- the inference phase -- deploys the system for the task for which it was trained. Each phase requires processors with a specific skill set in order to operate at peak efficiency. The Inferentia chip was specifically developed with inference in mind -- as the name implies -- so GPUs are still necessary during the training phase. By making this available to its customers, Amazon is helping to bring cost-effective AI to the masses.

This was part of a massive announcement by the technology specialist that introduced nine new computing and networking options for AWS customers. By continually developing new and innovative products and services, Amazon is bucking to maintain its lead in cloud computing and stave off the competition for as long as possible.