And then there is Qualcomm. Its unique approach offers a single platform in a distributed solution the company calls AIEngine. Betting that there is no single solution for AI, Qualcomm is using its latest Snapdragon Kryo CPU, Adreno GPU, and Hexagon DSP cores, starting with various Snapdragon generations, and then tying it all together through a common software platform.

Xilinx, the largest standalone FPGA chipmaker, spent over $1B in the past 4 years to participate in the AI race. It is playing an increasingly significant role in enabling data center workloads associated with machine learning. Using a heterogeneous computing platform, it applies multiple processing resources to create a single AI solution with an emphasis on data centers. Xilinx’s Adaptive Compute Acceleration Platform (ACAP) is designed to deliver 20x and 4x performance increases on deep learning and 5G processing, respectively. The first chip, called Everest, will tape out later this year in a 7-nanometer process. Since traditional processors lack the computational power to support many of these intelligent features, Xilinx’s AI solution for developing neural networking has now expanded to offer ML applications for the Cloud and the Edge, especially with the recent acquisition of DeePhi, a startup from Beijing. The integration with DeePhi will be very important to Xilinx’s AI portfolio as the development of its deep learning processing units (DLPUs) will include FPGA and ASIC chips.

Moving forward, we believe the market is going to remain fiercely competitive. VCs invested more than $1.5B in chip startups in 2017, nearly doubling the investments made two years ago, according to a CB Insights report. There are at least 45 startup chip companies focused on NLP, speech recognition, and self-driving cars. Silicon Valley startup Cerebras and UK’s Graphcore are quietly working on bots that can carry on conversations and systems that can automatically generate video and virtual reality images. Not only do these newcomers have strong backing by leading VCs, but they have also been on an active hiring spree, cherry picking key executives from many of the older and established chipmakers. Cerebras has hired dozens of engineers from Intel, notably bringing in its CTO from Intel’s Datacenter group. Graphcore was founded by semiconductor veterans who have founded multiple startups in the past, including Icera, a mobile chip company, which was sold to Nvidia in 2011 for $376M. Another promising startup, SambaNova, which was funded by Google Ventures and co-founded by an Oracle veteran and professors from Stanford, is working on solutions to integrate hardware and software to maximize the performance and efficiency of AI chips.

With such a crowd of innovators focused on a single target, new innovations are expected to continue to accelerate at a rapid clip. Perhaps the biggest differentiator on deck at the moment is the development of key software that is tightly incorporated into single solution sets. So far, Nvidia appears to have the clear advantage, and its equally weighted software and hardware development teams reflect the importance of software integration in the next generation AI chip sets. But with each new advancement comes the opportunity for new leaders of the pack, and the specialization of AI chips for different segments is already evolving faster than most analysts had expected. As each of these companies fights for its share of this $35B chip market opportunity, watch for accelerating M&A activity—and more opportunities for investors in every area of the space.

By Lisa Chai, Senior research Analyst, ROBO Global

For more trends in robotics and artificial intelligence, visit the Robotics & AI Channel.