Home > News content

Will the war between Nvidia and Google over AI chips spread to the brink and "crush" AI startups?

via:博客园     time:2019/3/27 14:11:45     readed:169

The AI boom continues, and the AI war is escalating. As one of the most concerned companies in this wave of AI, Invida has a great influence on AI's war situation. At last week's GTC 2019 in the United States, Huang Renxun devoted a lot of time to introducing Yingweida's advances in AI software and computing power, but the Jetson Nano AI computer, which sells for only 99 dollars (about 664 yuan), has become the focus of most attention. At the TensorFlow Developers Summit earlier this month, Google also released an Edge TPU development board for $149.99 (about RMB109).

This means that the AI chip war among the giants has spread from the cloud to the edge, but why is it a mixed news?

Nvidia

Cloud AI Chip Fire Spreads Downward

Although it is the most concerned AI chip company, Invida did not go smoothly in 2018, first because of the high GPU inventory caused by the mine accident, and then because of the demand of the Chinese market and server market, the market demand is lower than expected stock price. Over the course of 2018, the market value of Nvidia has shrunk by nearly half. Therefore, in the context of AMD's preemptive release of 7Nm GPU, the outside world is more looking forward to Yingda's release of the latest 7Nm GPU on GTC 2019.

However, Huang Renxun did not release the latest 7Nm GPU, but spent a lot of time introducing RTX and CUDA-X AI.

CUDA-X AI integrates all the Yingweida's libraries. According to Huang Renxun, CUDA-X AI unlocks the flexibility of the Tensor Core GPU and can speed up machine learning and data science workloads up to 50 times. In addition, CUDA-X AI can accelerate every step of the typical AI workflow, including training voice and image recognition systems with in-depth learning.

Invida also announced that seven world-class vendors will launch servers based on NVIDIA T4 GPU and NVIDIA CUDA-X AI acceleration libraries, which have been specially optimized for CUDA-X AI. Matt Garman, vice president of Amazon AWS, also announced that the latest EC2 G4 server uses Nvidia T4 Tencor Core GPU, which will be available in the coming weeks.

Although Nvidia has not launched a more powerful GPU, it is improving its performance and attractiveness in the cloud through CUDA-X AI. Even so, the company's key customer, Google, has launched its own cloud-based AI chip TPU.

Since 2015, Google has begun to use TPU core internally. In 2016, Google publicly acknowledged the existence of TPU for the first time. In 2017, it released the second generation TPU, and in 2018, it released TPU 3.0. This means that Google's relationship with Nvidia in the cloud AI chip market has changed from cooperation to competition.

Lei Feng learned that Huang Renxun strongly disagreed with the TPU threat when he talked about Google TPU. The competition between Google and Invida in the cloud AI chip market will be hard to come to a conclusion in the near future. But it is clear that their chip competition has spread to the edge.

AI Chip Warfare on the Edge

As a veteran chip giant, Invida has long entered the edge computing market. Jetson series, including Jetson AGX Xavier for fully autonomous machines and Jetson TX2 for edge AI, have been launched, but the price of hundreds or even thousands of dollars has blocked many users. The key reason why the same Jetson Nano series launched by GTC in 2009 has received high attention is the price.

Throughout the development process of different industries, in addition to the factors of technological maturity, it is also critical for the outbreak of industries to reduce the price of products to a level acceptable to the market. The Jetson Nano computer launched by GTC in 2019 is surprisingly priced, compact in appearance but not low in performance. It is reported that Jetson Nano can perform 472 GFLOPS (1 billion floating-point operations per second) and consume only 5 watts of power. At the same time, Jetson Nano supports high resolution sensors, can process multiple sensors in parallel, and can run multiple modern neural networks on each sensor stream.

For different needs, Nvidia has also launched two versions of Jetson Nano, a $99 Developer Suite for developers, creators and technology enthusiasts, and a $129 production ready module for mass-market enterprises to create edge systems.

Nvidia

Nvidia Jetson Nano

Similar to Nvidia's Jetson Nano, Google's Coral, a development board with Edge TPU, was released earlier this month for $150. Coral development board has 1GB of LPDDR4 memory and 8GB of eMMC storage. Installing Mendel Linux or Android, it can perform local offline operations, with up to 4 trillion operations.

In addition to the Coral development board, Google has released a $75 Coral USB accelerator, which also includes an Edge TPU that runs on Debian Linux on any 64-bit ARM or x86 platform.

Nvidia

Google Edge TPU Development Board

Huang Renxun does not think that Google's TPU is a threat, but the two giants seem to have a good understanding of the progress of low-cost products at the edge. First, Google launched a $75 and $150 development board with an Edge TPU and an accelerator. Not long after, Nvidia launched Jetson Nano for $99 and $129.

Nvidia

Not only will prices compete with each other, but the edge-oriented computing market will overlap. According to Invida, Jetson Nano can create millions of intelligent systems, modules for embedded applications such as network video recorders, home robots and intelligent gateways with complete analysis functions. Invida hopes to save time for hardware design, testing and verification of complex, robust and energy-saving AI systems, shorten the overall development time and make products more quickly marketable.

Coral development board also emphasizes privacy, low latency, high efficiency and offline deployment for embedded devices. In terms of specific applications, Google demonstrated an interesting image classification application based on Oral. Google says it provides a simple API for performing image classification and object detection on Edge TPU devices. This means that Edge TPU favors image-related edge applications.

Therefore, whether in terms of location, performance, application or price, Google and Nvidia are once again competing positively at the edge.

Popularizing AI or Revolutionary AI Chip Startup?

The competition between Google and Nvidia can promote AI to some extent, especially at the edge. Jetson Nano and Oral development boards can greatly reduce the difficulty of AI product development and accelerate the time of product launch, increase the choice for existing AI application companies, of course, also want to use AI for more innovative companies and individuals to provide a more portable choice, which has a positive significance for the popularity of AI at the edge.

Nvidia

However, it may be bad news for many AI chip start-ups. Lei Feng Network (Public No. Lei Feng Network) has counted in 2018 that 11 of the 13 AI chip start-ups founded in China have laid out the field of automatic driving and security, and all of them are AI chips oriented to the edge. AI chip start-ups mostly choose the AI edge computing market because Intel and Invidia are absolutely dominant in the cloud, and it is very difficult for them to succeed in this field.

Although the edge AI gives startups greater market opportunities, from the current situation, Yingweida also has a good market performance in the field of automatic driving. Today, both Invida and Google are launching development boards that are easier to use and more affordable. AI chip startups have two more competitors and are powerful competitors.

Even more troubling to AI chip startups is the fact that both giants have cloud-based AI chips, which can be more competitive with edge-based AI chips. Moreover, the important role of software in AI chips is being seen by more and more people. Unfortunately, both Yingda and Google have powerful software.

As mentioned at the beginning of the article, the release of CUDA-X AI by Nvida will improve the AI performance of GPU, but at the same time, Jetson Nano is also a NVIDIA CUDA-X AI computer that can run all AI models.

On the Google side, along with the Oral development board, TensorFlow Lite, a cross-platform solution for mobile and embedded devices, is released. This lightweight framework helps machine learning models to be deployed on mobile and IoT devices. Google said that after TensorFlow Lite's optimization, CPU performance reached 1.9 times the original, and performance on Edge TPU increased by 62 times.

Lei Feng believes that the giants have AI chips from cloud to terminal, and have powerful software to help chips improve hardware performance. At the same time, they also have the advantages of long-term brand, channel and market. This will promote the popularity of AI at the edge, while competing with many AI chip start-ups.

However, there are still many uncertainties in the future. To what extent will Invida and Google affect AI chip start-ups?

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments