Home > News content

Long ignored Intel GPU technology can now be spared it?

via:博客园     time:2016/8/22 10:30:17     readed:976

长期忽略长期忽略 GPU 技术的英特尔如今还能独善其身吗?

Intel's discrete graphics technology has not been enough attention, even if the company throws a series of recent important product information for AI and VR technology, they also still did not show sufficient interest in GPU.

Back in 2009, Intel abandoned its development project called Larrabee GPU's, when the product has been widely recognized as an important measure of the impact of Intel's PC gaming systems. But now many analysts began to question whether there is to build a high-performance Intel GPU necessary, high-performance GPU has now become an integral part of the games, virtual reality and artificial intelligence behind the market indispensable.

Intel Developer Forum recently held (Intel Developer Forum) on about Intel VR and AI strategy is still the focus of the forum, and also talked about their own CPU and FPGA (field programmable gate array) technology subject area. However, the topic is related to GPU continued to be cold, it will make people can not help but suspect the integrity of the chip giant's product line.

Over the years, Intel has not been GPU leader in the field, this has never been the same, such as AMD and NVIDIA to expand too many head-on confrontation. However, Intel this week is to showcase some of my progress achieved in this field, the company said seventh-generation Core processor microarchitecture codenamed & ldquo; KabyLake & rdquo; integrated graphics already support for the most popular 4K display standard.

In addition, the executive vice president of Intel's Data Center Group, Diane - Bryant (Diane Bryant) also on Wednesday announced that the company has developed artificial intelligence technology for high-end server chip & ldquo; Xeon Phi & rdquo ;. Moreover, the Chinese Internet companies Baidu has been determined that will enable XeonPhi chips for deep learning platform in its data center.

It should be noted that this is already the third generation of Intel's Xeon Phi product, and its code-named & ldquo; Knights Mill & rdquo ;. And because the use of FPGA technology, the chip can also be based on a different machine learning task and a separate program.

In other words, Intel believes in its own AI strategies do not necessarily need to be present a pure GPU products.

& Ldquo; in fact, the highest performance computer need not GPU, but need to improve the ability of parallel applications, and to do this there are a number of ways. & Rdquo; Vice-Chairman of Intel's Data Center Solutions division - Jason Waxman (Jason Waxman) he explains.

But now I am afraid that the actual situation is not so simple. In this regard, founder and principal analyst at market research firm Moor Insights and Strategy Patrick - Mo Haide (PatrickMoorhead) that in many multimedia applications, a separate GPU chip will be very useful. If Intel can have NVIDIA or AMD par with cutting-edge graphics technology, so the company's participation in AR and VR fields will be improved a lot.

& Ldquo; in the ordinary VR, AR technology, Intel's own CPU is enough to do the job. But if it involves such as HTC Vive or OculusRift such high-end VR products, users probably still need to use NVIDIA or AMD graphics card. While it is perhaps because of Intel Xeon Phi emergence no need GPU, but with the passage of time and change. & Rdquo;

Technology industry research firm Tirias Research founder - Jim McGregor (Jim McGregor) said that an excellent performance of Intel GPU can help better manage AI, games, VR and AR market, but you really want to build a good GPU Intel is concerned is by no means a simple matter.

McGregor believes that Intel's concept of the so-called co-processor or a separate processor is not cold. Moreover, the company at this stage can resolve this issue through with AMD, NVIDIA cooperation.

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments