Editor / Wen Fei
Source: New Wisdom (ID: AI_era)
On the evening of March 2nd, I heard that there was an AI personnel change news —— Caffe author Jia Yangqing will leave from Facebook.
In just a few hours, nearly 100,000 people have browsed this issue. Not only that, according to the AI frontline broke out, Jia Yangqing will join the Ali Silicon Valley Institute as a VP after leaving Facebook, and officially joined the company on March 11 this year.
Xinzhiyuan asked Jia Yangqing and Ali for verification, and the other party did not reply. In the March 27th ScaledML meeting agenda, Jia Yangqing was also present as the Director of the Facebook AI Architecture.
On the other hand, the news of Jia Yangqing's resignation has spread in the industry. Duke University professor Chen Yiran also issued a microblog that: Ali Silicon Valley Research Institute has since opened "one hard and one soft", two universities, the king of the king leadership model, in the field of AI research It is like a broken bamboo.
Chen Yiran refers to “hard”, which is the professor of computer architecture in a certain university in the United States, and “soft” is Jia Yangqing.
Perhaps, for Jia Yangqing himself, this is just another "ordinary job change", but considering his current position on Facebook —— AI architecture director, and the next stop in the rumor —— Arida The Silicon Valley Laboratory of the Institute will be one of the most important personnel developments in 2019.
From Caffe to PyTorch, Jayan Ching's AI Architecture Great God Road
Jia Yangqing is known as the author of the deep learning framework Caffe. After graduating from Tsinghua University, he obtained a Ph.D. in computer science from the University of California at Berkeley. He worked at the National University of Singapore, NEC USA Lab, Google Brain, and joined Facebook as a research scientist in 2016. He is currently the Director of Facebook AI Architecture and is responsible for the cutting-edge AI platform. Development and artificial intelligence research.
During the Berkeley reading, Jayang Ching developed the deep learning framework Caffe, the full name of “Convolutional Architecture for Fast Feature Embedding”, which has become one of the most popular and successful open source deep learning frameworks with excellent structure, performance and code quality. The development of the field has greatly promoted and influenced.
Later, Jia Yangqing joined the Google brain internship and participated in the development of the TensorFlow platform. In February 2016, Jia Yangqing joined Facebook, and the news also sparked extensive discussions within the circle. See the new wisdom report. Caffe author Jia Yangqing: Why did I leave Google and join Facebook?
Since then, Jia Yangqing has made an attack on the road of AI architecture:
In November 2016, Facebook launched the lightweight, modular deep learning framework, Caffe2Go, to run deep neural network models on mobile phones. Jia Yangqing sent a team on the FB official website, “We developed a new deep learning platform on mobile devices, realizing the real-time capture, analysis and pixel processing —— this most advanced technology is only in hand. Can be achieved. ……Caffe2Go, along with research tools such as Torch, form the core of Facebook machine learning products. ”
In April 2017, Facebook announced the open source product-level deep learning framework Caffe2, which brings cross-platform machine learning tools. Caffe2 is an experimental re-creation of Caffe that focuses on performance, speed and modularity. In a September 2017 evaluation, Caffe2 ranked first in the TensorFlow, PyTorch, MXNet, CNTK and other frameworks with 79% accuracy and 149 seconds of computing time.
In May 2018, Facebook officially announced PyTorch 1.0, a framework based on the merger of PyTorch 0.4 and Caffe2. It also integrates the ONNX format and aims to unify research and production capabilities within a framework.
From deep learning of the open source framework to the present, TensorFlow and PyTorch are now in court, and Jia Yangqing has also moved from the Caffe author to the Facebook AI Architecture Director.
He co-head of the PyTorch 1.0 project, created the first prototype of the open model format in 2017, and later became ONNX released by FB, Microsoft, Amazon and many hardware vendors.
From Google brain, Facebook to Ali, continue to fly in another place
For Jia Yangqing's departure, Facebook software engineer Xiaofei gave such an answer on knowing:
For Daniel, never be obsessed with the aura of big companies, the platform is just a springboard to achieve their professional development.
Or work hard to grow in the environment surrounded by the Great God, learn the core technology, and finally become the technical leader of the independent side; or solve the problem with its excellent emotional intelligence, create more value with the team, and eventually become the team leader who can lead everyone.
Therefore, in the merger of Caffe2+Pytorch, the internal turmoil of our company, the negative news constantly, Jia Yangqing's choice is nothing more than to find a platform more suitable for their own development, and continue to fly in another place.
Jia Yangqing is a native of Shaoxing, Zhejiang. In an interview with Bloomberg News earlier, when asked by reporters whether there is a possibility of returning to work in China in the future, Jia Yangqing said: “This is a more complicated issue that needs to be considered in the family. And various reasons. ”
In the past few years, companies such as BAT have established a large number of AI research institutes, and opened branches in the United States, especially in IT towns such as Silicon Valley and Seattle, attracting a large number of American AI talents. Of course, many of them are Chinese.
This is not just a BAT going out, but a large part of the reason is also to reduce the Chinese engineers and researchers who have been working and living abroad for many years. Back to the threshold of Chinese companies. It is as if Ma Huateng opened the institute to Microsoft, because many researchers “do not want to leave Seattle”.
At the Yunqi Conference in October 2017, Alibaba announced the establishment of the Global Research Institute & mdash;— Alibaba Dharma Institute, which invested RMB 100 billion in three years to attract talents. Dharma has established cutting-edge science and technology research centers around the world, including San Mateo (near the San Francisco Bay Area) and Bellevue (Washington State).
The next stop of Jia Yangqing in this rumor is the Silicon Valley Research Institute of Dharma Institute in San Mateo.
Source: Google Earth
In a recent article on Facebook, Jia Yangqing posted a photo of his visit to the Baoding Tower in Hangzhou.
Jia Yangqing shared a tour of Hangzhou Baoji Tower in the latest article on Facebook.
Although the Baoji Tower is still far away from the Alibaba headquarters, it is necessary to join the Ali, even if it is working in the Silicon Valley Research Institute, but it is definitely necessary to go to the Hangzhou headquarters to go through the relevant procedures, so this photo also joins Jia Yangqing to Ali. Increased credibility.
The AI open source framework ecology is in the ascendant, and there are many good plays in the industry.
According to the Jia Yangqing LinkedIn page, he joined Facebook to create, develop, and spin off multiple teams, now collectively known as the Facebook AI Infrastructure group.
As Director of AI Architecture, he leads a team of researchers and engineers to build Facebook's large-scale artificial intelligence platform to support advertising, feeds, search rankings, computer vision, voice, natural language processing, VR and AR applications. And other AI product use cases.
Machine learning has the potential to solve world-scale problems. However, to support major engineering issues, large-scale scalable machine learning is needed. Nowadays, the research and development of AI architecture and platform has more and more practical significance.
In April 2018, Jia Yangqing introduced how Facebook uses machine learning at ScaledML, including Facebook's hardware and platforms and frameworks for machine learning.
Facebook's machine learning system is a platform that supports 2 billion users.
In response to the merger of Caffe2 and PyTorch, Jia Yangqing said in his knowledge that the integration of these two frameworks can greatly improve the development efficiency and provide convenience for developers.
Because PyTorch has an excellent front end, Caffe2 has an excellent backend that can be combined to further maximize developer productivity. At present, more than half of FAIR's projects use PyTorch, and the product line is using Caffe2, so both sides have strong power to integrate advantages.
Development efficiency is one of the things I value very much on Facebook: I started the ONNX project in the middle of last year (the first version of the code was handwritten by me), and then helped build the ONNX team to enhance collaboration between different frameworks and even different companies. The merger of Caffe2 and PyTorch at the code level was also a gradual push from that time.
In December 2018, Ali Open Source Learning Framework XDL, for advertising, recommendation, search and other scenarios, focused on building industrial-level distributed operational capabilities, and seamlessly interface with existing open source frameworks such as TensorFlow and PyTorch.
Jia Yangqing evaluated in the circle of friends, XDL is "very practical work, not to re-create the wheel at the framework level, but to focus on the framework above, apply the following large-scale computing abstraction and system abstraction, which should be the industry after the open source framework is stable. Big trend & rdquo;.
Perhaps compared to two years ago, the ecology of deep learning open source frameworks and compilers seems to have entered a relatively stable period, but the competition has not subsided. Next, with the large-scale application of AI technology and penetration into more areas, more AI engines and industry-level deep learning frameworks will emerge.
More importantly, in the current AI framework, there is still a lack of products from China. Neither Baidu's PaddlePaddle nor Tencent's Angel has the same strength as TensorFlow, PyTorch or MXNet.
If Jayan Qing really joins Ali, he may be looking forward to a Chinese framework on the stage of a large-scale AI architecture. In short, in any case, look forward to the choice and future of the Great God.