Facebook News Research

Facebook Open-sources Its Artificial Intelligence Hardware Design

facebook-language-ai-tech-portal
Share on Facebook
Tweet about this on TwitterShare on Google+Share on StumbleUponShare on LinkedInPin on PinterestShare on Reddit

In the past few months, we saw Facebook open-source many of its AI based softwares. Today however, the social networking giant took the whole open-sourcing craze up a notch. Facebook has announced that it is planning to open-source its latest AI server design.

The server, dubbed Big Sur, is the latest development of the company in the hardware front. It can apparently be used inside several of the company’s applications. The detailed design of this hardware will soon be up for anyone to explore through the Open Compute Project. The blog post announcing the open-sourcing of Big Sur reads:

Big Sur is our newest Open Rack-compatible hardware designed for AI computing at a large scale. In collaboration with partners, we’ve built Big Sur to incorporate eight high-performance GPUs of up to 300 watts each, with the flexibility to configure between multiple PCI-e topologies. Leveraging NVIDIA’s Tesla Accelerated Computing Platform, Big Sur is twice as fast as our previous generation, which means we can train twice as fast and explore networks twice as large. And distributing training across eight GPUs allows us to scale the size and speed of our networks by another factor of two.

The server works on a new principle that Facebook likes to call deep learning which will train artificial neural networks with a lot of diverse data. This data could include graphics, audio, voice and image recognition etc.

This is a way of saying, ‘Look, here is what we use, here is what we need. If you make hardware better than this, we’ll probably buy it from you,’.

said Yann LeCun, head of the Facebook Artificial Intelligence Research lab, during a conference call on the news. 

The server can house up to eight GPUs, each of which can max out at 300 watts. While the Big sur can take any GPU out in the market, it was especially define under Nvidia’s Tesla M40 GPU.

We want to make it a lot easier for AI researchers to share techniques and technologies. As with all hardware systems that are released into the open, it’s our hope that others will be able to work with us to improve it. We believe that this open collaboration helps foster innovation for future designs, putting us all one step closer to building complex AI systems that bring this kind of innovation to our users and, ultimately, help us build a more open and connected world.

Facebook researchers Kevin Lee and Serkan Piantino wrote in the blog post.


[email protected]


Add Comment

Click here to post a comment

Your email address will not be published. Required fields are marked *