Microsoft Partners with Elon Musk's OpenAI
This week, Microsoft announced a partnership with OpenAI, the nonprofit research organization co-founded by futurist Elon Musk. The goal? Nothing less than the democratization of AI.
“We’re excited to announce a new partnership with OpenAI focused on making significant contributions to advance the field of AI, while also furthering our mutual goal of using AI to tackle some of the world’s most challenging problems,” Microsoft Executive Vice President Harry Shum says. “We’re also excited that OpenAI chose Microsoft Azure as their primary cloud platform.”
OpenAI’s use of Azure will help OpenAI advance their research and create new tools and technologies that are only possible with the cloud, a Microsoft representative told me. And while the two firms are aligned generally, Microsoft’s recent embrace of open technologies hasn’t hurt.
The firm has outlined three Azure services that will play a key role in the partnership. These are:
Azure Bot Service. Available now in preview, the Azure Bot Service helps developers accelerate the development of bots using the Microsoft Bot Framework and then deploy and management them in a serverless environment on Azure. Microsoft notes that over 50,000 developers are already creating bots with the Microsoft Bot Framework for companies as diverse as AllRecipes.com, Lowe’s, and Uber.
Azure Functions. Now generally available, Azure Functions can maximize development agility and operational efficiency for nearly any app or service. and do so at lower costs. Azure Bot Service also runs on Azure Functions, which allows the bots to scale on-demand, Microsoft says.
Azure N-series virtual machines. Coming in December, the Azure N-series VMs are designed for the most intensive compute workloads, including deep learning, simulations, rendering and the training of neural networks.
“We’ve made major strides in artificial intelligence just in the past five years, achieving milestones many people who have devoted their lives to the field wouldn’t have thought possible,” Mr. Shum adds. “Now, we have the opportunity to help our partners and customers use these breakthroughs to achieve their goals.”
For its part, OpenAI notes that moving to Azure is helping them scale higher and faster than before.
“Azure has impressed us by building hardware configurations optimized for deep learning,” the OpenAI explains. “They offer K80 GPUs with InfiniBand interconnects at scale. We’re also excited by their roadmap, which should soon bring Pascal GPUs onto their cloud.”
OpenAI promises to share the results of its research with everyone, and to release software that will allow anyone to run large-scale AI projects in the cloud. It will also be working with Microsoft to help improve Azure going forward as the firms’ collective understanding of AI evolves. “We’re looking forward to accelerating the AI community through this partnership,” OpenAI says.