The two companies said Tuesday that an unspecified number of engineers are collaborating to make Facebook’s open source machine learning PyTorch framework work with Google’s custom computer chips for machine learning, dubbed Tensor Processing Units, or TPU. The collaboration marks one of the rare instances of the technology rivals working together on joint tech projects.
“Today, we’re pleased to announce that engineers on Google’s TPU team are actively collaborating with core PyTorch developers to connect PyTorch to Cloud TPUs,” Google Cloud director of product management Rajen Sheth wrote in a blog post. “The long-term goal is to enable everyone to enjoy the simplicity and flexibility of PyTorch while benefiting from the performance, scalability, and cost-efficiency of Cloud TPUs.”
Facebook product manager for artificial intelligence Joseph Spisak said in a separate blog post that “Engineers on Google’s Cloud TPU team are in active collaboration with our PyTorch team to enable support for PyTorch 1.0 models on this custom hardware.”
Google first debuted its TPUs in 2016 during its annual developer conference, and pitched them as a more efficient way for companies and researchers to power their machine-learning software projects. The search giant sells access to its TPUs via its cloud computing business instead of selling the chips individually to customers like Nvidia, whose graphics processing units, or GPUs, are popular with researchers working on deep learning projects.
Artificial intelligence technologies like deep learning have grown in popularity over the years with tech giants like Google and Facebook that use the technologies to create software applications that can automatically do tasks like recognize images in photos.
As more businesses explore machine learning technology, companies like Google, Facebook, and others have created their own AI software frameworks, essentially coding tools, intended to make it easier for developers to create their own machine-learning powered software. These companies have also offered these AI frameworks for free in an open source model in order to popularize them with coders.
For the past few years, Google has been courting developers with its so-called Tensorflow framework as the preferred coding tools for AI projects, and it developed its TPUs to work best with Tensorflow. The fact that Google is willing to update its TPUs to work with Facebook’s PyTorch software shows that the company wants to support more than its own AI framework and potentially gain more cloud computing customers and researchers who may use competing frameworks.
Get Data Sheet, Fortune’s technology newsletter.
“Data scientists and machine learning engineers have a wide variety of open source tools to choose from today when it comes to developing intelligent systems,” said Information Services Group principal analyst Blair Hanley Frank. “This announcement is a critical step to help ensure more people have access to the best hardware and software capabilities to create AI models.”
Frank said that he expects “more collaboration like this to crop up in the AI market.”
“Expanding framework support can help cloud providers like AWS, Google and Microsoft drive additional usage of their platforms.” Frank said. “That means it makes sense for them to support as broad a set of development tools as possible, to try and attract as many customers as they can.”
Besides Google, Facebook also said that Amazon and Microsoft are “deepening their investment” in its PyTorch software.
This content was originally published here.