Lightning AI, the startup behind the open source PyTorch Lightning framework, today announced that it raised $40 million in a Series B round led by Coatue with participation from Index Ventures, Bain, and First Minute Capital. CEO William Falcon told TechCrunch that the new money will be used to expand Lightning AI’s 60-person team while supporting the community around PyTorch Lightening development.
Lightning AI, formerly Grid.ai, is the culmination of work that began in 2018 at the New York University Computational Intelligence, Learning, Vision, and Robotics (NYU CILVR) Lab and Facebook AI Research (now Meta AI Research). After Falcon started developing PyTorch Lightning as an undergrad at Columbia in 2015, he founded Lightning AI in 2019 with Luis Capelo, the former head of data products at Forbes. While working on his Ph.D. at NYU and Facebook AI Research, Falcon open-sourced PyTorch Lightning, and — according to him — the project quickly gained traction.
“[W]e realized that the biggest challenge holding back AI adoption at scale was fragmentation of the AI ecosystem,” Falcon said. “I first noticed the impact of the fragmented AI ecosystem back in 2019. Because AI adoption at scale was and still is so nascent, every few months we would discover a ‘missing’ part of the machine learning stack. Why this matters is that every missing piece of the puzzle slows down the overall pace of AI innovation … Just getting a model to the point where it can be pushed into production takes hundreds if not thousands of developer hours spent purely on infrastructure.”
AI dev framework
With PyTorch Lightning, Falcon sought to decouple R&D workflows from engineering, making AI experiments easier to read and reproduce (in theory). PyTorch Lightning provides a high-level interface for PyTorch, Facebook’s popular deep learning framework, abstracting away the code normally required to set up and maintain AI systems.
PyTorch Lightning includes a collection of tools relevant to machine learning, like workflow scheduling for distributed computing (i.e., spreading workloads across multiple machines) and managing and provisioning infrastructure through code. A gallery of AI apps curated by the Lightning team is available to be used or further built upon, as is a library of components that add capabilities to AI apps such as extracting data from streaming video.
Apps built using PyTorch Lightning can run on private cloud infrastructure or on-premises environments. Alternatively, Lightning AI provides a hosting platform for deploying and monitoring apps on the cloud.
“The vision I’ve been pursuing since my time at NYU has always been to build something like an operating system for AI that allowed all the disparate pieces of the AI ecosystem to work together,” Falcon said. “You don’t have to know anything about the internal combustion engine to drive to the grocery store; why should you have to know about [containers], cloud infrastructure, distributed file systems, and fault-tolerant training to simply bring your AI project to life? Current solutions hand you the disparate pieces of a working car and hope that you’ll be able to assemble them into something that you can use to take a drive.”
To this end, Lightning’s app gallery contains ready-to-deploy AI models designed to perform tasks like diagnosing cancer in pets, running workloads on the cloud, and spinning up cloud AI projects. Falcon sees the Lightning app gallery as a way to launch “multi-cloud, distributed” AI systems at “enterprise scale,” and, moreover, as a building block for the “next generation” of applied AI startups.
“We’ve found that many startups launched in the last two years could have been built as Lightning apps,” Falcon said. “We think of ourselves as the Apple App store but for AI. Some of the world’s biggest companies first launched as Apple Apps: we’re doing the same for AI.”
A budding ecosystem
PyTorch Lightning and Lightning apps slot into the category of MLOps software, which supports the AI lifecycle by orchestrating experimentation, AI model training, model deployment into production, and model tracking. There’s high interest in MLOps. In a recent survey, Forrester found that 73% of companies believe MLOps adoption would keep them competitive while 24% say it would make them an industry leader. Deloitte predicts that MLOps will be a $4 billion market by 2025.
Plenty of alternatives to PyTorch Lightning exist — PyTorch Ignite and Fast.ai among them. But Falcon points to metrics as evidence that Lighting AI’s project pulled ahead of the pack: to date, PyTorch Lightning has been downloaded four million times and is used by an estimated 10,000 companies in production. In 2019, NeurIPS, one of the world’s largest AI conferences, adopted PyTorch Lightning as a standard for submitting PyTorch code.
“[With PyTorch Lightning,] companies can understand what the systems they build are doing, and even loop in non-engineer team members — from compliance, for example — to help eliminate the business risk of a system that bankrupts the firm or becomes racist on Twitter,” Falcon said. “Companies that build solutions with Lightning AI won’t be locked into any specific cloud provider or hardware vendor … Finally, Lightning AI is powered by the open source AI community, which means companies can leverage the latest and greatest open source tools without spending months integrating any of them.”
Despite its origins in open source, Lightning AI could be perceived as competing with startups like Hugging Face, which also provided AI app hosting services. Companies including Comet, Iterative, Weights & Biases, and InfuseAI offer a comparable mix of paid and free MLOps solutions.
Falcon isn’t concerned about the competition, though. He claims that Lightning AI’s revenue trajectory is on track to be cash-flow positive within the next 36 months; the company primarily makes money through its fully managed PyTorch Lightning product. While he wouldn’t commit to hiring plans, Falcon noted that Lightning AI doubled its team to 60 people in the past year.
“I grew up in an inflation-ravaged Venezuela, during some of the darkest days of the financial markets. As a result, I always err on the side of caution and ensure that we’re well-capitalized for potential long-term drawdowns,” Falcon said. “While many anticipate a recession to shake our economy over the next few years, we’ve maintained our burn low and raised enough capital to provide us with a runway of at least several years.”