Flower Labs is announcing today its $20M Series A to accelerate the mainstream adoption of federated and decentralized AI. This new AI paradigm being led by Flower is challenging conventional GPU datacenter driven methods as it can utilize larger amounts of training data, and be less dependent on GPUs -- while also being more compliant with upcoming AI regulation. Flower is the leading open-source community and ecosystem for training AI on distributed data, and this round is the largest investment round to date in decentralized machine learning alternatives to existing centralized systems. The Series A is led by Felicis, and with this new round investors in Flower Labs include First Spark Ventures, Factorial Capital, Beta Works, Y Combinator, Pioneer Fund, Mozilla Ventures and notable angels such as Hugging Face CEO Clem Delangue and GitHub co-founder Scott Chacon. The investor group includes many with a strong track record of building and investing in open-source community initiatives like Flower.
Flower is at the forefront of the democratization of federated learning and decentralized AI more broadly. Early adopters of the Flower framework include dozens of Fortune 500 and Fortune Global 500 companies, including Samsung and Nokia Bell Labs, alongside technology innovators such as Brave and Banking Circle. Flower Labs is known for having one of the strongest technical teams in the domain, with a full spectrum of the skills necessary – including distributed systems, machine learning and systems engineering – to make decentralized AI turn-key and accessible to the wider community.
The Flower Mission
The mission of Flower is to cause a fundamental shift in how AI models are trained and used. The current de facto standard in AI is for “centralized training” that requires large-scale data collection in the cloud. But there are proven decentralized alternatives, like federated learning, that offer a radically different approach. Under federated learning, data remains where it naturally arises, within the source – such as a company, a factory line, car or even a smartphone. AI training computation is then performed on the data, at this original location; instead of data itself being collected, only the results of this computation is transferred from each location. As a result, AI systems can be designed not only to protect data privacy but with a flexible, decentralized architecture that goes beyond what is possible with a centralized datacenter approach. Using Flower, AI can safely leverage otherwise inaccessible training data (e.g., data spread across millions of corporate desktops); building AI also becomes easier and even faster, as data does not need to be collected any more, and can still be leveraged for training within an organization (or between organizations) while protecting privacy; finally, Flower also enables AI that is compliant with emerging regulations, by providing more control over how distributed data can be accessed for training, or even in which countries training is performed.
More than 1000 open-source projects are already built on-top of the Flower framework, and significant collaborations and/or code contributions have occurred with large corporations such as Intel and Bosch Research. By working closely with a community of over 3000 open-source developers, Flower is the leading ecosystem for advances in the rapidly evolving federated and decentralized machine learning field. Top universities have research labs using Flower, including MIT, Stanford, Harvard, Berkeley, Oxford and Cambridge – and many contribute their implementations back to the Flower community for broader usage. This virtuous cycle was further reinforced by the Flower Summer of Reproducibility initiative that distributed up to $100k for groups and individuals contributing new solutions to the platform. In March, members of the Flower community will be converging on London for the annual Flower AI Summit where hundreds researchers and developers discuss the latest advances in various forms of decentralized AI.
“Flower’s novel approach to federated machine learning will make model training more secure, safer, and friendly to enterprises of all sizes,” says Niki Pezeshki, General Partner at Felicis, “Top universities and multi-national companies already use Flower’s open source technology, and the team is committed to making federated learning accessible and efficient for a wide range of users and applications. We’re confident that Flower will play a pivotal role in shaping the future of AI, and we're proud to be part of this journey."
Flower has quickly progressed to this Series A, and is only nine months removed from raising a $3.6M pre-seed round led by First Spark Ventures after completing the Y-Combinator winter 2023 batch. This new round of funding will allow Flower to build a platform that will work alongside the open-source framework to even further simplify federated AI solutions. Furthermore, large language models (LLMs), and Generative AI more broadly, represent new opportunities for federated deployments; while the FedGPT technology recently developed by Flower offers a range of enabling solutions, further investment will enable a faster rollout to a wider set of use cases.
Train Different
The aspiration for Flower is to change the way the world approaches AI. By simplifying usage of decentralized technologies, like federated learning, a range of advantages over centralized alternatives will be unlocked. Chief among these will be the safe access it provides to large volumes of distributed data; such as the data present in hospitals, in corporations, in production lines, in cars and phones. Such data remains relatively untapped, and will be a catalyst for progress in a range of AI applications domains – such as health care, manufacturing, finance and automotive domains. In this next step for AI, Flower is poised to be the critical open-source framework and ecosystem as the AI software evolves to support this new generation of decentralized systems.
– Daniel, Taner and Nic