Business Interest in AI Continues to Rise, Recent Study poll Nearly two-thirds of companies plan to increase or maintain spending on artificial intelligence and machine learning this year, the study found. But these companies often face obstacles in deploying various forms of artificial intelligence into production.
Inspired by the challenge, Salman Avestimehr, founding director of the USC-Amazon Center for Trusted Machine Learning, co-founded a startup that helps companies train, deploy, monitor, and improve AI models in the cloud or at the edge.is called federated machine learningwhich raised $11.5 million in seed funding at a $56.5 million valuation, led by Camford Capital with participation from Road Capital and Finality Capital.
“Many businesses are eager to train or fine-tune custom AI models on company-specific or industry data so they can use AI to address a range of business needs,” Avestimehr told TechCrunch in an email interview. “Unfortunately, custom AI models are cost-prohibitive to build and maintain due to high data, cloud infrastructure, and engineering costs. Additionally, the proprietary data used to train custom AI models is often sensitive, regulated or isolated.”
Avestimehr claims that FedML overcomes these barriers by offering a “collaborative” AI platform that allows companies and developers to work together on AI tasks by sharing data, models, and computing resources
FedML can run any number of custom AI models or models from the open source community. Using the platform, customers can create a group of collaborators and automatically sync AI applications between their devices, such as PCs. Collaborators can add devices for AI model training, such as servers or even mobile devices, and track training progress in real time.
More recently, FedML introduced FedLLM, a training pipeline for building “domain-specific” large language models (LLMs) on proprietary data, similar to OpenAI’s GPT-4. Compatible with popular LLM libraries like Hugging Face and Microsoft DeepSpeed, FedLLM is designed to increase the speed of custom AI development while preserving security and privacy, Avestimehr said. (To be clear, the jury is still out on whether it actually achieved that goal.)
In this way, FedML isn’t much different from other MLOps platforms — “MLOps” refers to tools used to simplify the process of putting AI models into production, and then maintaining and monitoring them. These include Galileo and Arize, as well as Seldon, Qwak and Comet (to name a few). Incumbents like AWS, Microsoft, and Google Cloud also offer MLOps tools in some form (see: SageMaker, Azure Machine Learning, etc.)
But FedML’s ambitions go beyond developing tools for artificial intelligence and machine learning models.
According to Avestimehr, the goal is to build a “community” of CPU and GPU resources to host and serve models once they are ready for deployment. Details have yet to be determined, but FedML intends to incentivize users to contribute computation to the platform through tokens or other types of compensation.
Distributed, decentralized computing for AI model serving is not a new idea — Gensys, Run.AI, and Petals have all tried it and are experimenting with it. Nonetheless, Avestimehr believes that by combining this computing paradigm with the MLOps suite, FedML can achieve greater impact and success.
“FedML enables developers and businesses to build large-scale, proprietary and private LLMs for custom AI models at lower cost,” said Avestimehr. “What sets FedML apart is its ability to be trained anywhere , deploy, monitor, and improve ML models, and collaborate on combining data, models, and computations, dramatically reducing costs and time-to-market.”
In his view, FedML has 17 employees, about 10 paying customers, including a “Tier 1” auto supplier, and a total funding total of $13.5 million (including new funding). Avestimehr claims the platform is used by more than 3,000 users worldwide and has performed more than 8,500 training jobs on more than 10,000 devices.
“For data or technology decision makers, FedML makes custom, affordable AI and large language models a reality,” said Avestimehr with confidence. “Building custom alternatives is an easy best practice to implement thanks to the foundation of federated learning techniques, MLOps platforms, and collaborative AI tools that help developers train, serve, and observe custom models.”