During today’s Web3 Summit in Berlin, SubQuery made a significant announcement, introducing its latest breakthrough: decentralized AI inference hosting. During a live demonstration, James Bayly, the COO of SubQuery, showcased the seamless operation of the latest LLama model on SubQuery’s internal test network. The network, consisting of a fully decentralized network of Node Operators, proved efficient and reliable.
SubQuery has a clear vision of empowering developers to shape the future by embracing decentralization. The company is in charge of developing the next generation of Web3 applications for a broad user base, with decentralization as its guiding principle.
The SubQuery Network serves as a sophisticated infrastructure layer that supports this vision. It currently supports decentralized data indexers and RPCs, essential components for developers creating decentralized applications (dApps). SubQuery has established itself as a trustworthy alternative to centralized services, providing an inclusive network that allows anyone to join as a node operator or delegator.
The growing recognition of AI’s impact on various industries, including Web3, is becoming more evident. SubQuery has been closely following these developments and diligently working behind the scenes to integrate AI capabilities into its decentralized platform. The Web3 Summit in Berlin provides an ideal platform for us to unveil this new capability and showcase it in action,” stated James Bayly.
SubQuery is focused on AI inference, the process of using pre-trained models to make predictions on new data, rather than on model training. “While there are commercial services that offer inference hosting for custom models, few exist within the Web3 space,” James explained. “Our decentralized network is ideally suited for reliable, long-term AI model hosting.”
Currently, the market for AI inference is dominated by large centralized cloud providers who charge high fees and often use user data to improve their proprietary models. “Providers like OpenAI and Google Cloud AI are not only expensive but also leverage your data to enhance their closed-source offerings,” James noted. SubQuery is committed to providing an affordable, open-source alternative for hosting production AI models. “Our goal is to make it possible for users to deploy a production-ready LLM model through our network in just 10 minutes,” he added.
“Relying on closed-source AI models risks consolidating power in the hands of a few large corporations, creating a cycle that perpetuates their dominance,” James warned. “By running AI inference on a decentralized network, we ensure that no single entity can control or exploit user data. Prompts are distributed across hundreds of node operators, ensuring privacy and supporting an open-source ecosystem.”
The SubQuery Network aims to offer cutting-edge hosting for the most up-to-date open-source AI models, making scalable and easily accessible AI services available for Web3. SubQuery aims to empower a diverse network of independent node operators by embracing a community-driven approach that enables decentralized AI inference at scale.