Dec. 18th 2024
The DeAI manifesto, introduced during a live event by DFINITY, emphasizes the importance of decentralized AI in creating a more open, secure, and accessible future. The manifesto’s seven core principles focus on safety, security, and verifiability, promoting a decentralized, responsible approach to AI.
On Monday the 16th of December, we tuned in to a live event on the DFINITY YouTube channel where the DeAI manifesto was introduced. The event focused on the importance of decentralized AI and outlined the steps the community is taking toward creating a more open, accessible, and secure future for artificial intelligence.
Four hosts were present.
These experts are just a small part of the broader community driving the DeAI manifesto forward. You can view the full list of participants on the manifesto page.
What is Onicai (On Internet Computer AI)
Onicai is a platform that integrates advanced AI tools to help users with tasks like content generation, enhancing productivity, and streamlining workflows. Onicai’s interface simplifies the process of using AI in your work, making it easier to foster creativity and tackle complex challenges.
The Seven Core Principles of the DeAI Manifesto:
- DeAI Is Safe AI
- DeAI Is Self-Sovereign AI
- DeAI Is Secure AI
- DeAI Is Accessible AI
- DeAI Is Participatory AI
- DeAI Is Responsible AI
- DeAI Is Verifiable AI
In August 2023, the DFINITY Foundation made a major leap by integrating Large Language Models (LLMs) with the Internet Computer. A milestone achievement was the deployment of llama.cpp, an open-source LLM, within an ICP canister. This demonstrated the feasibility of running sophisticated AI models directly on the blockchain. By integrating LLMs with ICP, decentralized applications can harness advanced AI capabilities without relying on centralized cloud services. Source: definity forum
In addition to LLM integration, the DFINITY Foundation has launched a $5 million grant to foster innovation in decentralized AI applications. This initiative supports the development of AI solutions that align with the principles of the DeAI manifesto and helps grow the broader AI ecosystem on the Internet Computer blockchain.
Source: prnewswire
DEAI (Decentralized AI) stands apart from traditional AI models in the crypto space in several fundamental ways, particularly in terms of computation mechanisms and the paradigm of control, ownership, and interaction. Let’s explore the distinctions:
Core Philosophy: Decentralization vs. Centralization
Traditional AI Models in Crypto
These often rely on centralized AI computation, even when incorporating decentralized components like tokenized economies or distributed GPU networks. For example:
- Rendering/Compute Sharing Projects: Platforms like Render or Bittensor focus on pooling GPU or computational resources. However, the AI models themselves remain centrally trained and owned, with distributed resources primarily used to enhance efficiency or scalability.
- Token-Ledger AI Plugins: In configurations where tokens are integrated with AI (e.g., ChatGPT plugins), computation generally remains centralized—conducted by proprietary models like those of OpenAI—with crypto mechanisms serving as a payment layer.
DeAI
True DEAI emphasizes decentralizing the training, ownership, and governance of AI models. This approach enables models to be collaboratively built, trained, and owned by the community. It leverages blockchain for verifiability and uses decentralized computation networks for execution.
The Impact of Decentralized AI (DEAI) on the Cycle Burn Rate on ICP
DEAI’s integration into the Internet Computer Protocol (ICP) could significantly influence the cycle burn rate, primarily through the capability to perform AI inference directly within canister smart contracts. Here’s a closer look:
Inference Intensity
AI inference, which involves running trained models on new data in real-time, is highly compute-intensive. By enabling inference within canister smart contracts, ICP facilitates:
- Faster, more seamless AI interactions.
- Decentralized hosting of AI services that previously relied on external APIs or centralized infrastructure.
As developers increasingly deploy AI-powered applications (e.g., chatbots, recommendation systems, image recognition) on ICP, the demand for computational resources—and therefore cycle consumption—will rise substantially.
Increased Developer Activity
Running DEAI workloads natively on ICP is likely to attract AI developers and blockchain enthusiasts eager to integrate inference with decentralized infrastructure.
- More developers deploying and running AI systems → Higher demand for canisters consuming cycles.
This surge in activity enhances both the cycle burn rate and ICP’s tokenomics, as developers must purchase ICP to convert into cycles.
Exponential Growth Potential
The adoption of DEAI on ICP could drive exponential growth in the cycle burn rate. Each inference operation, due to its compute-intensive nature, will contribute to higher resource consumption per interaction, further accelerating the system’s economic dynamics.
Caffeine.ai: Simplifying AI Deployment
A hot topic raised during the live event: Caffeine.ai. Patrick commented on how it is an accessible AI with a single click to have your personal AI solution.
Caffeine.ai potentially allows developers to quickly deploy large language models (LLMs) in a decentralized environment. Caffeine.ai’s approach keeps both state and computation within a single platform, improving security and efficiency. This is an advanced approach compared to protocols like Render which uses a distributed network of GPUs and CPUs that perform the actual computational tasks and has data storage on external cloud services or off-chain infrastructure.
Decentralized AI offers several advantages. One example is security. In traditional AI platforms, sensitive data like strategic business decisions or employee information is often stored in centralized cloud servers, creating potential targets for cyberattacks. Decentralized AI, on the other hand, distributes data across multiple canisters, making it much harder for malicious actors to compromise the system.
The concept of Proof of Inference is crucial for the cost structure of decentralized AI. It ensures that the results generated by AI models are verifiable and trustworthy.
Proof of inference in blockchain technology ensures that the results of AI computations are accurate and trustworthy. It involves verifying that the AI model’s output is valid and consistent with the data used to generate it. This proof is stored on the blockchain, making it transparent and tamper-proof. In decentralized systems, it helps ensure that AI outputs can’t be manipulated by any single party.
Another important topic discussed during the event was safety. Kyle cited the example of centralized AI systems like Coca-Cola utilizing a proprietary GPT-2 model hosted on Google’s cloud. In a centralized system, data is often stored and processed in one location, making it more vulnerable to breaches. In contrast, a decentralized system distributes data across multiple nodes, offering a much higher level of security—particularly for sensitive information such as proprietary business strategies.
For instance, if someone hacks Coca-Cola’s model, a third party gains access to employee data, customer information, internal policies, and potential industrial secrets.
Decentralized AI presents a solution to this problem. Although it requires time, effort, patience, and investment, the benefits of implementing such a system are significant.
The DeAI manifesto presented is a step toward reshaping the future of artificial intelligence. By emphasizing the principles of safety, security, accessibility, and decentralization, we can ensure that AI is developed responsibly, empowering individuals and organizations alike. If you support the vision for a decentralized AI future, we encourage you to sign the manifesto and join the movement.