CoinWorld reported:
At first glance, AI x Web3 seems to be independent technologies, each based on fundamentally different principles and serving different functions. However, a closer look reveals that these two technologies have the opportunity to balance each other’s trade-offs, and their unique advantages can complement and enhance each other. Balaji Srinivasan articulated this concept of complementary capabilities at the SuperAI Conference, sparking a detailed comparison of how these technologies interact with each other.
Token emerged from the decentralized efforts of the anonymous network punks and has evolved over the past decade through the collaborative efforts of numerous independent entities. In contrast, artificial intelligence has been developed from a top-down approach, dominated by a few tech giants. These companies dictate the pace and dynamics of the industry, and the barrier to entry is determined more by resource intensity than technical complexity.
These two technologies also have fundamentally different natures. Essentially, Token is a deterministic system that produces immutable results, such as the predictability of hash functions or zero-knowledge proofs. This contrasts sharply with the probabilistic and often unpredictable nature of artificial intelligence.
Similarly, encryption technology excels in verification, ensuring the authenticity and security of transactions and establishing trustless processes and systems, while artificial intelligence focuses on generation and creating rich digital content. However, in the process of creating digital abundance, ensuring the source of content and preventing identity theft becomes a challenge.
Fortunately, Token provides the contrasting concept of digital scarcity. It offers mature tools that can be extended to artificial intelligence technologies to ensure the reliability of content sources and prevent identity theft.
One notable advantage of Token is its ability to attract a large amount of hardware and capital into coordinated networks to serve specific goals. This ability is particularly beneficial for resource-intensive artificial intelligence. Mobilizing underutilized resources to provide cheaper computing power can significantly improve the efficiency of artificial intelligence.
By comparing these two technologies, we can not only appreciate their individual contributions but also see how they can jointly create new paths for technology and the economy. Each technology can compensate for the shortcomings of the other, creating a more integrated and innovative future. In this blog post, we aim to explore the emerging industry landscape of AI x Web3, focusing on some emerging verticals at the intersection of these technologies.
Source: IOSG Ventures
2.1 Computing Networks
The industry landscape first introduces computing networks, which aim to address the limited supply of GPUs and attempt to reduce computing costs in different ways. The following are worth noting:
Non-uniform GPU interoperability: This is a very ambitious attempt with high technological risks and uncertainties. However, if successful, it could create significant scale and impact, making all computing resources interchangeable. Essentially, the idea is to build compilers and other prerequisites that allow any hardware resource to be plugged in on the supply side, while the non-uniformity of all hardware is completely abstracted on the demand side, so that your computing requests can be routed to any resource in the network. If this vision succeeds, it will reduce the current dependence on CUDA software, which is completely dominated by AI developers. Although there are high technological risks, many experts are highly skeptical of this approach’s feasibility.
High-performance GPU aggregation: Integrating the world’s most popular GPUs into a distributed and permissionless network without worrying about interoperability issues between non-uniform GPU resources.
Consumer-grade GPU aggregation: Targeting the aggregation of lower-performance GPUs that may be available in consumer devices, which are the most underutilized resources on the supply side. It caters to those who are willing to sacrifice performance and speed for cheaper and longer training processes.
2.2 Training and Inference
Computing networks are primarily used for two main functions: training and inference. The demand for these networks comes from Web 2.0 and Web 3.0 projects. In the Web 3.0 field, projects like Bittensor utilize computing resources for model fine-tuning. In terms of inference, Web 3.0 projects emphasize the verifiability of the process. This emphasis has given rise to verifiable inference as a market vertical, where projects are exploring how to integrate AI inference into smart contracts while maintaining decentralization principles.
2.3 Intelligent Agent Platforms
Next is intelligent agent platforms, and the landscape outlines the core problems that startups in this category need to address:
Agent interoperability and discovery and communication capabilities: Agents should be able to discover and communicate with each other.
Agent cluster building and management capabilities: Agents should be able to form clusters and manage other agents.
Ownership and market for AI agents: Providing ownership and market for AI agents.
These features highlight the importance of flexible and modular systems that can seamlessly integrate into various blockchain and artificial intelligence applications. AI agents have the potential to fundamentally change the way we interact with the internet, and we believe agents will rely on infrastructure to support their operations. We envision AI agents relying on infrastructure in the following ways:
Utilizing distributed crawling networks to access real-time web data.
Using DeFi channels for inter-agent payments.
Requiring economic deposits not only for punishment in case of misconduct but also to enhance agent discoverability (using deposits as economic signals during discovery).
Utilizing consensus to decide which events should lead to slashing.
Open interoperability standards and agent frameworks to support the construction of composable collectives.
Evaluating past performance based on immutable data history and selecting the appropriate collective of agents in real-time.
Source: IOSG Ventures
2.4 Data Layer
In the fusion of AI x Web3, data is a core component. Data is a strategic asset in the AI competition and, together with computing resources, forms a critical resource. However, this category is often overlooked as most of the industry’s attention is focused on the computing layer. In reality, primitives offer many interesting value directions in the process of data acquisition, mainly including the following two high-level directions:
Access to public internet data: This direction aims to build a distributed web crawling network that can crawl the entire internet and acquire massive datasets in a matter of days or access very specific internet data in real-time. However, to crawl a large dataset from the internet, there is a high demand for network resources, requiring at least hundreds of nodes to start meaningful work. Fortunately, Grass, a distributed crawling node network, already has over 2 million nodes actively sharing internet bandwidth with the goal of crawling the entire internet. This demonstrates the enormous potential of economic incentives in attracting valuable resources.
Although Grass provides a fair competitive environment for public data, there still exists the challenge of utilizing potentially valuable data, namely the access to proprietary datasets. Specifically, there is still a significant amount of data that is protected due to its sensitive nature. Many startups are leveraging cryptographic tools to allow AI developers to build and fine-tune large language models using the underlying data structures of proprietary datasets while keeping sensitive information confidential.
Technologies such as federated learning, differential privacy, trusted execution environments, fully homomorphic encryption, and multi-party computation provide different levels of privacy protection and trade-offs. Bagel’s research article (https://blog.bagel.net/p/with-great-data-comes-great-responsibility-d67) provides an excellent overview of these technologies. These technologies not only protect data privacy in the machine learning process but also enable comprehensive privacy-preserving AI solutions at the computing level.
2.5 Data and Model Sourcing
Data and model sourcing technologies aim to establish processes where users can be assured that they are interacting with the expected models and data. Additionally, these technologies provide guarantees of authenticity and provenance. Taking watermarking technology as an example, watermarking is one of the model sourcing technologies that directly embeds signatures into machine learning algorithms, specifically into model weights, so that during retrieval, the inference can be verified if it comes from the expected model.
2.6 Applications
In terms of applications, the possibilities are limitless. In the industry landscape above, we list some development cases that are particularly exciting as AI technologies converge with Web 3.0. As these use cases are mostly self-explanatory, we will not provide additional comments here. However, it is worth noting that the convergence of AI and Web 3.0 has the potential to reshape many verticals in the field, as these new primitives provide developers with more freedom to create innovative use cases and optimize existing ones.
Source: IOSG Ventures
Summary
The fusion of AI x Web3 brings a promising future full of innovation and potential. By leveraging the unique advantages of each technology, we can address various challenges and pave the way for new technological paths. In exploring this emerging industry, the synergy between AI and Web3 can drive progress and reshape our future digital experiences and interactions on the web.
The fusion of digital scarcity and digital abundance, the mobilization of underutilized resources for computational efficiency, and the establishment of secure and privacy-preserving data practices will define the era of the next-generation technology evolution.
However, we must recognize that this industry is still in its early stages, and the current industry landscape may become outdated in a short period of time. The rapid pace of innovation means that today’s cutting-edge solutions may soon be replaced by new breakthroughs. Nevertheless, the foundational concepts discussed, such as computing networks, agent platforms, and data protocols, highlight the tremendous potential of the fusion of artificial intelligence and Web 3.0.
Related Posts
Add A Comment
© 2025 Bull Run Flash All rights reserved.