CoinWorld reported:
At first glance, AI x Web3 seems to be independent technologies, each based on fundamentally different principles and serving different functions. However, a closer examination reveals that these two technologies have the potential to balance each other’s trade-offs, and their unique advantages can complement and enhance each other. Balaji Srinivasan eloquently expounded on this concept of complementary capabilities at the SuperAI Conference, sparking detailed comparisons of how these technologies interact with each other.
Token emerged from the decentralized efforts of the anonymous network punks, taking over a decade to evolve through the collaborative efforts of numerous independent entities. In contrast, artificial intelligence has been developed from a top-down approach, led by a few tech giants. These companies dictate the pace and dynamics of the industry, and the entry barriers are determined more by resource intensity than technological complexity.
These two technologies also have fundamentally different natures. Essentially, Token is a deterministic system that produces immutable outcomes, such as the predictability of hash functions or zero-knowledge proofs. This is in stark contrast to the probabilistic and usually unpredictable nature of artificial intelligence.
Similarly, encryption technology excels in verification, ensuring the authenticity and security of transactions and establishing trustless processes and systems. On the other hand, artificial intelligence focuses on generation, creating rich digital content. However, in the process of creating digital richness, ensuring the source of content and preventing identity theft becomes a challenge.
Fortunately, Token provides the contrasting concept of digital scarcity. It offers mature tools that can be extended to artificial intelligence technology to ensure the reliability of content sources and avoid identity theft issues.
One notable advantage of Token is its ability to attract a large amount of hardware and capital into coordinated networks to serve specific goals. This ability is particularly beneficial for resource-intensive artificial intelligence. Mobilizing underutilized resources to provide cheaper computing power can significantly improve the efficiency of artificial intelligence.
By comparing these two technologies, we can not only appreciate their respective contributions but also see how they jointly pave the way for new paths in technology and the economy. Each technology can compensate for the shortcomings of the other, creating a more integrated and innovative future. In this blog post, we aim to explore the emerging AI x Web3 industry landscape, focusing on some emerging verticals at the intersection of these technologies.
Source: IOSG Ventures
2.1 Computing Networks
The industry landscape first introduces computing networks, which aim to address the constrained GPU supply and attempt to reduce computing costs in different ways. The following are worth highlighting:
Non-uniform GPU interoperability: This is a very ambitious attempt with high technical risks and uncertainties. However, if successful, it could create significant scale and impact, making all computing resources interchangeable. Essentially, the idea is to build compilers and other prerequisites that allow any hardware resource to be inserted on the supply side, while the non-uniformity of all hardware on the demand side is completely abstracted, so that your computing requests can be routed to any resource in the network. If this vision succeeds, it will reduce the current dependence on CUDA software, which is completely dominated by AI developers. Although there are high technical risks, many experts are highly skeptical of this approach.
High-performance GPU aggregation: Integrating the world’s most popular GPUs into a distributed and permissionless network without worrying about the interoperability issues between non-uniform GPU resources.
Consumer-grade GPU aggregation: Targeting the aggregation of some lower-performance GPUs that may be available in consumer devices, which are the most underutilized resources on the supply side. It caters to those who are willing to sacrifice performance and speed for cheaper and longer training processes.
2.2 Training and Inference
Computing networks are mainly used for two main functions: training and inference. The demand for these networks comes from Web 2.0 and Web 3.0 projects. In the Web 3.0 field, projects like Bittensor utilize computing resources for model fine-tuning. In terms of inference, Web 3.0 projects emphasize the verifiability of the process. This focus has spawned the market vertical of verifiable inference, where projects are exploring how to integrate AI inference into smart contracts while maintaining decentralization principles.
2.3 Intelligent Agent Platforms
Next is the intelligent agent platform, where the industry landscape outlines the core problems that startups in this category need to solve:
Agent interoperability and discovery and communication capabilities: Agents can discover and communicate with each other.
Agent cluster building and management capabilities: Agents can form clusters and manage other agents.
Ownership and marketplace for AI agents: Providing ownership and marketplace for AI agents.
These features emphasize the importance of flexible and modular systems that can seamlessly integrate into various blockchain and AI applications. AI agents have the potential to fundamentally change the way we interact with the internet, and we believe that agents will leverage infrastructure to support their operations. We envision AI agents relying on infrastructure in the following aspects:
Accessing real-time internet data using a distributed crawling network
Using DeFi channels for inter-agent payments
Requiring economic deposits not only for punishment in case of misconduct but also to enhance agent discoverability (i.e., using deposits as economic signals in the discovery process)
Utilizing consensus to determine which events should lead to slashing
Open interoperability standards and agent frameworks to support building composable collectives
Evaluating past performance based on immutable data history and selecting appropriate agent collectives in real-time
Source: IOSG Ventures
2.4 Data Layer
In the fusion of AI x Web3, data is a core component. Data is a strategic asset in AI competition, along with computing resources. However, this category is often overlooked as most industry attention is focused on the computing layer. In reality, primitives offer many interesting value directions in the data acquisition process, primarily in the following two high-level directions:
Access to public internet data: This direction aims to build a distributed web crawling network that can crawl the entire internet in a matter of days to obtain massive datasets or access very specific internet data in real-time. However, to crawl a large dataset from the internet, the network demands are very high, requiring at least hundreds of nodes to start meaningful work. Fortunately, Grass, a distributed crawler node network, has already attracted over 2 million nodes actively sharing internet bandwidth, with the goal of crawling the entire internet. This demonstrates the huge potential of economic incentives in attracting valuable resources.
While Grass provides a fair competitive environment for public data, there still exists the challenge of accessing potentially valuable proprietary datasets, which are often protected due to their sensitive nature. Many startups are leveraging cryptographic tools to enable AI developers to build and fine-tune large language models using the underlying data structure of proprietary datasets while keeping sensitive information private. Technologies such as federated learning, differential privacy, trusted execution environments, fully homomorphic encryption, and multi-party computation provide different levels of privacy protection and trade-offs. Bagel’s research article (https://blog.bagel.net/p/with-great-data-comes-great-responsibility-d67) provides an excellent overview of these technologies. They not only protect data privacy in the machine learning process but also offer comprehensive privacy-preserving AI solutions at the computing layer.
2.5 Data and Model Provenance
Data and model provenance technologies aim to establish processes that can guarantee to users that they are interacting with the intended model and data, offering authenticity and source assurance. Taking watermarking technology as an example, it is one of the model provenance technologies that directly embeds signatures into machine learning algorithms, specifically in model weights, so that in retrieval, the inference can be verified to come from the expected model.
2.6 Applications
In terms of applications, the possibilities are endless. In the industry landscape above, we listed some exciting development cases in the Web 3.0 domain that are particularly anticipated with the advancement of AI technology. As these use cases are mostly self-descriptive, we will not provide additional comments here. However, it is worth noting that the intersection of AI and Web 3.0 has the potential to reshape many verticals in the field as these new primitives offer developers more freedom to create innovative use cases and optimize existing ones.
Source: IOSG Ventures
Summary
The fusion of AI x Web3 brings prospects full of innovation and potential. By leveraging the unique advantages of each technology, we can address various challenges and pave new paths in technology. The synergy between AI and Web3 can drive progress and reshape our future digital experiences and interactions on the internet.
The fusion of digital scarcity and digital richness, mobilization of underutilized resources for computing efficiency, and establishment of secure and privacy-preserving data practices will define the era of the next generation of technological evolution.
However, we must recognize that this industry is still in its early stages, and the current industry landscape may become outdated in a short period of time. The rapid pace of innovation means that today’s cutting-edge solutions may soon be replaced by new breakthroughs. Nevertheless, the foundational concepts discussed here—such as computing networks, agent platforms, and data protocols—highlight the tremendous potential of the fusion of artificial intelligence and Web 3.0.