Podcasts & Videos
Channel logo

Opentensor Foundation

From Bitcoin to Bittensor :: Building Open Markets for Intelligence // Tsinghua university, Beijing

March 9, 2026
33:15
Published
March 9, 2026
Duration
33:15

AI summary

Here’s the structured summary of the video content with a focus on Bittensor and decentralized AI:


KEY TAKEAWAYS

  • Incentive Computing Paradigm: Bittensor founder Jacob Steeves frames Bitcoin as the world’s largest "incentive computer," leveraging decentralized hash production as a blueprint for general-purpose optimization. Bittensor extends this to AI by creating markets for intelligence (e.g., coding, gradients, robotics).
  • Decentralized AI Labs: Bittensor enables permissionless, borderless AI development. Example: A subnet (likely Cortex.t, though unnamed) evolved a 7,000-line coding AI outperforming Claude/OpenAI on Swebench without centralized engineering—miners earned $60K/day.
  • Distributed Model Training: Subnets like Gradients (Subnet 56) train a 70B-parameter model via decentralized gradient contributions, challenging centralized AI labs’ compute monopolies.
  • Physical-Digital Hybrid Use Cases: Bittensor optimizes GPU rentals (cheapest rates globally), stock trading signals, sports betting, and robotics (e.g., drone control models).
  • Market Efficiency: Bitcoin’s 450,000 exaflops (vs. centralized providers’ 1,000) exemplify incentive computing’s scalability. Bittensor replicates this for AI, avoiding corporate overhead (HR, bias).
  • Open vs. Closed AI: Critiques OpenAI’s closed model, advocating Bittensor’s transparent, participatory alternative where users own/contribute to AI assets.

SUMMARY
Jacob Steeves traces AI’s evolution from pre-2012 handmade feature engineering to today’s self-learning systems, drawing parallels to Bitcoin’s decentralized compute market. Bittensor generalizes Bitcoin’s incentive mechanics to create "markets for intelligence," where miners compete to provide value (coding, gradients, GPU compute) and validators reward performance with tokens. Projects like Cortex.t (coding AI) and Gradients (distributed training) demonstrate decentralized AI’s viability, while subnets for robotics and trading highlight its versatility. Steeves positions Bittensor as a counter to centralized AI monopolies, emphasizing ownership transparency and permissionless participation.

ALPHA SIGNALS

  • Catalysts: Launch of Gradients’ 70B model (weeks away), Cortex.t’s Swebench dominance, and GPU subnet’s cost efficiency.
  • Tokenomics: Dynamic Tao mechanism (liquidity rewards for high-performing subnets) may drive TAO demand.
  • Risks: Centralization critiques (e.g., OpenAI parallels if validator power consolidates).
  • Opportunity: Undervalued subnets like Gradients (Subnet 56) and Cortex.t (unlisted) could gain prominence.

TECHNICAL DEEP DIVE

  • Architecture: Bittensor’s L1 blockchain coordinates subnets (specialized markets) where miners (providers) and validators (evaluators) interact via consensus rules.
  • Innovation: Gradient mining (Subnet 56) uses loss reduction as the reward metric—a novel decentralized training primitive.
  • Performance: Coding subnet miners iteratively improve AI agents via competitive benchmarking (Swebench), achieving SOTA without explicit engineering.

ECOSYSTEM IMPACT

  • Competition: Challenges centralized AI (OpenAI, Anthropic) by lowering entry barriers for contributors/owners.
  • Validator Economics: High rewards ($60K/day for top miners) attract talent but risk short-term speculation.
  • Regulatory Edge: Permissionless design may face scrutiny but aligns with crypto’s ethos of open access.

ACTION ITEMS

  • Monitor: Gradients’ 70B model release, Cortex.t’s benchmark updates.
  • Research: Dynamic TAO’s liquidity mechanics (Subnet 120).
  • Engage: Bittensor’s Discord/GitHub for subnet development.
  • Watch: GPU subnet’s price war vs. centralized providers (Lambda Labs, etc.).

No links are included per guidelines. Focus remains on Bittensor’s decentralized AI primitives and subnets, with contextual references to Bitcoin’s incentive model.

From Bitcoin to Bittensor :: Building Open Markets for Intelligence // Tsinghua university, Beijing | Podcast Analytics | Backprop Finance