Jason Calacanis, angel investor and host of the "All-In Podcast," has a new pitch: Bittensor, the AI crypto company and decentralized marketplace behind $TAO.
Calacanis, who was early to Uber and Airbnb, isn't just interested in this relatively obscure crypto. He's alarmingly bullish on it, telling "This Week in Startups" that it could go "200x" in 5 to 10 years.
At the time of writing, $TAO is trading around a $3.75 billion market cap, up from lows of $145 to its current $380, driven largely by Calacanis's out-of-nowhere endorsement. A 200x from here puts the market cap at roughly $760 billion. With a circulating supply of ~9.6 to 10.8 million TAO, that implies a price of $65,000 to $72,000 per token. So yeah, quite the call from Jason.
He's not alone in the enthusiasm. Nvidia CEO Jensen Huang called Bittensor's recent Covenant-72B training run a "pretty crazy technical accomplishment," and the underlying pitch is genuinely interesting: TAO runs an open-source protocol powering a decentralized, blockchain-based machine learning network where AI models train collaboratively and get rewarded in TAO based on the informational value they contribute. Instead of OpenAI and Anthropic training in isolation behind closed doors, users stake $TAO to subnets or validators, which function as votes determining where daily emissions flow to miners. Those miners, like their Bitcoin counterparts, do the actual work: running models, providing compute, generating outputs, competing to do it best.
Our analysts just identified a stock with the potential to be the next Nvidia. Tell us how you invest and we'll show you why it's our #1 pick. Tap here.
The miners front all the computing costs themselves, betting on TAO rewards on the back end. In the Templar (SN3) training run Jensen flagged, 70 contributors scattered across the globe used their own hardware to earn a share of TAO under the subnet's incentive mechanism. The result was Covenant-72B, a language model roughly the size of Meta's Llama 2 70B and performing in the same neighborhood as early GPT-4-class models on standardized benchmarks.
The kicker is how it got built. OpenAI, Anthropic, and Meta spent millions on comparable runs. Covenant-72B was trained by 70 strangers on the internet with their own GPUs and home internet connections.
If that trend has legs, Bittensor's trajectory starts to feel less speculative and more structural, especially as the cost of compute and inference continues to fall. Ironically, Google may have just accelerated that timeline by dropping TurboQuant on Wednesday.
One stock. Nvidia-level potential. 30M+ investors trust Moby to find it first. Get the pick. Tap here.