The Future is Decentralized Verifiable Compute
The future is decentralized compute. More specifically, decentralized verifiable compute.
You might be wondering, "What is that? And why should I care?""
Before I go into the "what", first let's review at a high level three trends/predictions.
If you believe them to be true, or at least highly probably, then you should read on.
The three trends are:
- DeFi or decentralized finance will expand across all financial assets, and that's including cryptocurrencies and NFT.
- Smart contracts will govern pretty much how every industry works in some way or fashion.
- The metaverse will be about extreme edge computing, much more so than VR.
If you believe those three things are interesting, then you should read on.
What is decentralized verifiable compute?
Let's break it down into defining decentralized and verifiable .
What is decentralized compute?
The definition of "decentralized" can vary depending on whom you speak to, and sometimes have philosophical or polemical differences between them. The ethos and technical design draw from blockchain technology.
My working definition is simple: decentralized compute is compute infrastructure not controlled by a single business entity.
AWS, as an example in contrasts, is centralized compute. Even though you, as an individual, can rent an instance, AWS controls it. If AWS makes a decision to cut off access to your instance, they can do so.
One could argue that, if one had two nodes -- one in Amazon Web Services (AWS) and one in Microsoft Azure or Google Cloud -- the computing resources are less centralized. But this wouldn't mean you had set up decentralized system.
One decentralized computing litmus test is the degree to which the services hosted on the compute are censorship resistant. In the same way that the Bitcoin blockchain network is largely censorship resistant, meaning, a single entity or even a single country couldn't take it down. Nevertheless, I realize people argue that the Bitcoin network is now centralized. This is not the debate or the point, although I realize at some point we need to revisit it with a fineer point.
I offer an alternative nomenclature and definition later.
The key: compared to a single compute provider, decentralized compute is "lots" of resources controlled by different entities geographically dispersed.
What does it mean to be verifiable?
To define this, let's look at a typical execution run on AWS that is not verifiable.
When you do run code on a remote server you are renting in AWS how do you know for sure that any given job was executed properly?
Saying that it performed to spec, or didn't crash, would be the typical answer.
But how would you know, for sure, there wasn't some slight modification that was done to the code, either accidentally or maliciously?
How do you know there wasn't something injected unexpectedly through a man-in-the-middle attack to parameters to your function?
Imagine your application were processing millions of financial transactions a day. And there's a high financial incentive for a bad actor to mess with the execution of your code?
Sure, there are plenty of DevSecOps tools development can use when deploying to reduce the risk, but there's still no way to actually know for sure at the execution level for a specific job.
Verifiable compute removes that guessing and provides a cryptographic proof that the code that you intend to execute with the inputs as you understood them was indeed executed.
Why does decentralized verifiable compute matter?
Most people support decentralized verifiable compute as an alternative to BigTech's monopolistic rent seeking or their censorship
If such compute were a direct alternative, then they would need value propositions different from BigTech by a long shot. And BigTech has alot of advantages over decentralized cloud services such as the number of services they offer to developers (databases, AI, identity management) and their economies of scale.
Some common generalized value propositions decentralization offers mentioned (note: I don't know if they are generalizably true, but won't go into depth here):
- Lower cost
- Greater privacy
- Better reliability
- Lower latency
For general cloud compute use cases, running a SaaS, for example, it's unclear whether decentralized compute offers meaningful advantages in these areas.
Instead of these generalized values, I would suggest that decentralized verifiable compute becomes inevitable in the context of the business model innovations that will create demand.
If we look at legacy businesses already running on centralized compute by BigTech, the problems to solve are minimal.
So let's go back to the three trends at the beginning of this article.
IF (and it's a big "if") those three come true, they each introduce new business requirements which creates a forcing function for trust minimized compute services (decentralized verifiable compute).
Let's look at each one in more detail.
DeFi will encompass all existing assets (real-world and crypto)
At it's heart, decentralized finance gives users, the owners of assets, lower risks, higher transparency and lower costs through disintermediation.
The blockchain has enabled assets to be trustlessly recorded and transacted, removing the need to trust that the bank, stock exchange, or trading app will behave in a transparent and uncompromised fashion.
However, keeping a ledge of the assets is just part of the equation.
Full applications will not to support the ways of doing business. But to decentralize just the ledger while keeping the rest of the applications that interact with the assets would defeat the purpose of decentralization.
A centralized app could still limit access to assets, compromise pricing and other information, and obfuscate information on the ledger. It wouldn't eliminate the problems that decentralization solves, and defeats the purpose and principles behind DeFi.
The underlying compute that the decentralized applications (dApps) run on also need to be decentralized.
To do this, the apps now need compute features to support true decentralization:
- Security (through cryptographic assurances and consensus mechanisms)
- Verification of execution integrity (through cryptographic proof)
- Governance decentralization (no single entity can make top-down decisions)
Imagine a world where all the applications we know of today to support the financial infrastructure -- mutual funds, stock exchanges, banks, bond trading, hedge funds -- moves the asset ledgers to the blockchain.
And as a result all of the applications that need to work with those assets -- algorithmic applications, compliance and regulatory reporting, transaction software (institutional and individual) -- need to support the demand for decentralization.
This will drive the demand for compute.
But it doesn't stop there.
Smart contracts will govern how every industry runs
The "core" of decentralized finance are the "smart contracts" that permit the trading to occur in a transparent and reliable way on the blockchain.
Companies will move to smart contracts, perhaps for some cost benefits, but more likely because they need to gain or keep customers by offering trust-minimized services. In other words, their customers will not trust the entity to be a centralized entity.
For a decentralized exchange (Dex), as an example, the "smart contracts" power 24/7 trading with functions to enable wallets to see their balance, buy, sell, and provide the algorithmic pricing through automatic marketing making. DeX's remove the risks that a central exchange offers to "shut down" someone's account.
Similarly, customers of legacy industries, such as insurance for example, will increasingly distrust that the insurance company is going to be the one to execute the contract fairly. That same ethos is going to go to the off chain compute needed to support the smart contracts.
Most smart contracts run on the blockchain themselves, and so arguably would never need the off-chain compute provided by trust minimized compute.
However, as more data is needed and more computation required, those will best be performed off-chain, and the output then used by the smart contract and written to the blockchain. The blockchain would be likely too slow and the risks of congestion too high to not leverage off-chain compute.
For example, the machine learning models that supports a new hedge fund product would need alot of compute and alot of data that is off-chain (for example, maybe it incorporated Twitter sentiment or NLP analysis of analyst reports). These models would best run off-chain on decentralized compute.
Similarly, when other industries adopt smart contracts, they will probably have lots of integrations with outside data, workflows, and computation (like AI and ML). This computation will not occur on blockchains, but on decentralized compute.
Insurance companies will run smart contracts to determine premiums and settle claims, but these may require drone-based photographs, integrations with APIs, lots of compute and inputs.
Imagine a similar form of decision-making via smart contracts for our court system, manufacturing, farming industries, oil and gas. These use alot of computation power now; but as they adopt smart contracts to increase trust with customers and lower operational costs, they will pull in decentralized compute.
In both of these cases, the primary value is to support the new business models around decentralization. If DeFi and smart contracts run on the blockchain, then the enabling applications and supporting off-chain compute should also be decentralized.
However, there's a case which, if it comes to pass, will dramatically change the way we relate to compute resources altogether.
The Metaverse's core tech will be extreme edge, not VR
Most of the Metaverse narrative seems to focus on VR (or in some cases, which I think is far more likely, AR).
Regardless of the presentation layer, we can be confident that the compute resources to support the Metaverse will by significant: graphics, motion graphics, machine learning, AI. All of these compute intensive services will need to be run fast.
Latency matters in the Metaverse. So on the one hand, the computations will increase in complexity while the threshold of latency shrinks.
Now physics becomes a limitation.
According to Next Platform:[1]
The two Amazon Web Services regions that are relatively close to each other – in Northern Virginia and Ohio – are still 300 or more miles away with a latency of no lower than 28 milliseconds.
But according to the article quoting Matt Baker from Dell, the expectated latencies for real-time operations is around 5-9 milliseconds.
Building more data centers that are closer are unlikely.
Even cell towers have limitations as we have seen with 5G roll-outs by telecommunications companies which have indicated that line-of-sight requirements hinder full adoption in major urban centers.
What will drive a new level of Metaverse capabilities? Extreme edge.
Extreme edge means the number of pops goes from the current hundreds from any given major provider to the 100,000s of cell sites or cell towers, [2|2] to millions of compute nodes.
Extreme edge will change our relationship to the compute resources because many households will own the nodes, themselves.
In the same way households buy a smart speaker like Alex or Apple Home, households will purchase a Metaverse node (more likely branded, such as a Roblox Home).
These nodes will be decentralized (obviously), but because they are out in the wild, need the blockchain verification to prevent bad actors from compromising the servers.
These home node servers not only accelerate the compute locally for the home (so the Roblox players enjoy the best metaverse experience) but can run compute for other metaverse players nearby, bringing the compute within thousands, if not hundreds, of feet away.
Not only will this type of fully decentralized verifiable compute enable the full realization of the metaverse. It will change the relationship we have with the computing resources themselves.
How will it change our relationship to computing resources?
It will allow those who own the compute nodes to financially benefit from the spend to execute jobs on their servers. They become rent collector through tokens.
But that's a different topic. If interested, let me know in my newsletter.
Conclusion
The mistake that many are making when envisioning a future of decentralized compute is that it will displace existing centralized clouds.
But new technology needs new business models to drive demand.
And the three possible trends that will do so are:
- DeFi will be adopted across all asset classes
- Smart contracts will run in every industry
- The metaverse will demand extreme edge
What do you think?
Let me know here: https://twitter.com/timfong888/status/1633196600058314753?s=61&t=zkq_cJBXvstQ2uHAw_IbFA
Unlock Your Business with Tokens
If you want to learn about how blockchain tokens can unleash new opportunities for you personally or for your business, subscribe here: Unleashing the Incentive Machine: Unlocking Tokens for GTM | Timbo | Substack
Thread
Decentralized verifiable compute is inevitable. It's just not been well-defined.
(Personally, it should be called "trust minimized compute" to capture the true value it brings.)
But even that needs some context.
It should not be defined against Big Tech (even though it definitely provides more positive social and economic benefits than these monopolies do)
Instead, inevitable demand comes IF these three trends come true:
- DeFi will be adopted across all asset classes
- Smart contracts will run in every industry
- The metaverse will demand extreme edge
If you believe these three will come true, then the necessity of trust minimized compute appears.
Changelog
- 2023-03-07: First draft and first tweet
- 2023-03-07: updated for SEO check