The MetaGame. How Quant is changing the DLT Metagame by Ghost of St. Miklos

Humans love to play games. We’ve done so since ancient times, with myriads of forms. Some involve physical movement, some are quite mentally involved, with moves taking minutes or even hours at a time. In order to explore this topic, I am differentiating between two types of games here: competitive and cooperative. Competitive games are ones in which two, or more, players or teams are placed in opposition with one another in order to achieve a certain win state.

One such seminal game is the game of Go.

 

Go is an ancient Chinese game, constructed from simple rules. It starts with an empty 19×19 grid, with opposite players taking turns placing white and black pieces respectively. By claiming a spot on the board, the opponent may not place his or her piece there, making them place elsewhere on the board. By encircling the opponent’s pieces, no matter how many there may be within, you may remove their colored stone from the board, preventing them from placing further pieces unless they too encircle your position. The Chinese call this game, appropriately, “Weiqi”, the encirclement game. The object of the game is to, by the final possible move, have a majority of pieces claiming space on the board. The simplicity of movement, placing a stone in empty space, makes it to where even children can pick up and play, but the emergent complexity of how pieces group, grow, and claim territory is something that can’t helped but be called a “living game.”

 

Pieces with open space around them have “liberties”, such space being claimed by one color called an “eye”. Two or more “eyes” within a cluster means such a group of pieces can be considered “alive.” By creating these strongholds within your territories, you can build out undeniable spaces for you to further expand and control territory. However, what would happen if one’s vision of the board were obscured? What if the territories that are clearly designated as occupied could only be gnon once your pieces were within a certain range? The omniscience gnon by a player of most board games is a privileged position, one not often shared by even the most prepared general. This is the notion of the fog of war, where your intermediate objectives are obscured from the ultimate direction you are reaching for. When injecting this sort of uncertainty, the game of Go and other competitive strategic games (chess, starcraft, etc) take on a taste of real warfare, where moves are made on fuzzy assurances, deception is rampant, and gnoledge is better than gold.

Plenty of heuristics have emerged from the millennia of thought surrounding the games of life and death, all dealing with the epistemological uncertainty that a player has to face to venture into the ungnon. These “catgorization tools”, such as Donald Rumsfeld’s box, chart out gnoledge just as one charts out unexplored territory, with neat, constrained boxes containing “gnon gnons”, “ungnon gnons”, “gnon ungnons”, and the ultimate mystery of “ungnon ungnons.”

 

By applying this sort of analysis, the player can at least understand where they exist relative to their gnoledge set, but the need still remains for them to venture out, gathering experience to even have the ability to categorize to begin with. In this exploration of the ungnon, it is often better to team and collaborate, rather than necessarily compete for the sake of the goal. Intrepid explorers such as Lewis and Clark, Christopher Columbus, Magellan, and countless others are marked as heroes in history books but could only achieve such feats with a collaborative structure that supported and motivated their efforts towards charting lands. From this venture, the novel becomes route, the fog clears, the board enlightens, and the Meta can progress toward new states. The foray is just the beginning, however, and if one were to stop there, we would be like a match lit, only to be snuffed out soon after. The ultimate success of a venture is the sustained repetition of life, the never ending cycles that perpetuate our existence further. In this way, a collaborative game can also be viewed as a continual, or infinite, game. Such can be seen in “Conway’s Game of Life.” John Horton Conway, who passed away just this year from coronavirus complications, was a brilliant mathematician, theorist, and, most notably, games master. Several of the games he designed contributed greatly to philosophical exploration of adversaries, cooperation, and the boundaries that define such games.

 
Conway’s Game of Life

Very similar to Go, Conway’s board is a sandbox for the interplay of space and matter, with simple rules governing the presence or removal of such:

1. Any live cell with two or three live neighbors survives.

2. Any dead cell with three live neighbors becomes a live cell.

3. All other live cells die in the next generation. Similarly, all other dead cells stay dead.

The difference here between Go and Conway’s namesake is threefold:

  1. Conway’s game features no competition or players, just the extrapolation of future state from initial conditions; a chaotic plain of sorts.
  2. Conway’s game is boundless: cells can perpetuate for however far they can propagate, the only limit is what extent you are willing to compute.
  3. Conway’s game is undecidable: there is no assurance that any state will converge toward an ultimate endgame.
 
An Automata Factory

Conway’s game features a base concept, that of the cellular automata. Cellular automata can be thought of as a sufficient logical base for self-replication, where certain arrangements of “live cells” can exist or even replicate indefinitely over time.

 
They can look quite mesmerizing.

Such also is the DLT “Game of Life”. Directly derived from Bitcoin, or merely inspired, are now thousands of differing chains each embodying their own version, or class, of game. Each expressly works on their own board, autistic to the existence of other DLTs in their own worlds. This is preferred, as too much interdependence would bring about complexities compromising the security of each DLT game. You wouldn’t want the outcome of one tennis game to affect another directly, such a situation would affect the integrity of the gameplay, but the end result of any game played repeatedly affects the strategies of said game. This collection of different competing strategies is called the MetaGame, a byproduct of the players interacting and learning from one another in the process of playing the game.

AXIOMS OF THE METAGAME

 

John Nash was a brilliant mathematician, statistician, and theorist. His work, like John Horton Conway, spans many different fields, but for our purposes we are focusing on a single gem from his treasure chest of ideas: Nash Equilibrium. One could say Nash Equilibrium is to Economics like DNA is to Biology; these fundamental mathematical relationships describe the eventual endgame the MetaGame is searching for. Within every game, player behavior converges towards a certain set of actions and reactions (in other words, a strategy!)

 
https://limenlemony.wordpress.com/2012/11/06/are-we-stuck-in-a-prisoners-dilemma/

Such equilibriums describe the mathematical probabilities of the outcomes of competitive games, and their payouts. Nash constructed his equilibriums under the assumptions that the games played were:

Competitive

And

Convergent

Competitive, meaning that there was limited coordination among players, and convergent, meaning that there were distinct win states, and payoffs from said win state. This was extensible to quite a few games, but it is not universally applicable.

What was discovered in 1970 was a greater theory, more general for behavioral dynamics than just Nash’s equilibrium. Nash equilibrium imagines a situation where there is little to no coordinating structures that accompany the players playing the games with Nash equilibrium. But there exists a super set of the Nash equilibrium, called a correlated equilibrium.

 
https://www.researchgate.net/figure/Relationship-between-Nash-correlated-and-coarse-correlated-equilibrium_fig1_258400586

Imagine having some sort of signal sending operator that can coordinate players so that they can reach a more stable behavioral complex; it was discovered that with enough repetition, enough rounds of playing games, there can emerge behavior within a game that appears to be coordinated by some centrally correlating entity. Now, when this theory was first imagined in the 1970s, people were thinking of functional entities like a Manhattan traffic light, a bulletin board for the New York Stock Exchange, NASDAQ-hosted executing engine for an order book. The list of cybernetic tools goes on and on, however, these instruments, especially the ones that required complex computation and coordination of different agents, might be more vulnerable to malicious distortion. There arises situation where, rather than the innate desire for people to cooperate with one another ensuring fair “play”, competition may be a corrupting influence on the outcome of the game. There existed a need for centralization of the operation of these tools, in order to ensure that a fair process was followed at scale.

 
https://portal.311.nyc.gov/article/?kanumber=KA-01047

With the inception of the Genesis block, everything changed. Synthesizing timestamps, Merkle trees, and fault tolerant digital algorithms (Reusable PoW, Hashcash, Bitcoin incepted what could be the first autonomous, fully defined, correlative equilibrium. Starting from Genesis, Bitcoin incepted the exploration of ungnon game-theoretical territory, a vector of blocks with which the protocol, enlivened by sprites/oracles (the people who play roles within each DLT system,) conducting virtuous cycles of consensus refining and rewarding the players that are most effective at playing each game, with the ultimate goal of continual, never-ending growth. The correlated equilibrium engine is contained within the protocols themselves, replicated redundantly until corruption of such a process is virtually impossible to conduct, at least without incredible cost.

 
https://beesandbombs.tumblr.com/post/85375263689/wave

This foundation of Game Theory is what lies at the heart of every DLT. There is a similar repeated game that results in the proliferation of blocks on a linear or graph-like creation. Each DLT, which was first coherently implemented within Bitcoin, features its own flavor of game theory. One derives from the next in succession, iterating, improving and innovating on what was established previously. Competition defines these games, with ultimate goals motivating action amongst its players, but, strangely enough, with enough repetition and development, sub games, or strategies, emerge that supplement the finite, main goal. Now what does that mean? The notion of competition is essential for the Bitcoin, Ethereum, and all Proof of Work chains. The Nash Equilibrium of this game is coupled with the block reward (the carrot for miners winning the game,) and the competition itself: solving for a nonce that advances an elliptic curve, pushing that elliptic curve further relative to a difficulty adjustment algorithm.

 
https://blockonomi.com/bitcoin-difficulty-target-adjustment/

The difficulty adjustment algorithm keeps the tempo of the game rhythmic relative to the skill of the players playing the game. You can imagine a fabric of all these players and games, and that each game is operating independently of one another; Bitcoin in its own world, Ethereum and in its own world, Monero and its world, etc, etc, etc. Ultimately, it is essential for each of these DLTs to operate autonomously. Just like the creators of TCP IP or the Border Gateway Protocol envisioned: a network of networks approach, where you have an ecosystem of different networks that are able to speak to one another through a protocol that allows them to transport objects, whether that’s data for TCP IP, or standardized value and assets across ODAP.

 
SRC: MIT & Council of Quamf/Yung AJ

As Mao Zedong said: Let a thousand flowers bloom. In our case, we are letting a thousand DLT automata species bloom. So too will, in nature and the number, different DLTs proliferate. If we are to build a truly coherent resilient and robust system, then there needs to be an ability to interchange DLTs automatically for the services they provide. If this really is a natural order, an ecosystem, each DLT then has resources that they are competing for, in order to maintain its internal metabolic rate, feeding upon the inputs and attention fed into them. Through a mix of utility and speculation, this is achieved. Utilities from one DLT should be interchangeable with another DLT of that same niche. Without it, you’re depending upon market participants to provide this aggregation service, leaving us vulnerable to exploitation by such a party if they were to achieve a sufficiently powerful network effect. A network of networks’ main goal is neutrality, and, through that neutrality, price normalization among different utilities. This is fundamental. It’s what allows for Netflix to service their video content and their content delivery platform at scale across jurisdictions, geographies, service providers, ad infinitum. If Netflix had to negotiate with every single service provider that lays down fiber within the Americas, let alone the world, it would be a field day for their lawyers, but a nightmare for everyone else. It is essential that the main foundations for the Internet of trust be like the Internet of old that we know and love: Open, Interoperable, and Autonomous.

The plurality of protocols that have emerged from the tangled world wide web all converge to a single standard protocol which governs data packet transfer from gateway to gateway. IP serves as the waist of the hourglass, the single semantic point where software and hardware are able to meet. Use cases and specific applications are able to be built upon this common layer, but never diverging from this common standard of communication. There are many reasons why this is so: major institutional adoption, the simplicity of implementation, the seminality of innovation are all great justifications for its dominance, but, whatever reason you may resolve to, it’s dominance remains.

 
https://www.researchgate.net/figure/The-hourglass-architectural-model-of-the-Internet-Protocol_fig2_251419252

This is the power of standards-based approaches to designing systems. By considering the situation generally, you can design schemas one size fit all, with wholly differently constructed gateways interoperable at scale across different networks. By constraining the method of communication to a single standard, at a critical juncture, you actually allow for greater utility at higher levels of the Internet Protocol stack. The “Hourglass”-like nature of the IP stack is one of these abstract mappings of the logical space underlying the internet we use every day. By occupying the spot of convergence, you need not move to claim space above or below; the space you have claimed is at a critical juncture, which all other parts of the stack must deal with.

This is the Ethos that lies at the heart of what Quant and MIT have begun with their ODAP and Gateway submissions to the IETF.

If Bitcoin could be thought of as the incepting unit of decentralized, cryptographic, digital value, like how Claude Shannon formalized the unit of metric entropy (Bits) that information theory, computation, and the internet are built upon, ODAP (Open Digital Asset Protocol) is the formalized exchange of such similar digital units of value, just like how the IP (internet protocol) suite is the formalized exchange of information. Recall the map of different DLTs, and therefore different forms of digital value, that our burgeoning crypto-ecosystem is incubating. A whole digital forest of value flourishes beneath our fingertips, never stopping for workdays, weekends or holidays, nor at the behest of any one governance authority. Each organism in this ecosystem needs to gather resources to survive, and, in doing so, execute a number or cooperative, competitive, or complex strategies. Just as a natural forest can be thought of as a energy capture game, so too can DLTs be thought of as a value capture game! Even though the energy capture and catalyst processes may be species/product specific (think Krebs/Urea cycles, glycolysis, etc.,) they all still work at the same general goal, energy. A unit of glucose is a unit of glucose, whether or not a cow or a blade of grass is using it. What is necessary, then, for the corresponding ecosystem of value/trust to emerge is the adoption of value-exchange protocols such that the greater DLT ecosystem can interchangeably use fundamental units of value for the contexts in which they are needed. Whether or not these fundamental units represent storage, computation, or data itself, the convergence towards a single protocol of exchange (ODAP) allows for the flourishing of universal adoption and incredible diversity of protocols and use cases built on top. An untold amount of utility is made possible where once there was nothing, generating substance from the seams between the protocols.

 
https://medium.com/@CryptoSeq/the-network-of-networks-scalable-interoperability-to-unleash-the-true-potential-of-blockchain-c54e7d373d2d

This is the MetaGame that Gilbert envisioned from the very beginning. Instead of blindly placing pieces on the board, expanding inch by inch the territory first established, Mr. Verdian chose to strategically seize select choke points which straddle the barrier between the Digital New World and the existing legacy guardians. Choice spots, such as government standards bodies, private corporations, and national infrastructure, accumulating weight at the edges of the board. While many an aspiring Satoshi tried to emulate, one-up, or even retort what Bitcoin did, each failed to realize that, without interoperability (and institutional adoption to boot,) blockchain/DLT/cryptocurrency would remain novelties of cryptographic construction. The likes of UCL, MIT, Standards Australia, the NHS, the BSI, Vocalink (which ran the real time payments infrastructure for the Bank of England) the Bank of England itself, the Federal Reserve, the IETF the ITU, the IEEE; the list goes on and on and on. It shows Gilbert coordinating and establishing the boundaries of the fabric of DLTs that are being that are proliferating and growing at an exponential rate. So, by providing this correlative equilibrium through ISO TC 307, and it’s embodiment within Overledger, a network of networks can be created, where the utility of every distinct DLT is normalized and brought together forming a interoperable cooperative ecosystem of different architectures. Not one is chosen from the beginning; the best and most robust architectures win out.

 
Wei Qi

Coming full circle, we stand at the precipice of a new paradigm: one for digital assets, the internet, and trust itself. The foundations of our society are being shaken, and, from this sifting process, will emerge new axioms upon which we base national critical infrastructure. Quant is just one actor, fulfilling a very important role in this rearchitecting: encircling the world of DLT, standardizing it, and helping give birth to an open, autonomous, and interoperable ecosystem of ecosystems.

Your move.

References/Resources:

Equilibrium points in n-person games

One may define a concept of an n -person game in which each player has a finite set of pure strategies and in which a…

www.pnas.org

 

https://arxiv.org/abs/1907.02143

https://www.cs.princeton.edu/courses/archive/fall06/cos561/papers/cerf74.pdf

http://economics.mit.edu/files/1082

https://medium.com/avalanche-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b

https://bitcoin.org/bitcoin.pdf

Bitcoin Miners Beware: Invalid Blocks Need Not Apply

Bitcoin is an impenetrable fortress of validation.

hackernoon.com

 

https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/KERI_WP_2.x.web.pdf

https://medium.com/selfrule/meta-platforms-and-cooperative-network-of-networks-effects-6e61eb15c586

Updated on June 20, 2021

If you find any broken links or pages please contact any admin from Council to let us know. Help us help you.