X

Feedback + Support

Need Assistance? Notice something missing or broken? Let us know!

Press esc to dismiss

Show research Article List
Sort icon: direction descending

research articles

Original research, analysis and reports across the frontier of cryptoeconomics, blockchain technology, and digital assets.
magnifying-glass

Update

August 23, 2019

Report

August 2, 2019

Cryptoasset Report

May 23, 2019

Cryptoasset Report

May 9, 2019

Cryptoasset Report

April 25, 2019

Cryptoasset Report

March 1, 2019

Quarterly Report

February 26, 2019

Cryptoasset Report

December 20, 2018

Cryptoasset Report

December 18, 2018

Quarterly Report

August 20, 2018

Analysis

June 6, 2018

Analysis

April 4, 2018

Token-based fundraising

March 6, 2018

Analysis

March 2, 2018

Token-based fundraising

February 4, 2018

Token Sales

December 29, 2017

Token-based fundraising

February 12, 2017

Introduction

August 2, 2016

Introduction

July 13, 2016

Education

July 4, 2016

Introduction

June 21, 2016

Introduction

June 14, 2016

Introduction

June 7, 2016

Introduction

March 24, 2016

Introduction

March 17, 2016

You've reached the end of the list

Report

March 22, 2019

Cryptofinancial Valuation Series Part Two: The Quantity Theory of Money

This article provides an overview and analysis of one of the first macroeconomic theories on record, the Quantity Theory of Money (QTM). Complete with a review of the necessary model assumptions, the modifications needed to facilitate applications to crypto, and the specific challenges associated with velocity, Smith + Crown takes a close look at how the QTM continues to influence economic discourse to this day.

Overview

The Quantity Theory of Money (QTM) has become the de facto payment token valuation tool in the crypto industry—given a lack of well-established alternatives—despite interpretations of the theory varying greatly within traditional economics and a number of challenges when employing the model to value a cryptoeconomy. It is conceivable that the QTM’s ease of application may further explain its popularity, along with its pedigree, which confers a sense of legitimacy that can mask the oversimplification underlying certain applications of the model.

The inconspicuous transition of the QTM from macroeconomic model to valuation tool may suggest that an understanding of what constitutes a valuation in the broadest sense is important. At a fundamental level, any particular type of valuation simply represents a specific ruleset that is generally accepted by market participants. While popular valuation methodologies such as those enumerated in part I of this series are certainly justifiable from a theoretical perspective, there is no intrinsic reason why such models should be considered beyond reproach or otherwise as strictly correct. Rather, the acceptance of these models is largely a product of convention.

In the case of crypto payment tokens or cryptoassets with payment or medium-of-exchange functions, analysts must attempt to model the demand for a money-like instrument. Such an exercise is different than an equity valuation for several reasons; it is instead more akin to determining price levels associated with a specific monetary base and production environment. Analysts have defaulted to the QTM—assuming its assumptions are met—because it appears to be the closest equation in economics literature that attempts to estimate demand for a non-interest bearing payment instrument.

As mentioned in part I of this valuation series, examining the origins of the QTM model is paramount to better understanding its benefits and limitations. What follows is a review of the necessary model assumptions, the modifications needed to facilitate applications to crypto, the specific challenges associated with velocity, and best practices.

Photograph of a bank interior
The Quantity Theory of Money (QTM) has a long and complex history; many influential thinkers have struggled to understand its applicability and limitations. The viability and relevance of the QTM hinges on discerning the ability to use the model as a tool for monetary policy prescription.

A Historical Perspective

The QTM is a formula that links the value of money in circulation to its respective economic output. The formula was articulated and restated by a laundry list of influential thinkers: Nicolaus Copernicus, John Stuart Mill, Karl Marx, John Maynard Keynes, Milton Friedman, Irving Fisher and Knut Wicksell. While its proponents suggest the model is useful when evaluating the impact of changes to the money supply on price levels, the requisite model assumptions and its historical context are important to consider before implementation.

Robert Skidelsky’s Money and Government offers readers an excellent discussion of the history of the QTM and its various interpretations. Skidelsky explains that the QTM has spawned at least two major interpretations: the ‘American’ Friedman-esque view popularized by Irving Fisher and the ‘European’ Wicksellian view. Although both maintain that changes in price levels are directly caused by expansion and contraction of the money supply, Fisher considers this to be a direct function of the central bank, while Wicksell argues that the central bank could affect control only indirectly given the central bank’s reliance on commercial banks to lend out the newly minted money supply in turn. Although steadfast Wicksellians may emphasize the ability for commercial banks to interfere, both views can be very powerful from a policy prescription perspective—in a fiat system, since the central bank can (directly or indirectly) control the money supply, the central bank can, by extension, set price levels.

To the extent that volatility in price levels leads to unanticipated shifts in wealth distribution and activity levels that catalyze economic and social instability—a quintessential Monetarist view—price stability should be the number one goal of central bank policies. While the QTM may appear at first glance to offer a framework with which to calibrate price levels given the perception that a desire to increase price levels can be effectuated with a proportional increase in money supply, there are significantly more considerations to wrestle with before arriving at such a conclusion.

To better understand these considerations, one may examine the Fisher formulation and its assumptions.

The Fisher Formulation and Model Assumptions

While there are several versions of Fisher’s formula, this version emphasizes the original context, where money supply multiplied by circulation velocity equals the sum of each unit of economic output multiplied by its respective price:

MV = ΣpQ
M - Money supply
V - Velocity of circulation
ΣpQ - Price of every good in the economy multiplied by its price

Fisher simplifies further by replacing the summation portion of the formula with PT, where P represents the average price of all goods and T represents the total number of transactions during the period:

MV = PT

Considered an identity equation, the above formulation is true by definition given that ‘goods cost what you pay for them.’ Several fundamental economic assumptions are required, however, before the equation can be used as a tool for monetary policy prescription:

  1. The price level (P) is an exogenous factor: P is a passive factor that is influenced by other variables, but that does not influence other variables, in turn
  2. The velocity of money (V) is constant throughout the subject period and is not influenced by endogenous factors
  3. The volume of transactions (T) is constant throughout the subject period and is not influenced by endogenous factors
  4. People never hoard cash and never demand money directly. Rather, people demand goods and service. Any accumulation of savings is interpreted as demand for future consumables, not demand for money itself
  5. The free market will naturally establish and maintain full employment equilibrium.

Needless to say, the acceptance of this host of assumptions has been anything but universal.

"Too large a proportion of recent ‘mathematical’ economics are merely concoctions, as imprecise as the initial assumptions they rest on, which allow the author to lose sight of the complexities and interdependencies of the real world in a maze of pretentious and unhelpful symbols."

— J.M. Keynes

Challenging the QTM

Although Keynes himself struggled to form a definitive position on the topic, he eventually challenged the assumptions of the QTM in his seminal 1936 book The General Theory of Employment, Interest, and Money. Successfully arguing that unemployment was likely but not inevitable and that money demand needed to be modelled as an independent factor of production, Keynes compellingly discredited assumptions number four and five. Furthermore, Keynes recognized that velocity is not an independent variable, velocity being affected by the variables, P,Q, and M, thereby effectively invalidating assumption number two.

A generic modern-day example where GDP (i.e., PQ) results are reported much lower than anticipated helps illustrate Keynes’ point. In this scenario, many citizens and investors are likely to consider the news alarming and suggestive of future uncertainty. Investors that are particularly concerned are likely to reduce spending moving forward, instead preferring the benefits of increased liquidity. The resultant increase in savings implies more people are holding money for a longer period of time, thereby reducing circulation velocity. This simple example demonstrates one instance where changes in PQ can ultimately impact V—a violation of assumption two. What is more, one need not go far for a concrete illustration of this phenomenon—it is easy to see the vertiginous decline in M1 velocity resulting from the 2008 Financial Crisis.

Although Keynes may not have had access to a robust historical record of velocity such as the above, he ultimately recognized the presence of significant issues with several QTM assumptions, ultimately supporting his conclusion that the QTM was an unreliable tool for analyzing prospective monetary policy. Keynes’ arguments cast doubt on the Monetarist’s aspirations to achieve price stability through QTM-informed changes to monetary supply. Adjusting money supply so to achieve a price level change equal in sign and magnitude will likely prove unproductive, given the propensity for velocity to change simultaneously. While Keynes’ criticisms were themselves never universally accepted, their most important impact was in presenting a number of well-argued, substantial challenges to this widely accepted formulation.

Velocity of M1 Money Stock
Source: Federal Reserve Bank of St. Louis, Velocity of M1 Money Stock [M1V], retrieved from FRED, Federal Reserve Bank of St. Louis; https://fred.stlouisfed.org/series/M1V, February 20, 2019.

While a definitive assessment of QTM approaches is beyond the scope of this work, the critical observation from the above discussion is that even in its original context the QTM has never achieved universal or unchallenged acceptance. The variety of outstanding questions and issues noted above merely begin to illustrate some of the concerns that have been raised, and which remain contentious, debated points even amongst professional economists. Acknowledging these concerns is relevant here as it establishes a cautionary approach towards analytical approaches based upon foundations of the QTM, a useful mindset before considering some of the applications of QTM approaches to the crypto world.

QTM - The Crypto Way

Despite its utility as an often directionally accurate but imperfect tool to evaluate the impact of monetary policy, the application of the QTM to the world of cryptofinance has caused no insignificant amount of confusion. Widely popularized by a number of hedge funds and crypto enthusiasts, the QTM has easily become the most prevalent cryptoasset valuation approach out there. While it has clearly made a meaningful contribution as a model that has catalyzed public discourse surrounding cryptoasset valuation, unfortunately, applications of the QTM in crypto contexts often occur haphazardly and have frequently led to questionable conclusions.

The elegance and simplicity of the equation represents what may be both its best and most problematic aspects. How could such a basic equation confuse so many of its proponents? The issues can likely be traced to three major areas:

  1. A lack of attention to detail or misunderstanding of ecosystem ‘price levels’, exchange rates, and their units;
  2. The inherent challenges of approximating and understanding velocity; and
  3. Complexities associated with estimating the monetary base (M) given the uncertainty over how much is actively circulating for purposes of economic transaction.

The fact that the QTM is an identity equation that ‘must always be true’ appears to serve as a source of confusion for many. The confusion presumably stems from erroneously reasoning that, if one replaces USD references with references to a specific cryptoeconomy and divides both sides of the equation by V, a valuation should emerge. Many fail to realize the number of considerations and assumptions that are involved prior to calculating the final result.

The Units Issue & Proper Model Specification

Model assumptions aside, using the QTM in the context of a fiat-based economy is relatively straightforward, with consideration for the units on each side of the equation being fairly trivial when viewed in QTM’s original context:

MV = PQ
M - Quantity of money (USD)
V - Money velocity (a scalar, unitless variable)
P - Price level in units of currency per units of output
Q - Quantity of output, in units of output

Multiplying out the variables on each side yields units of USD on both sides of the equation — this makes sense for an identity equation. If the units on each side of the equation did not match, this would indicate that a mistake was made. When analyzing a cryptoeconomy, however, the specification of units requires strict attention, as failure to do so can result in erroneous calculations.

Warren Weber has succinctly articulated the importance of correctly specifying units when utilizing the QTM in his article The Quantity Theory of Money for Tokens. As a former Federal Reserve economist and researcher focused on the theory and history of money and banking and the implications of new payment technologies on the future of monetary policy, Warren is an expert on the subject matter. Warren explains how the QTM is used in economic literature to obtain a relationship between the money supply and the price level. Using a superscript ‘T’ to denote a token economy and rearranging the equation to solve for P illustrates this relationship:

PT = MTVT / QT

Note PT does not represent the token price, instead representing the price of a unit of ‘project output’ quoted in units of native project tokens. In addition, VT does not represent the number of times that tokens are bought or sold for USD. Rather, VT is the number of times that a token is used to purchase project output, on average. (A review of freely available cryptofinancial valuations suggests some remain confused on these points.)

Alternatively, we can restate the above equation to solve for MT:

MT = PTQT / VT

Here MT represents the quantity of tokens that is supported by the project economy at a specific moment in time. In other words, MT is the number of tokens required to support all transactions in the project economy without a change in the level of output or the unit price of such output. Warren offers an even more intuitive and precise formulation, relative to the typical goal of solving for a USD-denominated token price:

E = MTVT / πQT

Where π is the USD-denominated price of QT, MT is the maximum number of tokens authorized, and E = PT / π.

We then extend Warren’s formulation one step further by taking the reciprocal to arrive at a formula that allows users to observe the token price in USD terms:

USD Token Price equals 1 divided by E equals pi Q ^T divided by M^TV^T

While the above formula is helpful across a diverse set of cryptoeconomic applications, the quality of formula inputs surely limits the quality of its output. Furthermore, understanding which variables are independent and which are dependent is critical when attempting to utilize the model to make pro forma forecasts. This is of particular importance in the context of the specification of velocity.

Theoretical Challenges with Velocity

Taking a quick step back, in the context of the Monetarist view of the QTM, velocity is assumed to be independent of the monetary base (M) and the GDP of the economy (PQ). Rather, Monetarists view velocity as a function of interest rates, trade activity, habits, population, preferences, and investment frictions, among other complex factors. The independence of velocity from endogenous QTM factors theoretically allows for the comparison of various states of the economy over time. If velocity is, in fact, affected by changes in other QTM variables, the probability of model misspecification rises dramatically.

More troubling, in a crypto context, is that while this problem of establishing velocity in a national economy is simplified—given that the other variables in the equation are known—this is not the case in a crypto economy. In effect, solving for velocity requires making apt prior assumptions about the size of the monetary base, thereby making any output at best approximate and dependent upon the quality of assumptions.

Using this velocity for analysis in other periods implicitly assumes that user behavior will not change as a result of what could be dramatic changes in the asset pricing environment.

As previously mentioned, the QTM is an identity equation that must be true at any given point in time. To get the most out of the equation from a valuation perspective, analysts need to apply the QTM to discrete periods of time in the future. This is often where challenges arise, as analysts are prone to unbeknowingly make assumptions about the relationships among QTM variables (e.g., adopting the Monetarist view that assumes changes in PQ do not affect V.)

Although it may appear straightforward to calculate velocity based on ‘observable’ inputs for M, and PQ at first glance, as with most valuation analysis, the devil is in the details. Chris Burniske’s influential piece Cryptoasset Valuations represents a well-known and generally highly regarded example of cryptoasset valuation. Chris models a fictional cryptoasset, INET, a payment token granting users access to a decentralized virtual private network. Using MV=PQ as a basis to determine utility value throughout the explicit forecast period, Chris then discounts token utility to the present day and aggregates the results.

While a full critique of the INET model itself is not necessary for the time being, note that Burniske’s article contains a bitcoin velocity calculation that is likely emblematic of the most widely implemented approach currently in practice. Chris annualizes an estimated daily transaction value figure of $160 million² (PQ), dividing the annualized number by the average size of bitcoin’s asset base, $8.9 billion (M), to arrive at a velocity of 6.5. Although Chris includes a fairly thoughtful discussion of the resulting figure, including an adjustment based on the proportion of bitcoin addresses that are assumed to utilize bitcoin strictly as an investment, the general methodology is problematic.

Firstly, as noted in the preceding section, when applying MV=PQ to a cryptoeconomy, M represents the number of currency units in circulation—it does not represent the USD value of said units. (Conflating the USD-denominated market capitalization with the total number of units authorized appears to be a common misspecification of the QTM and can lead to incorrect conclusions.) Secondly, utilizing average market capitalization figures in the calculation of historical velocity inextricably embeds the then-current price into the resulting velocity estimate. Using this velocity for analysis in other periods implicitly assumes that user behavior will not change as a result of what could be dramatic changes in the asset pricing environment. Moreover, using velocity calculated in one period for analysis in another also assumes that all other factors impacting velocity in the Monetarist view (e.g., population, trade activity, habits, preferences, and investment frictions) won't change.

For example, this suggests that in the case of a payment token with percentage-based transaction costs, no matter how expensive transaction fees become as asset prices increase, users will not seek to reduce the number of discrete transactions they execute. Moreover, this approach assumes no changes to velocity would result in a case where system GDP (i.e., PQ) increases substantially as a result of higher transactional load resulting from a large cohort of new users.

Diagram for Monetary Velocity in Advanced Economies

Sources: Federal Reserve Bank of St. Louis: [1] [2] [3] [4] [5] [6] [7] [8] [9]
Exchange-Rates.org, World Currency Exchange Rates and Currency Exchange Rate History retrieved from MBH Media, Inc.

Practical Considerations for Velocity

While it may be easy to specify the ideal objective of a velocity calculation—that is, to account for the totality of transactions that represent the genuine acquisition of project output and divide by the freely circulating monetary base—the execution of this objective is anything but straightforward. Specifically, the pseudonymous nature of most cryptosystems makes the identification and exclusion of transactions that do not contribute to economy GDP quite difficult in practice. Furthermore, the inherent nature of the monetary base of most cryptosystems presents additional challenges (discussed in the subsequent section).

The thoughtful methodology and technical acrobatics outlined in BlockSci: Design and applications of a blockchain analysis platform is applaudable. Beyond the technical architecture, the paper outlines a practical method of calculating velocity through analysis of on-chain activity. In short, the calculation relaxes the requirement to remove transactions that are not associated with a purchase of project output, instead capturing every instance where a unit of currency changes possession in any manner.

In addition, BlockSci researchers attempt to adjust for ‘self-churn’ by eliminating transactions where outputs are controlled by an address linked to one of the input addresses, as well as cases where outputs are spent within four blocks (a scenario that may be indicative of self-churn whereby a large output is broken down into a series of smaller outputs in a series of transactions). Although this approach to calculating velocity is highly practical and may allow for the comparison of velocity across multiple cryptoassets, the reformulation modifies the fundamental notion of velocity without any offsetting adjustments to other QTM variables. This modification to velocity therefore renders it inappropriate as an input to the QTM.

Although the pseudonymous nature of most cryptosystems and the need to carefully classify transactional activity may, at times, seem insurmountable, analysts may be able to utilize comparable economies to determine a range of velocity values that is sufficient to construct a reasonable valuation output. While it may be worth emphasizing that such an approach is likely in its infancy in practice, a thoughtful triangulation of velocity based on comparable crypto and fiat economies remains theoretically sound. To the extent there are cryptoeconomies with similar characteristics that offer more easily observable transactional and monetary base data, it is logical to extrapolate that the calculated velocity could be used in a comparable cryptosystem.

Fiat economies offer robust financial datasets that have been meticulously presented; they represent a viable source of comparison and can help analysts determine an appropriate range of velocity assumptions. Looking at the below graph suggests that velocity in fiat economies (at least in advanced countries such as we have included) remains relatively stable over time when considered on an order-of-magnitude basis. While there are exceptions to this stability, at a minimum the data provides a certain level of comfort should we consider that genuine cryptosystem transactions that involve the purchase of project output are generally originated by a human. To the extent velocity is a function of underlying human activity it is conceivable that its characteristics are likely not entirely unalike when compared across fiat and crypto economies. This is not to suggest that one can indiscriminately utilize a velocity range of 0 to 10 based on the below data, but rather, that it is potentially valuable to leverage these rich datasets when performing a payment token valuation using the QTM.

Quantifying the Interrelatedness

Given the large impact of the velocity estimate on overall valuation and in light of some of the theoretical and practical challenges associated with estimating velocity for purposes of pro forma forecasts, assessing how changes in PQ affect V is imperative to a correct application of the QTM model. Considering all aspects of variable interaction—including certainty, direction, and magnitude of impact—is equally imperative. A cursory look at correlations is inadequate to understanding the interplay between these variables as it does not account for the scale of impact. This is not to suggest that there is a way to circumvent making what will often constitute a major assumption about the interplay of these variables. Rather, this is to emphasize the importance of considering the relationship at present and understanding how various developments can change these relationships’ nature as time passes.

Utilizing a framework that quantifies the relationship between QTM variables appears to be a logical next step in the evolution of the payment token valuation dialogue. Fortunately Johnny Antos has articulated an approach in his piece, entitled Cryptoasset Valuation: Introducing Beta of Velocity, that should be fairly accessible, given that the methodology employed is generally accepted in the traditional finance community as a tool to measure the sensitivity of stock prices to broad market movements. Antos leverages the well-known concept of beta as a basis to quantify the relationship between PQ and V. Termed the ‘beta of velocity’ and calculated per the below formula, the approach allows analysts to measure the impact of changes in PQ on V and M.

The beta of velocity equals the coefficient of variation for change in velocy and change in cryptoasset GDP divided by value of risk for change in cryptoasset GDP
βV = Beta of velocity
rV = Δ velocity (V) %
rPQ = Δ cryptoasset GDP (PQ) %

Substitution and simplification of the above formula using the relationships between standard deviation, variance and correlation allows for an alternative formulation that may more easily demonstrate the factors at play.

Given that:

and

And since:

It follows that:

where ρV,PQ is the correlation of changes in velocity and cryptoasset GDP, and σV and σPQ are the respective volatilities. This formulation suggests that the extent of co-directional movement is a function of the correlation between changes in velocity and changes in cryptoasset GDP, adjusted to account for the relationship between the standard deviation of changes in velocity and the standard deviation of changes in cryptoasset GDP.

In lieu of nonchalantly assuming that changes in PQ have no impact on V (without having a reasonable basis to do so), analysts can assume a relationship exists and use the beta of velocity as a way of predicting the expected impact. Depending on the proximity of βV to zero and unity values, the expected direction and magnitude of impact can be quantified. For example, in the case where βV is greater than one, an increase in PQ will result in an increase in V that is larger on a percentage basis thus resulting in a lower M value.

Extending this analysis with inspiration from the way in which financial analysts use beta to measure systematic risk, it follows that one can utilize βV to calculate the velocity that is expected after an instantaneous shift in PQ:

Where V0 is the initial velocity and V* is the velocity immediately following the change in PQ.

While a number of challenges remain in terms of accurately calculating velocity at any given point in time, formally considering the relationship between PQ and V is a worthwhile endeavor that may provide additional clarity when contemplating the myriad ways in which a cryptoeconomy may develop over time.

Changes in PQ and the Beta of Velocity
Table for changes in PQ and the beta of velocity

The Monetary Base

Yet another variable that requires consideration is the monetary base (M). In order for the QTM to function correctly the referenced velocity must correspond to the referenced monetary base. As the model assumes that the product of the monetary base and velocity represents the entirety of productive economic activity, the monetary base must only correspond to units of currency that circulate freely.

Although most advanced economies have derivative and lending activity that further complicate monetary base calculations, most countries (such as the U.S.) have adopted conventions that allow for the precise specification of which currency units are considered to be freely circulating, a standardization effort that dramatically improves calculation accuracy. As with many aspects of the naisent crypto space, the equivalent concepts in cryptofinance are immature and analysts are thus left to their own devices when performing such calculations. Crypto derivatives, lending, and the presence of layer 2 solutions further compound complexity and increase the need for discretion.

To make matters worse (and despite the fact that all records are digital and immutable), cryptosystems are inherently more difficult to account for given the diversity of transactional activity. Just as one must isolate and remove trading activity from system GDP (i.e., PQ) calculations given that the QTM only applies to genuine economic activity, analysts must identify and exclude tokens that are used to support secondary trading activities. In addition, orphaned tokens resulting from lost private keys and the impact of periodic and unpredictable staking must also be accounted for—a failure to do so would result in an overstated monetary base. Given this plethora of issues it is not clear that one can ever definitively determine the monetary base of a cryptosystem with any significant levels of adoption.

Closing Thoughts

The preceding discussion is perhaps especially helpful in illustrating the challenges of valuing many of the utility tokens that dominated the industry through 2018, and which, given the lack of defined economic rights, rely upon a delicate assemblage of assumptions that are often precarious in most real-world applications.

Given the host of challenges associated with properly implementing a QTM-based payment token valuation, the approach may raise as many questions as it resolves in most existing cryptofinancial applications. A successful valuation must at minimum apply the QTM with precision while simultaneously navigating a complex web of theoretical quandaries and practical limitations. The challenges associated with utilizing units consistently, correctly identifying token holdings that must be removed from the monetary base, and accurately parsing transactional history should not be underestimated. In addition, use of the model requires an ability to forecast velocity with a reasonable degree of accuracy, a feat that is problematic given that: 1) economists generally do not forecast velocity; 2) there are serious methodological concerns with treating velocity independently; and 3) it’s unclear which historical velocity values in crypto or fiat economies might apply.

A meticulous implementation that adjusts the QTM to appropriately account for the endogeneity of velocity with the appropriate caveats, however, can still provide valuable output. For example, while it may be difficult to accurately estimate velocity at a future point in time, a thorough review of current and anticipated system dynamics including transactional frictions, user adoption, user behavior, future development trajectory, and an assessment of the likely relationship between velocity and PQ may provide sufficient context to surmise a reasonable range of expected velocity values. As mentioned earlier, one may also use comparable economies (both fiat and crypto-based) to triangulate a range of reasonable expected velocity values that will facilitate the use of the QTM as a tool for valuation.

Furthermore, while it is undoubtedly challenging to produce a definitive valuation, considering the impact of various system design and token holder behaviors in the context of the QTM may provide some level of guidance when establishing system structure, system parameters, and KPI targets.

Thankfully there are other cryptofinancial valuation methodologies available that can help fill the gap. Furthermore, as mentioned in Part One of this series, the consummate analyst rarely relies on any single valuation tool, instead preferring to triangulate value across several distinct approaches.

Fortunately for practitioners, some cryptoeconomic structures are naturally more conducive to traditional valuation. Token economies that provide holders with a stream of value, either via dividend-like payments or by virtue of service discounts, provide analysts with more tangible value distribution mechanisms that can be more readily modeled. While the emerging security token ecosystem is likely to allow analysts to sidestep many of these critical issues by allowing for the attachment of clearly defined value streams to tokens, an emerging sub-class of utility tokens that provide their holder with service discounts is also alleviating some of the pressure to vet existing (or create new) payment token valuation approaches. S+C’s next publication will walk readers through how to value such an economy using a concrete example.

Smith + Crown provides cryptoeconomic, strategic, and technical advisory services to a wide array of best-in-class crypto projects and traditional enterprise clients.