Xlera8

Поведінкова сторона загальної пропозиції та ціни за токен

This article is the second part of series to learn more about Tokenomics from author & contributor: Max K – CEO Coinstruct.tech.  Read the first part here: Supply-Side Tokenomics: Supply, Distribution, Optimization for Price Performance & Cost-Efficiency

Defining Total Supply

Once you’ve opted for a capped maximum supply, the next step – is to define the exact number of tokens. 10M, 100M or even 100B tokens as your total supply?

The answer, which may not align with expectations, is that the choice holds relatively little significance—aside from leveraging psychological biases, the selected number is essentially random.

There exists no specific reasons dictating why Bitcoin had to be designated with a 21 million maximum supply. It could have just as viably been 2.1 million, 210 million, or any other figure. However, Satoshi Nacomoto is known to be targeting future Bitcion price to be the similar value to many popular fiat currencies, so it would be more comfortable to execute transactions. The primary impact of the maximum supply is actually  observed in its influence on the price per token within a given market capitalization (MC).

Defining Price

The price per token is another psychological issue that slightly affects the perception of many retail investors and sometimes even influences the price performance.

Some interesting insights can be driven from TradFi. The price per share, akin to the price per token, holds no intrinsic value and provides little insight to a rational investor regarding the company’s relative worth, recent performance, or anticipated future performance. Still, there is a tendency for stocks to experience significant growth subsequent to stock splits. Stock splits result in a reduction in the price per share alongside an increase in the number of shares, while maintaining the market capitalization and dilution at constant levels. For instance, in a 1:10 split, if an investor owns 1 share priced at $100/share before the split, they would own 10 shares priced at $10/share afterward, yet the $100 investment still represents the same proportion of ownership in the company in both scenarios. This phenomenon of outperformance post-splits may be attributed to investors’ irrational preference for lower-priced shares.

The same approach seems to be taking place with tokens. Although these perception  biases typically imply that a lower token price is more favorable, it stands to reason that there is likely a limit to this advantage.

For instance, if consumers attach greater significance to the leftmost digit, there is probably a substantial psychological gap between a token priced at $0.01 and one at $1.00, compared to the difference between $0.000001 and $0.0001, despite both representing a 100x increase. When individuals find themselves needing to pause and carefully count leading zeros, it suggests that the price may have surpassed the optimal point to benefit from any bias towards lower prices.

At Коінструкт, after analyzing top 1000 projects by Market Cap, we found out that the findings roughly seem to reject the “lower price is better” hypothesis. The $<0.01 price per token range had statistically significantly worse returns than the ≥$10 – <$100 price range.

Tokens priced below $0.01 exhibit a statistically significant higher volatility in relative monthly returns compared to tokens priced between $1.00 to less than $10.00, tokens priced above $100.00, and tokens priced between $10.00 to less than $100.00. This indicates a consistently wider range of performance relative to the market across different price buckets.

For builders, this suggests the importance of avoiding selecting a maximum supply that would result in an expected token price below $0.01. Below this threshold, the data indicates that tokens tend to underperform significantly and may experience heightened volatility. This phenomenon could be attributed to psychological perceptions of tokens priced below 1 cent as high-risk, or perhaps due to the presence of a large number of low-quality coins attempting to leverage a low token price as a marketing gimmick.

Optimizing Emissions

What should developers consider when determining their emission curve?

The data indicates that optimizing emissions doesn’t solely revolve around minimizing inflation at any expense, as often presumed by many builders. Instead, it’s about striking a delicate equilibrium between supply and demand, establishing and sustaining the right incentives, defining valuable user actions, ensuring a well-distributed holder base, and fostering protocol stability—all contingent on the unique requirements of your project. There isn’t a universal “optimal emissions curve,” but rather a set of best practices and improvements over linear time-based models that warrant consideration.

Usually vesting in crypto is time-based, which means that tokens are distributed after a set time period passes. However, vesting can also be done according to the project’s progress, such as milestone vesting. In particular, utilizing “S”-shaped curves, such as the sigmoid curve depicted in the image below, may offer advantages over linear schedules.

Source: Tokenomics Design Canvas

As you can notice, the rates of inflation, or dilution, tend to be highest during the initial launch of a token due to the fact that the existing supply is at its lowest-ever levels. This holds true for most projects, even those employing linear emissions schedules. As a result, many projects experience their peak inflation rates at the start, setting the stage for optimal conditions for a price decline. This phenomenon can lead to the perception of the project as a pump-and-dump scheme. That’s why it is important to consider other supply curves rather than linear.

A finely tuned sigmoid curve diminishes inflation during periods when the project is particularly vulnerable to pump-and-dump schemes, effectively smoothing the inflation rate more evenly across the entire schedule. However, it’s important to note that a curve cannot emit fewer tokens than an equivalently long linear curve without eventually emitting more tokens than the linear curve at some point.

This phenomenon is evident in the curve depicted above, where the month-to-month token emissions of the sigmoid curve surpass those of the linear curve from approximately Year 1 to Year 3 within the 5-year emissions schedule. Yet, this deviation from linearity may be deemed an acceptable trade-off for achieving lower inflation rates during the project’s initial launch phase, and it might even represent an inherent strength of sigmoid curve emissions.

It is also worth looking at some profound vesting dynamics like AVV (Adoption Adjusted Vesting) proposed by Achim Struve. The main idea of this approach is to get rid of traditional time-based vesting schedules where the majority of the supply gets emitted into the ecosystem at the early building phase of the product suite. In other words a lot of supply hits low market demand.

AVV utilizes a flexible and smooth approach with  algorithmic token vesting according to certain KPIs. AVV is determined by controllers that increase the emissions in lower adoption and decrease them in higher adoption phases. In this case vesting tokens could be issued into the ecosystem depending on TVL, user acquisition / token holder count, revenue thresholds, locking allocations, or the token price itself. Looking at the example of TVL the token vesting rate could be increased in phases of low protocol TVL to have more tokens to incentivise deposits through higher rewards.

забезпечення  

When launching a token, deciding on the recipients and distribution methods are among the pivotal tokenomics design choices. These crucial decisions hold significant sway over how your token is perceived and performs, serving as key indicators of whether the team and initial supporters have aligned incentives.

джерело: Token Vesting and Allocation Benchmarks

For a data-driven approach to allocation, we should look at what is typical for projects and generally best practice.

  1. For the Core Team, approximately 18.8% of tokens are set aside, encompassing allocations for founders, employees, and other contributors.
  2. The investor allocation stands at about 13%, though this average incorporates projects without investor presence, thereby reducing the overall category allocation. However, when exclusively considering projects with investor involvement, the token allocation percentage rises to 17%.
  3. Reserved for future product development and operational expenses, Company Reserves or Treasury funds hold approximately 22% of the token allocation.
  4. Community Incentives or Distributions hold the largest allocation at 40.5%. This elevated allocation is logical as it aids in achieving adequate decentralization and widespread network ownership. Additionally, it serves as the primary method to incentivize product usage and early engagement.
  5. A fraction of tokens, namely 4.2%, is designated for Public Sales, including public exchange listings and token sales. This figure has decreased from 55% in 2017, likely due to regulatory constraints, leading to a corresponding uptick in allocations to investors and community incentives.
  6. Lastly, partners and advisors typically receive an allocation of 1.5% of tokens.

With that being said, no “one-size-fits-all” token allocation exists for all crypto projects. Every project should tailor their token allocation plan to match its unique internal economic system.

Коінструкт is an official tokenomics partner of InnMind. It has developed 25+ tokenomics including such projects as Nomis Protocol (LayerZero, zkSync, Galaxe partner), Otcmarsbase.io (+$450M in Published Deals), AANN.ai (+$1M raised), Dexodus.xyz, it’s consulted projects such as InnMind.com ($60M raised by Startups), Claimr, CryptoDo, Піктограми (Linea partner), Qoomon Quests (team from Goldman Sachs and Tencent) and 20+ more across various blockchain ecosystems. Coinstruct team know how to approach tokenomics challenges with all the supply & demand factors.

Coinstruct is a specialized agency for full-cycle Web3 tokenomics development where we have a team of diverse specialists from mathematicians, product managers to degens from leading DAOs working together to create advanced token models, reward systems, economy audits and scoring. Our focus is to create Web3-native sustainable economic solutions to help projects achieve desired goals through tokens: from efficient fundraising to increasing user base, retention and loyalty.

Якщо у вас є проект web3, який планує запустити токен і має проблеми з розробкою стійкої токеноміки – сміливо заповнюйте форма and arrange a free strategic call with the Coinstruct team. Or you can join our новий телеграм канал when we share tokenomics insights and experience.

Читайте також:

Керівництво з токеноміки: ключові показники, які слід враховувати

Пояснення ключових показників токеноміки для стартапів web3. Як спроектувати розподіл токенів, розрахувати швидкість токенів, використання мережі тощо. З прикладами та

Як розробити утиліту маркерів: найкращі практики та приклади

Discover the key to successful token utility design in the Web3 space with practical strategies and examples from Mezen tokenomics experts.

Зв'яжіться з нами!

Привіт! Чим я можу вам допомогти?