Tokenomics Reflections from EthCC Paris

Share here:

Learn more tokenomics

Watch video version of this article


Introduction

Recently, I had the privilege of attending EthCC Paris. Invited by the Token Engineering Academy team, I was given the opportunity to present a workshop at the Token Engineering Barcamp side event and deliver a talk on their token engineering track at EthCC. It was an enriching experience that exposed me to a plethora of insights and innovations in the field of tokenomics and token engineering. Today, I'd like to share some of my key takeaways from the conference.



EthCC Paris: A Brief Overview

EthCC, short for Ethereum Community Conference, has been an annual event since 2018. This year, it hosted over 5,000 attendees and 350 speakers, including yours truly. Billed as "the largest annual European Ethereum event focused on technology and community", the conference primarily caters to developers building on the Ethereum blockchain. The talks usually delve into the technical details of development on Ethereum.

A Shift in Focus

This year, however, there was a noticeable shift in the conference's focus. Many attendees who had been visiting for years observed that not only had the conference grown considerably, but the variety of topics discussed had also expanded. Several talks had a more economic and financial focus than in previous years.

This shift is crucial given the nature of the products being built on blockchains and smart contracts. While the technical aspects of building these products draw heavily on computer science, software engineering, and cryptography, the human aspect of these products cannot be overlooked. These products are used by humans who engage with them through economically motivated behaviours.

To fully understand the consequences of these complex behaviours and how they affect the product's performance, it becomes necessary to incorporate perspectives from conventional finance, economics, and their newly emerging, blockchain-specific subfields: tokenomics, crypto economics, and token engineering.

This shift in focus was corroborated by a review of the conference by Consensys, which stated:

"A shift in focus from infrastructure to application development is overdue. It is crucial for more people to concentrate on building and deploying real dapps with practical use cases for users, rather than creating replicated versions of existing functional infrastructure."



Conference Puts a Spotlight on Risk

The conference was a treasure trove of insightful talks, events, and ideas. However, some of the ones that stood out the most were those that focused on "risk". This is a topic of current study focus for me, and it was interesting to see how often "risk" popped up in talk titles throughout the conference.

Risk Management in Blockchain

The development of conventional risk management accelerated after the 2008 Global Financial Crisis. Similarly, the FTX crisis has spurred efforts to develop risk methodologies and practices suitable for blockchain. The discussions at the conference suggested that while these efforts are primarily based on approaches and expertise from traditional finance, there is also an openness to identifying and measuring new risks specific to the blockchain space.

Index Coop's Approach to Risk Management

A talk by David Garfunkel and Andrew Jones from Index Coop, an enterprise that creates tokenised investible indices for investors, highlighted how they've created risk scores to manage their asset management exercise. They were later joined by Paul Lei from Gauntlet, who showcased some of the analytics and risk dashboards they build for major protocols.

Gauntlet's Data Analysis for Reward Programs

Another talk that left a significant impression was by Gabe Pohl-Zaretsky from Gauntlet. He demonstrated how data analysis could be used to evaluate the likely impact of a reward program and estimate how much value it could bring back. This approach aligns with my own view that reward programs need to be carefully costed and evaluated against performance metrics like CAC (Customer Acquisition Cost) and LTV (Lifetime Value) to ensure they pay back.

The Relevance of Higher-Level Statistical/ML Approaches

During his talk, Gabe made a crucial point about the choice of approach needed to check if a prospective reward programme would successfully pay off. While Gauntlet had visibility into wallet-level behaviours, these behaviours were so changeable and exotic that it wasn't possible to get any traction on the problem through low-level agent-based simulation. However, they discovered that macro-level statistical patterns offer signal and an opportunity for insight, despite the variety and mutability of micro-level behaviours. This insight validated my sceptical intuitions about the limits of agent-based modelling, and highlighted how higher-level statistical/ML approaches could contribute to design and optimisation challenges in the field.

S&P's Approach to Credit Rating for Protocols and Blockchain Enterprises

Charles Jansen from Standard and Poors (S&P), one of the major credit rating agencies in traditional finance, shared how there is growing acceptance at S&P that the future of credit is on chain. The increasing institutional flows into blockchain activity and DeFi necessitate the development of credit rating approaches for protocols and blockchain enterprises. They have already issued a rating for Compound Prime, the enterprise arm of Compound.

Block Analytica's Use of Risk Data Science in Optimising DeFi Protocols

Jan Osolnik from Block Analytica and MakerDAO shared how he is using risk data science to optimise DeFi protocols. He listed several machine learning techniques, including EDA (Exploratory Data Analysis), regression, and clustering analysis, as relevant parts of the toolkit. Interestingly, he also mentioned that causal inference modelling, which focuses on substantiating the claim that "A is the cause of B", could have special relevance in the space. This is because blockchain projects are seldom able to run randomized control experiments needed for AB testing, for ethical reasons. In such situations the causal inference methods that are taught in statistics and econometrics could help projects estimate what the impact of a future system change or intervention might be.



The Evolution of Data Tools in the Blockchain Space

The evolution of tools in the blockchain space is a fascinating aspect of the field. The development of these tools is driven not only by increased computational power but also by the changing nature of the data we have to handle. Given the unique data characteristics of blockchain, there is a legitimate need for new tools that are specialised and optimised for these problem tasks.

Low-Level Modelling Tools

TokenSPICE: Optimising Smart Contract Designs

Trent McConaghy of Ocean Protocol discussed how he migrated methods used to optimise analogue circuits to optimise smart contract designs using a package called TokenSPICE. He shared a case study where the simulation detected an unexpected exploit in their ecosystem and successfully delivered a solution to remove it.

20 Squares: Game Theoretic Modelling Tool

Fabrizio Genovese and Daniele Paolmbi from 20 squares demonstrated the potential of their game theoretic modelling tool. Their tool translates game theoretic descriptions into a modular framework, allowing for flexible composition of complex processes and relationships. It also enables the use of programming language-style expressions, making game theory more familiar to coders.

Their tool could eventually enable some powerful testing capabilities. If a circuit contains an auction process, for example, the tool could systematically test various auction designs drawn from an internal library, in order to find the best one. It also allows users to solve for more generous notions of system equilibria, that are more realistic and appropriate to the complexity of these systems than the Nash-style equilibria that are taught in undergraduate courses.

Data Extraction and Transformation Tools

Several tools are being developed to facilitate the extraction of data and its transformation into meaningful analytical measures and concepts.

Messari: Building Data Feeds Using Graph Protocol

Mihail Grigore from Messari showcased how he is building data feeds using Graph protocol's indexing technology.

Blockscout: Analytics Capabilities Within Block Explorers

Ulyana Skladchikova from Blockscout discussed the idea of having more analytics capabilities natively available within block explorers as is offered by Blockscout.



Token Engineering Track Day: A Deep Dive into Tokenomics and Token Engineering

The Token Engineering Track Day was the event that brought me to EthCC in the first place. We had an impressive lineup of speakers, and the talks were well-attended throughout the day. Here are some quick highlights from the day.

Trent McConaghy: Building and Adapting Economies on the Go

Trent McConaghy of Ocean Protocol gave two talks, one of which highlighted the importance of being able to build and adapt economies on the go. His experiences with Ocean Protocol showed that no amount of modelling at the drawing board will ever capture the complexities and eventualities of real life.

Achim Struve: The Right Modelling Choice

Achim Struve from Outlier Ventures emphasised that the right modelling choice depends as much upon the client's situation as it does on the problem being modelled. He demonstrated a simpler, spreadsheet-based tool that could be more suitable for clients in an exploratory phase or with a limited budget.

Rohan Mehta: Natural Language Agents for Better Tokenomic Decisions

Rohan Mehta demonstrated a proof of concept of how natural language agents could aid in making better tokenomic decisions. He used my own fundraising model as the template to create the backend for this demo.

Dr. Mark Richardson: DEX Design Innovation

Dr. Mark Richardson, who works with Carbon and Bancor, gave a stunning talk on DEX design innovation. His talk demonstrated how conceptually obvious and intuitive design improvements could still be sitting just under our noses. Carbon’s innovative new trading system asks:

“Why do buyers and sellers have to buy at the same price?” (As they do in Uniswap v2 and v3.)

“Why not just give buying and selling separate bonding curves and allow those to recreate market limit and stop loss orders that most traders are familiar with in conventional and centralized exchanges?”

I also recommend you watch this video just to see a superb example of data storytelling in play. Mark’s talk demonstrates how you go beyond using data and charts for internal analysis and turn them into effective storytelling tools for external and non-expert audiences.

Lukasz Szymanski and Tom Mellan: The Importance of Diversity of Expertise

Lukasz Szymanski from Tokenomia Pro and Tom Mellan of CryptoeconLab highlighted the importance of diversity of expertise and talent in larger teams. This suggests that one doesn't just have to be a data scientist or control engineer to do this work, broadening the opportunities for people to get involved as this field grows.

Lukasz also made a point that’s often said with regard to datascience and analytics roles. Being able to communicate with stakeholders and decisionmakers is a vitally important complement to the technical skills of these roles. Unfortunately, this mix proves just as hard to come by in web3 as it does in web2.

Dr. Kris Paruch: Breaking into the Emerging Field

Dr. Kris Paruch, one of the founding members of TEA and the CTO of Adim, shared various ways to break into this emerging field.


Token Engineering Barcamp: A Side Event Worth Attending

The Token Engineering Barcamp was another side event that I attended. Unfortunately, the sessions were not recorded, so attendees had to be there live. The event was well-attended, and it was great to be in a space with so many talented people working on problems in this space. Three speakers offered presentations about conceptualising, identifying, and modelling the fundamental economic process at work in blockchain systems, providing a view of the economic process that stands behind a token's utility.

Nate from Token Dynamics

Nate, the author of Eat Sleep Crypto and founder of Token Dynamics consulting, was one of the presenters. His exploration of the economic processes behind token utility was insightful and provided a fresh perspective on demand-side tokenomics.

Lisa Tan from Economics Design

Lisa Tan of Economics Design also presented her views on how we could use macroeconomic models of GDP to map to analogues in blockchain economies. Her approach to tokenomics is always thought-provoking, and her presentation at the Barcamp was no exception.

Dr. Vasily Sumanov

Dr. Vasily Sumanov's presentation was particularly interesting. He proposed a modular framework for describing token utility that starts by identifying distinct "Origins of Value", then pairing them with token-use-logic that successfuly meshes with this process, thereby establishing a value relationship between that token and the underlying utility. My intuition is that this approach ultimately allows for a more granular, flexible, and economy-agnostic analysis of the economic processes that underly token utility and is likely to be more useful for it. I look forward to seeing more of his research on this topic!


Conclusion: A Rewarding Experience

In conclusion, attending EthCC Paris was a rewarding, intellectually stimulating experience. I am grateful the Token Engineering Academy for inviting me to be part of this experience. The conference provided a wealth of insights into tokenomics and token engineering and showcased the exciting work being done in this field.

Next year, EthCC will take place in Belgium, due to the high prices in Paris caused by the Olympics. If it's within your budget, I would DeFinitely recommend attending. Even though all the material is available on YouTube, being at the event itself and making in-person connections is an experience worth having.

As the field of tokenomics and token engineering continues to grow and evolve, I look forward to participating in more opportunities like this in the future.


Liked this article? Share it with others!

Tokenomics Reflections from EthCC Paris
Scroll to top