# GIP-0026: Decaying Curation Tax

This GIP is based on ideas initially proposed by @juanmardefago and can be found on Radicle here. For convenience, the full text of the GIP is pasted below:

# Abstract

This proposal modifies the existing curation tax on a subgraph deployment bonding curves to decay linearly over time.

# Background

The Graph currently has a curation tax aimed at discouraging too frequent upgrades of subgraphs. Indexing a subgraph from genesis implies a large upfront cost to Indexers before they have the opportunity to collect indexing rewards or serve queries. If subgraphs upgraded too frequently or unpredictably, then one might expect Indexer profitability, and by extension participation, to decline.

# Motivation

The curation tax as it exists today is a blunt instrument. It is levied irrespective of how long a Curator is signaled on the subgraph deployment bonding curve. This is in part due to the way curation royatlies are distributed–into the reserves of the bonding curve. The protocol currently levies the tax on deposit (signal) rather than withdraw (unsignal) because the latter would also imply taxing a portion of the earned royalties.

With the new principal-protected bonding curve design introduced in GIP-0025, curation royalty distribution is no longer carried out via the reserves of the bonding curve, but rather through separate payouts.

This presents the opportunity to revisit the decision to levy the curation tax on deposit. By taxing on withdrawal we may incorporate the amount of time the Curator was signaled into the size of the tax that is levied.

Reducing the curation tax makes the curation market more economically efficient and reduces the costs of using the network to subgraph developers.

# High Level Description

This proposal requires the following additional logic:

1. Tracking the cost-weighted average time a Curator is signaled on a subgraph.
2. Modifying the curation tax to be levied on withdraw and decay linearly to zero as a function of cost-weighted average time signaled.

## Tracking Cost-Weighted Average Time Signaled

When a Curator signals on a subgraph deployment bonding curve, they deposit GRT and receive newly minted curation shares in return. These shares are ERC-20 compatible[1] and may be transfered to other Etheruem accounts. The ERC-20 standard leverages an account model, rather than a UTXO model[2], meaning that when the same account receives curation shares multiple times, these shares are combined into a single balance rather than tracked separately.

This raises the question of how to track the amount of time signaled for such a balance that has been added to multiple times. The approach proposed here, and the one that requires the least additional bookkeping, is to track the cost-weighted average time basis of a balance of curation shares.

This is simply a weighted arithmetic mean of the times at which shares were minted, measured in blocks, of all the balances of shares that were combined into a single balance, either through repeated signaling or transfering. The average is weighted by the amount that was deposited for each respective balance of shares, also known as the cost basis.

For reasons similar to those described above, when the same account receives shares in multiple distinct transfers, the respective cost bases are combined into a single share-weighted average cost basis. This bookkeping was introduced in GIP-0025 to support principal-protected bonding curves.

In order to compute the cost-weighted average time signaled for a given amount of shares being burned, we simply subtract the cost-weighted average time basis from the current time.

## Linearly Decaying Curation Tax

The curation tax, currently levied on deposit, must be replaced by a curation tax levied on withdraw that decays linearly to zero as a function of the cost-weighted average time signaled.

A new governance parameter must be introduced to the protocol to modify the time it takes for the curation tax to decay to zero. We will call this the curation tax decay time.

# Detailed Specification

## Psuedocode

#### Signaling. Alice deposits tokens to mint curation shares.

1. Alice has M curation shares with an average cost basis of x and an average time basis of a
2. Alice deposits y tokens to mint N shares at time b
3. Alice’s new average time basis is (x \cdot a + y \cdot b)/(x+y)
1. If Alice’s previous balance was zero then we assign a previous average cost basis x=0.
4. Alice’s new average cost basis is (M \cdot x + N \cdot y)(M+N).

#### Transfer. Alice has previously minted shares trasnfered to her.

1. Alice has M curation shares with an average cost basis of x and an average time basis of a
2. Alice receives N shares having an average cost basis of y and an average time basis of b.
3. Alice’s new average time basis is (x \cdot a + y \cdot b)/(x+y)
• If Alice’s previous balance was zero then we assign a previous average cost basis x=0.
4. Alice’s new average cost basis is (M \cdot x + N \cdot y)(M+N).

#### Unsignaling. Alice burns previously minted shares to withdraw reserve tokens.

1. Alice has M curation shares with an average cost basis of x and an average time basis of a. Curation tax decay time is represented by \Delta and the initial curation tax before any decay is represented by \tau_{c,t=0}.
2. Alice unsignals N, where N \le M curation shares at time b.
3. The Cost-Weighted Average Time Signaled is simply b-a.
4. If N=M
• Alice’s new average cost basis is reset to 0.
• Alice’s new average time basis is undefined.
5. If N<M
1. Alice’s average cost basis remains x.
2. Alice’s average time basis remains a.
6. The curation tax percentage \tau_c is \max(0, \tau_{c,t=0} - \frac{\tau_{c,t=0}}{\Delta} \cdot t) where t is measured in blocks.
7. The reserves to withdraw and the curation royalties to withdraw are computed separately, according to the logic in GIP-0025.
8. The curation tax is levied only on the reserves to withdraw and not on any curation royalties being withdrawn.

# Prototype

The withDecayingCapitalGains mixin in this Observable HQ notebook comprising Bonding Curve prototypes, has a calculation similar to the average time basis calculation described here–except that it uses a share-weighted average time basis rather than a cost-weighted average time basis.

This change should be implemented for newly created bonding curves but cannot be implemented retroactively for existing subgraph deployment bonding curves.

# Backwards Compatibility

This proposal doesn’t introduce any breaking changes in the bonding curve interfaces, though some UIs may need to be updated to display a different curation tax to users.

# Dependencies

This proposal relies on modifications to the bonding curve and curation royalties calculations as outlined as a part of the principal-protected bonding curves work in GIP-0025.

# Risks and Security Considerations

A possible risk is that we do not set the curation tax decay time long enough, or similarly that we set the initial curation tax before decay too low.

This could result in upgrades that are too frequent for Indexers to effectively manage their allocations and indexing activity in a way that is profitable and that doesn’t introduce a lot of wasted time and effort.

# Validation

The changes introduced here should go through a smart contract audit and receive community feedback, with special focus on Curators, Subgraph Developers and Indexers.

# Rationale and Alternatives

## Decay Function

We could choose different functional forms for the decay function, such as exponential, but using a linear decay function keeps the implementation simple to reason about and implement.

Copyright and related rights waived via CC0.

# Notes

7 Likes

The high-level approach presented appears well-chosen for increasing efficiency while still disincentivizing too frequent upgrades of subgraphs. Can you share any intuition for how to choose the curation tax decay time? At first read, it is unclear to me how to fix this a priori (unless it is pegged to another time window of interest).

As an alternative, exponential decay is mentioned, but this is said to be less favorable compared to linear decay (for simplicity of implementation and reasoning). Regarding implementation, it seems a plausible way to do it is to replace the choice of curation tax decay time with a decay parameter \alpha > 0 and use

\tau = \dfrac{\tau_{c,t=0}}{2^{\lfloor \alpha \cdot t\rfloor}},

where \lfloor x\rfloor is the floor function (which rounds x down to the nearest integer). This is, of course, not as gas efficient as the linear decay, but may be not much more for implementation (as we just need a min function and a handful or two of divisions by 2). Second, we can reason about this using notions like that of a “half-life.” For example, we could choose \alpha such that that tax rate drops to \frac{1}{10} its value every X days, and so after 2X days the tax \tau is only 1% of \tau_{c,t=0}. I bring this up as I think it may be easier to justify a drop by a certain percentage (rather than linear decay to zero) if we do not know how much time is needed to satisfactorily mitigate the effects of frequent subgraph upgrades.

minor typos

• In the motivation paragraph, “If subgraphs upgraded” → “If subgraphs upgrade”
• “trasnferred” in Alice Transfer example

From an intuitive standpoint, the decay time (assuming this is the time it takes to reach 0% tax, without considering the actual implementation details, like if it’s linear or exponential decay) would probably be decided as a factor of “reallocation cycle”.

What I mean by this is, given that indexers have a soft constraint as to how long they can extend their allocations (if they choose to participate in rewards distribution), we would like to incentivize subgraph upgrades to happen within the timeframe that indexers have to recycle their allocations (given that they would need to have it synced by then)

So if we took a decay time of more than 1 reallocation cycle (28 epochs), it would most certainly be more than needed for indexers to not be bothered by an upgrade, since they would have been required to sync it a long time before that, and would’ve already received payment for doing so.

On the other hand, if we took a small fraction of a reallocation cycle, for example 1/28 (or 1 epoch), it would most likely not be enough time to even sync the original deployment, and also might not even be enough time to even close the allocation itself before signal gets drained from the original deployment.

In my own opinion, I think a decent enough decay time would be somewhere around 1/2 and slightly over 1 reallocation cycle(s), so to give a reasonable amount of time to sync and gather rewards, but also not go too overly aggressive with the stickiness incentive (although the incentive does seem pretty minor in this case, given the tax values for curation being relatively low).

PS: I haven’t done extensive modelling on this, so take this with a grain of salt, as it’s mostly boiled down to intuitive reasoning I’ve been having while thinking about the basic proposal.

@juanmardefago, this is helpful! I was suspecting the time was tied to the 28 epochs, but didn’t fully grasp why that might be until this message.

I would be open to specifying the decay in terms of half-life depending on @ariel’s input on impact to gas costs. I agree there are some intuitive benefits.

I’ll echo Juan’s explanation on the timing and would note that if we make the decay too short and subgraphs upgrade too frequently (whether in terms of half-life or linear decay), Indexers expected fixed costs of indexing a subgraph will increase, which may make Indexers less responsive to indexing rewards altogether, and could hurt quality of service.

Similarly, if Indexers feel their expected fixed costs are lower, they may become more responsive to indexing rewards, and by extension curator signal, and the average Curator may need to signal less to attract an equivalent amount of Indexers, even as they pay a higher percentage of that signal as a tax.

I would be in favor of a half-life of one allocation cycle (28 epochs) or a linear decay time of 3 allocation cycles (84 epochs). Of course, these are still largely magic numbers driven by intuition.

1 Like