Dynamic Curation Tax

I have reservations about this proposal as it introduces more complexity into an already complicated & still nascent system.

In general I am also opposed to anything that introduces higher costs & complexity for subgraph developers, without unambiguous net benefit.


Great ideas in here, thanks for kicking off this discussion @Slimchance!

A downside of an tax based on unsignal is that it penalizes Curators that signal for longer, because they will have accumulated more Curator royalties in the bonding curve that will will be subsequently taxed.

I generally agree with Adam that this proposal in its current form imposes too many additional costs to get its intended benefit. It feels like a blunt instrument.

A variant on this idea, however, that we’re researching at E&N is a decaying capital gains tax. For example, the protocol could tax capital gains earned over a period of a block at ~100% and capital gains earned over a year at ~0%. This would virtually eliminate any profit opportunities from front-running/ sandwich attacks, and would instead reward Curators whose shares appreciate in price over a longer period of time.

The major downside of the above approach is that Curator shares lose some fungibility (brings them closer to UTXOs), and there would be additional bookkeeping required to track the cost-basis of curation shares in order to levy the capital gains tax. My working assumption has been that we wouldn’t be able to do something like this until The Graph migrates parts of its protocol logic to an L2, but it could be worth prototyping sooner to check that assumption.


A small group of curators have been discussing a potential solution to this and hope to have it refined a bit more. But we’re excited to present it to the community!


Thanks for your reply Brandon.

Regarding decaying capital gains tax

  • I’d love more details on how a decaying capital gains tax would work. I assume it would calculate capital gains as (GRT received from burning curation shares - GRT deposited into the bonding curve), and then tax this amount depending on how long the signal has stayed in the bonding curve.

  • I don’t think this will prevent front-running bots and low quality signal from curators that tries to be first on the bonding curve. The first curator on the bonding curve would still carry the least amount of risk, while also having a large potential upside:

    • If a bot is the first to signal on a subgraph that ends up with low signal, the capital gains tax would be low or 0, and the bot can still sell their shares at minimal loss.

    • On the flip side - If a bot is the first to signal on a subgraph that receives a large amount of signal: Sure, their shares would be locked for a while. However, they will still have significant upside when they do decide to sell, several months later. This looming might also affect the signal of other curators, as they would know early curators will be able to sell their shares at a low tax 8-12 months after the subgraph was published.

  • Another potential issue with a decaying capital gains tax, is that it rewards staying in the bonding curve for a larger amount of time. The curation markets should not just reflect the current queries. Curators should also be incentivized to act as predictors for future query volume, so indexers can allocate resources and optimize infrastructure and cost models accordingly.

    • A decaying capital gains tax might make the curation market less sensitive to expected future changes to query volume. (Similar to flattening the bonding curve).

      • Consider a dApp a that plans on doing a temporary promotion: They expect a huge increase in query volume the following month. Afterwards, they expect the query volume to return to regular levels. If Curators are able to pick up on this, they can signal on the subgraph to prepare the network for the temporary increase in query volume. With a decaying capital gains tax, this might no longer be feasible.
    • A decaying capital gains tax incentivizes the best curators - those who are able to accurately predict future behaviour - to stay locked in on a few subgraphs. I think the curation market should incentivize the best curators to keep assessing subgraph: signal on those they find undercurated, and - equally important - unsignaling on overcurated subgraphs.

Thanks for digging into this :slight_smile: Hearing from the E&D team is super helpful.

I think Capital Gains Tax would solve the issue of front running but would also cause unwanted side effects that would affect the goals and experience of curation.

  1. At the heart of curation is the ability to adjust to market changes at a moments notice. Curators have their ears to the ground as they monitor:
  • New subgraph entrance
  • Deprecation of subgraphs
  • Updates to data being stored in a subgraph
  • Events that could lead to increase/decrease in query traffic

Being able to adjust signal based on these factors would not be possible if they are disincentivized to do so.

  1. Curators would be incentivized to keep signal on a subgraph that might not be balanced to the amount of query traffic coming in. Indexers would then be mislead to think that the subgraph should have more resources allocated to it.

As DataNexus mentioned, we’ve been working on an idea and hope to share with the community soon and would love to get your feedback on it :slight_smile:

1 Like

Reverse Auction

(Dynamic Curation Tax v2)

Working with @chK8r42z @Datanexus @graphgod1 @Cole @JamesTheBondsmith @jona and @Oliver we have further refined the idea. We have renamed the idea to “Reverse Auction”, as this is closer to what we are trying to achieve. In a reverse auction, the price starts very high and decreases over time. This allows the market to assess the “fair value” of curation shares on a specific subgraph. As we can see in the examples toward the end of this post, this would also ensure a fairer distribution of curation shares. The Dynamic Curation Tax is used to facilitate this Reverse Auction.

In this post we will look at:
  • Challenges with the Reverse Auction , and how we propose to solve them
  • Parameters for the Reverse Auction, and why they were chosen
  • Example numbers, showing how the Reverse Auction might mitigate the 3 curation pain points explained in the original post.


Challenge: The Reverse Auction might increase the profit of Curators that are good at assessing subgraphs. However, the very first Curators might receive less shares, which can feel punitive:

Solution: Curation Fund

  • The Reverse Auction is there to ensure a fairer distribution of curation shares, and to encourage Curators to assess a subgraph before curating on it.

  • There is no need to burn the GRT that is collected in the process. Thus, we suggest that any tax over the baseline (2.5%) is not burned, but rather deposited into a curator fund.

  • This fund will be governed by The Graph Foundation. The Foundation will be directed by Curators using snapshot voting. Over time, we might see this fund evolve to be governed by a Curator DAO

  • The fund might be used for

    • Grants that improve the curator experience : Dashboards, guides.
    • Initiatives that directly improve curator revenue.

Challenge: Increased complexity for Curators

Solution: Make sure all critical information is readily available in the UI.

  • Current Reverse Auction Fee

  • “Break Even Signal” When the total signal on a subgraph reaches this value, the Curator will be able to burn their shares to retrieve the same amount of GRT they put into the protocol. If the total signal increases above this amount, the curator will be able to sell their shares at a profit.

  • Showing these two numbers in the UI, would allow Curators to make a decision on whether they should signal or not. If the total signal increases above the “Break Even Signal”, they will be able to sell their shares at a profit.

An example of how this might look:


The Reverse Auction parameters are still subject to change. The parameters we are currently looking at looks like this:

Subgraph deployment tax (20%)

  • This tax is incurred when developers “publish and signal”. Developers are allowed the first spot on the bonding curve. The tax acts as a deterrent to malicious attacks by developers. E.g “Bait and switch” explained in the original post.

    • If this number is too low, malicious attacks will not be deterred.
    • If this number is too high, developers might find the tax punitive
    • We are working with 20%. As can be seen in the example below: Developers that expect other Curators to signal, will quickly “recuperate” the deployment tax.
  • Example numbers:

    • A developer self-signals with 10 000 GRT. 8 000 GRT is deposited into the bonding curve, minting 89.4 shares.
    • Another curator deposits 2 125 GRT into the bonding curve, increasing the total signal to 10 125GRT.
    • At this point, the developer is already “Break Even”, and could sell their shares to retrieve the initial amount of 10 000 GRT.

Starting tax (100%)

  • When a subgraph is published, the Reverse Auction will start. The dynamic curation tax starts at 100%. It will then linearly decrease over time, until it hits 2.5%.

    • A high starting tax ensures that front-running bots and low-quality curation will no longer be profitable. Even for highly anticipated subgraphs.

    • A starting tax of 100% is also symbolic: It highlights one of the intentions behind the Reverse Auction: To encourage smart curation, instead of just being the first on the bonding curve.

Reverse Auction time period (10 days)

  • The Dynamic Curation Tax will decrease linearly, block by block, until it hits 0. The current suggestion is that the tax will decrease by ~10% each day, allowing it to hit the target tax after 10 days.
    • A longer time period gives Curators more time to assess the subgraphs.
    • A longer time period allows the subgraph some time to index. This allows Curators to assess the data provided by the subgraph.
    • If the time period is too long, network participants might not know what the curation market deems to be a “fair signal” until towards the end of the Reverse Auction period.
    • Looking at the numbers, the “sweet spot” for early Curators that expect the signal to increase might still be just a few days after the subgraph is published. See the examples later in this post.

Curve (Linear)

  • The Dynamic Curation Tax will decrease linearly.
    • A linear curve was chosen, as it would be easy for Curators to understand and predict the tax.


Let's first look at this example


Let us see how this affects the first two pain points.

  • Front-running bots and low quality signal.
    • With the current system, the first Curators carry the least amount of risk while also having the highest potential returns. Notice how the early Curators hold a vast majority of all curation shares.

    • In a Reverse Auction, the Dynamic Curation Tax starts very high, and then decreases over time. In the example with a Reverse Auction, being first is no longer a huge advantage: the first 5 Curators all receive the same amount of curation shares.

    • When comparing the two, we see that only curator 1 and curator 2 hold a larger amount of shares with the current system. The following 8 Curators all hold more shares after the Reverse Auction, even though some of the Curators paid a higher fee.

  • High volatility
    • With the current system, the first couple Curators hold a majority of all curation shares. If they choose to sell their curation shares, the signal will shift dramatically.

      • This will leave the remaining Curators in the the red
    • In the example with Reverse Auction, the curation shares are more evenly distributed. Even if a couple Curators sell their shares, the GRT valuation of the remaining shares will not decrease as much.

      • A less volatile signal increases the predictability in service for consumers
      • A less volatile signal benefits indexers: they can better optimize their allocations and tune their infrastructure and cost models. (By extension, this would also benefit delegators)

Another example - This one with a subgraph developer signalling on their own subgraph


  • Malicious Developers
    • A malicious developer can publish a subgraph they don’t intend for anyone to query. With “Batch GNS Transactions” they will be guaranteed the first spot on the bonding curve.

      • The highest potential rewards are found early on the bonding curve. Therefore, whenever new subgraphs are published, Curators make split-second decisions on whether or not to curate the subgraph.

      • Malicious developers can take advantage of Curators rushing into bad decisions.

      • With the Reverse Auction, developers pay a 20% developer tax to be guaranteed the first spot on the bonding curve. This increases the risk for malicious developers that are looking to exploit curators.

      • With a Reverse Auction, Curators are allowed more time to assess a subgraph. - They no longer need to be the first to get a sizeable amount of curation shares. This increases the chances of discovering a “malicious developer” before a decision has to be made.

    • Developers can deploy a new subgraph instead of upgrading an old one. I explained this “Developer Bait & Switch Attack” in the original post.

      • Developers are economically incentivized to create a new subgraph instead of upgrading an existing one. Each time they do this, they would be able to sell their curation shares at a profit.

      • As with the previous example: With a 20% developer tax for “publish and signal”, developers are encouraged to upgrade their version, instead of publishing a new subgraph.

      • The Reverse Auction would allow more time to assess newly published subgraphs. It would also allow time for the subgraph to be indexed, so Curators can assess the data. If a developer tries to deploy a similar subgraph under a different account, Curators might recognize that the data served is the same.

  • Note: In the example with Reverse Auction + developer self-signal, we see that “Curator 1” gets fewer shares than subsequent Curators. This highlights how a Reverse Auction benefits Curators that are able to correctly assess a subgraph, instead of those racing to be the first on the bonding curve.

I fully support this refined proposal.

Does a great job of taking away incentive in just hopping in & out subgraphs.


It is probably no surprise, but I am very in favor of this system.

An additional point that is not touched on is this system also prevents the anxiety of being the first to signal. From what I’ve seen most curators have jobs outside their role as a curator or are students.

Keeping this leg of the network open to the ‘average but enthusiastic joe’ keeps the scale of participation in play (delegator > curator > indexer).


I hate to sound like a broken record here, but I still firmly believe that the Continuous Organization smart contract provides a solution to every issue we’ve encountered since the beginning of Curation.

Protection from front-running via a Minimum Funding Goal (time based, not GRT quantity based) :

**Security while participating in the MFG:


Additional, optional protections:

Volatility mitigated by pre-minted shares:

Subgraph Deployer incentive for Curators:

Here is how this looks in my eyes, as a Curator:
Example 1: Legitimate Subgraph deployer (UNISWAP) w/ or without front-running bots

  • UNISWAP official subgraph deployed
  • UNISWAP pre-mints 100 Curation Shares, with the intention of burning them. This burn would increase the future value of that subgraph’s shares & promote long-term signals from current Curators while also helping to attract future Curators. A larger upfront investment from a project would quickly gain attention from Curators, as the Subgraph is more likely to be legitimate.
  • Along with the pre-minted shares, UNISWAP opts to offer a 7-day Minimum Funding Goal (Should be mandatory minumum of 3 days, IMO) period, where all shares of the subgraph cost the same price. The impact of front-running bots will be nearly eliminated, as the “bonding curve” would not be go into effect until after the 7-day period. Once the bonding curve is active, a bot’s unsignalling would not cause massive volatility as it does now. If it did, however, this is when the subgraph deployer could choose to burn some of their pre-minted shares to help make up any losses incurred by non-malicious Curators.

Exmaple 2: Illegitimate Subgraph deployer (UNIFLOP) w/ or without front-running bots

  • UNIFLOP deploys subgraph with no pre-minted shares (hint #1)
  • UNIFLOP opts for a 3-day Minimum Funding Goal Period (hint #2)
  • Eager curators & bots “ape in” to the Subgraph, only to find out 24 hours after deployment that the Subgraph is illegitimate. All human Curators unsignal, receiving a refund for their full initial investment amount (less current 2.5% burn or re-allocation as discussed above & gas costs)
  • UNIFLOP cannot close the Minimum Funding Goal period prior to the end of the 3-day period, without first paying back the GRT signaled to it by all Curators. If bots do not unsignal prior to the end of the MFG, the malicious subgraph deployer essentially “rugs” the bots & we have the perfect outcome of wrong-doers being done wrong by other wrong-doers.

I have said it before, and I will gladly say it again - I am not technologically inclined. I just believe in doing what is right, and I want to see every facet of The Graph be sustainable and prosperous for all users. Thanks for reading :beers: :man_astronaut:t2:

1 Like

Thanks for replying.

Continous Organisations are interesting. However, most of the whitepaper is not applicable to The Graph:

A Continuous Organization is an organization type that issues securities through a Continuous Securities Offering by funneling part or all of its realized revenues to a specific type of smart-contract called a Decentralized Autonomous Trust"

Let us break down some of the features of Continuous Organizations:
  • Different Buy Price Function and Sell Price Function

    • An investor buys shares at a higher price f(x), than they are able to sell them g(x). This can be seen as an immediate decrease in value, or a “tax”. Solving for this tax:

    • Given the following:

    • Buy Price Function f(x)= ax

    • Sell Price Function g(x)= bx

    • We see that

    • Since a and b are constants, this equals a constant tax of 1-b/a.

    • In other words, having a different Buy Price Function and Sell Price Function would be the same as adjusting the curation tax. However, unlike Dynamic Curation Tax, which would be limited to x amount of days after a subgraph has been published, this tax would persist.

    • We could choose to use a completely different price function for either or both the price functions. However, this may result in a “dynamic tax” that is dependent on how many curation shares are minted. I believe this would be very difficult for curators to work with.

    • Having a high curation tax that persists throughout the lifecycle of a subgraph will decrease the sensitivity of the curation signal.

  • Initialization period with a flat price.

    • My concern here is not just with backrunning bots. Any Curator that are part of the “Showroom”/ Initialization period are able to sell their shares at a profit. And they will rush to do so, decreasing the value of the remaining curation shares. (Unless a steep tax is imposed on the showroom). I explained this attack here Subgraph Showroom - #45 by Slimchance.

    • Regarding your example. We can not open the protocol to economic attacks, and then expect subgraph developers to cover this cost by buying and burning shares on their subgraph.

Edit: Let’s try to keep this thread about the pros and cons of Reverse Auction / Dynamic Curation Tax. If you’d like to dive deeper into Continuous Organizations, lets discuss it in the Thread about Continuous Organizations.

1 Like

I support this proposal. I think any change will
come with trade-offs. Yes it will reduce the gains if you manage to signal very early, however the pros far outweigh the cons. It smooths things out, and helps to steer curators into thinking of the bigger picture, rather than in it to make a quick multiple before rugging everyone else afterwards (and helps sort out them pesky bots) If uniswap launches tomorrow, it will damage a lot more people than it will benefit within the first couple of days. This taxation gives calm to the whole process where us as curators can do our job as we were supposed to without feeling rushed, and thus potentially making costly errors. The system we currently have pushes curators into feeling like they have to ape in without verifying the subgraph first. I believe with the proposed system it will still be lucrative as when there are thousands of subgraphs, even after 10 days of launch, many will still be under the radar and if we do our research properly, we can find gems and get in very early and do well out of it.

Uniswap is like Apple, everyone knows what it is, our job as curators has to go further than that and look at what the future is. There will be subgraphs deployed with very little signal and attention that could be the next uniswap. There’s still a lot of value in the proposed system, it just balances things more and makes it fairer for everyone that wants to contribute with less risk

1 Like

Thank you for the response. In regards to the original post, would it make more sense for the tax rates from the Deploy & Signal and Reverse Auction to be the same? The higher tax rate of 90% (10,000 GRT = 31.6 shares) only applying to day 1 Curators, but is guaranteed to a Deployer who only incurs a 20% tax (10,000 GRT = 89.4 shares).

What would Curator #2 signalling on day 1 look like in the Deploy & Signal example shown above (10,000 GRT, day 3)? Is this individual/bot subject to the 90% tax?

Some smaller Subgraphs need early attention from Curators/Indexers, and I see this tax as something that might discourage early signals - regardless of a Subgraph being legitimate or not. Additionally, is there any concern that this will simply have front-runners waiting for Day 10+ to signal? Could you help me to understand what this scenario would look like:

  • DEPLOYER - 10,000 GRT w/ 20% tax (89.4 shares)
  • CURATOR #1, DAY 1 - 10,000 GRT w/ 90% tax? ( ? shares)
  • CURATOR #3 - :robot:, DAY 10 - 10,000 GRT w/ 2.5% tax
  • CURATOR #4 - :whale2:, Day 11 - $50,000 GRT w/ 2.5% tax

A malicious deployer would be given a huge, immediate financial advantage over the #1 Curator based on the amount of shares received relative to their initial investments.

1 Like
  • Remember that the tax starts at 100% when the subgraph is deployed. So the tax could not be the same, as developers would then get 0 shares.

  • A very high developer tax (90%+) would feel punitive for developers that are looking to migrate or deploy their subgraph to the decentralized network. If a developer is told they need to buy 100 000 GRT in order to secure a signal of 10 000 GRT on their subgraph, I believe many projects would stop using The Graph. This does not benefit anyone

    • Developers would also be encouraged to find “smart” ways to avoid this tax. (Deploy 20 identical subgraph, but don’t tell the community which your are going to use before you signal on it 10 days later). etc.
  • With Batch GNS Transactions subgraph developers will be guaranteed the first spot on the bonding curve. I agree that this opens the protocol to malicious subgraph developers. This is one of the challenges we are adressing with the Reverse Auction:

    • Higher risk - With the current system, a malicious developer only risk the 2.5% curation tax. With Reverse Auction, they risk 20%.

    • With the current system, the highest potential rewards are found early on the bonding curve. Therefore, whenever new subgraphs are published, Curators make split-second decisions on whether or not to curate the subgraph. With a Reverse Auction, Curators are allowed more time to assess a subgraph. - They no longer need to be the first Curators on the bonding curve to get a sizeable amount of curation shares. This increases the chances of discovering a “malicious developer” before a decision has to be made.

  • Frontrunning “Day 10” would not make any sense, as the tax is linearly decreasing, block by block.

Feel free to hit me up in Discord, and we can dive deeper into the numbers if you’d like :slight_smile:


Thanks for sharing! Lots of interesting thoughts.

It seems that the primary outcome of the proposed Reverse Auction mechanism is to flatten the bonding curve, so there is less benefit (indeed a bit of a cost) to signalling early. In doing so it introduces quite a lot of complexity, for subgraph developers and anyone looking to signal on subgraphs. It also reduces the total amount of GRT deposited (which may have broader knock-on effects which I haven’t fully thought through.)

Would an alternative be to directly make the bonding curve flatter? That would address both the front-running and volatility concerns, while keeping things simpler for all participants.


I agree with Adam that perhaps we may be trying to think of new creative ways to fit the bonding curve, rather than looking to tweak the bonding curve itself. While I do see the benefit of a system like this, I also think it’s becoming overly complicated and may end up further benefitting those with a penchant for math and game theory moreso than driving the core role of a Curator.

What if the question was posed more as: “What kind of incentives do we actually need to keep Curators working/fulfilling their part of the network?”
Although, I don’t want to derail this thread, as this conversation and proposal is well thought out and has definite merit and should be considered.


There are a few reasons:

  • Frontrunning bots and low quality signal Flattening the bonding curve will not change the current dynamics. Being the first on the bonding curve would still carry the least amount of risk, while also having the largest potential upside. As before:

    • Frontrunning bots will still be able to exploit the protocol
    • Curators are rewarded for being first, and not for assessing subgraphs
  • Volatility The volatility would decrease. However, bots and early curators would still hold more curation shares than subsequent curators.

  • Malicious Developers A flatter bonding curve would not address this issue.

    • The highest potential rewards are still found early on the bonding curve. Therefore, whenever new subgraphs are published, Curators make split-second decisions on whether or not to curate the subgraph.

    • Malicious developers can take advantage of Curators rushing into bad decisions.

    • Developers are economically incentivized to create a new subgraph instead of upgrading an existing one. Each time they do this, they would be able to sell their curation shares at a profit.

  • Less sensitive curation signal A flatter bonding curve would have the added drawback of making the curation signal less sensitive.

In other words - this suggestion only partially address one out of three curation pain points. The other two challenges are not addressed at all. A flatter bonding curve also comes with an added drawback of changing the curation dynamics post the initial subgraph launch.

The bonding curve incentivizes Curators to predict future query volume. This, in turn, lets indexers allocate resources and optimize infrastructure and cost models accordingly. By making the curve flatter, you also decrease the potential earnings from being able to predict future query volume. This would make the signal less sensitive.

Tightening the bonding assists with the share allocation with curators, but it doesn’t eliminate the front running bots as they will still be in the highest reward potential and lowest risk position and it likely will not assist in the reduction of signal volatility. With the dynamic tax, we change the ‘wild west of signaling first’ into a ‘game of inspection, verifying and timing your entry’ which would result in higher quality signal for the network.

I do agree that this appears to add complexity, but with the tweaks to the UI showing the “profitable make/break point” at the time of signal should offer a curator a wormhole to understand the needed end result. As they become more acclimated to curation, they will be able to ask questions like “why is my total signal to reach profitability on new subgraphs”.


I think the same considerations around there still being an opportunity for frontrunners could be applied to the suggested dynamic tax, it just changes the optimum entry point for bots - as @DataNexus says, it is still about timing your entry.

A flatter bonding curve also comes with an added drawback of changing the curation dynamics post the initial subgraph launch.

I think this drawback could also be applied to any of the suggested changes.

Perhaps there is an opportunity to keep the bonding curve as it is for signal query rewards, while making the GRT deposited more like staking (i.e. you get out what you put in, minus some tax, rather than selling shares on a curve). I think this would address the issues identified (1. frontrunning bots would have no incentive to buy then sell, 2. volatility should be low as signal should only come from users who either want the subgraph to be indexed, or who genuinely believe there will be profitable query fees, and 3. there would be no opportunity for malicious developers). It would incentivize users to signal for long enough for query fees to be accrued.

This would obviously also significantly change the short-term economic opportunity for Curators, so I appreciate that it is a radical suggestion, but I think it is worth discussing in the interests of solving the challenges identified. There might be other ways that signal could be rewarded (e.g. some % of the indexing rewards for those subgraphs, more akin to Delegators - though I am sure more thought is required here).

1 Like

Thanks for the feedback on the decaying capital gains tax @Slimchance. I’m realizing now that our ideas are different enough that I’ll post a more fleshed-out description in its own thread so as not to distract from the conversation here.

For now, I’ll acknowledge that you are correct that a decaying capital gains tax could make the bonding curves less sensitive to short-term information changes, but that ultimately as protocol designers we’ll have to decide over what time horizon we want to maximally reward correct predictions.

I actually think @Ohmyjog’s observation that Curation (and also Delegation) can be modeled as Continous Organizations is a very astute one. The settings are very similar, and how we’ve been thinking about these mechanisms at E&N, irrespective of the suitability of all the specific design choices that the Fairmint team made. It’s worth noting that the BlockScience team (who is also working on The Graph) audited the Fairmint continuous organization design and the “initialization phase” is very similar to the “hatching phase” that BlockScience specified in Augmented Bonding Curves, which I describe in this thread.

My thoughts on the proposed dynamic curation tax/“reverse auction”

  • I generally agree with the critiques that the above is overly complex relative to what it accomplishes and, given that this mechanism only focuses on initialization, simpler designs are available such as a flat bonding curve during a hatching/initialization phase as referenced by @Ohmyjog and in the thread, I linked above. (I believe your critiques of a flatter bonding curve @Slimchance were in reference to a permanently flatter bonding curve and are valid in that context)


  • The high taxes in this design impose a large deadweight loss to the system and discourages important protocol actions, specifically:
    • 20% tax for subgraph deployers is too high. There is currently a proposal to reduce the curation tax which is based on feedback that the tax might already pose too large a cost on subgraph developers and could disincentivize migrating to the decentralized network from E&N’s hosted service.
    • Near 100% tax for Curators would also strongly disincentivize Curation during that early time period, which could be desirable in some contexts, but perhaps not in others (for example if the deployed subgraph is an upgraded version of a named subgraph that already has good reputation and Curators had already previously reviewed previous versions of that named subgraph or other subgraphs by that developer).
  • Introducing a Curator fund controlled by a DAO at the core protocol level expands the governance surface area and goes against one of the governing principles of the network: progressive minimization of governance surface area.
  • Front-running, pump-and-dump, and sandwich attacks can happen at any time, but this design only addresses the initialization of the bonding curve, so would need to be paired with something like batched bonding curves, separate in/out curves, decaying curation tax, etc. to be effective, at which point the marginal value of the additional complexity introduced by this mechanism is reduced even further.

This is a poorly illustrated vision I had, I’d love to hear your thoughts on it.

All values GRT, %, and TIME values used below are only examples, not necessarily suggestions

Let’s assume a 10 day period in which we’ll say ALL Curation shares cost 200 GRT. Any/all signals are taxed 10% during this time, with half of this tax (5%) being reapplied on the “sell” side of the bonding curve at either a predetermined future time OR some kind of linear unlock/redistribution. The other 5% of the taxes pooled from this period could be allocated to The Graph Foundation for various uses, as mentioned in the original post (see below for possible use-case). Perhaps this 5% is then split 50/50, with half (2.5%) going to The Graph Foundation and the other half locked for the Subgraph Deployers benefit, to be released after 1 year (tax-kickback).

After this hypothetical 10 day period is finished, the normal 2.5% tax goes into effect & the bonding curve becomes active. To discourage users/bots from unsignalling upon the redistribution of the aforementioned 5% tax OR at first sign of profit, an exit tax of 2.5% (for this example) would apply to all withdrawals. This tax would then applied to the “sell” side of the bonding curve to further lessen the financial impacts of large transactions on other Curators (subgraph deployers, included).

Lastly, a feature which allows Subgraph Deployers or The Graph Foundation etc. to directly deposit GRT into the “sell” side of the Bonding Curve, with the intention of increasing the value of current shares as an incentive/reward. The GRT could come from the pool created during the fixed-share-price period mentioned above. This would be a beneficial tool for sustaining long-term Curation signals and/or recognizing positive behaviors etc., in addition to acting as a kind of tax-kickback for self-signalling Subgraph Deployers.

I have tried to take some positive aspects of Continuous Organization & make it usable for The Graph.

1 Like