Dynamic Curation Tax

With curation launch several challenges have been seen;


  • Frontrunning bots - Due to the nature of the bonding curve, being the first to curate on a subgraph carries the least amount of risk, while also having a large upside potential. The first curator takes the following risks:

    • 2.5% curation tax
    • gas cost
    • opportunity cost

    While the risk of subsequent curators are:

    • All of the above
    • The GRT Valuation of their curation shares might decrease

    Frontrunning bots are currently signalling on every subgraph. Attempts to deter frontrunning bots have been made:

    • So called “Bot Bait Subgraphs” have been published to the network.
    • These are subgraphs that have been published with a single intention: to bait the bots into signalling on them.
    • Even if the bots are curating with 3000GRT each time, the incured curation tax is just 75GRT. Some bots are >600 000 GRT in profit. These bots would need to signal on 8000 “bait subgraphs” to break even. (Not counting gas. However, there are gas costs on the deployment side as well )

  • High volatility - With the bonding curve starting at “0”, there is a great deal of volatility on subgraph launch. This encourages curator behavior that might be less than ideal;

    • Low quality signal: As with the point above: earlier curators faces significantly lower risk. Curators might signal on a subgraph without first assessing whether the subgraph contains valuable, complete and accurate data, and whether a dApp intends to use the subgraph in production.

    • Pump-and-dump: Early curators on a subgraph stand to profit from convincing fellow curators to curate on the same subgraph.

    • Migration help: ****This has both positive and negative aspects. Curators are incentivized to, and actively helping developers migrate from the hosted service to the decentralized network, which is great. However, some developers might feel pressured into migrating without having critical knowledge about The Decentralized Network, such as;

      • Developers should be the first ones to curate on their own subgraphs. Being the first curator on the bonding curve, they would be able to mint a much higher amount of shares per GRT. This ensures a decent “minimum” amount of signal on their subgraph. This can be critical for smaller dApps. If query volume is smaller than expected, or hits a temporary dump, “independent” curators might sell their shares. DApp developers would be dependent on that minimum amount of curation shares to ensure that indexers keep serving queries.
      • Some developers are not aware of how versioning works on the decentralized network, and the cost of auto-migrating curation shares to a new version.
      • Several features are only available on the Hosted Service.

  • Developers switching subgraphs - This is not a huge issue as of now. However, I think this should be addressed;
    • Subgraph developers that wish to upgrade their subgraph pays the full 2.5% curation tax on their own shares, and a further 1.25% curation tax for all other curators that have signalled on the same subgraph (and subscribed to the newest version.) With the current system, a subgraph developer can, instead of upgrading, choose to “bait and switch” their curators, and publish an entirely new subgraph. They are economically incentivized to do so:

      • They won’t have to pay the curation tax of migrating to a new version
      • Assuming the subgraph developer is the first to signal on their own subgraph, their shares will have appreciated in value. They would retrieve more GRT than they put into the bonding curve.

      A counter argument would be that repeat offenders would “get a bad reputation”. However, I believe this is a poor argument for two reasons;

      • A decentralized network should try to minimize “human trust”
      • Word of mouth reputation systems does not scale well with the amount of subgraphs The Graph Expects.
      • It would be easy for developers to deploy the same subgraph under a different name.


Several solutions have been proposed. In this thread, I wish to look at two of them, as well as discuss a third one:

  • https://forum.thegraph.com/t/batch-gns-transactions/2285. Batch GNS Transactions. This proposal would allow subgraph developers to curate on their own subgraph in the same transaction the subgraph is published.

    • This would go a long way of adressing frontrunning bots. It could also help deal with the volatility on subgraph launch; If the subgraph developer gets to be the first to curate, they could hold a significant amount of curation shares. Assuming they don’t burn their shares, this would reduce the price impact of subsequent curators.

      I think this is a much needed upgrade. This might also be enough to significantly improve the protocol. We might not need to further address these challenges. I will still explore an additional solution and propose one of my own.

  • https://forum.thegraph.com/t/subgraph-showroom/2202 Showroom. I believe the showroom approach does have merit. But I also think it has some significant drawbacks:

    • Introduces another layer of complexity for developers, curators and indexers.

    • 2 days is not enough time for indexers to sync a subgraph. Meaning - the curators will still have limited data to assess the subgraph.

    • Depending on implementation, “Resolving” the showroom could be very gas intensive.

    • The solution requires considerable amount of UI development, smart contract code and auditing;

      • The showroom
      • Resolving logic
      • “Bid reclaiming” if a “ceiling” is used .
    • Needs to take a stand to the question “Would the subgraph developers themselves be allowed to buy shares before (cheaper) than the other showroom participants”

      • If no - Developers would no longer hold a decent “minimum” signal on their own subgraph, and the quality of the service they receive would be subject to curation volatilit.
      • If yes - one risks the developers switching subgraphs as outlined above
    • The showroom can make frontrunning bots an even larger issue. (As outlined in Subgraph Showroom - #45 by Slimchance )

I think it is worthwhile to fully explore simpler solutions as well. I want to propose another solution;

Dynamic curation tax.

A simpler solution could be to have a very high curation tax for curators that signal when the subgraph is first published, then reduce the curation tax over time, until it hits the “target” tax of 2.5%. Two examples are provided below. Both the timespan and the starting curation tax are just examples.

Example A

Example B

A few points on why I think this solution have merit;

  • Detering frontrunning bots : A high curation tax on “block 1” significantly increases the risk for bots that curate indiscriminately. Especially combined with the ability for developers to “Deploy and Curate”.

  • Lower volatility: Due to a higher curation tax around subgraph publication, it is no longer just about being “early”. Instead, this encourages curators to assess each subgraph and to choose the time of entry.

  • Higher Quality signal: With a showroom-type solution, I fear some curators might just “follow the masses”. They would curate with the same amount and “ceiling” as other, successful curators. With a decreasing curation tax, curators are rewarded for analyzing the subgraph, and entering the curve at “the right moment”.

  • Less Unknowns: With a “showroom” approach, (especially one without a “ceiling”), the final outcome would be subject to other curators’ behavior. With this solution, curators would know the exact amount of shares they will receive, before depositing their GRT.

  • Trustlessness between developers and curators. Developers that “publishAndCurate()” are deterred from creating a new subgraph instead of upgrading the previous one.

  • Higher commitment. If developers face a significant tax when initially publishing their subgraph, this sends a powerful signal to curators and indexers alike, that the developers intend to use the subgraph in production.

  • Less complex than a showroom solution. Compared to implementing a showroom, I believe this solution is both easier to understand, as well as much easier to code.

    • The UI would not need to accomodate a “two part” solution - one for “showroom subgraphs” and one for “production subgraphs”
    • No need for a step where the showroom math is “solved”

I also think there are some drawbacks to this approach:

  • Compared to keeping the protocol as-is, this introduces complexity

  • A high tax might feel punitive to new curators.


Edit: A variant of this solution would be to have a separate curation tax for developers that publish and curate in one transaction:

  • The protocol can settle on a developer tax that is just high enough to discourage “bait and switch” attacks.

  • With the initial deploy and curate transaction being exempt from the “Dynamic Curation Tax”, it would be possible to set the initial curation tax even higher. (To further discourage curators from signaling without doing their due diligence.)

  • The main drawback to this variant would be the added complexity.

  • An example would be

    • Allow the subgraph developer to publish and curate at a 20% curation tax
    • Have a dynamic curation tax that starts prohibitively high : 80%.
    • Decrease the curation tax block by block ~10% per day until it reaches the target of 2.5% curation tax

I encourage community members to publish feedback to this approach below.

  • Do you think this approach can help solve the challenges our community is facing?

  • What are some of the drawbacks you see with this approach?

  • If you think this approach is worth exploring, what parameters wold make sense, and why? (Starting curation tax, number of days to reach “target tax” etc.? )

Note : This idea has been further refined. See the post called Reverse Auction (Dynamic Curation Tax V2) later in this thread.


Hey Slim :wave:t4:

I think that modifying the curation tax is the way to go, combined with “Deploy and Signal” or “Batch GNS transactions”

The showroom is a great idea but adding complexity to an already complex system is not necessarily a good thing. Also, making things more difficult for developers in anyway is less than ideal.

I wonder, instead of a curation tax upon depositing or in addition to a deposit curation tax, would it not also make sense to tax upon un-signaling in this dynamic way that you have proposed?

My understanding is that we want quality subgraphs to have a decent amount of signal quickly so that indexers can begin to sync. Would a heavy tax on early signal then deter curators from signaling at all and thus extend the sync time because indexers are waiting for ore signal? And if the sync time is extended then developers must wait longer for their subgraph to do the job it needs to do.

Is the tax necessary? It is possible deploy and signal could defeat the bots on it’s own, which keeps the protocol simpler.

To summarize my views:

Adding in a dynamic tax complicates an already complicated system.

A tax on early signal could delay the sync process

Perhaps we should look at a dynamic tax upon un-signal

Deploy and signal may be enough of a fix


While I would hate to make changes and then see that the changes were not enough, I think this would be enough to allow bot baits to ruin the bots. It would change the bots from losing 75 GRT to losing that + a large chunk of their stacks and would incentivize the individual to go after bots (and maybe those that blindly signal).

Seems like a quick fix, which might be what is needed or could lead to other problems down the line. My first issue that I don’t see resolved is that if the bots can withstand the bot baits and then they are still 2nd into a major subgraph and still first to unsignal.


Great thoughts Slim and Bondsmith!

I think the deploy and signal is also a great idea but, I am concerned it won’t be enough. I think it could be valuable for someone to look into a dynamic un-signal tax similar to the proposals Slim made for a dynamic curation tax.

My 2 cents on a possible dynamic un-signal tax.

  1. This would dis-incentivize poor signaling because the curator’s GRT would be locked up longer or they would have to take a larger immediate loss.
  2. This would incentivize longer-term curating and hopefully provide better signals where the cost of new shares would roughly equal the future value of discounted cash flows.
  3. It would not reduce the initial speed of (quality) curation like the dynamic curation tax might.
1 Like

While I like the underlying idea of what you are proposing, it’s hard to get deeper into the proposal without having some specific metrics for target tax, time to reach target, and starting tax. Would need some modelling to be run.
But one of my main drawbacks is that I think this will just create a new math problem to be optimized: at which point is most optimal to enter the curation curve. Which just means that those can best work through the maths will have a leg up versus everyone else, rather than solving for the problem at hand.
While every solution has tradeoffs, I am leaning more towards trying to ensure that whatever the solution may be, that those with a better grasp on the maths, or better able to code a bot, won’t be advantaged (I know this may be pie in the sky thinking, but that’s where my head’s at currently).

There is definitely merit to this approach though, I just need to sit with it some more to better opine.


I like this idea. Well done!

1 Like

I totally agree with TheBondsmith, charging on un-signaling is a great solution and will give us higher rewards during our curation and before we un-signal.


Batching the signal and deploy txns is vital. That one change would make a significant proportional change to the bot’s impact. As is often the case, simple solutions are best implemented first.


I have reservations about this proposal as it introduces more complexity into an already complicated & still nascent system.

In general I am also opposed to anything that introduces higher costs & complexity for subgraph developers, without unambiguous net benefit.


Great ideas in here, thanks for kicking off this discussion @Slimchance!

A downside of an tax based on unsignal is that it penalizes Curators that signal for longer, because they will have accumulated more Curator royalties in the bonding curve that will will be subsequently taxed.

I generally agree with Adam that this proposal in its current form imposes too many additional costs to get its intended benefit. It feels like a blunt instrument.

A variant on this idea, however, that we’re researching at E&N is a decaying capital gains tax. For example, the protocol could tax capital gains earned over a period of a block at ~100% and capital gains earned over a year at ~0%. This would virtually eliminate any profit opportunities from front-running/ sandwich attacks, and would instead reward Curators whose shares appreciate in price over a longer period of time.

The major downside of the above approach is that Curator shares lose some fungibility (brings them closer to UTXOs), and there would be additional bookkeeping required to track the cost-basis of curation shares in order to levy the capital gains tax. My working assumption has been that we wouldn’t be able to do something like this until The Graph migrates parts of its protocol logic to an L2, but it could be worth prototyping sooner to check that assumption.


A small group of curators have been discussing a potential solution to this and hope to have it refined a bit more. But we’re excited to present it to the community!


Thanks for your reply Brandon.

Regarding decaying capital gains tax

  • I’d love more details on how a decaying capital gains tax would work. I assume it would calculate capital gains as (GRT received from burning curation shares - GRT deposited into the bonding curve), and then tax this amount depending on how long the signal has stayed in the bonding curve.

  • I don’t think this will prevent front-running bots and low quality signal from curators that tries to be first on the bonding curve. The first curator on the bonding curve would still carry the least amount of risk, while also having a large potential upside:

    • If a bot is the first to signal on a subgraph that ends up with low signal, the capital gains tax would be low or 0, and the bot can still sell their shares at minimal loss.

    • On the flip side - If a bot is the first to signal on a subgraph that receives a large amount of signal: Sure, their shares would be locked for a while. However, they will still have significant upside when they do decide to sell, several months later. This looming might also affect the signal of other curators, as they would know early curators will be able to sell their shares at a low tax 8-12 months after the subgraph was published.

  • Another potential issue with a decaying capital gains tax, is that it rewards staying in the bonding curve for a larger amount of time. The curation markets should not just reflect the current queries. Curators should also be incentivized to act as predictors for future query volume, so indexers can allocate resources and optimize infrastructure and cost models accordingly.

    • A decaying capital gains tax might make the curation market less sensitive to expected future changes to query volume. (Similar to flattening the bonding curve).

      • Consider a dApp a that plans on doing a temporary promotion: They expect a huge increase in query volume the following month. Afterwards, they expect the query volume to return to regular levels. If Curators are able to pick up on this, they can signal on the subgraph to prepare the network for the temporary increase in query volume. With a decaying capital gains tax, this might no longer be feasible.
    • A decaying capital gains tax incentivizes the best curators - those who are able to accurately predict future behaviour - to stay locked in on a few subgraphs. I think the curation market should incentivize the best curators to keep assessing subgraph: signal on those they find undercurated, and - equally important - unsignaling on overcurated subgraphs.

Thanks for digging into this :slight_smile: Hearing from the E&D team is super helpful.

I think Capital Gains Tax would solve the issue of front running but would also cause unwanted side effects that would affect the goals and experience of curation.

  1. At the heart of curation is the ability to adjust to market changes at a moments notice. Curators have their ears to the ground as they monitor:
  • New subgraph entrance
  • Deprecation of subgraphs
  • Updates to data being stored in a subgraph
  • Events that could lead to increase/decrease in query traffic

Being able to adjust signal based on these factors would not be possible if they are disincentivized to do so.

  1. Curators would be incentivized to keep signal on a subgraph that might not be balanced to the amount of query traffic coming in. Indexers would then be mislead to think that the subgraph should have more resources allocated to it.

As DataNexus mentioned, we’ve been working on an idea and hope to share with the community soon and would love to get your feedback on it :slight_smile:

1 Like

Reverse Auction

(Dynamic Curation Tax v2)

Working with @chK8r42z @Datanexus @graphgod1 @Cole @JamesTheBondsmith @jona and @Oliver we have further refined the idea. We have renamed the idea to “Reverse Auction”, as this is closer to what we are trying to achieve. In a reverse auction, the price starts very high and decreases over time. This allows the market to assess the “fair value” of curation shares on a specific subgraph. As we can see in the examples toward the end of this post, this would also ensure a fairer distribution of curation shares. The Dynamic Curation Tax is used to facilitate this Reverse Auction.

In this post we will look at:
  • Challenges with the Reverse Auction , and how we propose to solve them
  • Parameters for the Reverse Auction, and why they were chosen
  • Example numbers, showing how the Reverse Auction might mitigate the 3 curation pain points explained in the original post.


Challenge: The Reverse Auction might increase the profit of Curators that are good at assessing subgraphs. However, the very first Curators might receive less shares, which can feel punitive:

Solution: Curation Fund

  • The Reverse Auction is there to ensure a fairer distribution of curation shares, and to encourage Curators to assess a subgraph before curating on it.

  • There is no need to burn the GRT that is collected in the process. Thus, we suggest that any tax over the baseline (2.5%) is not burned, but rather deposited into a curator fund.

  • This fund will be governed by The Graph Foundation. The Foundation will be directed by Curators using snapshot voting. Over time, we might see this fund evolve to be governed by a Curator DAO

  • The fund might be used for

    • Grants that improve the curator experience : Dashboards, guides.
    • Initiatives that directly improve curator revenue.

Challenge: Increased complexity for Curators

Solution: Make sure all critical information is readily available in the UI.

  • Current Reverse Auction Fee

  • “Break Even Signal” When the total signal on a subgraph reaches this value, the Curator will be able to burn their shares to retrieve the same amount of GRT they put into the protocol. If the total signal increases above this amount, the curator will be able to sell their shares at a profit.

  • Showing these two numbers in the UI, would allow Curators to make a decision on whether they should signal or not. If the total signal increases above the “Break Even Signal”, they will be able to sell their shares at a profit.

An example of how this might look:


The Reverse Auction parameters are still subject to change. The parameters we are currently looking at looks like this:

Subgraph deployment tax (20%)

  • This tax is incurred when developers “publish and signal”. Developers are allowed the first spot on the bonding curve. The tax acts as a deterrent to malicious attacks by developers. E.g “Bait and switch” explained in the original post.

    • If this number is too low, malicious attacks will not be deterred.
    • If this number is too high, developers might find the tax punitive
    • We are working with 20%. As can be seen in the example below: Developers that expect other Curators to signal, will quickly “recuperate” the deployment tax.
  • Example numbers:

    • A developer self-signals with 10 000 GRT. 8 000 GRT is deposited into the bonding curve, minting 89.4 shares.
    • Another curator deposits 2 125 GRT into the bonding curve, increasing the total signal to 10 125GRT.
    • At this point, the developer is already “Break Even”, and could sell their shares to retrieve the initial amount of 10 000 GRT.

Starting tax (100%)

  • When a subgraph is published, the Reverse Auction will start. The dynamic curation tax starts at 100%. It will then linearly decrease over time, until it hits 2.5%.

    • A high starting tax ensures that front-running bots and low-quality curation will no longer be profitable. Even for highly anticipated subgraphs.

    • A starting tax of 100% is also symbolic: It highlights one of the intentions behind the Reverse Auction: To encourage smart curation, instead of just being the first on the bonding curve.

Reverse Auction time period (10 days)

  • The Dynamic Curation Tax will decrease linearly, block by block, until it hits 0. The current suggestion is that the tax will decrease by ~10% each day, allowing it to hit the target tax after 10 days.
    • A longer time period gives Curators more time to assess the subgraphs.
    • A longer time period allows the subgraph some time to index. This allows Curators to assess the data provided by the subgraph.
    • If the time period is too long, network participants might not know what the curation market deems to be a “fair signal” until towards the end of the Reverse Auction period.
    • Looking at the numbers, the “sweet spot” for early Curators that expect the signal to increase might still be just a few days after the subgraph is published. See the examples later in this post.

Curve (Linear)

  • The Dynamic Curation Tax will decrease linearly.
    • A linear curve was chosen, as it would be easy for Curators to understand and predict the tax.


Let's first look at this example


Let us see how this affects the first two pain points.

  • Front-running bots and low quality signal.
    • With the current system, the first Curators carry the least amount of risk while also having the highest potential returns. Notice how the early Curators hold a vast majority of all curation shares.

    • In a Reverse Auction, the Dynamic Curation Tax starts very high, and then decreases over time. In the example with a Reverse Auction, being first is no longer a huge advantage: the first 5 Curators all receive the same amount of curation shares.

    • When comparing the two, we see that only curator 1 and curator 2 hold a larger amount of shares with the current system. The following 8 Curators all hold more shares after the Reverse Auction, even though some of the Curators paid a higher fee.

  • High volatility
    • With the current system, the first couple Curators hold a majority of all curation shares. If they choose to sell their curation shares, the signal will shift dramatically.

      • This will leave the remaining Curators in the the red
    • In the example with Reverse Auction, the curation shares are more evenly distributed. Even if a couple Curators sell their shares, the GRT valuation of the remaining shares will not decrease as much.

      • A less volatile signal increases the predictability in service for consumers
      • A less volatile signal benefits indexers: they can better optimize their allocations and tune their infrastructure and cost models. (By extension, this would also benefit delegators)

Another example - This one with a subgraph developer signalling on their own subgraph


  • Malicious Developers
    • A malicious developer can publish a subgraph they don’t intend for anyone to query. With “Batch GNS Transactions” they will be guaranteed the first spot on the bonding curve.

      • The highest potential rewards are found early on the bonding curve. Therefore, whenever new subgraphs are published, Curators make split-second decisions on whether or not to curate the subgraph.

      • Malicious developers can take advantage of Curators rushing into bad decisions.

      • With the Reverse Auction, developers pay a 20% developer tax to be guaranteed the first spot on the bonding curve. This increases the risk for malicious developers that are looking to exploit curators.

      • With a Reverse Auction, Curators are allowed more time to assess a subgraph. - They no longer need to be the first to get a sizeable amount of curation shares. This increases the chances of discovering a “malicious developer” before a decision has to be made.

    • Developers can deploy a new subgraph instead of upgrading an old one. I explained this “Developer Bait & Switch Attack” in the original post.

      • Developers are economically incentivized to create a new subgraph instead of upgrading an existing one. Each time they do this, they would be able to sell their curation shares at a profit.

      • As with the previous example: With a 20% developer tax for “publish and signal”, developers are encouraged to upgrade their version, instead of publishing a new subgraph.

      • The Reverse Auction would allow more time to assess newly published subgraphs. It would also allow time for the subgraph to be indexed, so Curators can assess the data. If a developer tries to deploy a similar subgraph under a different account, Curators might recognize that the data served is the same.

  • Note: In the example with Reverse Auction + developer self-signal, we see that “Curator 1” gets fewer shares than subsequent Curators. This highlights how a Reverse Auction benefits Curators that are able to correctly assess a subgraph, instead of those racing to be the first on the bonding curve.

I fully support this refined proposal.

Does a great job of taking away incentive in just hopping in & out subgraphs.


It is probably no surprise, but I am very in favor of this system.

An additional point that is not touched on is this system also prevents the anxiety of being the first to signal. From what I’ve seen most curators have jobs outside their role as a curator or are students.

Keeping this leg of the network open to the ‘average but enthusiastic joe’ keeps the scale of participation in play (delegator > curator > indexer).


I hate to sound like a broken record here, but I still firmly believe that the Continuous Organization smart contract provides a solution to every issue we’ve encountered since the beginning of Curation.

Protection from front-running via a Minimum Funding Goal (time based, not GRT quantity based) :

**Security while participating in the MFG:


Additional, optional protections:

Volatility mitigated by pre-minted shares:

Subgraph Deployer incentive for Curators:

Here is how this looks in my eyes, as a Curator:
Example 1: Legitimate Subgraph deployer (UNISWAP) w/ or without front-running bots

  • UNISWAP official subgraph deployed
  • UNISWAP pre-mints 100 Curation Shares, with the intention of burning them. This burn would increase the future value of that subgraph’s shares & promote long-term signals from current Curators while also helping to attract future Curators. A larger upfront investment from a project would quickly gain attention from Curators, as the Subgraph is more likely to be legitimate.
  • Along with the pre-minted shares, UNISWAP opts to offer a 7-day Minimum Funding Goal (Should be mandatory minumum of 3 days, IMO) period, where all shares of the subgraph cost the same price. The impact of front-running bots will be nearly eliminated, as the “bonding curve” would not be go into effect until after the 7-day period. Once the bonding curve is active, a bot’s unsignalling would not cause massive volatility as it does now. If it did, however, this is when the subgraph deployer could choose to burn some of their pre-minted shares to help make up any losses incurred by non-malicious Curators.

Exmaple 2: Illegitimate Subgraph deployer (UNIFLOP) w/ or without front-running bots

  • UNIFLOP deploys subgraph with no pre-minted shares (hint #1)
  • UNIFLOP opts for a 3-day Minimum Funding Goal Period (hint #2)
  • Eager curators & bots “ape in” to the Subgraph, only to find out 24 hours after deployment that the Subgraph is illegitimate. All human Curators unsignal, receiving a refund for their full initial investment amount (less current 2.5% burn or re-allocation as discussed above & gas costs)
  • UNIFLOP cannot close the Minimum Funding Goal period prior to the end of the 3-day period, without first paying back the GRT signaled to it by all Curators. If bots do not unsignal prior to the end of the MFG, the malicious subgraph deployer essentially “rugs” the bots & we have the perfect outcome of wrong-doers being done wrong by other wrong-doers.

I have said it before, and I will gladly say it again - I am not technologically inclined. I just believe in doing what is right, and I want to see every facet of The Graph be sustainable and prosperous for all users. Thanks for reading :beers: :man_astronaut:t2:

1 Like

Thanks for replying.

Continous Organisations are interesting. However, most of the whitepaper is not applicable to The Graph:

A Continuous Organization is an organization type that issues securities through a Continuous Securities Offering by funneling part or all of its realized revenues to a specific type of smart-contract called a Decentralized Autonomous Trust"

Let us break down some of the features of Continuous Organizations:
  • Different Buy Price Function and Sell Price Function

    • An investor buys shares at a higher price f(x), than they are able to sell them g(x). This can be seen as an immediate decrease in value, or a “tax”. Solving for this tax:

    • Given the following:

    • Buy Price Function f(x)= ax

    • Sell Price Function g(x)= bx

    • We see that

    • Since a and b are constants, this equals a constant tax of 1-b/a.

    • In other words, having a different Buy Price Function and Sell Price Function would be the same as adjusting the curation tax. However, unlike Dynamic Curation Tax, which would be limited to x amount of days after a subgraph has been published, this tax would persist.

    • We could choose to use a completely different price function for either or both the price functions. However, this may result in a “dynamic tax” that is dependent on how many curation shares are minted. I believe this would be very difficult for curators to work with.

    • Having a high curation tax that persists throughout the lifecycle of a subgraph will decrease the sensitivity of the curation signal.

  • Initialization period with a flat price.

    • My concern here is not just with backrunning bots. Any Curator that are part of the “Showroom”/ Initialization period are able to sell their shares at a profit. And they will rush to do so, decreasing the value of the remaining curation shares. (Unless a steep tax is imposed on the showroom). I explained this attack here Subgraph Showroom - #45 by Slimchance.

    • Regarding your example. We can not open the protocol to economic attacks, and then expect subgraph developers to cover this cost by buying and burning shares on their subgraph.

Edit: Let’s try to keep this thread about the pros and cons of Reverse Auction / Dynamic Curation Tax. If you’d like to dive deeper into Continuous Organizations, lets discuss it in the Thread about Continuous Organizations.

1 Like

I support this proposal. I think any change will
come with trade-offs. Yes it will reduce the gains if you manage to signal very early, however the pros far outweigh the cons. It smooths things out, and helps to steer curators into thinking of the bigger picture, rather than in it to make a quick multiple before rugging everyone else afterwards (and helps sort out them pesky bots) If uniswap launches tomorrow, it will damage a lot more people than it will benefit within the first couple of days. This taxation gives calm to the whole process where us as curators can do our job as we were supposed to without feeling rushed, and thus potentially making costly errors. The system we currently have pushes curators into feeling like they have to ape in without verifying the subgraph first. I believe with the proposed system it will still be lucrative as when there are thousands of subgraphs, even after 10 days of launch, many will still be under the radar and if we do our research properly, we can find gems and get in very early and do well out of it.

Uniswap is like Apple, everyone knows what it is, our job as curators has to go further than that and look at what the future is. There will be subgraphs deployed with very little signal and attention that could be the next uniswap. There’s still a lot of value in the proposed system, it just balances things more and makes it fairer for everyone that wants to contribute with less risk

1 Like

Thank you for the response. In regards to the original post, would it make more sense for the tax rates from the Deploy & Signal and Reverse Auction to be the same? The higher tax rate of 90% (10,000 GRT = 31.6 shares) only applying to day 1 Curators, but is guaranteed to a Deployer who only incurs a 20% tax (10,000 GRT = 89.4 shares).

What would Curator #2 signalling on day 1 look like in the Deploy & Signal example shown above (10,000 GRT, day 3)? Is this individual/bot subject to the 90% tax?

Some smaller Subgraphs need early attention from Curators/Indexers, and I see this tax as something that might discourage early signals - regardless of a Subgraph being legitimate or not. Additionally, is there any concern that this will simply have front-runners waiting for Day 10+ to signal? Could you help me to understand what this scenario would look like:

  • DEPLOYER - 10,000 GRT w/ 20% tax (89.4 shares)
  • CURATOR #1, DAY 1 - 10,000 GRT w/ 90% tax? ( ? shares)
  • CURATOR #3 - :robot:, DAY 10 - 10,000 GRT w/ 2.5% tax
  • CURATOR #4 - :whale2:, Day 11 - $50,000 GRT w/ 2.5% tax

A malicious deployer would be given a huge, immediate financial advantage over the #1 Curator based on the amount of shares received relative to their initial investments.

1 Like