Dynamic Curation Tax

  • Remember that the tax starts at 100% when the subgraph is deployed. So the tax could not be the same, as developers would then get 0 shares.

  • A very high developer tax (90%+) would feel punitive for developers that are looking to migrate or deploy their subgraph to the decentralized network. If a developer is told they need to buy 100 000 GRT in order to secure a signal of 10 000 GRT on their subgraph, I believe many projects would stop using The Graph. This does not benefit anyone

    • Developers would also be encouraged to find “smart” ways to avoid this tax. (Deploy 20 identical subgraph, but don’t tell the community which your are going to use before you signal on it 10 days later). etc.
  • With Batch GNS Transactions subgraph developers will be guaranteed the first spot on the bonding curve. I agree that this opens the protocol to malicious subgraph developers. This is one of the challenges we are adressing with the Reverse Auction:

    • Higher risk - With the current system, a malicious developer only risk the 2.5% curation tax. With Reverse Auction, they risk 20%.

    • With the current system, the highest potential rewards are found early on the bonding curve. Therefore, whenever new subgraphs are published, Curators make split-second decisions on whether or not to curate the subgraph. With a Reverse Auction, Curators are allowed more time to assess a subgraph. - They no longer need to be the first Curators on the bonding curve to get a sizeable amount of curation shares. This increases the chances of discovering a “malicious developer” before a decision has to be made.

  • Frontrunning “Day 10” would not make any sense, as the tax is linearly decreasing, block by block.

Feel free to hit me up in Discord, and we can dive deeper into the numbers if you’d like :slight_smile:

2 Likes

Thanks for sharing! Lots of interesting thoughts.

It seems that the primary outcome of the proposed Reverse Auction mechanism is to flatten the bonding curve, so there is less benefit (indeed a bit of a cost) to signalling early. In doing so it introduces quite a lot of complexity, for subgraph developers and anyone looking to signal on subgraphs. It also reduces the total amount of GRT deposited (which may have broader knock-on effects which I haven’t fully thought through.)

Would an alternative be to directly make the bonding curve flatter? That would address both the front-running and volatility concerns, while keeping things simpler for all participants.

3 Likes

I agree with Adam that perhaps we may be trying to think of new creative ways to fit the bonding curve, rather than looking to tweak the bonding curve itself. While I do see the benefit of a system like this, I also think it’s becoming overly complicated and may end up further benefitting those with a penchant for math and game theory moreso than driving the core role of a Curator.

What if the question was posed more as: “What kind of incentives do we actually need to keep Curators working/fulfilling their part of the network?”
Although, I don’t want to derail this thread, as this conversation and proposal is well thought out and has definite merit and should be considered.

3 Likes

There are a few reasons:

  • Frontrunning bots and low quality signal Flattening the bonding curve will not change the current dynamics. Being the first on the bonding curve would still carry the least amount of risk, while also having the largest potential upside. As before:

    • Frontrunning bots will still be able to exploit the protocol
    • Curators are rewarded for being first, and not for assessing subgraphs
  • Volatility The volatility would decrease. However, bots and early curators would still hold more curation shares than subsequent curators.

  • Malicious Developers A flatter bonding curve would not address this issue.

    • The highest potential rewards are still found early on the bonding curve. Therefore, whenever new subgraphs are published, Curators make split-second decisions on whether or not to curate the subgraph.

    • Malicious developers can take advantage of Curators rushing into bad decisions.

    • Developers are economically incentivized to create a new subgraph instead of upgrading an existing one. Each time they do this, they would be able to sell their curation shares at a profit.

  • Less sensitive curation signal A flatter bonding curve would have the added drawback of making the curation signal less sensitive.

In other words - this suggestion only partially address one out of three curation pain points. The other two challenges are not addressed at all. A flatter bonding curve also comes with an added drawback of changing the curation dynamics post the initial subgraph launch.

The bonding curve incentivizes Curators to predict future query volume. This, in turn, lets indexers allocate resources and optimize infrastructure and cost models accordingly. By making the curve flatter, you also decrease the potential earnings from being able to predict future query volume. This would make the signal less sensitive.

Tightening the bonding assists with the share allocation with curators, but it doesn’t eliminate the front running bots as they will still be in the highest reward potential and lowest risk position and it likely will not assist in the reduction of signal volatility. With the dynamic tax, we change the ‘wild west of signaling first’ into a ‘game of inspection, verifying and timing your entry’ which would result in higher quality signal for the network.

I do agree that this appears to add complexity, but with the tweaks to the UI showing the “profitable make/break point” at the time of signal should offer a curator a wormhole to understand the needed end result. As they become more acclimated to curation, they will be able to ask questions like “why is my total signal to reach profitability on new subgraphs”.

2 Likes

I think the same considerations around there still being an opportunity for frontrunners could be applied to the suggested dynamic tax, it just changes the optimum entry point for bots - as @DataNexus says, it is still about timing your entry.

A flatter bonding curve also comes with an added drawback of changing the curation dynamics post the initial subgraph launch.

I think this drawback could also be applied to any of the suggested changes.

Perhaps there is an opportunity to keep the bonding curve as it is for signal query rewards, while making the GRT deposited more like staking (i.e. you get out what you put in, minus some tax, rather than selling shares on a curve). I think this would address the issues identified (1. frontrunning bots would have no incentive to buy then sell, 2. volatility should be low as signal should only come from users who either want the subgraph to be indexed, or who genuinely believe there will be profitable query fees, and 3. there would be no opportunity for malicious developers). It would incentivize users to signal for long enough for query fees to be accrued.

This would obviously also significantly change the short-term economic opportunity for Curators, so I appreciate that it is a radical suggestion, but I think it is worth discussing in the interests of solving the challenges identified. There might be other ways that signal could be rewarded (e.g. some % of the indexing rewards for those subgraphs, more akin to Delegators - though I am sure more thought is required here).

1 Like

Thanks for the feedback on the decaying capital gains tax @Slimchance. I’m realizing now that our ideas are different enough that I’ll post a more fleshed-out description in its own thread so as not to distract from the conversation here.

For now, I’ll acknowledge that you are correct that a decaying capital gains tax could make the bonding curves less sensitive to short-term information changes, but that ultimately as protocol designers we’ll have to decide over what time horizon we want to maximally reward correct predictions.

I actually think @Ohmyjog’s observation that Curation (and also Delegation) can be modeled as Continous Organizations is a very astute one. The settings are very similar, and how we’ve been thinking about these mechanisms at E&N, irrespective of the suitability of all the specific design choices that the Fairmint team made. It’s worth noting that the BlockScience team (who is also working on The Graph) audited the Fairmint continuous organization design and the “initialization phase” is very similar to the “hatching phase” that BlockScience specified in Augmented Bonding Curves, which I describe in this thread.

My thoughts on the proposed dynamic curation tax/“reverse auction”

  • I generally agree with the critiques that the above is overly complex relative to what it accomplishes and, given that this mechanism only focuses on initialization, simpler designs are available such as a flat bonding curve during a hatching/initialization phase as referenced by @Ohmyjog and in the thread, I linked above. (I believe your critiques of a flatter bonding curve @Slimchance were in reference to a permanently flatter bonding curve and are valid in that context)

Additionally:

  • The high taxes in this design impose a large deadweight loss to the system and discourages important protocol actions, specifically:
    • 20% tax for subgraph deployers is too high. There is currently a proposal to reduce the curation tax which is based on feedback that the tax might already pose too large a cost on subgraph developers and could disincentivize migrating to the decentralized network from E&N’s hosted service.
    • Near 100% tax for Curators would also strongly disincentivize Curation during that early time period, which could be desirable in some contexts, but perhaps not in others (for example if the deployed subgraph is an upgraded version of a named subgraph that already has good reputation and Curators had already previously reviewed previous versions of that named subgraph or other subgraphs by that developer).
  • Introducing a Curator fund controlled by a DAO at the core protocol level expands the governance surface area and goes against one of the governing principles of the network: progressive minimization of governance surface area.
  • Front-running, pump-and-dump, and sandwich attacks can happen at any time, but this design only addresses the initialization of the bonding curve, so would need to be paired with something like batched bonding curves, separate in/out curves, decaying curation tax, etc. to be effective, at which point the marginal value of the additional complexity introduced by this mechanism is reduced even further.
5 Likes

This is a poorly illustrated vision I had, I’d love to hear your thoughts on it.

All values GRT, %, and TIME values used below are only examples, not necessarily suggestions

Let’s assume a 10 day period in which we’ll say ALL Curation shares cost 200 GRT. Any/all signals are taxed 10% during this time, with half of this tax (5%) being reapplied on the “sell” side of the bonding curve at either a predetermined future time OR some kind of linear unlock/redistribution. The other 5% of the taxes pooled from this period could be allocated to The Graph Foundation for various uses, as mentioned in the original post (see below for possible use-case). Perhaps this 5% is then split 50/50, with half (2.5%) going to The Graph Foundation and the other half locked for the Subgraph Deployers benefit, to be released after 1 year (tax-kickback).

After this hypothetical 10 day period is finished, the normal 2.5% tax goes into effect & the bonding curve becomes active. To discourage users/bots from unsignalling upon the redistribution of the aforementioned 5% tax OR at first sign of profit, an exit tax of 2.5% (for this example) would apply to all withdrawals. This tax would then applied to the “sell” side of the bonding curve to further lessen the financial impacts of large transactions on other Curators (subgraph deployers, included).

Lastly, a feature which allows Subgraph Deployers or The Graph Foundation etc. to directly deposit GRT into the “sell” side of the Bonding Curve, with the intention of increasing the value of current shares as an incentive/reward. The GRT could come from the pool created during the fixed-share-price period mentioned above. This would be a beneficial tool for sustaining long-term Curation signals and/or recognizing positive behaviors etc., in addition to acting as a kind of tax-kickback for self-signalling Subgraph Deployers.

I have tried to take some positive aspects of Continuous Organization & make it usable for The Graph.

1 Like

PS everyone I’m really sorry for steering this thread every which way. No harm intended :v:t2:

2 Likes

I think the same considerations around there still being an opportunity for frontrunners could be applied to the suggested dynamic tax, it just changes the optimum entry point for bots - as @DataNexus says, it is still about timing your entry.

To be able to time your entry, you would need to correctly assess the future signal on a subgraph. In other words - predict the behaviour of future Curators. This is not something a bot can easily achieve. (Not talking about frontrunning single transactions with 0 slippage protection). I believe a Reverse Auction would benefit human curators that are able to correctly assess the total signal on a subgraph.

A flatter bonding curve also comes with an added drawback of changing the curation dynamics post the initial subgraph launch. I think this drawback could also be applied to any of the suggested changes.

The Reverse Auction will definitely flatten the bonding curve in the period it is active. However, our suggestion was that it would only be active for 10 days, after which it would no longer impact the price of curation shares.

Perhaps there is an opportunity …

I would love to hear a more fleshed out proposal. (With numbers :pray: ).
I have been hesitant to propose radical changes to curation, as it would not only change the whole curation market, but could potentially change the dynamic between curators and other stakeholder groups. Though - I do like the idea .

I have also humored another idea: Curators stake a certain amount of GRT to become Curators. They will then signal to the network the expected query volume of different subgraphs. They would earn query fees according to how well they perform as curators. However - this would again require a radical change in how curation works, and many stakeholder groups might be reluctant to give their support.

1 Like

Brandon, I appreciate your feedback on this.

This option was discussed, our concern that a complete remodel would lead to a longer runway in implementation. If you feel like the development time is a similar estimation I am open to digging further into this.

This tax would purely be for those who want to guarantee 1st position on the bonding curve. For any subgraph that is going to receive traffic, this is a no brainer - and the subgraph dev is often times going to be the closest to knowing how much use it’ll see. If they don’t wish to pay this tax, they can opt not to self signal their subgraph.

As an additional point this would help address the situation of promoting upgrading the deployment subgraph versus deploying a new GNS (causing disruption on an existing curve).

1 Like

This idea is great and has my full support. My only question is to whether the dynamic curation tax has to start at 100%? I would love to see what this would look like starting with the curation tax at 50% and 5% per day instead of 10%. I see the appeal of starting it at 100% but it may be just as symbolic to have the tax start at 50% while also keeping the needed incentives for curators to early signal precisely. Would love any feedback or constructive criticism on this idea!

1 Like

If we are set on the tax starting at 100%… maybe it doesn’t have linearly decrease as it does currently. Could we model the tax something like the picture below for days after the deployment? I know keeping it linear makes it easy for curators to understand the tax but if the function being used is readily available to all curators this function is not too complex for anyone to grasp. I’m not sure if this is the exact function that should be used. In my opinion, the function should still provide high incentives for being early as a curators while deterring “apeing” and encouraging curators to spend more verifying a subgraph and evaluating what a fair signal allocation is. Would love to hear some thoughts on this!

1 Like

Just an example to spark conversation… this may be a better function and would love to hear suggestions of how to better this idea. In this case after day 9, on day 10 the tax would drop to standard 2.5% from there on out.

2 Likes

With all of the scenarios involving a Dynamic Tax, my concerns mainly center around multiple Signals within the same day. As we have seen, “Day 1” of Curation is not TYPICALLY going to consist of just one Signal. When used in conjunction with an active Bonding Curve, the transaction-based Dynamic Tax will have a compounding effect on the value of subsequent Signals which occur on any given day (ie DAY 1, Signal 2, 3, 4, etc.) . I personally prefer your time-based proposal, as it would keep the Tax rate constant for the entire duration of a day. The Bonding Curve is still active, but Curator 2, 3, 4 are not subject to further devaluation caused by a tax rate based on their transaction position.

1 Like

We discussed the reason of starting at 100% as a symbolic point. In other words, you are not supposed to signal without researching it first.

We also talked about the rate at which the tax reduces. I’m open to a quadratic rate as opposed to linear. This point and the period (10 days vs 5 days) was something we were intending to get community feedback on.

2 Likes

I imagine this would be calculated at a ‘per ETH block’ basis. 5 people signaling would experience consecutively lower tax rates even if they all signaled on the same day because with each ETH block the tax reduces.

2 Likes

Without starting at 100%, I believe we would still see frontrunning bots and low quality curation. A few example numbers:

Example 1

  • At 90% tax, a curator signalling with 1,000 GRT will break even at 3,025 GRT total signal

  • The same curator will be able to burn their curation shares for 2,000 GRT (doubling their initial deposit), if the total signal hits 11,000GRT



Example 2

  • A curator does not wait for the tax to drop to 90%, and buys shares when the dynamic tax is at 98% tax. This curator will break even if the total signal hits 13 004 GRT. If the total signal increases above this, they would be able to sell their shares at a profit.
    Note: I use just 2 curators in my examples. The Profit/Loss for curator 1 would be the same no matter how many curators signal after him, as long as the total GRT in the bonding curve adds up to the same total signal.

In other words - If a subgraph is expected to reach a total signal >10 000 GRT, curators may start signalling the very first day, even if that means a dynamic tax of over 90%.

The linear decrease was chosen for its simplicity. It would be easy for curators to understand and predict. Though, I’d be happy to explore other functions.


Instead of looking at the dynamic curation tax, we can look at how it affects the “effective curation share price”

  • If the tax is linearly decreasing over 10 days, we get a dynamic curation tax of f(x)=100-10x
    image

  • If we look at how this affects the price of curation shares, we find g(x)= 1/0.1x
    image

    • We see that 1 day after a subgraph has been published, the effective price of curation shares is 10 times what they would be without a reverse auction. (90% tax). On day 2, the price is at 5 times the non-auction prices.

    • In other words - even if the dynamic curation tax is linearly decreasing, the impact on the price has a curve that is very steep during the first couple days, then flattens out.
      This was also noted by @Cole in a workshop on the Reverse Auction

    • Highly anticipated subgraphs might see a lot of activity during the first couple days, as the relative change is highest.


We could look at other dynamic curation tax functions that would give curators more or less time to assess subgraphs Example B in my original post features an “S-curve” that aims to do both: increase the time for highly anticipated subgraphs to start receiving signal, while also decreasing the time for smaller subgraphs. The drawback being the increased complexity.

Example 3

  • Tax f(x) = 100-x^2
    image

  • Relative Share Price g(x)=image
    image

2 Likes

Reading through the responses on the proposal has lead me to think about how to help each other better understand the origin of our ideas and perspectives.

Guide to Understanding You (Referenced below)

  1. Your perspective/role in relation to the curation system
  2. Your expectations/assumptions of how curation should work
  3. Your implementation timeline tolerances for a given solution
  4. Your definition and tolerance of complexity within the system

If we can share the Guide to Understanding You with each other we can start to understand the context of where each of our voices is coming from. I believe this is a crucial piece to the health and wellness of The Graph as the protocol grows and responsibilities become more and more decentralized.

Key benefits:

  • Closer knit community that understands many people’s contexts
  • Helps discussions have more clarity and momentum
  • Gives opportunity to help lift up someone else’s context/way of thinking
  • Allows boundaries/structure to be set at the start
  • Setting structure can ignite innovation by having people think outside that structure
  • Proposals can move through this process more efficiently

Here’s a list of questions to help as we discuss this proposal:

Implementation

  • What implementation timelines are appropriate?
  • Who will have to be involved in the implementation?
  • What would be the ratio of change to current codebase?

UX

  • What stakeholder in curation do you consider most important?
  • Does it improve the experience for some but degrade it for others?

Vision

  • What vision was initially created for the curator experience?
  • What pain points were expected and allowed to exist?
  • What vision do you have for the curator experience?
  • What are the long term goals you view for the curator experience?
  • What are the short term goals you view for the curator experience?

@Brandon @Josh-StreamingFast @adamfuller I’d love to hear about your Guide to Understanding You :rocket: :slight_smile:

4 Likes

I think this is a pretty good idea actually, Jona!
I would recommend starting up a new thread though, so as to not bury this down within the dynamic curation thread. I see great value in an exercise like this for any topic that is being discussed. I also think it shouldn’t be limited on who could/should be responding. I think that anyone who is taking part in the discussion should fill it out, as it will add more colour to the who we are debating/discussing with, which is super important.

4 Likes