Has changing the per-block rate to a per-epoch rate been considered? Why or why not is that a possible solution?
In addition to what Dave said, this carries the risk of letting the attacker time their signal and allocation towards the very end of an epoch and receiving a full epoch’s worth of rewards instead of just a few blocks. There might be other book-keeping we could do to mitigate this, but this adds complexity.
I also am struggling to see the direct tie-in to lowering the curation tax rate. While I have looked through the maths, and understand the point that is being made, I still don’t see how it ties back directly to a “need” to lower the curation tax rate.
You’re correct that the math doesn’t justify a need to lower the curation tax, only establishes a lower bound. At this point, as @davekaj noted, the need is anecdotal based on feedback that has come in from subgraph developers.
I don’t know of a way of mathematically proving a “correct” level of curation tax, though one thing we could do w/ more time would be to do a full accounting of the costs to subgraph developers of using the decentralized network and making sure they are not excessive when compared to centralized alternatives.
If we do adopt the change, we may assess whether we have lowered the curation tax by too much:
- We see instances of the attacks that Curation Tax was designed to mitigate.
- Indexers provide feedback that the costs of indexing subgraphs that upgrade too frequently for a given amount of signal are excessive.
In some ways, #2 actually seems self-correcting in the sense that I would expect Indexers to simply increase the threshold of signal + query fees that they require of subgraphs before indexing them, which in turn would again increase the total costs to subgraph developers.