GIP-0058: Replacing Bonding Curves with Indexing Fees

The cost of the average indexer is not based on compute units but is fixed. This is because the main cost is in operating the archive nodes or firehose. They are constantly growing in size and the amount of supported networks growths constantly as well.

Regarding the technical aspects, the proposal introduces the ‘Unit of Work’ as a central concept. While promising, the details matter. Currently, it seems the unit of work only accounts for tasks carried out by graph-node , overlooking other essential tasks like running RPC nodes and maintaining a firehose block set.

@czarly and @ellipfra , you seem to be raising the same issue. I’d like to dig a little deeper so I understand better. But you are right that this only considers the cost on the graph node side.

The way I think of it is that Archive/Firehose is a fixed cost that indexers incur before indexing any subgraph (assuming the indexer is running their own firehose/archive)… The actual indexing then incurs costs at the graph-node level But realistically, the graph-node setup is probably on some fixed infrastructure so indexing just takes up capacity of that hardware (disk, compute, etc.). So from an indexer perspective, they need to somehow “amortize” that fixed cost across all subgraphs and “charge” for the capacity on Graph Node. Maybe the amortization of the fixed cost is based on graph node usage, maybe not, but allowing indexers to price based on graph-node usage enables new pricing strategies for indexers to amortize their fixed costs.

Another interpretation in the current regime is that graph-node usage/work should be roughly correlated with time to sync, which is how long an indexer must wait to collect indexing rewards. So if an indexer can charge more based on the time to sync, it can be compensated for its time.

I am interested in your general thoughts and opinions but please correct me where my understanding is not correct.

1 Like

The current indexing rewards model is indeed effective in addressing fixed costs. Transitioning to a usage-based cost model could pose challenges for indexers operating on bare-metal due to their substantial capital and consistent fixed costs. This could also be a hurdle for emerging indexers. While amortizing fixed costs based on usage or variable revenues is a standard approach to resource allocation, The Graph has done an admirable job with the current system. So, moving away from it might be regrettable. (To clarify, my understanding of Horizon is solely based on guesswork, as I haven’t seen anything about it yet.)

Let me stress that it’s important to consider the varied operations that occur outside the graph-node. For instance, certain RPC queries require significant computational resources. Also, substream-powered-subgraphs are CPU-intensive during substream processing but demand less from the graph-node. These nuances should be factored in when evaluating the workload.

2 Likes

Wonderinf if you can expand a bit. Is it because an indexer can roughly calculate how much they will earn via indexing rewards with their budget of stake and then determine whether it covers the fixed cost?

1 Like

Yes, the current indexing rewards provide a level of predictability (excluding market volatility), enabling indexers to budget for capex (such as acquiring servers and storage) and make medium to long-term commitments (like salaries, service contracts, and rentals). This allows them to make projections, decide which chains to support, determine subgraph coverage, and plan investments. Ideally, any system that replaces indexing rewards should offer a similar level of revenue predictability. Otherwise, there’s a risk of pushing towards on-demand, cloud-based deployments. While these might seem advantageous in the short term, they come with centralization drawbacks and could prove to be more expensive for The Graph in the long run.

Again, I’d like to add a caveat that I’m indirectly criticizing a GIP that has yet to be published

2 Likes

I love the fact there is so much discussion on this topic. I wonder if we should have a session on this during indexer office hours?

3 Likes

This is very helpful.

This (very preliminary) GIP post was inspired by the current problems with curation. Specifically, the inability of payments (indexing rewards in this case) to be predictably directed toward the subgraphs that demand indexing services.

Now, I see that from an indexer’s perspective, choosing which subgraphs to index is only one (small) way rewards impact an indexer’s business model. Instead, indexing rewards are used for capital budgeting decisions.

So, my new perspective is that if we want to replace curation (with direct payments or some other mechanism), it would be better for the indexing community if we maintained low uncertainty in the capital budgeting decisions.

Let me know if that doesn’t track but I think that’s the takeawy.

2 Likes

This shouldn’t be a problem. I’d like to also keep the discussion going here for documentation purposes but I think IOH is another good place to keep discussing. I’ll try to set something up, though it might not be immediate.

2 Likes

From the proposal itself and from the discussions I saw that there is a perception that new system would decrease centralization by hurting big indexers and incentivizing active ones.

I just want to draw everyone’s attention to the fact that if data consumers will have an ability to choose indexers, they will choose:

  • Foundation members’ indexers, as they are by default the most trustworthy (if I could choose whether to buy some service from the creator of this service or from some 3rd party, I would choose the creator, because they know all the nuances of the product)
  • Big indexers yet again, simply because big indexers have big BD teams and offer multiple services apart from The Graph and any deal can be arranged
  • And only then they will turn to active indexers and to the open market

So, overall, this proposal will only lead to the suffer of the small indexers, as they will have to work harder to improve their QoS, as they do not have that much space to lower the costs, as big players on the market.

Also I do not see how this proposal helps new indexers to join the network. They won’t be selected by consumers, they won’t be able to compete in QoS, it seems that they won’t be able to predict the possible income. But with this proposal I can easily see the situation where an indexer looses all the selection algorithms and abandons the network afterwards. So, we might end up in a situation where we have around 30-40 indexers left.

Does The Graph still want to decrease the inflation from 3% or not? If yes, then what role will whole staking & delegation play when inflation is, let’s say, 0.5% and most of the revenue is distributed by indexing fees. It seems that in this situation all delegators and extra stake will become irrelevant. The staking part and the whole delegation is a bit of a mystery, I’m keen to hear about that stuff.

Overall, I’m waiting to hear what the whole Horizon will look like, maybe due to the fact that we are now exposed only to the part of Horizon, we can’t appreciate its full brilliance. I just want to highlight that in general everyone cherishes predictability, and when the whole tokenomics are being changed, everyone is just scared, thus we ask for more and more clarifications.

Anyhow, thanks for your work! Through dialog & discussions we will polish this proposal to a state where every participant, including the potential new ones and solo-indexers can benefit!

3 Likes

Thank you for your thoughts. These are insightful.

One thing to note in the proposal that is not as obvious as it should be is that we actually don’t expect the consumer to do the choosing but to defer to an automated algorithm (similar to the current ISA). Now maybe that automated algorithm also favors big indexers, but perhaps it can be tweaked so that it doesn’t. We suspect that most people that come to The Graph value the benefits of decentralization so they would be willing to pay a little bit more to an automated algorithm that provides decentralization benefits versus simply choosing the biggest, cheapest provider.

1 Like

Every dApp is a business

And these times a financially struggling one - most of the dApps do not have more revenue than their expenses and live only on investors’ money.

So, they will pick 3 cheapest or most trusted indexers - that is already decentralized enough to satisfy any beliefs.

1 Like

I think “determine subgraph coverage” kind of gets at this but I’d like to do a litmus test. In your opinion (knee jerk reaction, nothing is gospel), if we were to keep the current indexing rewards mechanism but also implement direct indexer payments, do you think an indexer would index a subgraph only for direct payments but no rewards on that subgraph (though they would be accruing indexing rewards elsewhere)?

The logic behind the question is this: If indexing rewards from other subgraphs are being used to cover Capex, then the current curation system covers that uncertainty. A new subgraph that is willing to pay, say 150 GRT per month to have its subgraph indexed would be a bonus for the indexer, even if it couldn’t collect indexing rewards by allocating to that subgraph.

1 Like

Very much agree. Thank you.

This of course puts the graph in a tough spot. We need to proivde 3 indexers that are cheaper than 3 centralized competitors, but we can’t pick the 3 cheapest on the network.

1 Like

Moreover, we, as indexer, can also be an investor for some project. Thus, we might be more than happy to provide free queries and free service for this project. And we will still benefit from it because this project will grow and our investment will play out better.

Other indexers do not have their investment fund nearby and can not get this “not direct benefit”.

But with this actions we will hurt the graph system and other smaller indexers. The Graph itself won’t get anything, but we will.

Even if we don’t do it because of our love for The Graph, others will because there will be an opportunity. Just don’t forget that by introducing this Direct Indexer Payments initiative The Graph itself can loose source of revenue, if consumers will start to pay not through the system, but in other ways. Even if the DIP will be not 0 then close to 0 and to avoid that the threshold should be established somehow.

2 Likes

That’s an interesting use case of the Graph (I’m referring to providing free queries because of extranalaties) and one I haven’t thought through. Thanks for bringing it up.

Just to be clear on one point though, “The Graph” does not earn revenue. I think you mean that the amount of payments flowing through The Graph might go down, but the Graph does not earn revenue by, for example, taking a cut of transactions.

2 Likes

Yeah, when I say “The Graph” I mean Indexers, Delegators, Curators and everyone in the system.

My main concern is to avoid introduction of loopholes with this proposal that can be exploited by some actors, even if those actors are big indexers, foundation indexers, or others.

The purpose of the decentralized system is not simply to be open by default, but to have the tokenomics where some small entity can still make revenue even by running the node at home without the need to be in a rivalry with industry giants with big BD departments.

1 Like

Understood. Thank you for clarifying.

1 Like

I wonder how much of a goal this actually is? I’ve talked to many different people who have multiple different views on this.

a) anyone with a small node should be able to participate (like a validator… set it up and forget it, except a software upgrade every few months)
b) [intermediate option]
c) [intermediate option]
d) indexing is a serious business. it is a full time job to be an indexer
e) as we keep adding more features (GIP-0042 subgraphs, substreams, rpc, sql, etc) indexing is only practical for larger teams
f) some/ all of the above

2 Likes

@mdarwin the key idea is that an introduction of this initiative might help not small entities, but small indexers with big entities behind, like Blockdaemon, Allnodes, Kiln, InfStones

Those are the huge entities throughout the web3 space, but now they can not compete with Data Nexus, Pinax, Wavefive, Stakesquid or Elipfra in The Graph.

But this initiative in the current form potentially creates a big market, bigger than the staking part, where those huge entities like Blockdaemon & Allnodes will compete.

The Graph might become not a target, but a weapon.

Those huge entities, P2P included, might use it in bundle offers alongside RPC nodes, infrastructure management, ZK Proof generation, audit, code support and many many other services. And DIP can be 0 in a package like this.

So, now there is a kind of a balance, where smaller entities can thrive to some extend, but we fear, that new non staked-based opportunities will open the door to those industry giants, and the competition will become more severe.

2 Likes

Hey all, just want to say that I really appreciate this discussion. Just want to reiterate that no decisions have been made when it comes to the protocol. The E&N Protocol Econ team has done some great research and is proposing some designs they’re considering, but this protocol is governed by the community. Robust input from all stakeholder groups is critical to any evolution in the protocol. Proposed changes could impact different participants differently, so it’s important to hear from lots of different participants so we can understand how changes might affect them and the health of the community as a whole. Thanks for taking the time to share your input! Keep it coming.

5 Likes

Hi! I’m Zac at Edge & Node, and I wrote the original Horizon proposal. It is my fault we’ve been silent about it because I’ve taken time to deal with a family emergency. I’m back now and will actively work toward publicly communicating about Horizon. I know there has been much uncertainty and angst. But Graph Horizon has given many renewed hope for the protocol’s future. There’s been more excitement and collaboration across the core devs lately than ever before. I’m thrilled to begin sharing these ideas with you finally.

Since launching the original protocol, we’ve learned a lot about our user’s needs and what steps the protocol must take to be the de-facto standard for decentralized access to the world’s public data. The design of Graph Horizon aims to rebuild The Graph from the ground up using first principles. It solves problems like:

  • Reliance on oracles and other forms of centralized governance
  • Beaurocratic processes for integrating new data services and products into the protocol
  • Permissioned roles, like the arbitrator, council, and gateway
  • Inefficient tokenomics that punish people for using the protocol by burning their tokens
  • Confusing and intractable UX for users
  • Security holes and economic attacks
  • Rewarding lazy whales at the expense of those who provide value to consumers
  • Unencapsulated complexity that makes it difficult to evolve the protocol or publish MVP products without breaking everything
  • Incentives to disintermediate the protocol
  • Unscalable mechanisms and high per-subgraph overhead
  • Failure to find product market fit
  • And more

Part of the hesitance in talking about Horizon stems from the fact that most of the sales pitch is “It’s The Graph I thought we were trying to build, without the problems.” I don’t know if there is a way to sell you on it without tearing apart the current iteration of the protocol and exposing its fundamental issues publicly. But, we understand that to gain the community’s support necessary to improve the protocol, we will need to be radically transparent about the current state of the protocol.

So, that’s what I will do - starting with curation. Tomorrow I will begin writing about the problem curation attempts to address, why it fails, and how we can design a better system. The opinions will be mine and mine alone, not representative of Edge & Node nor The Graph (the latter is not an entity and has no opinion). Buckle up.

5 Likes