Diff Adjustment (Potential Design/Tradeoffs)

This is a thread to start a community discussion. Also potentially education about diff adjustments in general.

7 Likes

Iā€™d like to hear the opinion from someone knowledgeable on ASERT Difficulty Adjustment Algorithm (aserti3-2d), which was implemented by BCH after major issues with their previous DAA.

This is original proposal I believe, includes a lot of commentary and tests against other DAAs:

Further info:

To be clear, Iā€™m not convinced Ergoā€™s DAA is worth a hardfork. These links show how complex implementing such a choice would be.

4 Likes

Adjust difficulty based on average block times between ā€œsuper blocks.ā€ I was reading up on NiPoPoWs and it made me think of this idea. This would randomize when difficulty adjustments occur. Difficulty adjustments would happen more quickly when hash rates spike as it will be more likely to hit a ā€œsuper blockā€. When things slow down, itā€™ll take more time to decrease block times. Similar to how things are now, but would adjust more quickly and with added randomness which will prevent gaming the system. Thoughts?

4 Likes

Superblocks happen too infrequently, if difficulty is adjusted high (like current situation) this would greatly lengthen the time before itā€™s adjusted back down.

Add that to the fact that block times can end up multiples higher than target (like in current situation) and youā€™d only be making a bad situation worse.

1 Like

Could use dark gravity wave.

3 Likes

There are different types of ā€œsuper blocksā€ of varying frequency.
Extract from, https://eprint.iacr.org/2019/1444.pdf, ā€œWe measured the superblock distribution in the mainnet Bitcoin blockchain. Our results are illustrated in Figure 1. As expected, half the blockchain blocks are 1-superblocks, 1/4 of blocks are 2-superblocks and generally approximately 2^āˆ’Āµ of the blockchain blocks are Āµ-superblocks.ā€

So we could potentially use a type 4-superblock (every 16 blocks on average) or something that occurs more or less frequently but at random.

6 Likes

Hmmmm, interesting idea. Thanks for the follow up!

Iā€™m not sure why, but I think Kushti is against difficulty retargeting every block. If we understood his reasoning, it could help generate ideas.

2 Likes

:+1:

Honestly with the amount of gpu hash floating around after the merge it makes sense to have the difficulty adjust based on the most recent previous blocks.

Miners will be swapping to the most profitable algo no doubt.

A large enough sample of blocks is need imo to slow the rate of change down based on recent blocktimes to the desired average.

After all that is what is desired. A stable regular block interval of 2 mins.

Hence my suggestion of dark gravity wave with a look back at the previous say 100 blocks, or 200 mins, and that will adjust the difficulty up or down as needed. The difficulty adjustment period can of course be adjusted to whatever parameters are preferred.

Ravencoin adopted DGW with a 180 block look behind early on in their lifecycle when it became gpu mineable and profit algos the miners use were causing the rvn network difficulty spikes. I know. I was there for it.

Their difficulty and hash follow each other pretty nicely now. Just offering suggestions.

Be interested to know why block difficulty targeting is not on the table as well technically. :pray:

3 Likes

Would it be possible to have a softfork that sends a percentage of the block rewards to a box when block time is under a certain threshold? The box empties again during slower blocks. It would help smooth out the overpayment of erg in the first part of a sudden extreme hashrate spike and save some for later to help keeping miners to stick around. My guess would be that something like that would be doable by softfork since the diff adjustment itself remains untouched.

4 Likes

This is similar to my thinking, but we donā€™t require any forks, just smartpools.

We can create smartpools that include a slush fund that pays out to miners when a sharp and sustained increase in difficulty is detected on the network.

The slush fund can be funded both or either by a constant fee on the mining pool, say 2-3% additional to dev fee; and by an opposite rule when the difficulty drops quickly (Say 20-50% of the block reward.)

This combined with a PPLNS reward model ( or at least one that can track loyalty over a long period of time, pre difficulty increase) will enable miners to see smoother rewards as the difficulty rises and falls WITHOUT decreasing the security of the network or removing voting power from miners.

2 Likes

Yes and the smart pool would have to track each miner as the extra rewards given during longer blocktimes should only be given to those miners that contributed during the shorter block time periods.

The above being said, I donā€™t think such a pool will attract miners as there is a lot more to gain by just hopping on when difficulty is low and hopping off after a large difficulty increase; either to save electricity or to mine something else.

2 Likes

Such a pool would attract loyal miners, the ones with a long term focus that are not chasing short term profits but the ethos of the Ergo Manifesto.

2 Likes

I think this could potentially work, just need to figure out what percentage (fixed or variable) and how to apply it. A potential problem I forsee with this is that it may force constant or near-constant difficulty, which wouldnā€™t necessarily be a good thing as it would facilitate 51% attacks.

Variable: Percentage determined by ā€œdistanceā€ from the ideal 120 second block time. Example: If hashrate increases and block times decrease by 12 seconds adjust reward being sent to box by 10% (if 48 ERG mining reward, send 4.8 erg to re-distribution box). If hashrate decreases and block times increase by 12 seconds then pay out from box to increase rewards by 10% (if 48 ERG mining reward, send additional 4.8 ERG).

This would essentially remove any incentive for the additional hashrateā€¦ which would keep difficulty constant. It would facilitate 51% attack.

Fixed: Probably better than variable but what thresholds would be used? Example: If block times less than 100 seconds send 10% of reward to re-distribution box. If block times greater than 140 seconds, pay out extra 10% from re-distribution box. This could dampen/partially mitigate the problem but would not resolve it. Once the re-distribution box is empty then what?

1 Like

A redistribution box need not ever empty, it could award miners a percentage of itā€™s remaining ERG. Mining pools can charge a higher fee when block time is lower than target (on a scale as you suggest in the variable example) and the repay it to those original miners when block time is higher than target.

In a zero sum situation where every miner pool hops, and coin hops, sure. The loyal miners that mine to secure the network, the ones mining now, the ones that can afford to mine and hold; those miners will always have incentive to increase their hash rate provided they can afford their electricity.

If we kick the can with a 2nd global remission contract including a variable or fixed tax as you describe, we risk miner transaction censorship and another potential hard fork as we did during EIP 27.

If we avoid soft forks and stick to voluntary smart contract mining pools, miners can choose their risk based on any combination of your variable and fixed solutions with numbers tweaked to fit their needs.

By keeping things voluntary, miners that want to coin hop for the quickest short term profit as Ergoā€™s difficulty rises and falls are not disincentivized, and miners that stay loyal to Ergo can see smoother rewards should they choose to participate in a pool offering such a scheme.

If something like this is implemented and the long haul miners migrate over to pools offering smooth rewards, there will still exist profit for coin-hoppers to play around our long difficulty adjustment time until such time that Ergo represents a majority of potential hashrate in the world.

There will always be incentive to increase hashrate/ secure the network for long term holders of a fixed supply asset.

2 Likes

True, but it can approach 0, which would essentially be the same as being empty; diminishing rewards with time.

Diminishing returns from a fixed supply is a very common distribution model in crypto. Bitcoin does it, and Ergo recently adopted it with EIP 27.

The existing difficulty adjustment formula should protect such an emission box from being emptied as long as it is funded by a percentage of every block reward, but I agree that a combination of fixed and variable would be ideal.

can you give more design detail on this idea? seems intersting and maybe useful. randomness+periodical=better solution, I think

Re-adjustment using super-blocks is NOT safe. The NiPoPoW paper explains ways that an adversary can easily mess with the distribution of super-blocks while still controlling less than 50% of the hashrate. The distribution of super-blocks is only guaranteed under the assumption that all miners are honest. The NiPoPoW paper explains ways to mitigate this effect in the context of proofs, but such techniques would not work in regards to difficulty adjustment.

Smart Pool based diff relief does sound interesting, I will look over some of the ideas posted here and give my own feedback in regards to this after some thought.

6 Likes

The redistribution box system seems like an interesting option, also avoiding HF?

What about the current system? Why was it chosen, how was it tested, what are the key pros? How do the key pros rank in evaluation, like the list mentioned in the conclusion of the BCH DAA article? How about the key cons & evaluation? Regarding the BCH DAA article, DAA system simulation with existing data (especially now with the Merge event last week) seems essential to evaluate how any proposed system change behaves in real-life stress tests.

Sorry for mostly asking questions, just scratching the surface of the topic today.

1 Like