WalletWasabi: Optimizing Anonymity Gain By Optimizing Timeouts
Background
There’s a constant 2h timeout in Wasabi, however this could be made dynamic and optimized for anonymity gain.
Quantifying Anonymity Gains
Take a round where 3 peer participated. All 3 got back 1BTC
mixed coins, and 2 of them got back 2BTC
mixed coins. In this case we quantify the quality of this coinjoin as 3*3*1 + 2*2*2
, as 3 peer gained 3 anonymity set to 1BTC
and 2 peer gained 2 anonymity for 2BTC
. Let’s call this number Wasabi Coinjoin Quality: WCQ. In this case WCQ = 44
. The higher the WCQ is for a given coinjoin, the better it is.
Optimizing Timeouts
The idea is to aim for the highest possible WCQ by dynamically adjusting timeouts. After every coinjoin, we calculate WCQ and adjust the timeout depending on WCQ’s direction by a 1m.
Examples
- Start server.
- Coinjoin 1 happened with
WCQ = 10
andtimeout = 120m
. - Lower timeout by 1m.
- Coinjoin 2 happened with
WCQ = 11
andtimeout = 119m
. - Lower timeout by 1m.
- Coinjoin 3 happened with
WCQ = 10
andtimeout = 118m
. - Increase timeout by 1m.
- Coinjoin 4 happened with
WCQ = 11
andtimeout = 119m
. - …
Notes
- This is purely server side change, so it does not requires client updates.
- With this algorithm we can make sure to deliver the greatest value with our current architecture to our users, as the timeouts would slowly converge towards the most optimal one.
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 4
- Comments: 29 (1 by maintainers)
The trend is turning around.
Lowballing and increasing as needed is cheaper than overshooting and we can fail at overshooting and be in the same situation. Seems only ideal for it being easier to code.
On Mon., Jan. 13, 2020, 11:41 Aviv Milner, notifications@github.com wrote:
These are the daily incomes since 1 Jan:
1 | 0.3429381 2 | 0.34836114 3 | 0.34360934 4 | 0.37555536 5 | 0.33264889 6 | 0.29230762 7 | 0.39711304 8 | 0.44784824 9 | 0.40725297 10 | 0.42587476 11 | 0.32236444 12 | 0.35203404
When we started the timeframe decrease there was an obvious jump, but the last two days shows a slight drop. IMO this timeframe is still too short to base statistics on it, so lets wait a bit more and evaluate on 27 Jan as discussed previously.
Final Results
Total WCQ
Total Income
Data
Sure, we set 27th of January. I’m just leaving some partial reports here.
IMO as we discussed earlier we need to give it some longer timescale to have better statistics.
Automatic timeout proved to be incorrect, as there’s no drastic jump or drop caused by cutting the timeout in half, so however we would mess with the timeouts it would be just noise (see @lontivero’s argument on control theory (https://github.com/zkSNACKs/WalletWasabi/issues/2940#issuecomment-571727036))
A bit longer timeframe into the past.
So far, here are the 24h WCQ stats since we took down the timeout to 1h from 2h:
Note, the only thing that matters here is the
WCQ Sum
, the rest of the stats are just interesting.Because it is a function of two variables where you have a multiplication in the numerator (wqc) and a divisor (t). Then there are two ways to maximize it (get and infinite number):
Given it is easier to make this wqc/t to grow by decreasing the t close to zero than making the number of participants close to infinite (we control the interval but not the number of participants) is that I said that.
Of course we don’t need to keep one variable constant and the optimal value would be to decrease the interval as close to zero while the number of participants also increases (a dream).
What we don’t know is if those two variables are really independent, i mean, we don’t know if a reduction in the interval can cause a drop in users’ participation, because in that case we should discover the optimal combination of time and wqc.
IMO it doesn’t matter how long we wait for 100 participants, 10 minutes ago there were 79 waiting users and now there are 72 so, two hours, three hours or more hours allow us to increase the “probability” of joining 100 people but decreases the real wqc that Wasabi provides at the end of the day.
This is why I think that if this algorithm works as expected we will see a reduction in the timeouts.
Notes:
What you want to do is known as control theory and your suggested implementation is a variant of the proportional control. In proportional control the next value of the variable (in this case, timeout) is multiplied by a factor, for example:
next_time_out = current_time_out * factor
. The problem with this is that the algorithm has no memory and then generates vibrations, meaning that the algorithm doesn’t converge in an smooth way to the optimal value. That’s why the bitcoin adjustment algorithm uses the expected number of blocks after two weeks and not simply based on how long it took to mine the latest block.That’s right - the optimal solution will likely not have the t -> 0. But I have a different concerns.
How do you know you are quantifying the value of a CoinJoin accurately? You could simplify it by simply considering the number of participants, or the sheer volume of the mix. Or you could simplify it to be the fees earned by the coordinator (which is closest to what Adam has right now).
Why do you think you should lower the timeout by 1 minute if the WQC goes up and vice versa? How are these things at all related? This doesn’t seem to have a clear reasoning behind it.
Let me offer another idea entirely - you have 120 minute countdown. Assuming there are 40 participants registered or less, this countdown continues. For every person over 40 that registers, the server subtracts 1 minute from the countdown.
If we have 100 participants, the Coinjoin will happen 1 hour earlier. If we have 70 participants, it will happen 30 minutes earlier and so forth.
I am not sure to understand the last comment so, probably I will say the same that you. I think the idea should be to optimize the anonymity/time (
wqc/t
). What would be something good because we could provide more anonymity to users by day so, those that keep mixing would speed up the process.The problem is that easiest way to increase the number is by reducing the timeout closer to zero and that can be done only if we agree on having more coinjoins with less participants.