yoroi-frontend: [ergo] Can we fix dust mining issue? Out of memory error

Can we fix dust mining issue. I have over 300 transactions in one address and now the wallet grind to a halt. I can’t do anything. I can’t even send all the assets to a new address.

It looks like it is caused by this error on the console:

ErgoApi::refreshTransactions error: "out of memory"

About this issue

  • Original URL
  • State: open
  • Created 3 years ago
  • Comments: 26 (9 by maintainers)

Most upvoted comments

There are more than 900 transaction records in my wallet now, which is blocked. I tried the browser version and the Android phone version and both had this problem. My computer had 16GB of RAM and still jammed and the CPU went up to 100%.

This is a serious problem that you haven’t solved for so many days

Version 4.9.0 is being pushed to production at the moment. Already available in Firefox, but Chrome takes some time. Once it’s available check please if the situation is better in any way.

Just to update, I’m still having the out of memory issue as of 4.9.1. Issue persists

Hi 🙂. Will v 4.9.0 stop requiring read and write data to all sites on Chrome? The browser disables the extension for that reason (unless you choose to accept giving those very broad permissions).

Version 4.9.0 will not stop requiring read and write data to all sites as it’s already requiring it. You can disable the actual access the browser is giving to the extension in the settings after you accept the general permission, see comment here: https://github.com/Emurgo/yoroi-frontend/issues/2655#issuecomment-1011864879

I am not sure what is going on but my pool says unknown pool but it usually says my pools name. I am just sick of losing my $

Is that somehow related to Ergo memory issue or are you just posting on random issues?

We are in the process of reworking the backend API and the local storage model to help resolve this problem. It’s a deeper issue of how Yoroi were not fundamentally designed with this kind of activiy in mind. Now it’s being updated to be able to handle it, but it’s not a one-day fix. Next version 4.8 that is being prepared at the moment unfortunately will not have this solved, I can tell that right away. But the version 4.9 after that might have the first step toward better handling of historical data. And after that we have the plan for further improvements, but they can only be gradual.