atuin: Unable to sync on new devices

I have the same problem as Issue #362.

That is, I am unable to sync atuin on a new device. I have the error message

thread 'main' panicked at 'failed to decrypt history! check your key: failed to open secretbox - invalid key?

Location:
    atuin-client/src/encryption.rs:97:22', /build/source/atuin-client/src/api_client.rs:143:45
    note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

I tried to run the same atuin logout; atuin login -u usernamehere -p passwordhere -k keyhere; atuin sync. The machine which sync worked always successfully ran the commands, while the machine failed to sync always failed with the above message.

I tried to remove everything under ~/.local/share/atuin/ and ran the command again, it also didn’t work.

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Reactions: 3
  • Comments: 22 (8 by maintainers)

Most upvoted comments

@Anagastes I’m really sorry but at no point have I ever said that “self hosted sucks and can’t be debugged…”.

I’ve said I’d rather that people open their own issue if they’re self hosting, and provide all the details about their own specific setup. I find large issues involving a whole bunch of people like this VERY difficult to follow. I’d need debugging cycles with each of the 11 people in this issue, all in one place, when github issues do not support threading.

I’ve said that self hosted setups differ tremendously from user to user - some run the binary, some use docker, some use kubernetes, I know some people using Nomad. Some host on Linux, some host on mac, some use postgres, some use sqlite, etc etc etc. Without all of that information from each person with a problem, it’s really hard for me to do anything at all.

GitHub does not allow me to specify that I am closing this issue in preference of another, which I linked above - #1199. In 1199, people debugged the same problem and got further than this issue here. Which is why I have closed this one, in preference of that.

Screenshot 2024-02-01 at 19 57 27

See where “not planned” can also mean “dupe”.

I’ve countless hours reworking sync to be way more flexible, and totally eliminate this problem for everyone.

So to be honest, it really sucks to be called arrogant. I’m sorry you feel that way about me. I’ve done nothing but try and help. In your own words, I don’t like to be treated like this when I only wanted the best.

Can your clarify what is “not planned” here? For my part I on your hosted version and several other users have confirmed that hooking up more than 2 machines is a problem.

I have a modified version of this story.

I initally set up my autin install on one Arch Linux machine using the auto setup script. I created my account there on the public server, and did my initial import and sync.

I then set up autin on a second Arch Linux machine using a similar system setup as the first. The import and sync there worked as well.

Now, I’m only able to sync on the second machine, and all other machines, both the first, and any additional I configure thereafter, all give the same error about the key. I did a comparison of the keys on the first two machines using sha256sum and md5sum and both files have identical hashsums. I use zsh on all machines.

I have the same error.

I have atuin running as my own server. The login and sync from the main account (with which the user was also created) works.

On the new PC, the login works, but the sync with the key does not. Here the debug output.

Oh and i’m running manjaro stable with zsh and with atuin 11

The server runs under debian with last atuin release from git

RUST_LOG=debug RUST_BACKTRACE=full atuin sync
 DEBUG atuin_client::database > opening sqlite database at "/home/XXX/.local/share/atuin/history.db"
 INFO  sqlx::query            > PRAGMA page_size = 4096; …; rows affected: 0, rows returned: 2, elapsed: 522.622µs

PRAGMA page_size = 4096;
PRAGMA locking_mode = NORMAL;
PRAGMA journal_mode = WAL;
PRAGMA foreign_keys = ON;
PRAGMA synchronous = FULL;
PRAGMA auto_vacuum = NONE;

 DEBUG atuin_client::database > running sqlite database setup
 INFO  sqlx::query            > CREATE TABLE IF NOT …; rows affected: 0, rows returned: 0, elapsed: 14.062µs

CREATE TABLE IF NOT EXISTS _sqlx_migrations (
  version BIGINT PRIMARY KEY,
  description TEXT NOT NULL,
  installed_on TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
  success BOOLEAN NOT NULL,
  checksum BLOB NOT NULL,
  execution_time BIGINT NOT NULL
);

 INFO  sqlx::query            > SELECT version FROM _sqlx_migrations …; rows affected: 0, rows returned: 0, elapsed: 34.775µs

SELECT
  version
FROM
  _sqlx_migrations
WHERE
  success = false
ORDER BY
  version
LIMIT
  1

 INFO  sqlx::query            > SELECT version, checksum FROM …; rows affected: 0, rows returned: 2, elapsed: 23.528µs

SELECT
  version,
  checksum
FROM
  _sqlx_migrations
ORDER BY
  version

 DEBUG atuin_client::sync     > starting sync upload
 DEBUG reqwest::connect       > starting new connection: https://XXX/
 DEBUG hyper::client::connect::dns > resolving host="XXX"
 DEBUG hyper::client::connect::http > connecting to XXX:443
 DEBUG hyper::client::connect::http > connected to XXX:443
 DEBUG rustls::client::hs           > No cached session for DnsName(DnsName(DnsName("XXX")))
 DEBUG rustls::client::hs           > Not resuming any session
 DEBUG rustls::client::hs           > ALPN protocol is Some(b"h2")
 DEBUG rustls::client::hs           > Using ciphersuite TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
 DEBUG rustls::client::tls12::server_hello > Server supports tickets
 DEBUG rustls::client::tls12               > ECDHE curve is ECParameters { curve_type: NamedCurve, named_group: X25519 }
 DEBUG rustls::client::tls12               > Server DNS name is DnsName(DnsName(DnsName("XXX")))
 DEBUG rustls::client::tls12               > Session saved
 DEBUG h2::client                          > binding client connection
 DEBUG h2::client                          > client connection bound
 DEBUG h2::codec::framed_write             > send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384 }
 DEBUG h2::proto::connection               > Connection; peer=Client
 DEBUG h2::codec::framed_write             > send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 }
 DEBUG hyper::client::pool                 > pooling idle connection for ("https", XXX)
 DEBUG h2::codec::framed_read              > received frame=Settings { flags: (0x0), max_concurrent_streams: 100 }
 DEBUG h2::codec::framed_write             > send frame=Settings { flags: (0x1: ACK) }
 DEBUG h2::codec::framed_read              > received frame=WindowUpdate { stream_id: StreamId(0), size_increment: 2147418112 }
 DEBUG h2::codec::framed_write             > send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) }
 DEBUG h2::codec::framed_read              > received frame=Settings { flags: (0x1: ACK) }
 DEBUG h2::proto::settings                 > received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384 }
 DEBUG h2::codec::framed_read              > received frame=Headers { stream_id: StreamId(1), flags: (0x4: END_HEADERS) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(1), flags: (0x1: END_STREAM) }
 DEBUG reqwest::async_impl::client         > response '200 OK' for https://XXX/sync/count
 INFO  sqlx::query                         > select count(1) from history; rows affected: 0, rows returned: 1, elapsed: 74.208µs
 DEBUG atuin_client::sync                  > remote has 339, we have 149
 DEBUG atuin_client::sync                  > starting sync download
 DEBUG hyper::client::pool                 > reuse idle connection for ("https", XXX)
 DEBUG h2::codec::framed_write             > send frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) }
 DEBUG h2::codec::framed_read              > received frame=Headers { stream_id: StreamId(3), flags: (0x4: END_HEADERS) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(3), flags: (0x1: END_STREAM) }
 DEBUG reqwest::async_impl::client         > response '200 OK' for https://XXX/sync/count
 INFO  sqlx::query                         > select count(1) from history; rows affected: 0, rows returned: 1, elapsed: 27.647µs
 DEBUG hyper::client::pool                 > reuse idle connection for ("https", XXX)
 DEBUG h2::codec::framed_write             > send frame=Headers { stream_id: StreamId(5), flags: (0x5: END_HEADERS | END_STREAM) }
 DEBUG h2::codec::framed_read              > received frame=Headers { stream_id: StreamId(5), flags: (0x4: END_HEADERS) }
 DEBUG reqwest::async_impl::client         > response '200 OK' for https://XXX/sync/history?sync_ts=1970-01-01T00%3A00%3A00%2B00%3A00&history_ts=1970-01-01T00%3A00%3A00%2B00%3A00&host=8b729336f723b415fc5ad8fdec64b26949dbbbe4c52d114a268cd5b2d9b39b00
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5) }
 DEBUG h2::codec::framed_read              > received frame=Data { stream_id: StreamId(5), flags: (0x1: END_STREAM) }
thread 'main' panicked at 'failed to decrypt history! check your key: failed to open secretbox - invalid key?

Location:
    atuin-client/src/encryption.rs:98:22', /build/atuin/src/atuin-11.0.0/atuin-client/src/api_client.rs:143:45
stack backtrace:
   0:     0x55fc1904bd4d - <unknown>
   1:     0x55fc1906fccc - <unknown>
   2:     0x55fc19046061 - <unknown>
   3:     0x55fc1904d4f5 - <unknown>
   4:     0x55fc1904d216 - <unknown>
   5:     0x55fc1904da86 - <unknown>
   6:     0x55fc1904d977 - <unknown>
   7:     0x55fc1904c204 - <unknown>
   8:     0x55fc1904d6a9 - <unknown>
   9:     0x55fc1875d183 - <unknown>
  10:     0x55fc1875d273 - <unknown>
  11:     0x55fc188946f9 - <unknown>
  12:     0x55fc187cc708 - <unknown>
  13:     0x55fc187f9bca - <unknown>
  14:     0x55fc1880425e - <unknown>
  15:     0x55fc18856c3c - <unknown>
  16:     0x55fc18836b96 - <unknown>
  17:     0x55fc18874c44 - <unknown>
  18:     0x55fc188ab120 - <unknown>
  19:     0x55fc187d2f73 - <unknown>
  20:     0x55fc18775639 - <unknown>
  21:     0x55fc1903ebde - <unknown>
  22:     0x55fc188ac442 - <unknown>
  23:     0x7f7a87a5d290 - <unknown>
  24:     0x7f7a87a5d34a - __libc_start_main
  25:     0x55fc1875d475 - <unknown>
  26:                0x0 - <unknown>