core: armv7 QNAP Docker: Restarting loop - Core finish process exit code 256 - Core finish process received signal 11 (exit code 139 and exit code 135)

Version 2022.7+ currently boot loop on QNAP NAS ts-231p

INFO: Home Assistant Core finish process exit code 256
INFO: Home Assistant Core finish process received signal 11
…

What version of Home Assistant Core has the issue? 2022.7+

What was the last working version of Home Assistant Core? 2022.6.7

What type of installation are you running? Home Assistant Container

Additional information Running on QNAP NAS (TS-231P).

Processor: Annapurna Labs Alpine AL314 Quad-core ARM Cortex-A15 CPU @ 1.70GH

About this issue

  • Original URL
  • State: open
  • Created a year ago
  • Comments: 145 (16 by maintainers)

Most upvoted comments

So I managed to bring up 2023.10.5 (lsat version with orjson 3.9.7) Additionalls: apk add ffmpeg ffmpeg-dev openblas openblas-dev cmake pip install ha-av==10.1.1 numpy=1.2.24(?) PyTurboJPEG pip install --no-binary=:all: webrtc-noise-gain grafik

Could you help me?

Currently the most recent builds of HA do not run on our hardware since they require Alpine 3.18. You can use my github project to build HA 2023.10.3 on your machine until we resolve the segmentation fault issue found on Alpine 3.18.

I have a project to ideally maintain and update an HA dockerfile for our hardware. It currently builds on my NAS but I can’t get it to build via Github Actions. Hopefully this project doesn’t need to be around for too long. I would like to hope that the underlying issues get addressed so official images work again. It might help if @magicse lays out a bullet point list of what you found that needs to be fixed/addressed.

https://github.com/Wetzel402/home-assistant-docker-armv7

I noticed one of the reasons HA was failing to install dependencies was that it needed zlib. I removed the # Cleanup unnecessary files and caches section and the images starts up correctly. I was able to get to the onboarding screen in a clean container. Running in my production container is TBD.

Hello, I was testing different images and tags from the official armhf for a QNAP TS-231P3 (Alpine AL314 armv7) and I reached the exit code 139. So definetly the official images are broken for that architectures. Now I’m surviving with a linuxserver image, but I think that for future releases I will need use the snipped on: https://github.com/home-assistant/core/issues/86589#issuecomment-1673510507

Thank you for the thread and the help

Problem still with auditwheel + patchelf

I was able to use @magicse instructions to get an updated instance of HA running on my QNAP NAS also. This proves there is an issue with the official HA builds that needs to be addressed as we were suspecting. I hope the issue is addressed soon. This work around makes updating far more complex.

Edit: I’ve been struggling to use this method with my production HA docker compose. The closest I get is the log looping a message about receiving traffic from a reverse proxy but my HTTP config not being set up for it. As far as I can tell this means it must be using a default config rather than my production config but I can’t figure out why. Any tips or suggests would be appreciated.

Got my own Container now running. It was a lot of work. Thx to Magicse 😉

@frenck: is there any chance to get this fixed by the team in future? Topic open since January

I don’t know what developers are doing with their container. But now the new home assistant container image throws error 139 even after I change the entrypoint to /bin/ash. Most likely, the binaries are assembled in QEMU without taking into account the subtleties of the Alpine AL314 Quad-core armv7 hardware architecture. If you see “exit code = 135” or “exit code = 139”, then your devices architecture is not supported.

RUN |6 BUILD_ARCH=armv7 QEMU_CPU= SSOCR_VERSION=2.22.1 LIBCEC_VERSION=6.0.2 QEMU_CPU= ??

Thx I will try it 😉

I have the same problem with Qnap TS-431X. Tested versions 2023.3.0.dev20230202, 2023.2.0.dev20230120, 2023.2.0b9 and current “stable”.

2022.6.7 works well.

Hi @Wetzel402 Here strings that have differences from zcat /proc/config.gz

Alpine Linux 3.18 for TS-131K with 4K page size

# CONFIG_ARM_PAGE_SIZE_32KB is not set
CONFIG_ARM_PAGE_SIZE_4KB=y
CONFIG_ARM_PAGE_SIZE_LARGE_SHIFT=12

# CONFIG_HIGHMEM is not set
CONFIG_CMDLINE="pci=pcie_bus_perf console=ttyS0,115200 root=/dev/ram mtdparts=mx_nand:32M(boot1_kernel),216M(boot1_rootfs2),32M(boot2_kernel),216M(boot2_rootfs2),15M(config);spi0.0:1088K(loader)ro,384K(env) zswap.enabled=0 zswap.compressor=lz4 memmap=2M$0x7000000 ramoops.mem_address=0x7000000 ramoops.mem_size=0x200000 ramoops.console_size=0x100000"

Alpine Linux 3.17 from TS-231P with 32K page size

CONFIG_ARM_PAGE_SIZE_32KB=y
# CONFIG_ARM_PAGE_SIZE_4KB is not set
CONFIG_ARM_PAGE_SIZE_LARGE_SHIFT=15

CONFIG_HIGHMEM=y
# CONFIG_HIGHPTE is not set
CONFIG_CMDLINE="pci=pcie_bus_perf console=ttyS0,115200 root=/dev/ram mtdparts=mx_nand:32M(boot1_kernel),216M(boot1_rootfs2),32M(boot2_kernel),216M(boot2_rootfs2),15M(config);spi0.0:1088K(loader)ro,384K(env) vmalloc=560M zswap.enabled=0 zswap.compressor=lz4 memmap=2M$0x7000000 ramoops.mem_address=0x7000000 ramoops.mem_size=0x200000 ramoops.console_size=0x100000"

If I run zcat /proc/config.gz using my new machine on Alpine 3.18 will that provide the data we need or does it need to be the armv7 based NAS with 32k page size?

Will be better if this will be armv7 based NAS with 4K page size. I will try ask about this my friends because they have TS-131K with page size 4K and worked Home Assistant with Alpine 3.18

I recently picked up an x86 based 8 bay QNAP NAS and will be migrating this weekend. I can keep my old unit for now to continue troubleshooting this problem. If I run zcat /proc/config.gz using my new machine on Alpine 3.18 will that provide the data we need or does it need to be the armv7 based NAS with 32k page size?

Hi @Wetzel402 I tried installing the latest original home assistant on a QNAP TS-131K (same CPU as TS-231p3) but with 1 Gb of memory and it still works with a 4K page size. And home assistant last version started and works without problems. So the problem is with the page size. Now I’m trying to compile the Alpine Linux 3.18 kernel inside the Alpine Linux 3.17 container, let’s see what happens.

I also wrote to QNAP support, but they are stubborn and say that 32K is good and they are not interested in third-party containers.

Commands to build Alpine inside container

adduser -D builder
addgroup builder abuild
su builder
cd /home/builder

git clone git://git.alpinelinux.org/aports

cd aports
git checkout -b custom-alpine
cd main/linux-lts
abuild-keygen -a
echo "PACKAGER_PRIVKEY=\"/home/builder/.abuild/your-key.rsa\"" >> /home/builder/.abuild/abuild.conf
cd /home/builder/aports/main/linux-lts
abuild -r
cd  /home/build/aports/scripts
./mkimage.sh --tag custom-alpine --repository /home/build/aports/main/linux-lts

Once the build process is complete, the customized ISO image in the output/ directory.

As I understand it, PAGE SIZE is set through the definition of PAGE_SHIFT in the configuration files… Where PAGE_SHIFЕ = 12 by default 2^12 = 4K (4096). For 32K must be 15, 2^15=32768

some interesting article https://lwn.net/Articles/822868/

Hi @Wetzel402 I’ll look and try to do something… I need to force myself)) We just have constant air raid warnings due war… and the mood is very sad.

General problem support for 32k pages

Hello @magicse sorry the delay… the crazy life.

I tried the 3 points, but unfortunetly didn’t work. Then I tried the Dockerfile in project of @Wetzel402. It worked. I don’t know exactly why it didn’t work in your Dockerfile. I think that maybe there are some missing dependency or for me, the most probable is that pip3 install homeassistant install the lastest version that has some kind of trouble.

Need more testing to understand what happened to prevent any possible issues in the future.

HA Version

@terox orjson … requires rustc 1.65+, but the latest we can have (thanks CentOS 6) is 1.62.

orjson … requires rustc 1.65+, but the latest we can have (thanks CentOS 6) is 1.62.

# We get the following error when compiling orjson on Centos 6:
# error: package `associative-cache v2.0.0` cannot be built because it requires rustc 1.65 or newer,
# while the currently active rustc version is 1.62.0-nightly
# Here's orjson switching to rustc 1.65:
# https://github.com/ijl/orjson/commit/ce9bae876657ed377d761bf1234b040e2cc13d3c
  1. You could try this Got the latest orjson working on on alpine3.11

or

  1. Try Before command RUN pip3 install homeassistant add command with some lower version of orjson. RUN pip3 install orjson==xxx

or

  1. $ curl --proto ‘=https’ --tlsv1.2 https://sh.rustup.rs -sSf | sh The command downloads a script and starts the installation of the rustup tool, which installs the latest stable version of Rust.

The effect that this works more slowly occurs during the initial initialization stage. After initialization it works normally. This is because any additional wheels are now assembled locally.

Conclusion - the image works in the same speed as the original one. Slowdown occurs at the stage of first initialization and when installing new components… since the assembly now occurs locally and does not download a ready-made packages from the repository.

https://github.com/home-assistant/core/assets/13585785/a039be31-3a03-4a83-9b18-9c6ae63c877a

@Wetzel402 Version Dockerfile with Python 3.11 to get latest homeassistant 2023.9 Dockerfile

# Container image that runs your code
FROM alpine:3.17
ENV CARGO_NET_GIT_FETCH_WITH_CLI=true
RUN apk add bash
RUN bash
RUN apk add g++ gcc make 
RUN apk add libcap libpcap-dev
RUN apk add ffmpeg-dev python3-dev sqlite-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev jpeg-dev zlib-dev
RUN apk add git cargo build-base
RUN mkdir /config

# Download, extract, configure, compile, and install Python 3.11
RUN wget https://www.python.org/ftp/python/3.11.0/Python-3.11.0.tgz && \
    tar -xzf Python-3.11.0.tgz && \
    cd Python-3.11.0 && \
    ./configure --enable-optimizations --with-openssl-rpath=auto && \
    make -j 4 && \
    make install

RUN python3 -m ensurepip --upgrade
# RUN pip3 install --upgrade pip
RUN pip3 install aiohttp
RUN pip3 install ffmpeg
RUN pip3 install libpcap
RUN pip3 install tzdata
RUN pip3 install PyNaCl
RUN pip3 install Pillow
RUN pip3 install git+https://github.com/boto/botocore
RUN pip3 install homeassistant

# Set the volume directory
VOLUME /config
# Expose the Home Assistant port (8123)
EXPOSE 8123
# Start Home Assistant
CMD ["hass", "--config", "/config"]
# Container image that runs your code
FROM alpine:3.17
RUN apk add bash
RUN bash
RUN apk add g++ gcc make
RUN apk add libcap libpcap-dev
RUN apk add python3
RUN python3 -m ensurepip --upgrade
RUN apk add git
RUN apk add python3-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev cargo jpeg-dev zlib-dev
RUN pip3 install aiohttp
RUN pip3 install ffmpeg
RUN pip3 install libpcap
RUN pip3 install tzdata
RUN pip3 install PyNaCl
RUN pip3 install Pillow
RUN pip3 install git+https://github.com/boto/botocore
RUN pip3 install homeassistant

# Cleanup unnecessary files and caches
RUN apk del \
    g++ \
    gcc \
    make \
    python3-dev \
    libffi-dev \
    libftdi1-dev \
    bzip2-dev \
    openssl-dev \
    cargo \
    jpeg-dev \
    zlib-dev && \
    rm -rf /var/cache/apk/*

# Set the working directory
WORKDIR /config
# Expose the Home Assistant port (8123)
EXPOSE 8123
# Start Home Assistant
CMD ["hass", "--config", "/config"]

@magicse, Have you tested this Dockerfile? It builds fine but I find that when HA starts it tries installing dependencies and fails.

My main concern if we could have some some issue with “wheels”. I read at start of that thread that maybe there are some issue with it.

the wheels will be assembled during the docker assembly process and will not use ready-made wheels from home assistant. This set of commands is what assembles and installs the wheels without any patchelfs.

RUN apk add g++ gcc make RUN apk add libcap libpcap-dev RUN apk add python3 RUN python3 -m ensurepip --upgrade RUN apk add git RUN apk add python3-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev cargo jpeg-dev zlib-dev RUN pip3 install aiohttp RUN pip3 install ffmpeg RUN pip3 install libpcap RUN pip3 install tzdata RUN pip3 install PyNaCl RUN pip3 install Pillow RUN pip3 install git+https://github.com/boto/botocore RUN pip3 install homeassistant

Also I added issue about Page Size 32K https://github.com/alpinelinux/docker-alpine/issues/342

@terox

Hello, I was testing different images and tags from the official armhf for a QNAP TS-231P3 (Alpine AL314 armv7) and I reached the exit code 139.

CODE 139 https://www.qnap.com/da-dk/how-to/faq/article/why-do-the-installed-third-party-containers-not-run-successfully-on-specific-32-bit-arm-devices

After they changed the memory page size from 4 to 32 K, a problem appeared with images that were designed (compiled) for 4K memory pages. Some applications or libraries are sensitive to page size mismatches. For example - jemalloc library (it must be compiled with next option for example ./configure --with-lg-page=15 # for pagesize 32K 2^15). With patchelf i think same problem. It detect page memory size from elf file compiled for 4K pagesize and make wrong patch of binaries and due this we get errors with native homeassistant wheels compiled under 4K pagesize settings. https://github.com/NixOS/patchelf/pull/216 When Homeassistant start use jemalloc we start get 139 or 135 error and this problem resolved with seting variable DISABLE_JEMALLOC = true and correct script inside homeassistant to check this. When Homeassistant start use auditwheel with patchelf we start get broken wheells libary binaries.

I have created a ticket on QNAP about this issue. Also you could create an issue about this on Alpine gihub for last alpine image with 32K page size support. But this isn’t really a problem though… It’s a more modern approach. It’s just that most docker images for arm32v7 continue to be created with a page size of 4K and not 32K.

Try Alpine 3.17 it work well cs1

I think the best variant create own project on Github with Dockerfile and use Actions to automatic build image and push it to Docker hub for public. I’m not a big expert in Github Actions and If anyone can help in creating an accurate Dockerfile and correct build-and-push.yml, I would be very grateful and appreciative. Need create something like this

Dockerfile

# Container image that runs your code
FROM alpine:3.17
RUN apk add bash
RUN bash
RUN apk add g++ gcc make
RUN apk add libcap libpcap-dev
RUN apk add python3
RUN python3 -m ensurepip --upgrade
RUN apk add git
RUN apk add python3-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev cargo jpeg-dev zlib-dev
RUN pip3 install aiohttp
RUN pip3 install ffmpeg
RUN pip3 install libpcap
RUN pip3 install tzdata
RUN pip3 install PyNaCl
RUN pip3 install Pillow
RUN pip3 install git+https://github.com/boto/botocore
RUN pip3 install homeassistant

# Cleanup unnecessary files and caches
RUN apk del \
    g++ \
    gcc \
    make \
    python3-dev \
    libffi-dev \
    libftdi1-dev \
    bzip2-dev \
    openssl-dev \
    cargo \
    jpeg-dev \
    zlib-dev && \
    rm -rf /var/cache/apk/*

# Set the working directory
WORKDIR /config
# Expose the Home Assistant port (8123)
EXPOSE 8123
# Start Home Assistant
CMD ["hass", "--config", "/config"]

Also need create or update GitHub Actions workflow configuration (e.g., .github/workflows/build-and-push.yml) to build the Docker image for ARMv7… Something like this

name: Build and Push Docker Image

on:
  push:
    branches:
      - main

jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout Code
        uses: actions/checkout@v2

      - name: Set up Docker Buildx
        id: buildx
        run: |
          docker buildx create --use
          docker buildx inspect default --bootstrap
        shell: bash

      - name: Build and Push Docker Image
        run: |
          docker buildx build \
            --platform linux/arm/v7 \
            -t homeassistant:latest \
            -f Dockerfile \
            .
          docker buildx imagetools create homeassistant:latest --tag homeassistant:$(date +%s) --target homeassistant:latest
        env:
          DOCKER_CLI_AGGREGATE: 1
        shell: bash

I think the best variant for now this

Download Alpina 3.16 or 3.17 official minimal clean docker image (only 5mb in size) https://hub.docker.com/_/alpine And install everything in it from scratch Python 3.10.10, g++, gcc and homeassistant with pip install homeassistant command. Also added this pip3 install git+https://github.com/boto/botocore <- due error https://github.com/home-assistant/core/issues/95192


apk add bash
bash
bash-5.1# apk add g++ gcc make
bash-5.1# apk add libcap libpcap-dev
bash-5.1# apk add python3
bash-5.1# python3 -m ensurepip --upgrade
bash-5.1# apk add git
bash-5.1# apk add python3-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev cargo jpeg-dev zlib-dev
bash-5.1# pip3 install aiohttp
bash-5.1# pip3 install ffmpeg
bash-5.1# pip3 install libpcap
bash-5.1# pip3 install tzdata
bash-5.1# pip3 install PyNaCl
bash-5.1# pip3 install Pillow
bash-5.1# pip3 install git+https://github.com/boto/botocore 
bash-5.1# pip3 install homeassistant

bash-5.1# hass --config /config

anydesk00008

I found a working Docker at Dockerhub but did not test it until now:

https://hub.docker.com/r/jkilloy/homeassistantnew

Perhaps helps somebody…

I was trying to see if this user had a github but I’m not finding it. I would be useful to see what changes they made. Also since I’m not finding source code I’m a little leery to use the image.

@frenck: is there any chance to get this fixed by the team in future? Topic open since January

That will not get addressed or any attention from our team. We suggest using our VM, we not going to invest effort for container installation. You have to do all this workaround forever or someone doing it and share it.

Is the team depreciating container installation?

My model of QNAP NAS does not support VM install plus, since the VM still uses containers I would be concerned the same problem might still arise on my hardware. If I’m not mistaken this issue also effects some single board computers such as early RPi.

Edit: I do understand we are a very limited subset of users. If I have a little guidance I will gladly make a pull request to fix it.

Edit2: @magicse, you seem to understand the issue. Can you make a pull request to fix it?

@Wetzel402 QNAP updated its native reverse proxy and now there is no need to install docker nginx. You can use the native one from QNAP by automatically pulling up the certificate.

qnap qnap2

Try instruction above

Hi there,

I got the same problem with my Qnap Ts431P3.

Any solution suggested?

@magicse: Could we do it on our own? Could you give instructions? Never did this before…

I think the best variant for now this 1 https://github.com/home-assistant/core/issues/86589#issuecomment-1436089123 2 https://github.com/home-assistant/core/issues/86589#issuecomment-1436499486

Download Alpina 3.16 or 3.17 official minimal clean docker image (only 5mb in size) https://hub.docker.com/_/alpine And install everything in it from scratch Python 3.10.10, g++, gcc and homeassistant with pip install homeassistant command.

apk add bash
bash
bash-5.1# apk add g++ gcc make
bash-5.1# apk add libcap libpcap-dev
bash-5.1# apk add python3
bash-5.1# python3 -m ensurepip --upgrade
bash-5.1# apk add python3-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev cargo
bash-5.1# pip3 install aiohttp
bash-5.1# pip3 install ffmpeg
bash-5.1# pip3 install libpcap
bash-5.1# pip3 install tzdata
bash-5.1# pip3 install PyNaCl
bash-5.1# pip3 install homeassistant

bash-5.1# hass --config /config

@magicse, Were you able to build a working image? I’ve been running the image provided by linuxserver.io which did not have the problem but they are now ending life for armhf so the last version available is 2023.4.4.

Interestingly it appears as though 2023.2.x is the last official version to do the looping Home Assistant Core finish process exit code 256 Home Assistant Core finish process received signal 11. Versions 2023.3+ seem to completely crash the container.

I would like to reiterate for everyone here there is clearly something wrong with the way the official container is built for our architecture because the linuxserver.io container works just fine. I am currently back on their 2023.4.4 container.

For clarity, I am running a QNAP TS-431XeU which has a Alpine AL314 Quad-core ARM Cortex-A15 CPU.

Sorry to hear that, however, not related and not justifying any of your actions above.

…/Frenck

@magicse, great work. Sounds like HA wheels need to be rebuilt.

Removing site-packagaes folder inside of original Home Assistant and rebuilding packages give fully worked Home Assistant without segfaults and restarting loop.

  • Create container from Official Home Assistant image and instead of init set /bin/bash
  • Start container
  • Try simple import numpy or aiohttp in to python and You will get SEGFAULT
  • Remove all packages from /usr/local/lib/python3.10/site-packages/*
  • And reinstall packages and you’ll get working home assistant

$\textcolor{red}{\textsf{Conclusion: segfault and loop reboot problem is 100\% due to badly built whl packages for armv7.}}$

bash-5.1# rm -rf /usr/local/lib/python3.10/site-packages/*
bash-5.1# apk add g++ gcc make cmake
bash-5.1# apk add cargo
bash-5.1# apk add python3-dev jpeg-dev libffi-dev libftdi1-dev bzip2-dev ffmpeg-dev openssl-dev libxml2-dev libxslt-dev
bash-5.1# python3 -m ensurepip --upgrade
bash-5.1# cd /usr/src/homeassistant
bash-5.1# pip3 install -e .
bash-5.1# python3 -c "import sys; print((sys._base_executable, sys.version))"
bash-5.1# ('/usr/local/bin/python3', '3.10.7 (main, Nov 24 2022, 13:02:43) [GCC 11.2.1 20220219]')
bash-5.1# uname -a
bash-5.1# Linux 0b265030057f 4.2.8 #2 SMP Thu Jan 12 10:44:50 CST 2023 armv7l Linux
bash-5.1# uname -m
bash-5.1# armv7l
bash-5.1# ldd /usr/local/lib/python3.10/site-packages/yarl/_quoting_c.cpython-310-arm-linux-gnueabihf.so
        /lib/ld-musl-armhf.so.1 (0x54278000)
        libc.musl-armv7.so.1 => /lib/ld-musl-armhf.so.1 (0x54278000)
bash-5.1# readelf -A /usr/local/lib/python3.10/site-packages/yarl/_quoting_c.cpython-310-arm-linux-gnueabihf.so
Attribute Section: aeabi
File Attributes
  Tag_CPU_name: "7-A"
  Tag_CPU_arch: v7
  Tag_CPU_arch_profile: Application
  Tag_ARM_ISA_use: Yes
  Tag_THUMB_ISA_use: Thumb-2
  Tag_FP_arch: VFPv3-D16
  Tag_ABI_PCS_wchar_t: 4
  Tag_ABI_FP_denormal: Needed
  Tag_ABI_FP_exceptions: Needed
  Tag_ABI_FP_number_model: IEEE 754
  Tag_ABI_align_needed: 8-byte
  Tag_ABI_enum_size: int
  Tag_ABI_VFP_args: VFP registers
  Tag_CPU_unaligned_access: v6
bash-5.1# hass --config /config

Just for information so is probably easier to debug. The core image is built starting FROM the homeassistant/docker image, that is build on top of the python3.10 homeassistant/docker-base image that, in turn, is based on the armv7-base:

https://github.com/home-assistant/docker-base/tree/master/alpine (armv7-base) https://github.com/home-assistant/docker-base/tree/master/python/3.10 (python3.10 -alpine 3.16 docker-base) https://github.com/home-assistant/docker (homeassistant-base) https://github.com/home-assistant/core (core)

you could probably try to check if the problem is really in the wheels or in some misconfigured image from all the layers

Before everything it starts with arm32v7/alpine:3.16

Ok so you’re trying to do all what is done from multiple github actions alone … And it’s a good method, first trying to make it work and then slowly remove some custom builded image using the “offical” one is a good method to narrow down what is going wrong. You already did a great work I suspect that since the base image was the 2022.11.0 and stayed that for long time probably something was not working as expected but it could also be something other

as I said I used the core one 😄 no problem I was just curious because I saw that the new version should have the base image with the new python so maybe that will solve the issue without compiling

@Wetzel402 @jrieger @larsxschneider @lswysocki @boyarale @Gerigot

Next experiment inside of original Home Assistant docker container to get worked Home Assistant

I removed old python by rm command because apk del python3 can’t delete it, I think because of some wrong installation of Python in Home Assistan container.

rm -f /usr/local/bin/python*
rm -f /usr/local/bin/pip*
rm -f /usr/local/bin/pydoc
rm -rf /usr/local/bin/include/python*
rm -f /usr/local/lib/libpython*
rm -rf /usr/local/lib/python*
rm -f /usr/local/share/man/python*
rm -rf /usr/local/lib/pkgconfig
rm -f /usr/local/bin/idle
rm -f /usr/local/bin/easy_install-*
rm -rf /usr/local/include/python*
rm -f /usr/local/share/man/python*

After that I reinstall Python and Home Assistant inside of container.

apk add pyton3
ln -sf /usr/bin/python3 /usr/bin/python && ln -sf /usr/bin/python3 /usr/local/bin/python
python3 -m ensurepip --upgrade
apk add cargo libffi-dev libftdi1-dev python3-dev bzip2-dev openssl-dev
pip3 install ffmpeg
pip3 install libpcap
pip3 install tzdata
pip3 install PyNaCl
cd /usr/src/homeassistant
pip3 install -e .
hass --config /config

And voila Home Assistant works without problems

bash-5.1# hass --config /config
2023-02-21 09:06:13.090 WARNING (SyncWorker_0) [homeassistant.loader] We found a custom integration rhvoice which has not been tested by Home Assistant. This component might cause stability problems, be sure to disable it if you experience issues with Home Assistant
2023-02-21 09:06:13.124 WARNING (SyncWorker_0) [homeassistant.loader] We found a custom integration browser_mod which has not been tested by Home Assistant. This component might cause stability problems, be sure to disable it if you experience issues with Home Assistant

sc3

Therefore, questions remain about the Python installed in the original Home Assistant container.

Listing of worked installation
bash-5.1# pip install -e .
Obtaining file:///usr/src/homeassistant
  Installing build dependencies ... done
  Checking if build backend supports build_editable ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting voluptuous-serialize==2.5.0
  Downloading voluptuous_serialize-2.5.0-py3-none-any.whl (6.8 kB)
Requirement already satisfied: pip<22.4,>=21.0 in /usr/lib/python3.10/site-packages (from homeassistant==2023.2.5) (22.3.1)
Collecting home-assistant-bluetooth==1.9.2
  Using cached home_assistant_bluetooth-1.9.2.tar.gz (10 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting PyJWT==2.5.0
  Downloading PyJWT-2.5.0-py3-none-any.whl 
Collecting voluptuous==0.13.1
  Using cached voluptuous-0.13.1-py3-none-any.whl (29 kB)
Collecting astral==2.2
  Using cached astral-2.2-py2.py3-none-any.whl (30 kB)
Collecting awesomeversion==22.9.0
  Using cached awesomeversion-22.9.0-py3-none-any.whl (12 kB)
Collecting orjson==3.8.6
  Using cached orjson-3.8.6.tar.gz (655 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting ifaddr==0.1.7
  Using cached ifaddr-0.1.7-py2.py3-none-any.whl (10 kB)
Collecting pyOpenSSL==23.0.0
  Using cached pyOpenSSL-23.0.0-py3-none-any.whl (57 kB)
Collecting python-slugify==4.0.1
  Using cached python-slugify-4.0.1.tar.gz (11 kB)
  Preparing metadata (setup.py) ... done
Collecting cryptography==39.0.1
  Using cached cryptography-39.0.1.tar.gz (603 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting lru-dict==1.1.8
  Using cached lru-dict-1.1.8.tar.gz (10 kB)
  Preparing metadata (setup.py) ... done
Collecting atomicwrites-homeassistant==1.4.1
  Using cached atomicwrites_homeassistant-1.4.1-py2.py3-none-any.whl (7.1 kB)
Collecting jinja2==3.1.2
  Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting certifi>=2021.5.30
  Downloading certifi-2022.12.7-py3-none-any.whl (155 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 kB 761.5 kB/s eta 0:00:00
Collecting pyyaml==6.0
  Using cached PyYAML-6.0.tar.gz (124 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions<5.0,>=4.4.0
  Using cached typing_extensions-4.5.0-py3-none-any.whl (27 kB)
Collecting ciso8601==2.3.0
  Using cached ciso8601-2.3.0.tar.gz (26 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting httpx==0.23.3
  Using cached httpx-0.23.3-py3-none-any.whl (71 kB)
Collecting async-timeout==4.0.2
  Using cached async_timeout-4.0.2-py3-none-any.whl (5.8 kB)
Collecting aiohttp==3.8.1
  Using cached aiohttp-3.8.1.tar.gz (7.3 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Collecting bcrypt==4.0.1
  Using cached bcrypt-4.0.1.tar.gz (25 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting attrs==22.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting yarl==1.8.1
  Using cached yarl-1.8.1.tar.gz (172 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting requests==2.28.1
  Using cached requests-2.28.1-py3-none-any.whl (62 kB)
Collecting frozenlist>=1.1.1
  Downloading frozenlist-1.3.3.tar.gz (66 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 66.6/66.6 kB 537.9 kB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting multidict<7.0,>=4.5
  Downloading multidict-6.0.4.tar.gz (51 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 51.3/51.3 kB 436.4 kB/s eta 0:00:00
  Installing build dependencies ... done
  WARNING: Missing build requirements in pyproject.toml for multidict<7.0,>=4.5 from https://files.pythonhosted.org/packages/4a/15/bd620f7a6eb9aa5112c4ef93e7031bcd071e0611763d8e17706ef8ba65e0/multidict-6.0.4.tar.gz (from aiohttp==3.8.1->homeassistant==2023.2.5).
  WARNING: The project does not specify a build backend, and pip cannot fall back to setuptools without 'wheel'.
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Collecting aiosignal>=1.1.2
  Downloading aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Collecting charset-normalizer<3.0,>=2.0
  Downloading charset_normalizer-2.1.1-py3-none-any.whl (39 kB)
Collecting pytz
  Downloading pytz-2022.7.1-py2.py3-none-any.whl (499 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 499.4/499.4 kB 658.7 kB/s eta 0:00:00
Collecting cffi>=1.12
  Using cached cffi-1.15.1.tar.gz (508 kB)
  Preparing metadata (setup.py) ... done
Collecting httpcore<0.17.0,>=0.15.0
  Downloading httpcore-0.16.3-py3-none-any.whl (69 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 69.6/69.6 kB 562.2 kB/s eta 0:00:00
Collecting rfc3986[idna2008]<2,>=1.3
  Downloading rfc3986-1.5.0-py2.py3-none-any.whl (31 kB)
Collecting sniffio
  Downloading sniffio-1.3.0-py3-none-any.whl (10 kB)
Collecting MarkupSafe>=2.0
  Downloading MarkupSafe-2.1.2.tar.gz (19 kB)
  Preparing metadata (setup.py) ... done
Collecting text-unidecode>=1.3
  Downloading text_unidecode-1.3-py2.py3-none-any.whl (78 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 78.2/78.2 kB 767.2 kB/s eta 0:00:00
Collecting idna<4,>=2.5
  Downloading idna-3.4-py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 kB 557.1 kB/s eta 0:00:00
Collecting urllib3<1.27,>=1.21.1
  Downloading urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.6/140.6 kB 497.1 kB/s eta 0:00:00
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting h11<0.15,>=0.13
  Downloading h11-0.14.0-py3-none-any.whl (58 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 58.3/58.3 kB 239.7 kB/s eta 0:00:00
Collecting anyio<5.0,>=3.0
  Downloading anyio-3.6.2-py3-none-any.whl (80 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 80.6/80.6 kB 659.2 kB/s eta 0:00:00
Building wheels for collected packages: aiohttp, bcrypt, ciso8601, cryptography, home-assistant-bluetooth, orjson, pyyaml, yarl, frozenlist, multidict
  Building wheel for aiohttp (pyproject.toml) ... done
  Created wheel for aiohttp: filename=aiohttp-3.8.1-cp310-cp310-linux_armv7l.whl size=1224585 sha256=0b0d0d36dd3dacee924a8017468c95a11d2488965a45d83a3f7070741850e9d0
  Stored in directory: /root/.cache/pip/wheels/d4/50/eb/f51338c53367b838a0ec965f2c3169ea7e3e15a846345dcf51
  Building wheel for bcrypt (pyproject.toml) ... done
  Created wheel for bcrypt: filename=bcrypt-4.0.1-cp310-cp310-linux_armv7l.whl size=261298 sha256=ed2b2f4fbfc7131ac394f789144f150e791c371048877f614d74ba984db17d52
  Stored in directory: /root/.cache/pip/wheels/03/3b/4e/dcbf6b75a11a1ca8559ccdd1c72f3a5bccc06c1d4c446a910e
  Building wheel for ciso8601 (pyproject.toml) ... done
  Created wheel for ciso8601: filename=ciso8601-2.3.0-cp310-cp310-linux_armv7l.whl size=37163 sha256=f99a7f6050038a2343fa21582c4f0c850525a380f58c3c09c0c3fc27d802cb69
  Stored in directory: /root/.cache/pip/wheels/bf/6d/37/ade063f0371c2d99d0d89c573b5f8f8d2b65867cbbdecac23d
  Building wheel for cryptography (pyproject.toml) ... done
  Created wheel for cryptography: filename=cryptography-39.0.1-cp310-cp310-linux_armv7l.whl size=1724519 sha256=d2e969335c6f5b9e30197f3d12a9bb4151abc46ef5cdfdeedaa4ec0386ed5626
  Stored in directory: /root/.cache/pip/wheels/64/c4/cc/f550958c39c03b9a252e4d1b3d1f66a60465239087c76bb3cd
  Building wheel for home-assistant-bluetooth (pyproject.toml) ... done
  Created wheel for home-assistant-bluetooth: filename=home_assistant_bluetooth-1.9.2-cp310-cp310-musllinux_1_2_armv7l.whl size=9880 sha256=c3058c27d6f0ec4db05c3d4fc2a3929ab30192d0c00f360f257ae272e97b0f70
  Stored in directory: /root/.cache/pip/wheels/1b/45/f8/d15781d785e0954f95d7fa40a868d3af255d4a5dc6096e5777
  Building wheel for orjson (pyproject.toml) ... done
  Created wheel for orjson: filename=orjson-3.8.6-cp310-cp310-linux_armv7l.whl size=307962 sha256=287798bad47db17615d5aa484086c464761889f1e3670300455ca114e73017c1
  Stored in directory: /root/.cache/pip/wheels/27/60/74/750046bf9140da2969b4439a56111b0ed3214e18481f0bd7db
  Building wheel for pyyaml (pyproject.toml) ... done
  Created wheel for pyyaml: filename=PyYAML-6.0-cp310-cp310-linux_armv7l.whl size=45331 sha256=d5a116e6c94d6e7f9ba9374e7cc79ea39393461f9a80e5bdf9d8cc3bad5de494
  Stored in directory: /root/.cache/pip/wheels/06/72/f6/3f89f64cf1943a82e42cdd8e59d7b2aa98769fd48b08019fc7
  Building wheel for yarl (pyproject.toml) ... done
  Created wheel for yarl: filename=yarl-1.8.1-cp310-cp310-linux_armv7l.whl size=251195 sha256=a747d014b792db53a1d6fd10743c142dc9a31c000a4dafad2fda863b7dcd9728
  Stored in directory: /root/.cache/pip/wheels/1f/60/e8/b65794eedd75315d15fba8f46db710ce68e823664810ca5ce6
  Building wheel for frozenlist (pyproject.toml) ... done
  Created wheel for frozenlist: filename=frozenlist-1.3.3-cp310-cp310-linux_armv7l.whl size=140954 sha256=58f39c3b7ce9c72b99223e827deb5e787b1bc492d4f597d59e7eb125e391038d
  Stored in directory: /root/.cache/pip/wheels/37/77/4d/2330a0cc3e244d931a9e60859bb6ec66e09558b3806c382983
  Building wheel for multidict (pyproject.toml) ... done
  Created wheel for multidict: filename=multidict-6.0.4-cp310-cp310-linux_armv7l.whl size=108885 sha256=ec576deb44836fc0e71f0f60c84ff2d0446314eef02bfd256dcafce1310f5752
  Stored in directory: /root/.cache/pip/wheels/ae/d2/13/61a3897335dd417ee80bcf70d39fea8eda1f7761d28d5547f5
Successfully built aiohttp bcrypt ciso8601 cryptography home-assistant-bluetooth orjson pyyaml yarl frozenlist multidict
Installing collected packages: voluptuous, text-unidecode, rfc3986, pytz, lru-dict, ifaddr, ciso8601, voluptuous-serialize, urllib3, typing-extensions, sniffio, pyyaml, python-slugify, PyJWT, pycparser, orjson, multidict, MarkupSafe, idna, home-assistant-bluetooth, h11, frozenlist, charset-normalizer, certifi, bcrypt, awesomeversion, attrs, atomicwrites-homeassistant, async-timeout, astral, yarl, requests, jinja2, cffi, anyio, aiosignal, httpcore, cryptography, aiohttp, pyOpenSSL, httpx, homeassistant
  DEPRECATION: lru-dict is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
  Running setup.py install for lru-dict ... done
  DEPRECATION: python-slugify is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
  Running setup.py install for python-slugify ... done
  DEPRECATION: MarkupSafe is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
  Running setup.py install for MarkupSafe ... done
  DEPRECATION: cffi is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
  Running setup.py install for cffi ... done
  Running setup.py develop for homeassistant
Successfully installed MarkupSafe-2.1.2 PyJWT-2.5.0 aiohttp-3.8.1 aiosignal-1.3.1 anyio-3.6.2 astral-2.2 async-timeout-4.0.2 atomicwrites-homeassistant-1.4.1 attrs-22.2.0 awesomeversion-22.9.0 bcrypt-4.0.1 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-2.1.1 ciso8601-2.3.0 cryptography-39.0.1 frozenlist-1.3.3 h11-0.14.0 home-assistant-bluetooth-1.9.2 homeassistant-2023.2.5 httpcore-0.16.3 httpx-0.23.3 idna-3.4 ifaddr-0.1.7 jinja2-3.1.2 lru-dict-1.1.8 multidict-6.0.4 orjson-3.8.6 pyOpenSSL-23.0.0 pycparser-2.21 python-slugify-4.0.1 pytz-2022.7.1 pyyaml-6.0 requests-2.28.1 rfc3986-1.5.0 sniffio-1.3.0 text-unidecode-1.3 typing-extensions-4.5.0 urllib3-1.26.14 voluptuous-0.13.1 voluptuous-serialize-2.5.0 yarl-1.8.1

List of commands to install Home assistant in to the clear minimal Alpine 3.16 docker image.

apk add bash
bash
bash-5.1# apk add g++ gcc make
bash-5.1# apk add libcap libpcap-dev
bash-5.1# apk add ffmpeg 
bash-5.1# apk add python3
bash-5.1# apk add python3-dev libffi-dev libftdi1-dev bzip2-dev openssl-dev
bash-5.1# apk add cargo
bash-5.1# python3 -m ensurepip --upgrade
bash-5.1# pip3 install aiohttp
bash-5.1# pip3 install ffmpeg
bash-5.1# pip3 install libpcap
bash-5.1# pip3 install tzdata
bash-5.1# pip3 install PyNaCl
bash-5.1# pip3 install homeassistant

To start home assistant run hass

@Wetzel402 I downloaded Alpina 3.16 official minimal clean docker image (only 5mb in size) https://hub.docker.com/_/alpine And I installed everything in it from scratch myself Python 3.10.10, g++, gcc and homeassistant with pip install homeassistant command.

After installation I ran *hass script and Home Assistant 2023.2.5 started and everything works great. This indicates that the docker image https://hub.docker.com/r/homeassistant/home-assistant which is posted on homeassistant is not built correctly. sc1 sc2 sc3

QNAP armv7l May be problem with armv7l <-- > armv7hf and package architecture (armv7l) does not match system (armhf) also it not armv7hf-musl armv7hf armv7l-musl

bash-5.1# cat /proc/cpuinfo processor : 0 model name : Annapurna Labs Alpine AL314 Quad-core ARM Cortex-A15 CPU @ 1.70GHz Speed : 1.7GHz Features : half thumb fastmult vfp edsp neon vfpv3 tls vfpv4 idiva idivt vfpd32 lpae evtstrm CPU implementer : 0x41 CPU architecture: 7 CPU variant : 0x2 CPU part : 0xc0f CPU revision : 4

@Wetzel402

Edit: They also use their own wheels. This makes me suspect it is a wheels issue as some have previously suspected…

May be you are right and they use different wheels for example for cryptography. I see by names that they didn’t use musllinux for arch armv7l . musllinux used for aarch64 and x86_64 arch. Also aarch64 and x86_64 arch builded for Python 3.6, and armv7l armv8l builded for Python 3.10

[cryptography-39.0.0-cp310-cp310-linux_armv7l.whl](https://wheels.linuxserver.io/alpine-3.16/cryptography-39.0.0-cp310-cp310-linux_armv7l.whl)
[cryptography-39.0.0-cp310-cp310-linux_armv8l.whl](https://wheels.linuxserver.io/alpine-3.16/cryptography-39.0.0-cp310-cp310-linux_armv8l.whl)
[cryptography-39.0.0-cp36-abi3-musllinux_1_1_aarch64.whl](https://wheels.linuxserver.io/alpine-3.16/cryptography-39.0.0-cp36-abi3-musllinux_1_1_aarch64.whl)
[cryptography-39.0.0-cp36-abi3-musllinux_1_1_x86_64.whl](https://wheels.linuxserver.io/alpine-3.16/cryptography-39.0.0-cp36-abi3-musllinux_1_1_x86_64.whl)

Debugging of official image give me error in ld-musl-armhf.so.1 while importing (for exmaple) numpy or cryptography

(gdb) r
bash-5.1# gdb --args python -c "import sys, numpy; print(numpy.__version__, sys.version)"
Starting program: /usr/local/bin/python -c import\ sys,\ numpy\;\ print\(numpy.__version__,\ sys.version\)

Program received signal SIGSEGV, Segmentation fault.
0x75fb792c in ?? () from /lib/ld-musl-armhf.so.1

I think we could start from this point… For example when simple importing of numpy inside of container will be without segmentation fault.

**bash-5.1# python3 
Python 3.10.7 (main, Nov 24 2022, 13:02:43) [GCC 11.2.1 20220219] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy
Segmentation fault
bash-5.1#** 

@magicse, right now I think comparing the repositories to find differences would be a good start. Why does linuxserver.io’s image run when the official does not?

I was hoping that @Gerigot’s fix would also correct the issue with QNAP but it appears that isn’t the case. More investigation and research is needed…

@Wetzel402

This issue might be resolved finally. If the latest official container still isn’t working you can try linuxserver.io’s container.

linuxserver.io’s container Home Assistant 2023.2.1 work well.

@magicse can you try with a pre-release version? like the new one from today 2023.2.0b9

So that we can ensure that it’s related with the problem #75142.

Because the fix was for armv6 and I’m not sure the problem with your device is the same, although it seems similar.

This issue might be resolved finally. If the latest official container still isn’t working you can try linuxserver.io’s container.