core: Monoprice 6-Zone Amplifier fails after updating to 2023.11.1

The problem

Home Assistant 2023.10.5 Supervisor 2023.10.1 Operating System 11.1 Frontend 20231005.0 - latest

I’m running this as a VM on Unraid (qcow2, I believe)

The integration failing: Monoprice

What version of Home Assistant Core has the issue?

2023.11.1

What was the last working version of Home Assistant Core?

2023.10.5

What type of installation are you running?

Home Assistant OS

Integration causing the issue

Monoprice

Link to integration documentation on our website

https://www.home-assistant.io/integrations/monoprice/

Diagnostics information

There seems to be no debug link on the integration page. I have already rolled back to the previous HAOS to avoid the problem.

Example YAML snippet

No response

Anything in the logs that might be useful for us?

2023-11-02 08:45:04.448 ERROR (Recorder) [homeassistant] Error doing job: Task exception was never retrieved
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/serial/urlhandler/protocol_socket.py", line 203, in write
    n = self._socket.send(d)
        ^^^^^^^^^^^^^^^^^^^^
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/helpers/entity.py", line 1219, in async_request_call
    return await coro
           ^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 948, in _handle_entity_call
    result = await task
             ^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/media_player/__init__.py", line 715, in async_turn_on
    await self.hass.async_add_executor_job(self.turn_on)
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/monoprice/media_player.py", line 200, in turn_on
    self._monoprice.set_power(self._zone_id, True)
  File "/usr/local/lib/python3.11/site-packages/pymonoprice/__init__.py", line 36, in wrapper
    return func(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pymonoprice/__init__.py", line 207, in set_power
    self._process_request(_format_set_power(zone, power))
  File "/usr/local/lib/python3.11/site-packages/pymonoprice/__init__.py", line 153, in _process_request
    self._send_request(request)
  File "/usr/local/lib/python3.11/site-packages/pymonoprice/__init__.py", line 144, in _send_request
    self._port.write(request)
  File "/usr/local/lib/python3.11/site-packages/serial/urlhandler/protocol_socket.py", line 231, in write
    raise SerialException('write failed: {}'.format(e))
serial.serialutil.SerialException: write failed: [Errno 32] Broken pipe

Additional information

Strange behavior. What happens is the devices (6 of which are enabled, the remaining 12 are disabled) fail to respond to any controls issued from the media_player control for each zone, producing the BrokenPipeError: [Errno 32]

If I delete the integration and reconfigure it from scratch, it works for a short period of time, and then fails again.

Since there’s an RS232 to Ethernet Adapter between the Monoprice Amp and Home Assistant, I thought perhaps that might be the problem, but it’s working as expected. No manner of rebooting those devices changes anything.

I crossed my fingers and rolled back to my most recent version with a full restore and everything was back to normal, except for one big problem that I revealed that’s not related. A full restore wiped out my mapped Media folder in the Storage setup which is on my NAS. I’ll save that issue for another day.

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Comments: 23 (8 by maintainers)

Most upvoted comments

I agree. Sounds likely. In every case it has resolved itself with an upgrade, but it certainly does color the stability experience. Thanks for the quick response.