core: Unable to find Vera entitites
The problem
Using Vera Pro with the HASS integration (via YAML). Occasionally entities are not responsive to turning on/off. Logs say that HASS is unable to find the referenced entities. When I log in to the Vera directly, everything is working normally. After restarting HASS, they are usually available again, but will stop working again after some time.
Environment
arch | x86_64
dev | false
docker | false
hassio | false
os_name | Linux
os_version | 4.4.0-173-generic
python_version | 3.8.2
timezone | America/New_York
version | 0.107.5
virtualenv | true
- Home Assistant release with the issue:
- Last working Home Assistant release (if known): 0.106.4
- Operating environment (Hass.io/Docker/Windows/etc.): Ubuntu Server 16.04
- Integration causing this issue: Vera
- Link to integration documentation on our website: https://www.home-assistant.io/integrations/vera/
Problem-relevant configuration.yaml
vera:
vera_controller_url: http://[VERA IP]
Traceback/Error logs
Log Details (WARNING)
Logger: homeassistant.helpers.service
Source: helpers/service.py:371
First occurred: March 23, 2020, 8:04:45 PM (8 occurrences)
Last logged: 7:27:07 AM
Unable to find referenced entities switch.man_cave_lights_133
Unable to find referenced entities switch.back_spotlight_118, switch.deck_lights_108, switch.front_porch_115, switch.front_spotlight_135
Unable to find referenced entities switch.front_porch_115
Unable to find referenced entities switch.front_spotlight_135
Unable to find referenced entities switch.back_spotlight_118
Additional information
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 26 (13 by maintainers)
I submitted a PR that I suspect will fix the issue of entities not getting updated. Since I have been unable to reproduce the issue, please test out the PR if you can. https://github.com/home-assistant/core/pull/35703
Based on the responses from many users, this issue appears to be caused by the controller not returning accurate incremental data or a race condition with threads could result in data being squashed. At any rate, this PR ensures that all status data is retrieved from the controller (not only incremental) and all this is done using proper hass event loops.
It is in the latest betas of 109. I tested home-assistant-109.0.dev20200411 and 413 (in docker) but the option to include devices (switches) as lights didn’t work which breaks about 40% of my devices. Excluding specific devices didm’t work either. I excluded device 382 (which is the second half of a double switch I’m not using) and it was still imported and added to the gui.
But adding Vera using the integration is much easier than YAML.Particularly when you can delete and re-add Vera multiple times WITHOUT restarting Home Assistant!
Thanks for your work on this though Robert. There are still a few of us who haven’t thrown in the towel with Vera just yet (although I’m getting close to be fair). .
On Sun, Apr 12, 2020 at 3:57 PM Robert Van Gorkom notifications@github.com wrote:
This PR has multiple vera hub support and also fixes the unique entity error (for newly added hubs only). You will have to remove and re-add your old hubs to get the fix applied. There is no migration option possible.
https://github.com/home-assistant/core/pull/33613
I also have been having this issue intermittently the past 2 months, more lately the past 5 days or so. When it happens, I also see that home assistant is trying to add the entity again. Example:
homeassistant.exceptions.HomeAssistantError: Entity id already exists: light.office_light_113. Platform vera does not generate unique IDsI can still control other vera devices with no issues.
Some times it will add a duplicate entity by appending
_1to it, so in the above example in home assistant I end up 2 entities mapped back to the same device in vera, likelight.office_light_113 light.office_light_113_1About a month ago I tried removing vera completely, cleaned up all the entities from home assistant and added vera back. Everything was imported correctly and I had no issues for about 2 weeks.