core: ruckus_unleashed: integration "Failed to set up" on 2023.6.0

The problem

After upgrading my HA instance to v2023.6.0 yesterday my Ruckus Unleashed integration stopped working with the “Failed to set up” error on the integration page; reloading the integration, restarting HA, and rebooting the entire device did not resolve the problem.

If I enable debug logging for the integration, reload the integration, and check the logs I see the following two entries which appear relevant.

The first entry is coming from the Ruckus integration itself:

Logger: homeassistant.config_entries
Source: components/ruckus_unleashed/__init__.py:31
First occurred: June 7, 2023 at 10:13:26 PM (3 occurrences)
Last logged: 10:09:47 AM

Error setting up entry mesh-XXXXXX for ruckus_unleashed
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 387, in async_setup
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/ruckus_unleashed/__init__.py", line 31, in async_setup_entry
    ruckus = await Ruckus.create(
             ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pyruckus/__init__.py", line 44, in create
    await ruckus.connect()
  File "/usr/local/lib/python3.11/site-packages/pyruckus/__init__.py", line 50, in connect
    result = await ssh.login(
             ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pyruckus/RuckusSSH.py", line 48, in login
    i = await self.expect(login_regex_array, timeout=login_timeout, async_=True)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pexpect/spawnbase.py", line 340, in expect
    return self.expect_list(compiled_pattern_list,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pexpect/spawnbase.py", line 366, in expect_list
    from ._async import expect_async
  File "/usr/local/lib/python3.11/site-packages/pexpect/_async.py", line 6, in <module>
    @asyncio.coroutine
     ^^^^^^^^^^^^^^^^^
AttributeError: module 'asyncio' has no attribute 'coroutine'

The second entry doesn’t appear to be directly related to the Ruckus integration, but it appears similar to other outstanding GH issues for the integration:

Logger: homeassistant.util.async_
Source: util/async_.py:166
First occurred: 10:00:24 AM (4 occurrences)
Last logged: 10:09:47 AM

Detected blocking call to sleep inside the event loop. This is causing stability issues. Please report issue for config doing blocking calls at homeassistant/components/config/config_entries.py, line 112: await hass.config_entries.async_reload(entry_id)

What version of Home Assistant Core has the issue?

core-2023.6.0

What was the last working version of Home Assistant Core?

core-2023.5.4

What type of installation are you running?

Home Assistant OS

Integration causing the issue

Ruckus Unleashed

Link to integration documentation on our website

https://www.home-assistant.io/integrations/ruckus_unleashed

Diagnostics information

No response

Example YAML snippet

No response

Anything in the logs that might be useful for us?

No response

Additional information

No response

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Reactions: 1
  • Comments: 93 (40 by maintainers)

Most upvoted comments

I can confirm same issue exists also on 2023.6.1

@ms264556 Thanks a ton for working on the tests!

I’ll dig into your branch soon and update the PR!

I’ll second your sentiments @Bubbgump209 : @andornaut - You’re a god 😃

For anybody using HassOS Go to the console ha > login # docker exec -it homeassistant /bin/bash homeassistant:/config# pip install https://github.com/pexpect/pexpect/archive/master.zip homeassistant:/config# exit # shutdown -r now

Thanks for the solutions!

Hi @lanrat, I’ve ported over the existing integration’s tests, apart from one which seems to be testing old HA functionality. Hopefully that’s enough to get this accepted.

Can you have a look at my branch here, and apply the changes to your branch?

I created a library which is completely async, using aiohttp, specifically so it could replace pyruckus here (and I put AP LED, client blocking and WLAN disabling into the API). If you don’t want to reinvent the wheel then you might want to have a look.

https://pypi.org/project/aioruckus/

Hi @Coder84619 - In lieu of a legitmate response I can offer: The rewritten integration has been submitted as a PR and passed 49/51 tests. My head is too far underwater to provide insight into the failed tests, but I believe it may be political rather than technical … possibly regarding the ownership of the code base (?!?) I believe the former owner/writer/creator of the Integration no longer uses Ruckus devices and is keen to hand over ownership. Under a strict “do not hassle the talent” policy, corresopndance is via HA bots. We wait. If, like myself, your HA automations require Device Tracker presense to trigger events, there are a couple of ways to make the old integration work, as per Bubbgump209’s surgical rewrite of the offending _async.py file in the pexpect library, or andornaut’s upgrade of the entire pexpect library to a later version. Preferably though, you can spare 10-15 minutes and follow lanrat’s suggestion of pulling down the new code into your custom_components folder to test the new integration that will (hopefully) soon be released to us all once it passes the final tests. I’m new to much of this, but fortunately have some time to commit to bridging the knowledge gap from those who create, and myself, who consumes. If you’re short on time, but need the Integration up and running, lay out your scenario, and I will try to to help you get things moving until it all becomes a simple click and consume Integration once more.

@faithless01 I’m glad you figured it out.

As you found out the hard way, I forgot to mention you will need a add "version": "0.0.1" to manifest.json.

You’re a beast @lanrat I’ve raw copied the files from your repo into my custom_components folder, and all appears well. Is there any way to tell if I’m running the custom version? Enabling debug only provides the following: 2023-06-25 04:35:41.758 DEBUG (MainThread) [homeassistant.components.ruckus_unleashed] Finished fetching ruckus_unleashed data in 0.666 seconds (success: True)

EDIT: I moved the /usr/src/homeassistant/homeassistant/components/ruckus_unleashed folder to my home folder and rebooted. I received the Error: Setup failed for ruckus_unleashed: Integration not found Once I moved the original ruckus_unleashed back and rebooted, all worked as per before. EDIT2: I trashed the device_tracker.py file in the custom_componets file, to brute force test if I’m using your new engine, restarted and everything is working as per before … I think I can safely say, I’m not using your rewritten integration: I’ll go research how to manually add a custom component 😃 EDIT3: So, I replaced the original /usr/src/homeassistant/homeassistant/components/ruckus_unleashed folder with the contents from your repo, and received the following error: Setup failed for ruckus_unleased: No setup or config entry setup function defined.

Time for me do undo my attempts and work with what little I understand at the moment. It feels like I’m trying to repair a pocket watch with a sledgehammer 😃

EDIT4: I promise I’ll internalise my journey going forward and leave the dialogue to those with talent… However, to clarify my failings, the root cause of my problems was mistyping __init__.py as _init_.py Upon correcting the mistake, the logs show the following:

2023-06-25 05:33:15.807 WARNING (SyncWorker_4) [homeassistant.loader] We found a custom integration ruckus_unleashed which has not been tested by Home Assistant. This component might cause stability problems, be sure to disable it if you experience issues with Home Assistant
2023-06-25 05:33:15.808 ERROR (SyncWorker_4) [homeassistant.loader] The custom integration 'ruckus_unleashed' does not have a version key in the manifest file and was blocked from loading. See https://developers.home-assistant.io/blog/2021/01/29/custom-integration-changes#versions for more details

Followed shortly after by 2023-06-25 05:34:18.055 DEBUG (MainThread) [homeassistant.components.ruckus_unleashed] Finished fetching ruckus_unleashed data in 0.658 seconds (success: True)

EDIT5: Wooo! Apologies, but I persist with my sad rambling in the vain hope some other muppets frustrated google search results in a simple forehead slap and not hours of d*cking around with their slightly blunter sledgehammer So, as per the error thrown above, I appended a version number to the custom_componets/ruckus_unleasged/manifest.json file

  "requirements": ["aioruckus==0.30", "xmltodict==0.13.0"],
  "version": "0.0.1"
}

…and voila - I was hit with masses of errors for trashing the device_tracker.py file earlier. I recopied the device_tracker.py file, and after a final reboot, a yellow “Custom Integration” icon appeared on my Ruckus Unleashed Integration. I have ADDed a HUB, with the ability to ADD more HUBs! 😃 (The text input boxes inside the ADD HUB popup were not labelled ip/user/pw - but thats probably just me mistyping something) My Device Trackers are all responding perfectly. Thank You!

If anyone wants a simpler solution to get this integration working until we have the correct test coverage to get my PR merged, you can clone my repo/branch and mount the ruckus_unleashed from my repo’s components folder into your custom_components folder to have it override the built in broken integration.

This is what I have been doing for the past few weeks and its been incredibly stable.

One way to workaround this issue is by installling pexpect from the ‘master’ branch:

pip install https://github.com/pexpect/pexpect/archive/master.zip

Here’s an example from an Ansible role that works around this issue:

- name: "Install pexpect from 'master' branch. Workaround 1/2 for: https://github.com/home-assistant/core/issues/94264"
  community.docker.docker_container_exec:
    container: homeassistant
    argv:
      - /bin/bash
      - "-c"
      - "pip install https://github.com/pexpect/pexpect/archive/master.zip"

- name: "Restart the homeassistant container. Workaround 2/2 for: https://github.com/home-assistant/core/issues/94264"
  community.docker.docker_container:
    name: homeassistant
    state: started
    restart: true

I can confirm that downgrading HA core to v2023.5.4 restores the Ruckus Unleashed integration.

Sorry team. I changed the AP unique id in one PR and then fixed this in a second PR.

But they only took my first PR into this month’s release.

So I guess when the 2nd PR goes in next week then you’ll need to delete and re-add the AP again.

On Tue, 26 Sept 2023, 05:15 Coder84619, @.***> wrote:

@Bubbgump209 https://github.com/Bubbgump209 @faithless01 https://github.com/faithless01 Thanks guys…I deleted my AP, re-added it, and now all is well.

— Reply to this email directly, view it on GitHub https://github.com/home-assistant/core/issues/94264#issuecomment-1734066078, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHC7WFVHSYVHFARELOCGOC3X4GUXJANCNFSM6AAAAAAY7MFTYM . You are receiving this because you were mentioned.Message ID: @.***>

I’m trying it out and it looks like something’s wrong with the UI at least : image

Home Assistant 2023.7.2 
Supervisor 2023.07.1 
Operating System 10.3 
Interface utilisateur : 20230705.1 - latest 

That said, I think it found everything that’s on my wifi very quickly, it even turned some of them into entities when it found matching entities already and it did automagicly set the room these entities are in already… hass magic happening again.

Took me less than 10 mins to do, don’t know why it took me weeks between seeing your comments and trying it out, thanks a lot to @lanrat for the dev and @faithless01 for the quick quide

FYI, just in case, my strings.json in the cusom component has this :

{
  "config": {
    "step": {
      "user": {
        "data": {
          "host": "[%key:common::config_flow::data::host%]",
          "username": "[%key:common::config_flow::data::username%]",
          "password": "[%key:common::config_flow::data::password%]"
        }
      }
    },
    "error": {
      "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
      "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
      "unknown": "[%key:common::config_flow::error::unknown%]"
    },
    "abort": {
      "already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
    }
  }
}

FWIW, the aioruckus integration seems great here. It was a seamless drop in replacement. It survive an update to 2023.7 just fine. For the 5 minutes of effort yeah, I’d very much encourage folks to install it and rest easy until the rejigger is accepted.

Perhaps just my cludged intallation, but my aioruckus integration is not discovering any devices on my slave access points.

I’ll put some time into this tomorrow. I’ll add a note to specifically trace through this situation.

Thanks @andornaut No need to troubleshoot, as monkey patching the _async.py file is currently working. I received the following errors when I clobbered pexpect with the master branch after the last core and OS updates:

RuntimeError: Detected blocking call to sleep inside the event loop. Use `await hass.async_add_executor_job()`; This is causing stability issues. Please report issue
2023-06-20 10:57:50.135 WARNING (MainThread) [homeassistant.config_entries] Config entry 'Mesh-Backbone' for ruckus_unleashed integration not ready yet: Could not establish connection to host; Retrying in background
2023-06-20 10:57:50.640 WARNING (MainThread) [homeassistant.util.async_] Detected blocking call to sleep inside the event loop. This is causing stability issues. Please report issue for hassio doing blocking calls at homeassistant/components/hassio/handler.py, line 534: request = await self.websession.request(
2023-06-20 10:57:57.334 ERROR (MainThread) [homeassistant] Error doing job: Exception in callback _UnixReadPipeTransport._call_connection_lost(OSError(5, 'I/O error'))
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/usr/local/lib/python3.11/asyncio/unix_events.py", line 596, in _call_connection_lost
    self._pipe.close()
  File "/usr/local/lib/python3.11/site-packages/pexpect/pty_spawn.py", line 328, in close
    self.ptyproc.close(force=force)
  File "/usr/local/lib/python3.11/site-packages/ptyprocess/ptyprocess.py", line 403, in close
    time.sleep(self.delayafterclose)
  File "/usr/src/homeassistant/homeassistant/util/async_.py", line 166, in protected_loop_func
    check_loop(func, strict=strict)
  File "/usr/src/homeassistant/homeassistant/util/async_.py", line 123, in check_loop
    raise RuntimeError(
RuntimeError: Detected blocking call to sleep inside the event loop. Use `await hass.async_add_executor_job()`; This is causing stability issues. Please report issue

Install procedure, as per previous working method:

ha > login
# docker exec -it homeassistant /bin/bash
homeassistant:/config# pip install https://github.com/pexpect/pexpect/archive/master.zip
homeassistant:/config# exit
# shutdown -r now

I’ll try to clobber the pexpect library with the master branch again, after the next core/os release and see how it goes. Thanks again - fantastic community!

Massive thank you to @lanrat and @ms264556 for pushing the aioruckus integration PR. (98% there!) I’m struggling to follow the process, but am learning a lot watching you guys work. Thank You.

For anybody still relying on the current broken integration in the interim, (perhaps having automations based on presence detection of wifi devices), you can still mend the broken implementation, using @Bubbgump209 's monkey patched _async.py file above.

The solution from @andornaut to clobber the pexpect library with the master branch has not worked in the last OS update, providing a different set of error messages. (Maybe just me?)

Unfortunately, for those running HASSOS, there is a challenge gaining access to the ‘broken’ Python libraries. While this can be performed relatively easily from a terminal shell, retyping the monkey patched file without copy/paste can be tedious.

Copy / Paste is straight forward in SSH, and can be accessed easily from your own Host OS, as with a Core, Containered or Supervised installation of Home Assistant.

If however, you are running the HASSOS implementation natively on a Pi, or virtualised in VMware or Proxmox, the native SSH Add-On, drops you into its own docker container, with no access to the Core Container, meaning no access to the ‘broken’ pythin libraries. The following Add-On from @adamoutler provides an SSH session to the host OS, where you can jump up to the Home Assistant Core Docker container, and cut and paste the monkey patch over the broken _async.py file https://community.home-assistant.io/t/add-on-hassos-ssh-port-22222-configurator/264109

Again - thanks to those higher up the tree that are making this work.

For sure if you have a good plan/link for best-practices mocking then I’ll have a go.

But you should definitely be able to mock [sub?]adequately as-is. I whacked the following classes into my test __init__.py:-

class MockSession(AjaxSession):
    """Mock Session"""
    def __init__(
        self,
        websession: aiohttp.ClientSession,
        host: str,
        username: str,
        password: str,
        auto_cleanup_websession=False,
    ) -> None:
        super().__init__(websession, host, username, password, auto_cleanup_websession)
        self.mock_results = {}
        self._api = MockApi(self)

    async def __aenter__(self) -> "AjaxSession":
        return self

    async def __aexit__(self, *exc: Any) -> None:
        pass

    async def login(self) -> None:
        pass

    async def close(self) -> None:
        pass

    @property
    def api(self) -> "MockApi":
        return self._api

    @classmethod
    def async_create(cls, host: str, username: str, password: str) -> "AjaxSession":
        return MockSession(None, host, username, password, True)
    
    async def get_conf_str(self, item: ConfigItem, timeout: int | None = None) -> str:
        return self.mock_results[item.value]

class MockApi(RuckusApi):
    """Mock Session"""
    def __init__(self, session: MockSession):
        self.session = session

    async def get_active_clients(self, interval_stats: bool = False) -> List:
        if interval_stats:
            raise NotImplementedError(self)
        else:
            result_text = self.session.mock_results["active-client-stats1"]
            return self._ruckus_xml_unwrap(result_text, ["client"])

then I was able to throw in some quick’n’dirty mocks for a few methods:-

async with MockSession.async_create("<my ZD ip>", "<my ZD user>", "<my ZD password>") as session:

    session.mock_results[ConfigItem.WLANSVC_LIST.value] = '<?xml version="1.0" encoding="utf-8"?><!DOCTYPE ajax-response><ajax-response><response type="object" id="wlansvc-list.0.5"><wlansvc-list><wlansvc name="MyFirstTestSSID" ssid="MyFirstTestSSID" description="" ofdm-rate-only="true" bss-minrate="11" tx-rate-config="4" authentication="open" encryption="wpa23-mixed" do-802-11w="1" is-guest="false" max-clients-per-radio="100" usage="user" acctsvr-id="0" acct-upd-interval="5" do-802-11d="enabled" do-wmm-ac="disabled" option82="0" option82-opt1="0" option82-opt2="0" option82-opt150="0" option82-opt151="0" option82-areaName="" force-dhcp="0" force-dhcp-timeout="10" dis-dgaf="0" parp="1" authstats="1" sta-info-extraction="1" enable-type="0" idle-timeout="true" max-idle-timeout="3000" called-station-id-type="0" policy-id="" policy6-id="" ci-whitelist-id="0" client-isolation="disabled" acl-id="1" pool-id="" vlan-id="1" https-redirection="disabled" local-bridge="1" dhcpsvr-id="0" bgscan="1" balance="1" band-balance="1" devicepolicy-id="" precedence-id="1" role-based-access-ctrl="false" client-flow-log="disabled" export-client-log="false" wifi6="true" dtim-period="2" directed-mbc="0" transient-client-mgnt="0" id="1"><rrm neighbor-report="enabled" /><smartcast mcast-filter="disabled" /><avp-policy avp-enabled="disabled" avpdeny-id="0" /><urlfiltering-policy urlfiltering-enabled="disabled" urlfiltering-id="0" /><qos uplink-preset="DISABLE" downlink-preset="DISABLE" perssid-uplink-preset="0" perssid-downlink-preset="0" /><wpa cipher="aes" x-sae-passphrase="mypassphrase" sae-passphrase="mypassphrase" x-passphrase="mypassphrase" passphrase="mypassphrase" dynamic-psk="disabled" /><queue-priority voice="0" video="2" data="4" background="6" /><wlan-schedule value="0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0" /></wlansvc><wlansvc name="RuckusWifi" ssid="RuckusWifi" description="" ofdm-rate-only="true" bss-minrate="0" tx-rate-config="4" authentication="open" encryption="wpa2" do-802-11w="1" is-guest="false" max-clients-per-radio="100" usage="autonomous" acctsvr-id="0" acct-upd-interval="10" do-802-11d="enabled" do-wmm-ac="disabled" option82="0" option82-opt1="0" option82-opt2="0" option82-opt150="0" option82-opt151="0" option82-areaName="" force-dhcp="0" force-dhcp-timeout="10" dis-dgaf="0" parp="0" authstats="0" sta-info-extraction="1" enable-type="0" idle-timeout="true" max-idle-timeout="300" called-station-id-type="0" policy-id="" policy6-id="" ci-whitelist-id="0" client-isolation="disabled" acl-id="1" pool-id="" vlan-id="1" https-redirection="disabled" local-bridge="1" dhcpsvr-id="0" bgscan="1" balance="1" band-balance="1" fast-bss="disabled" devicepolicy-id="" precedence-id="1" role-based-access-ctrl="false" client-flow-log="disabled" export-client-log="false" dtim-period="3" directed-mbc="0" transient-client-mgnt="0" id="2" wifi6="true"><rrm neighbor-report="disabled" /><smartcast mcast-filter="disabled" /><avp-policy avp-enabled="disabled" avpdeny-id="0" /><urlfiltering-policy urlfiltering-enabled="disabled" urlfiltering-id="0" /><qos uplink-preset="DISABLE" downlink-preset="DISABLE" perssid-uplink-preset="0" perssid-downlink-preset="0" /><wpa cipher="aes" x-passphrase="mypassphrase" passphrase="mypassphrase" dynamic-psk="disabled" /><queue-priority voice="0" video="2" data="4" background="6" /><wlan-schedule value="0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0" /></wlansvc></wlansvc-list></response></ajax-response>'
    session.mock_results[ConfigItem.AP_LIST.value] = '<?xml version="1.0" encoding="utf-8"?><!DOCTYPE ajax-response><ajax-response><response type="object" id="ap-list.0.5"><ap-list><ap mac="8c:7a:15:3e:01:02" application-reboot="2" user-reboot="0" push-reset-reboot="0" kernel-panic-reboot="0" watchdog-reboot="0" powercycle-reboot="1" reboot-reason="power cycle detect" reboot-detail="" rejoin-reason="Heartbeat Loss" mesh-last-good-ssid="67457921423000304" x-mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" id="1" name="8c:7a:15:3e:01:02" devname="MyTestAp" model="r650" description="" location="" coordinate_source="0" gps="-33.52647,102.59308" group-id="7" ipmode="*" as-is="true" as-is-ipv6="true" bonjour-check="false" x-psk="" mesh-enabled="true" mesh-mode="*" max-hops="*" led-off="true" usb-installed="true" usb-port="true" working-radio="0" approved="true" poe-mode-setting="2" port-setting="*" support-11ac="true" poe-mode="2" last-seen="1680750614" ip="192.168.1.65" netmask="255.255.255.0" gateway="192.168.1.1" dns1="192.168.1.1" dns2="" ipv6-addr="fc00::1" ipv6-plen="7" ipv6-gateway="" ipv6-dns1="" ipv6-dns2="" version="10.5.1.0" build-version="240" strong-cert="normal" config-state="3" serial="302139001234" tunnel-mode="2" udp-port="12223" ext-ip="192.168.1.65" ext-ipv6="fc00::1" ext-port="12223" ext-family="2" support-11ax="true" auth-mode="psk" blocked="false" mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" psk=""><radio radio-type="11na" ieee80211-radio-type="a/n" radio-id="1" channel="*" channel_seg2="*" tx-power="*" wmm-ac="*" prot-mode="*" vap-enabled="*" wlangroup-id="*" channel-select="*" enabled="1" channelization="*" /><radio radio-type="11ng" ieee80211-radio-type="g/n" radio-id="0" channel="*" channel_seg2="*" tx-power="*" wmm-ac="*" prot-mode="*" vap-enabled="*" wlangroup-id="*" channel-select="*" enabled="1" channelization="*" /><adv-mesh apply-acl="false" /><ports port-num="2" acctsvr-id="0" authsvr-id="0" mac-auth="false" supplicant="mac"><port id="1" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" /><port id="2" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" /></ports><venue-names /><bonjourfencing enable="*" policy="*" /></ap><ap mac="24:79:2a:3d:01:02" last-seen="1638930560" ip="192.168.1.2" netmask="255.255.255.0" gateway="192.168.1.1" dns1="202.180.64.10" dns2="202.180.64.11" ipv6-addr="2404:4404:2735:4e00:2679:2aff:fe3d:d70" ipv6-plen="64" ipv6-gateway="fe80::faca:59ff:febf:8ff1" ipv6-dns1="" ipv6-dns2="" application-reboot="17" user-reboot="0" push-reset-reboot="0" kernel-panic-reboot="0" watchdog-reboot="0" powercycle-reboot="24" reboot-reason="target fail detect" reboot-detail="" rejoin-reason="AP Restart" mesh-last-good-ssid="67457921423000304" x-mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" ext-ip="119.224.68.105" ext-port="12223" ext-ipv6="2404:4404:2735:4e00:2679:2aff:fe3d:d70" ext-family="2" tunnel-mode="2" version="10.5.0.0" build-version="212" strong-cert="updated" id="2" name="24:79:2a:3d:01:02" devname="MyOtherTestAp" model="r600" description="" location="44a The Test Road" coordinate_source="0" gps="-33.582834,102.567143" group-id="4" ipmode="*" by-dhcp="true" as-is="false" as-is-ipv6="true" bonjour-check="false" x-psk="" mesh-enabled="true" mesh-mode="*" max-hops="*" led-off="*" usb-installed="false" working-radio="0" approved="true" port-setting="*" support-11ac="true" config-state="3" poe-mode="0" serial="241703001234" udp-port="42530" auth-mode="psk" cband-chann="*" cband-license="*" mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" psk=""><radio radio-type="11na" radio-id="1" channel="*" tx-power="*" wlangroup-id="*" wmm-ac="*" vap-enabled="*" channel-select="*" channelization="*" ieee80211-radio-type="a/n" enabled="1" channel_seg2="*" prot-mode="*" /><radio radio-type="11ng" radio-id="0" channel="*" tx-power="*" wlangroup-id="*" wmm-ac="*" vap-enabled="*" channel-select="*" channelization="*" ieee80211-radio-type="g/n" enabled="1" prot-mode="*" /><adv-mesh apply-acl="false" channel="*" channelization="*" channel-select="*" /><ports port-num="2" acctsvr-id="0" authsvr-id="0" mac-auth="false" supplicant="mac" channel="*" channelization="*" channel-select="*"><port id="1" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" igmp-snooping="disable" /><port id="2" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" /></ports><venue-names channel="*" channelization="*" channel-select="*" /><bonjourfencing enable="*" policy="*" channel="*" channelization="*" channel-select="*" /></ap></ap-list></response></ajax-response>'
    session.mock_results["active-client-stats1"] = '<?xml version="1.0" encoding="utf-8"?><!DOCTYPE ajax-response><ajax-response><response type="object" id="DEH"><apstamgr-stat><client mac="Rec:0e:c4:af:01:02" vap-mac="8c:7a:15:3e:01:02" vap-nasid="8C-7A-15-3E-01-02" wlan-id="1" ap-name="MyTestAp" status="1" ext-status="0" first-assoc="1687043257" vlan="1" called-station-id-type="0" ssid="MyFirstTestSSID" favourite="0" blocked="0" wlan="MyFirstTestSSID" role-id="0" channel="1" description="" dvcinfo-group="9" channelization="20" ieee80211-radio-type="g/n" radio-type-text="11ng" rssi="49" received-signal-strength="-47" noise-floor="-96" num-interval-stats="0" location="" auth-method="Open" acct-multi-session-id="8c7a153e21d8ec0ec4af09fd648e3cb9100a" acct-session-id="648d4188-a7c" ap="8c:7a:15:3e:01:02" dpsk-id="0" user="" ip="192.168.1.45" ipv6="" dvcinfo="Gaming" dvctype="Gaming" model="Playstation 2" hostname="ec:0e:c4:af:01:02" oldname="Rec:0e:c4:af:01:02" radio-type="11ng" rssi-level="excellent" encryption="WPA2" /><client mac="30:e3:7a:7d:01:02" vap-mac="8c:7a:15:3e:01:02" vap-nasid="8C-7A-15-3E-01-02" wlan-id="1" ap-name="MyTestAp" status="1" ext-status="0" first-assoc="1687040776" vlan="1" called-station-id-type="0" ssid="MyFirstTestSSID" favourite="0" blocked="0" wlan="MyFirstTestSSID" role-id="0" channel="112" description="" dvcinfo-group="1" channelization="80" ieee80211-radio-type="a/n/ac" radio-type-text="11ac" rssi="38" received-signal-strength="-58" noise-floor="-96" num-interval-stats="0" location="" auth-method="Open" acct-multi-session-id="8c7a153e21dc30e37a7df17c648e33080ff5" acct-session-id="648d4188-a7b" ap="8c:7a:15:3e:01:02" dpsk-id="0" user="" ip="192.168.1.92" ipv6="" dvcinfo="Windows" dvctype="Laptop" model="Microsoft Windows/Windows 10.0.0" hostname="My-Test-PC" oldname="My-Test-PC" radio-type="11ac" rssi-level="excellent" encryption="WPA2" /><client mac="1a:97:a7:bf:01:02" vap-mac="2c:c5:d3:46:01:02" vap-nasid="2C-C5-D3-46-01-02" wlan-id="7" ap-name="Mum" status="1" ext-status="0" first-assoc="1687061025" vlan="1" called-station-id-type="0" ssid="RuckusWifi" favourite="0" blocked="0" wlan="RuckusWifi" role-id="0" channel="40" description="" dvcinfo-group="3" channelization="80" ieee80211-radio-type="a/n/ac" radio-type-text="11ac" rssi="39" received-signal-strength="-66" noise-floor="-105" num-interval-stats="0" location="" auth-method="Open" acct-multi-session-id="2cc5d38685d81a97a7bf802e648e82211082" acct-session-id="64e8cf09-ee" ap="2c:c5:d3:06:01:02" dpsk-id="0" user="" ip="192.168.7.124" ipv6="" dvcinfo="Apple iOS" dvctype="Smartphone" model="iOS Phone" hostname="1a:97:a7:bf:01:02" oldname="1a:97:a7:bf:01:02" radio-type="11ac" rssi-level="excellent" encryption="WPA3" /></apstamgr-stat></response></ajax-response>'

    ruckus = session.api
    wlans = await ruckus.get_wlans()
    aps = await ruckus.get_aps()
    active_clients = await ruckus.get_active_clients()

(edited to add login and close methods, since you directly use these)

Honestly, I didn’t 100% understand what HA required to be tested, so I failed miserably at that part of the PR.

@ms264556 I’ve pushed my current WIP changes to my fork here: https://github.com/lanrat/hass_core/tree/ruckus_unleashed_py3.11

Nevermind. I can repro the issue if I try to call AjaxSession.async_create() directly instead of using it as an async context manager.

Let me work up some code which works in that way.

Otherwise I’ll create a new aioruckus release which includes a non-async context, so this usage will work OK.

It was against an older version of aioruckus, which looked quite different (actually, looked quite like your current code 😀), and I killed it when I got feedback that the tests were no good and I should update pyruckus rather than start a new library.

You can see my attempt here:-

https://github.com/home-assistant/core/compare/dev...ms264556:home-assistant-core:ruckus_unleashed_updates

aioruckus looks better than my code and is what I think would be a good path forward.

I can start a draft PR to move this component to it, unless @ms264556 wants to do it.

@ms264556 You did a great job. If use ssh cli, there are some limitation. For example, can not set block clients in system acl-list. Your solution can do. This help me to solve to create a switch by blocking client.

I’ve been working on a python library to parse connected clients from Ruckus Unleashed APs using the web API, instead of scraping the SSH output like pyruckus does. So far I’ve only tested it with my R500 and R600, but if anyone else wants to help me test it, it should be far more stable than pyruckus and I can submit a PR to use it here.

https://github.com/lanrat/ruckus-clients

@faithless01 as I explained above, this is an issue with upstream Python. Python changed how they do things in newer versions. And turn this broke Pexpect which is a library that all these integrations are using to facilitate SSH and pulling back data structures. Pexpect is a generally available Python library and has no relation whatsoever to Home Assistant. Interestingly enough the master branch is ahead of the 4.8 release though still reports as 4.8. Otherwise it would be a very simple fix - bump the version requirement in pyruckus.

So see above as far as what the options are. Wait for pexpect to merge the fix and put out a new release (4.9?), rewrite the integration to use different libraries, or monkey patch while waiting.

@andornaut - are you doing this from the HassOS installation, or a supervised install, like @Bubbgump209 ?

I’m running Home Assistant in a Docker container. A similar approach should work for a HassOS installation, but I haven’t used HassOS myself, sorry.

@andornaut You’re a god. I was about to start creating a playbook for this as I have a feeling this won’t be fixed upstream for quite some time.

The issue is 100% in pexpect within _async.py. I monkey patched _async.py to

/usr/local/lib/python3.11/site-packages/pexpect/_async.py

import asyncio
import errno

from pexpect import EOF

async def expect_async(expecter, timeout=None):
    # First process data that was previously read - if it maches, we don't need
    # async stuff.
    previously_read = expecter.spawn.buffer
    expecter.spawn._buffer = expecter.spawn.buffer_type()
    expecter.spawn._before = expecter.spawn.buffer_type()
    idx = expecter.new_data(previously_read)
    if idx is not None:
        return idx
    if not expecter.spawn.async_pw_transport:
        pw = PatternWaiter()
        pw.set_expecter(expecter)
        transport, pw = await asyncio.get_event_loop()\
            .connect_read_pipe(lambda: pw, expecter.spawn)
        expecter.spawn.async_pw_transport = pw, transport
    else:
        pw, transport = expecter.spawn.async_pw_transport
        pw.set_expecter(expecter)
        transport.resume_reading()
    try:
        return (await asyncio.wait_for(pw.fut, timeout))
    except asyncio.TimeoutError as e:
        transport.pause_reading()
        return expecter.timeout(e)


class PatternWaiter(asyncio.Protocol):
    transport = None

    def set_expecter(self, expecter):
        self.expecter = expecter
        self.fut = asyncio.Future()

    def found(self, result):
        if not self.fut.done():
            self.fut.set_result(result)
            self.transport.pause_reading()
    
    def error(self, exc):
        if not self.fut.done():
            self.fut.set_exception(exc)
            self.transport.pause_reading()

    def connection_made(self, transport):
        self.transport = transport
    
    def data_received(self, data):
        spawn = self.expecter.spawn
        s = spawn._decoder.decode(data)
        spawn._log(s, 'read')

        if self.fut.done():
            spawn._buffer.write(s)
            return

        try:
            index = self.expecter.new_data(s)
            if index is not None:
                # Found a match
                self.found(index)
        except Exception as e:
            self.expecter.errored()
            self.error(e)
    
    def eof_received(self):
        # N.B. If this gets called, async will close the pipe (the spawn object)
        # for us
        try:
            self.expecter.spawn.flag_eof = True
            index = self.expecter.eof()
        except EOF as e:
            self.error(e)
        else:
            self.found(index)
    
    def connection_lost(self, exc):
        if isinstance(exc, OSError) and exc.errno == errno.EIO:
            # We may get here without eof_received being called, e.g on Linux
            self.eof_received()
        elif exc is not None:
            self.error(exc)

This works perfectly. The bigger issue seems to be upstream. There is some blocker in pexpect where they are waiting for Pypy to get updated so that unit tests will pass. Here’s the Pypy issue https://foss.heptapod.net/pypy/pypy/-/issues/3931 which is blocking the pexpect issue https://github.com/pexpect/pexpect/pull/732.

Essentially this is a bit of dependency hell.

The quick fix is my monkey patch if folks feel comfortable running a shell into the homeassistant container. EDIT: Understand, if you do monkey patch, you’ll need to monkey patch after every HA upgrade until the upstream is fixed as the new container will nuke the change. /EDIT.

The longer fix is wait for the whole dependency deal to work itself out OR rewrite the integration to use an alternative to pexpect (Fabric?) that is better maintained OR fork pyexpect… or 100 other work arounds.