tesla: Tesla Vehicle Command Protocol required - Performing actions (such as lock, unlock etc) fails with a traceback

Version of the custom_component

v3.19.3

Configuration

Not sure what to attach here, but using the default config for everything.


Describe the bug

This is a fresh install of Home Assistant on the ‘Yellow’ board, and of this plugin. I’m able to sync data from tesla correctly, but performing any action such as locking/unlocking doors, setting charge limit etc doesn’t seem to work. There is a traceback in the logs that is attached below.

Debug log


2023-11-12 17:10:09.296 ERROR (MainThread) [homeassistant.components.websocket_api.http.connection] [548106961344] 
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/websocket_api/commands.py", line 230, in handle_call_service
    await hass.services.async_call(
  File "/usr/src/homeassistant/homeassistant/core.py", line 2035, in async_call
    response_data = await coro
                    ^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/core.py", line 2072, in _execute_service
    return await target(service_call)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/helpers/entity_component.py", line 235, in handle_service
    return await service.entity_service_call(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 876, in entity_service_call
    response_data = await _handle_entity_call(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 948, in _handle_entity_call
    result = await task
             ^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/lock/__init__.py", line 99, in _async_lock
    await entity.async_lock(**remove_entity_service_fields(service_call))
  File "/config/custom_components/tesla_custom/lock.py", line 36, in async_lock
    await self._car.lock()
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/car.py", line 854, in lock
    data = await self._send_command("LOCK")
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/car.py", line 775, in _send_command
    raise ex
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/car.py", line 760, in _send_command
    data = await self._controller.api(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/controller.py", line 1311, in api
    response = await self.__post_with_retries(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
    return await fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/__init__.py", line 325, in iter
    raise retry_exc.reraise()
          ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/__init__.py", line 158, in reraise
    raise self.last_attempt.result()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in __call__
    result = await fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/controller.py", line 1334, in __post_with_retries
    return await self.__connection.post(command, method=method, data=data, url=url)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/connection.py", line 165, in post
    return await self.__open(url, method=method, headers=self.head, data=data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/teslajsonpy/connection.py", line 216, in __open
    raise TeslaException(resp.status_code)
teslajsonpy.exceptions.TeslaException

About this issue

  • Original URL
  • State: open
  • Created 8 months ago
  • Reactions: 3
  • Comments: 86 (11 by maintainers)

Most upvoted comments

Recent core added Tessie integration and it works.

But requires a tessie subscription

Recent core added Tessie integration and it works.

Unfortunately you’ll probably have to wait for someone with the new car that requires this to submit a PR or workaround. I’m not planning on buying a new Tesla (and doubt anyone would buy one for me 😉 ) and the old api will continue to work for me.

It won’t take long for yours. Update from ChargeHQ below:

UPDATE 2023-11-24

Tesla have announced that the old API will stop working on vehicles as follows:

November 2023 -newly delivered vehicles will only support the new API
Nov - Dec 2023 - the old API will stop working on existing vehicles that have not used the old API in the preceding 30 days
January 2024 - the old API will stop working on all vehicles

A third solution that scale (but with different tradeoffs) :

  • Use the BLE integration from https://github.com/teslamotors/vehicle-command to have local polling of your Tesla. Benefit : local polling. Drawback : only works when your Tesla is at Home and that you have BLE range where it’s parked.

@tugakufi teslajsonpy fix has just been released, waiting on llamafilm to finalize changes on this integration. If all goes well it should be ready this weekend. You can keep an eye on this PR: https://github.com/alandtse/tesla/pull/853

Or you could pull my fork of this repo (https://github.com/thierryvt/tesla) if you really can’t wait.

Do however note that the new API set-up is quite a bit more involved than the current one.

Update for those that had issues with the refresh button not working: the latest update of this integration fixes that (just tested it myself - M3 2023)

Yes, especially the part about the log error to look for, which is not the log error that I’m getting. And I’m not in a new car. People keep insisting that all issues raised tie back to he fleet API.

And if they do, the README docs should be updated to say to reflect that.

If it is auth issue and the fleet API does work for you, you can make a pull request to update the docs.

It would be great if we could easily signpost in error messages that people are hitting the API issue, but given the amount of work that was initially done just to get the fleet API working we’re lucky to even have what we’ve got.

Also you say you don’t have a new car but from what Tesla have initially said, at some point most model of cars will have to migrate over, some of us have been fortunate enough it hasn’t happened (as yet, model 3 2023 version here) but it is totally down to Tesla and how they go about forcing users from one system to the other.

@alandtse I’m still trying to work it all out, but I think teslajsonpy does not need any changes for my proxy to work. I can make it trust the local cert by changing util.py in this addon:

    import ssl
    context = ssl.create_default_context()
    SSL_CONTEXT = context.load_verify_locations(cafile='/share/tesla/selfsigned.pem')

It looks like this integration tries to exchange the refresh token for an access token, but the HTTP Proxy does not support that. So we need to split it into 2 different URLs. First hit https://auth.tesla.com/oauth2/v3/token to exchange the refresh token for an access token. Then use that access token to send commands to the local proxy. Does that sound reasonable?

There’s no need to have a “business”, just use your first and last name. My app was approved immediately.

I built an add-on that runs the official HTTP Proxy using your own personal Tesla auth and key. This is my first time building an add-on so feedback is welcome. I handle the initial authorization flow in a temporary Flask app, in conjunction with the Nginx SSL add-on. Perhaps this integration could be modified to use the proxy.

https://github.com/llamafilm/tesla-http-proxy-addon

The issue here is that us pre-2021 S/X owners are being drowned out by the 3/Y and newer S/X owners. From what I can gather from Tesla, what works for them WILL NOT work for us and vice-versa. If we want pre-2021 cars to work at all, I think you can ignore the “other” fixes as they aren’t relevant, and we should focus on getting yours implemented. The fix for newer cars and 3/Y cars is going to be a lot more in depth, and waiting for their fix is unlikely to help us anyway.

I made a few changes to teslajsonpy and got it working for my Pre-2021 Model S/X. My fork can be found here.

https://github.com/gkwok1/teslajsonpy/archive/refs/heads/dev.zip

Copy the teslajsonpy folder inside the zip file to /config on HA and restart.

I did not create a PR as this is not a solution for everyone.

Note from Tesla: January 2024 - All vehicles* will require Tesla Vehicle Command protocol. The REST API will be fully deprecated *Fleet accounts are excluded from these changes until further notice. Pre-2021 Model S/X are excluded from these changes.

Tesla has shut down the API we’ve been relying on…

@alandtse Is there any news, developments, or even a glimmer of hope regarding this integration working again in the future? Any updates or insights you can provide would be greatly appreciated.

Happy to take a PR. If someone wants to create a HA Addon for the proxy, we can leverage that.