versatile_thermostat: window_state does not change, configuration issue?
VTherm is not changing the window_state to “open” when the slope drops below the configured value:
I’ve configured all VTherm entities using the option “window detection” with the suggested value for temperature slope of 0.05 (yaml see below):
Unfortunately while there are changes in the slope, the window_state does not change to open (on):
By the way, in the configuration you ask for a “decrease” with the info: “Recommended value: between 0.05 and 0.1. Leave empty if automatic window open detection is not use”. So I assume one needs to put a positive value in there, because a negative value of a decrease would be an increase in temperature? Otherwise, if the configuration requires a negative value, you shouldn’t ask for a decrease but for a change and also change the recommended value to negative value.
PS: Why does the slope jump around so much, if the room temperature is usually pretty stable? I don’t understand the slope when comparing it with the room temperature in the screenshot.
Here is the configuration:
hvac_modes:
- "off"
- heat
min_temp: 16
max_temp: 26
preset_modes:
- none
- eco
- comfort
- boost
current_temperature: 19.4
temperature: 21
hvac_action: heating
preset_mode: comfort
hvac_mode: heat
type: null
eco_temp: 20
boost_temp: 22
comfort_temp: 21
eco_away_temp: 19
boost_away_temp: 21
comfort_away_temp: 20
power_temp: null
ext_current_temperature: 2.40000009536743
ac_mode: false
current_power: null
current_power_max: null
saved_preset_mode: comfort
saved_target_temp: 21
saved_hvac_mode: null
window_state: null
motion_state: null
overpowering_state: null
presence_state: "on"
window_auto_state: false
window_bypass_state: false
security_delay_min: 60
security_min_on_percent: 0.5
security_default_on_percent: 0.1
last_temperature_datetime: "2023-11-09T08:16:59.625031+01:00"
last_ext_temperature_datetime: "2023-11-09T08:16:31.993989+01:00"
security_state: false
minimal_activation_delay_sec: 10
device_power: 1
mean_cycle_power: null
total_energy: 380.69
last_update_datetime: "2023-11-09T08:16:59.637219+01:00"
timezone: …
window_sensor_entity_id: null
window_delay_sec: 30
window_auto_open_threshold: 0.05
window_auto_close_threshold: 0
window_auto_max_duration: 30
motion_sensor_entity_id: null
presence_sensor_entity_id: input_boolean.presence_someone
power_sensor_entity_id: null
max_power_sensor_entity_id: null
is_over_climate: true
start_hvac_action_date: "2023-11-08T16:00:40.167541+01:00"
underlying_climate_0: climate.schlafzimmer_eve_trv
underlying_climate_1: null
underlying_climate_2: null
underlying_climate_3: null
regulated_target_temperature: 22
regulation_accumulated_error: 0
friendly_name: Schlafzimmer Heizung
supported_features: 17
About this issue
- Original URL
- State: closed
- Created 8 months ago
- Comments: 39 (20 by maintainers)
Commits related to this issue
- Issue #181 - auto-window for over_climate doesn't work — committed to jmcollin78/versatile_thermostat by deleted user 8 months ago
An additional info to alpha8: I’m still seeing strong spikes in the calculated slope:
Here’s the measured temperature data:
I agree, a slope measured in °C/hour is better interpretable for humans. And anything users configure themselves should be as understandable as possible.
Regarding alpha8, which I installed yesterday, the calculated slopes unfortunately still cause false positives when the temperature changes by only 0,1°C and looking at these charts I couldn’t say which threshold I would want to use. (I opened the windows in all three rooms at the same time and for the same duration, with 0°C outside)
Unfortunately I’m currently rather busy and don’t have time to investigate possible solutions for a better calculation of the smoothed temperature. But as this smoothing is partly relevant also for an improved regulated temperature, I’m hoping that someone can come up with a solution. Because otherwise I’d want to suggest to change the default cycle from 5 minutes to 1 minute (for everything, not just the slope), I’m not yet aware of any downside to having shorter cycles.
I agree, it is promising. I also found this alpha calculation, it probably is the better choice and maybe you want to consider it instead (in C++ and Ruby, isn’t there any AI translation service into Python?):
As for using the data: You could calculate the EMA and keep all calculated EMA values of the past 60 minutes in memory (or only keep a maximum of one per minute, it is smoothed anyway). This would allow to calculate a slow and a fast slope (e.g. 60 minutes and 10 minutes). And having this in a central place even allows to use the EMA values for a regression calculation to calculate the slope at some point in the future. It will allow predicting the temperature in the future (in 5 minutes) and this value could be used in the regulation algorithm.
As for the EMA calculation of irregular time series, I’ve seen that it might be useful to have an upper limit for alpha in case the last measurement was too long ago. For example when using a half life of 10 minutes a measurement that is 60 minutes ago (if there’s nothing inbetween) would contribute to the smoothed value with 1,5%, giving the current measurement 98,5% relevance. It could be wise to limit the alpha to e.g. 4x the half life (
=0.9375
).I’ve seen that it’s common to use EMA for irregular time series. In that case the input is a list of tuples containing the timestamp and corresponding measured temperature value. The timestamps should be in ascending order.
The EMA is calculated by using the distance between two measurements to calculate the alpha. See this explanation, scroll down to “Irregular Intervals”. Also see this pdf paper (page 9).
EDIT: The decay/half-life (denominator G) in the first example is the one point where one can define if the EMA should only look at very recent data (e.g. 10 minute) or longer time frames.
I agree, it’s a goal to have a place in code where measurements can reported the moment they happen and are processed, to be available for slope calculation, regulation etc. – and this must take the timestamps of the measurements into account, by e.g. creating a 1 minute timeframe and if multiple measurements are taken in that single minute then we use the average and if there’s no measurement over a few minutes then we calculate the sliding average between the available measurements.
I don’t have academic training in statistics but statistics have been a rather large part of my profession over the past 20 years, from classical surveying to predictive analytics, financial markets and language models - and anything inbetween.
Jean-Marc, I have a couple of suggestions:
As a statistician I would say that smoothing of the slope is wrong, you should smooth the input that is used for calculations instead.
Create a central place in the code for calculation of EMA (exponential smoothing) and/or other algorithms, instead of having it in multiple places (e.g. slope calculation, regulated temperature,…). Especially the regulated temperature algorithm can be improved when knowing the current slope. Or maybe even better: two slopes: one long term (the past 60 minutes), one short term (the past 10 minutes)
The alpha (or: multiplier, smoothing factor,…) of EMA is frequently defined by the number of periods that should be used for the EMA calculation. The alpha is 0.66 at 2 periods, 0.5 at 3 periods, 0.4 at 4 periods, 0.33 at 5 periods:
alpha = 2 / (num_periods + 1)
. I find it easier to understand when using the number of periods as input over a factor like the 0.6 you’re suggesting.You could use something like:
(I’m just starting to learn python, so I hope that’s valid)
Is the slope calculation run every 5 minutes or is it run each time the measured temperature changes? It should really be based on a fixed cycle, because some temperature sensors update their measurement every second or every 10 seconds. We cannot use a proper smoothing if we don’t know if the last 3 measurements are from the last 10 minutes or last 10 seconds.
We should have a cycle of 1-2 minutes for dealing with temperature measurements. A cycle of 5 minutes only gives you a maximum of 3 measurements in 10 minutes, which is not really enough for a proper smoothing of measurements. Especially as many sensors have a tolerance of +/- 0.2°C and only few have +/- 0.1°C. So when using less than 5 measurements you’ll end up with noise only.
When calculating a slow (1 hour) and a fast (10 minutes) EMA we could use the distance between these for window open detection.
I will try that before I have to reboot HA the next time. Rebooting is always as issue as the entire thread mesh network needs to recreate itself and that takes a while and sometimes some devices don’t join.
But in general I think this part of the code could be drastically improved by using more than the last 2 temperature measurements and the time inbetween, this will always be inexact. I don’t have a pseudo-code solution yet though.