openhab-addons: [influxdb] Exception when storing integer values

The following Exception is logged when integer values are stored by the OH3 influxdb persistence add-on:

[TRACE] [e.influxdb.InfluxDBPersistenceService] - Storing item Outside_Humidity (Type=NumberItem, State=91 %, Label=Humidity, Category=humidity, Groups=[Humidity]) in InfluxDB point org.openhab.persistence.influxdb.internal.InfluxPoint@2012713f
[ERROR] [org.influxdb.impl.BatchProcessor     ] - Batch could not be sent. Data will be lost
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "value" on measurement "Outside_Humidity" is type integer, already exists as type float dropped=1
	at org.influxdb.InfluxDBException.buildExceptionFromErrorMessage(InfluxDBException.java:144) ~[bundleFile:?]
	at org.influxdb.InfluxDBException.buildExceptionForErrorState(InfluxDBException.java:173) ~[bundleFile:?]
	at org.influxdb.impl.InfluxDBImpl.execute(InfluxDBImpl.java:827) ~[bundleFile:?]
	at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:460) ~[bundleFile:?]
	at org.influxdb.impl.OneShotBatchWriter.write(OneShotBatchWriter.java:22) ~[bundleFile:?]
	at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:340) [bundleFile:?]
	at org.influxdb.impl.BatchProcessor$1.run(BatchProcessor.java:287) [bundleFile:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]

This value is based on a humidity channel of the openweathermap binding (openweathermap:weather-and-forecast:account:local:current#humidity).

I queried the influxdb (1.8.3) and it looked like the Outside_Humidity series only had integer values. It does seem that the OH3 influxdb persistence now adds an item tag which previously wasn’t stored. When I manually update the item to e.g. “91.0 %” it did store the value.

After I removed the series it was possible to store new integer values. However if I would then add a float it would error with:

[TRACE] [e.influxdb.InfluxDBPersistenceService] - Storing item Outside_Humidity (Type=NumberItem, State=92.2 %, Label=Humidity, Category=humidity, Groups=[Humidity]) in InfluxDB point org.openhab.persistence.influxdb.internal.InfluxPoint@730a50c2
[ERROR] [org.influxdb.impl.BatchProcessor     ] - Batch could not be sent. Data will be lost
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "value" on measurement "Outside_Humidity" is type float, already exists as type integer dropped=1
	at org.influxdb.InfluxDBException.buildExceptionFromErrorMessage(InfluxDBException.java:144) ~[bundleFile:?]
	at org.influxdb.InfluxDBException.buildExceptionForErrorState(InfluxDBException.java:173) ~[bundleFile:?]
	at org.influxdb.impl.InfluxDBImpl.execute(InfluxDBImpl.java:827) ~[bundleFile:?]
	at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:460) ~[bundleFile:?]
	at org.influxdb.impl.OneShotBatchWriter.write(OneShotBatchWriter.java:22) ~[bundleFile:?]
	at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:340) [bundleFile:?]
	at org.influxdb.impl.BatchProcessor$1.run(BatchProcessor.java:287) [bundleFile:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]

I also saw the add-on is now using a new client library which may be causing this new behavior.

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 28 (25 by maintainers)

Commits related to this issue

Most upvoted comments

I will check tomorrow if I can to see the real cause, I only had time to do a little test but with new client it’s easy to reproduce, writing only to points (a double and an integer) in a clean database and I caught the error.

I did some more testing and also ran into the issue without having rrd4j installed. So I think it has to do with the new client library. It also seems that storing both integers and floats was never an issue with the library in OH 2.5.x. I was able to store both without any issue:

> select * from Outside_Humidity where time > now()-1d
name: Outside_Humidity
time                value
----                -----
1603198335974000000 77
1603212705936000000 93
1603212858295000000 91.5
1603212866313000000 91
1603212888119000000 99
1603212894625000000 99.2

I also tested with removing the tags but that also did not fix the issue. Also when I remove the optimization in convertBigDecimalToNum to always use the doubleValue the issue is resolved and I still only see integers being stored. So it might be that the library already does these optimizations itself now and we can safely remove this code.