arrow: [CI][Java] Integration jobs with Spark fail with NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator
Describe the bug, including details regarding any error messages, version, and platform.
It does seem that:
https://github.com/apache/arrow/pull/36211
Updated from PoolThreadCache
to PoolArenasCache
this has made our nightly integration tests with Spark previous and current development versions to fail.
- test-conda-python-3.10-spark-master
- test-conda-python-3.8-spark-v3.1.2
- test-conda-python-3.9-spark-v3.2.0
The error:
02:07:30.759 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 24.0 (TID 30) (ab33723e6432 executor driver): java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;
Spark hasn’t yet updated to 4.1.94.Final. I am unsure on how do we fix this but does this mean we break backwards compatibility with previous Spark versions?
Component(s)
Continuous Integration, Java
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 17 (16 by maintainers)
Commits related to this issue
- GH-36332: [Java] Fix Spark integration failure due to Netty version upgrade — committed to danepitkin/arrow by danepitkin a year ago
- GH-36332: [CI][Java] Patch spark to use Netty 4.1.94.Final on our integration tests — committed to raulcd/arrow by raulcd a year ago
- [SPARK-45781][BUILD] Upgrade Arrow to 14.0.0 ### What changes were proposed in this pull request? This pr upgrade Apache Arrow from 13.0.0 to 14.0.0. ### Why are the changes needed? The Apache Arrow... — committed to apache/spark by LuciferYang 8 months ago
I ran crossbow on that PR - looks like it does pass now