arrow: [CI][Java] Integration jobs with Spark fail with NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator

Describe the bug, including details regarding any error messages, version, and platform.

It does seem that: https://github.com/apache/arrow/pull/36211 Updated from PoolThreadCache to PoolArenasCache this has made our nightly integration tests with Spark previous and current development versions to fail.

The error:

 02:07:30.759 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 24.0 (TID 30) (ab33723e6432 executor driver): java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;

Spark hasn’t yet updated to 4.1.94.Final. I am unsure on how do we fix this but does this mean we break backwards compatibility with previous Spark versions?

Component(s)

Continuous Integration, Java

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 17 (16 by maintainers)

Commits related to this issue

Most upvoted comments

I ran crossbow on that PR - looks like it does pass now