snowflake-jdbc: SNOW-379300: Exception when switching from jdk 15 to jdk 16

I’m running a simple query with the snowflake jdbc.

    try (Connection connection = manufDatasource.getConnection()){
      try(Statement statement = connection.createStatement()){
        try(ResultSet resultSet = statement.executeQuery("select distinct COL1, COL2 from \"THE_TABLE\"")){
          Set<Network.SubEntity> result = new HashSet<>();
          while (resultSet.next()) {
            result.add(new Network.SubEntity(resultSet.getLong(1), resultSet.getString(2)));
          }
          return result;
        }
      }
    }

it is working with the jdk 15. When I try with the jdk 16 I’m facing the following exception:

org.jboss.resteasy.spi.UnhandledException: net.snowflake.client.jdbc.SnowflakeSQLLoggedException: JDBC driver internal error: Fail to retrieve row count for first arrow chunk: null.
        at org.jboss.resteasy.core.ExceptionHandler.handleApplicationException(ExceptionHandler.java:106)
        at org.jboss.resteasy.core.ExceptionHandler.handleException(ExceptionHandler.java:372)
        at org.jboss.resteasy.core.SynchronousDispatcher.writeException(SynchronousDispatcher.java:218)
        at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:519)
        at org.jboss.resteasy.core.SynchronousDispatcher.lambda$invoke$4(SynchronousDispatcher.java:261)
        at org.jboss.resteasy.core.SynchronousDispatcher.lambda$preprocess$0(SynchronousDispatcher.java:161)
        at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
        at org.jboss.resteasy.core.SynchronousDispatcher.preprocess(SynchronousDispatcher.java:164)
        at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:247)
        at org.jboss.resteasy.plugins.server.servlet.ServletContainerDispatcher.service(ServletContainerDispatcher.java:

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 9
  • Comments: 33 (2 by maintainers)

Commits related to this issue

Most upvoted comments

It would be awesome to get a new release off of master so we can use arrow 8 & retry updating our service to JDK 17!

Java17 is now GA’d. https://www.oracle.com/news/announcement/oracle-releases-java-17-2021-09-14/

So is there an ETA on snowflake support or atleast correct their requirement statement:

The Snowflake JDBC driver requires Java 1.8 or higher.

to: The Snowflake JDBC driver requires Java 1.8 thru’ Java15.

This should be addressed in #1017

@iamzafar it looks like you can also work around this in JDK 17 by adding the following Java option: --add-opens java.base/java.nio=ALL-UNNAMED

Apache Arrow are not going to fix it. You can switch to using JSON in response, but I’ll assume for huge selects it will work slower and network load will be higher.

You can add the following 2 settings in the following file (macOS) /Applications/DBeaver.app/Contents/Eclipse/dbeaver.ini

-Djdk.module.illegalAccess=permit –add-opens=java.base/java.nio=ALL-UNNAMED

Information from: https://support.dbvis.com/support/solutions/articles/1000309803-snowflake-fail-to-retrieve-row-count-for-first-arrow-chunk-


Another alternative that worked for me on my MAC M1, is to use JDK11

brew install openjdk@11

Edit: /Applications/DBeaver.app/Contents/Eclipse/dbeaver.ini this line: …/Eclipse/jre/Contents/Home/bin/java change to /opt/homebrew/opt/openjdk@11/bin/java

Restart dbeaver

I cloned that PR locally and tested it. It doesn’t work. It’s quite a big leap in Arrow’s version, but it looks like we need to also add the arrow-memory-netty dependency. I was working on this at my end, and got the dependencies sorted out, but I found out that now the new Arrow library is instead causing the issue and not Netty: https://github.com/apache/arrow/blob/apache-arrow-6.0.0/java/memory/memory-core/src/main/java/org/apache/arrow/memory/util/MemoryUtil.java#L84

The error I got was: Caused by: java.lang.RuntimeException: Failed to initialize MemoryUtil. at net.snowflake.client.jdbc.internal.apache.arrow.memory.util.MemoryUtil.<clinit>(MemoryUtil.java:136) ... 17 more Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field long java.nio.Buffer.address accessible: module java.base does not "opens java.nio" to unnamed module @3c60b7e7 at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297) at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:178) at java.base/java.lang.reflect.Field.setAccessible(Field.java:172) at net.snowflake.client.jdbc.internal.apache.arrow.memory.util.MemoryUtil.<clinit>(MemoryUtil.java:84)

Now MemoryUtil in Arrow is using reflection.

I’m looking into this @iamzafar

Hello, Thanks for the workaround, but we still want a fix for this issue. As in our product, we cannot ask our user to run ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT=‘JSON’ in every session.