seatunnel: [Bug] [Connector-hive-v2] can't get table from hive3.x by the seatunnel 2.3.1-release

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

there are a problem when i start with hive 3.x . i can’t get the table from hive although Kerberos authentication successful and connection is opened.

SeaTunnel Version

seatunnel2.3.1 zeta

SeaTunnel Config

standalone 

env {
  execution.parallelism = 1
  job.mode = "BATCH"
  job.name = "mysql_hive_test"
}

source {

  Hive {
    table_name = "seatunnel_test.test_person_yxf_part2"
    metastore_uri = "thrift://ambari-31.snowleopard.cn:9083,thrift://ambari-32.snowleopard.cn:9083"
    kerberos_principal = "hive/ambari-34.snowleopard.cn@SNOWLEOPARD.CN"
    kerberos_keytab_path = "/etc/security/keytabs/hive.service.keytab"
    hdfs_site_path = "/etc/hadoop/conf/hdfs-site.xml"
  }
}

sink {
    Console {}
}

Running Command

./bin/seatunnel.sh --config ./config/hive-mysql-tbds.conf -e local

Error Exception

(1):  
 WARN  hive.metastore - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: null
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4247) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4233) ~[hive-exec-2.3.9.jar:2.3.9]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:496) [hive-exec-2.3.9.jar:2.3.9]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:245) [hive-exec-2.3.9.jar:2.3.9]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:128) [hive-exec-2.3.9.jar:2.3.9]
        at org.apache.seatunnel.connectors.seatunnel.hive.utils.HiveMetaStoreProxy.<init>(HiveMetaStoreProxy.java:58) [connector-hive-2.3.1.jar:2.3.1]
        at org.apache.seatunnel.connectors.seatunnel.hive.utils.HiveMetaStoreProxy.getInstance(HiveMetaStoreProxy.java:74) [connector-hive-2.3.1.jar:2.3.1]
        at org.apache.seatunnel.connectors.seatunnel.hive.config.HiveConfig.getTableInfo(HiveConfig.java:59) [connector-hive-2.3.1.jar:2.3.1]
        at org.apache.seatunnel.connectors.seatunnel.hive.source.HiveSource.prepare(HiveSource.java:123) [connector-hive-2.3.1.jar:2.3.1]
        at org.apache.seatunnel.engine.core.parse.ConnectorInstanceLoader.loadSourceInstance(ConnectorInstanceLoader.java:64) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.engine.core.parse.JobConfigParser.sampleAnalyze(JobConfigParser.java:371) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.engine.core.parse.JobConfigParser.parse(JobConfigParser.java:132) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.engine.core.parse.MultipleTableJobConfigParser.parse(MultipleTableJobConfigParser.java:112) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.engine.client.job.JobExecutionEnvironment.getLogicalDag(JobExecutionEnvironment.java:155) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.engine.client.job.JobExecutionEnvironment.execute(JobExecutionEnvironment.java:147) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:140) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) [seatunnel-starter.jar:2.3.1]
        at org.apache.seatunnel.core.starter.seatunnel.SeaTunnelClient.main(SeaTunnelClient.java:34) [seatunnel-starter.jar:2.3.1]
 ERROR org.apache.seatunnel.core.starter.SeaTunnel - Exception StackTrace:org.apache.seatunnel.core.starter.exception.CommandExecuteException: SeaTunnel job executed failed
        at org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:181)
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
        at org.apache.seatunnel.core.starter.seatunnel.SeaTunnelClient.main(SeaTunnelClient.java:34)


(2):
Caused by: org.apache.seatunnel.connectors.seatunnel.hive.exception.HiveConnectorException: ErrorCode:[HIVE-03], ErrorDescription:[Get hive table information from hive metastore service failed] - Get table [seatunnel_test.test_person_yxf_part2] information failed
        at org.apache.seatunnel.connectors.seatunnel.hive.utils.HiveMetaStoreProxy.getTable(HiveMetaStoreProxy.java:87)
        at org.apache.seatunnel.connectors.seatunnel.hive.config.HiveConfig.getTableInfo(HiveConfig.java:60)
        at org.apache.seatunnel.connectors.seatunnel.hive.source.HiveSource.prepare(HiveSource.java:123)
        at org.apache.seatunnel.engine.core.parse.ConnectorInstanceLoader.loadSourceInstance(ConnectorInstanceLoader.java:64)
        at org.apache.seatunnel.engine.core.parse.JobConfigParser.sampleAnalyze(JobConfigParser.java:371)
        at org.apache.seatunnel.engine.core.parse.JobConfigParser.parse(JobConfigParser.java:132)
        at org.apache.seatunnel.engine.core.parse.MultipleTableJobConfigParser.parse(MultipleTableJobConfigParser.java:112)
        at org.apache.seatunnel.engine.client.job.JobExecutionEnvironment.getLogicalDag(JobExecutionEnvironment.java:155)
        at org.apache.seatunnel.engine.client.job.JobExecutionEnvironment.execute(JobExecutionEnvironment.java:147)
        at org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:140)
        ... 2 more

Flink or Spark Version

No response

Java or Scala Version

java8

Screenshots

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 15 (4 by maintainers)

Most upvoted comments

same problem; my versions: hive version is 2.1.1; hadoop version is 3.0.0;

seatunnel used hive-exec-2.3.9.jar