You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using Flink to read the existing table, A Kerberos authentication error occurred.
Affects Versions
master
What engines are you seeing the problem on?
Flink
How to reproduce
Use the Amoro Flink runtime jar in Flink to create an Amoro table and execute "select * from xxx" SQL.
Relevant log output
Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: transformer from meta failed.
at com.netease.sloth.flink.connector.arctic.catalog.ArcticCatalogWrapper.getTable(ArcticCatalogWrapper.java:130)
at org.apache.flink.table.catalog.CatalogManager.getPermanentTable(CatalogManager.java:425)
at org.apache.flink.table.catalog.CatalogManager.getTable(CatalogManager.java:395)
at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:356)
at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:222)
at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:182)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
at scala.collection.IterableLike.foreach(IterableLike.scala:74)
at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at scala.collection.TraversableLike.map(TraversableLike.scala:237)
at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:182)
at com.netease.sloth.flink114.sql.table.FlinkTableEnvironmentImpl.translate(FlinkTableEnvironmentImpl.java:533)
at com.netease.sloth.flink114.sql.table.FlinkTableEnvironmentImpl.sqlUpdateWithoutUdfClassloader(FlinkTableEnvironmentImpl.java:289)
at com.netease.sloth.flink114.sql.table.FlinkTableEnvironmentImpl.lambda$sqlUpdate$2(FlinkTableEnvironmentImpl.java:239)
at com.netease.sloth.flink.sql.api.classload.ClassLoadWrap.run(ClassLoadWrap.java:88)
... 38 more
Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:276) at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:312) at com.netease.arctic.table.TableMetaStore.constructUgi(TableMetaStore.java:333) at com.netease.arctic.table.TableMetaStore.getUGI(TableMetaStore.java:228) at com.netease.arctic.table.TableMetaStore.doAs(TableMetaStore.java:343) at com.netease.arctic.io.ArcticHadoopFileIO.exists(ArcticHadoopFileIO.java:192) at com.netease.arctic.hive.catalog.MixedHiveTables.checkPrivilege(MixedHiveTables.java:84) at com.netease.arctic.hive.catalog.MixedHiveTables.loadKeyedTable(MixedHiveTables.java:65) at com.netease.arctic.hive.catalog.MixedHiveTables.loadKeyedTable(MixedHiveTables.java:38) at com.netease.arctic.catalog.MixedTables.loadTableByMeta(MixedTables.java:79) at com.netease.arctic.catalog.BasicArcticCatalog.loadTable(BasicArcticCatalog.java:154) at com.netease.arctic.catalog.ArcticCatalog.tableExists(ArcticCatalog.java:105) at com.netease.arctic.flink.catalog.ArcticCatalog.getTable(ArcticCatalog.java:173) at com.netease.sloth.flink.connector.arctic.catalog.ArcticCatalogWrapper.getTable(ArcticCatalogWrapper.java:124) ... 58 moreCaused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:84) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 72 moreCaused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1137) ... 78 more
Anything else
No response
Are you willing to submit a PR?
Yes I am willing to submit a PR!
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
It seems that 'sun.security.krb5.Config.refresh();' deleted by #1819 has some problems. The sun.security.krb5.Config is refreshed when logging in but not in "setConfiguration". However, I did not see this error in the Trino and Spark.
The real cause of this problem is that sun.security.krb5.Config has been initialized once before the "constructUgi" method (the location of this initialization is temporarily unknown), and because the krbConf file location has not been set at this time, so Config is empty. Then since the "refresh" method is removed, you will get an empty Config when you reach the "setConfiguration" method.
I only see this extra initialization in the Flink engine, so I did not see the same error in the Spark and Trino.
HuangFru
changed the title
[Bug]: Flink can not set configuration when setting UGI
[Bug]: Can not set configuration when setting UGI
Sep 5, 2023
It has nothing to do with the engine, it has something to do with the version of hadoop. The higher version of hadoop will initialize krb config once when the configuration is created.
What happened?
When using Flink to read the existing table, A Kerberos authentication error occurred.
Affects Versions
master
What engines are you seeing the problem on?
Flink
How to reproduce
Use the Amoro Flink runtime jar in Flink to create an Amoro table and execute "select * from xxx" SQL.
Relevant log output
Anything else
No response
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: