Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Can not set configuration when setting UGI #1912

Closed
2 tasks done
Tracked by #1930
HuangFru opened this issue Sep 4, 2023 · 2 comments · Fixed by #1919
Closed
2 tasks done
Tracked by #1930

[Bug]: Can not set configuration when setting UGI #1912

HuangFru opened this issue Sep 4, 2023 · 2 comments · Fixed by #1919
Labels
type:bug Something isn't working

Comments

@HuangFru
Copy link
Contributor

HuangFru commented Sep 4, 2023

What happened?

When using Flink to read the existing table, A Kerberos authentication error occurred.

Affects Versions

master

What engines are you seeing the problem on?

Flink

How to reproduce

Use the Amoro Flink runtime jar in Flink to create an Amoro table and execute "select * from xxx" SQL.

Relevant log output

Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: transformer from meta failed.
	at com.netease.sloth.flink.connector.arctic.catalog.ArcticCatalogWrapper.getTable(ArcticCatalogWrapper.java:130)
	at org.apache.flink.table.catalog.CatalogManager.getPermanentTable(CatalogManager.java:425)
	at org.apache.flink.table.catalog.CatalogManager.getTable(CatalogManager.java:395)
	at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:356)
	at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:222)
	at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:182)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
	at scala.collection.TraversableLike.map(TraversableLike.scala:237)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
	at scala.collection.AbstractTraversable.map(Traversable.scala:108)
	at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:182)
	at com.netease.sloth.flink114.sql.table.FlinkTableEnvironmentImpl.translate(FlinkTableEnvironmentImpl.java:533)
	at com.netease.sloth.flink114.sql.table.FlinkTableEnvironmentImpl.sqlUpdateWithoutUdfClassloader(FlinkTableEnvironmentImpl.java:289)
	at com.netease.sloth.flink114.sql.table.FlinkTableEnvironmentImpl.lambda$sqlUpdate$2(FlinkTableEnvironmentImpl.java:239)
	at com.netease.sloth.flink.sql.api.classload.ClassLoadWrap.run(ClassLoadWrap.java:88)
	... 38 more
Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm
	at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:276)
	at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:312)
	at com.netease.arctic.table.TableMetaStore.constructUgi(TableMetaStore.java:333)
	at com.netease.arctic.table.TableMetaStore.getUGI(TableMetaStore.java:228)
	at com.netease.arctic.table.TableMetaStore.doAs(TableMetaStore.java:343)
	at com.netease.arctic.io.ArcticHadoopFileIO.exists(ArcticHadoopFileIO.java:192)
	at com.netease.arctic.hive.catalog.MixedHiveTables.checkPrivilege(MixedHiveTables.java:84)
	at com.netease.arctic.hive.catalog.MixedHiveTables.loadKeyedTable(MixedHiveTables.java:65)
	at com.netease.arctic.hive.catalog.MixedHiveTables.loadKeyedTable(MixedHiveTables.java:38)
	at com.netease.arctic.catalog.MixedTables.loadTableByMeta(MixedTables.java:79)
	at com.netease.arctic.catalog.BasicArcticCatalog.loadTable(BasicArcticCatalog.java:154)
	at com.netease.arctic.catalog.ArcticCatalog.tableExists(ArcticCatalog.java:105)
	at com.netease.arctic.flink.catalog.ArcticCatalog.getTable(ArcticCatalog.java:173)
	at com.netease.sloth.flink.connector.arctic.catalog.ArcticCatalogWrapper.getTable(ArcticCatalogWrapper.java:124)
	... 58 more
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:84)
	at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
	... 72 more
Caused by: KrbException: Cannot locate default realm
	at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
	... 78 more

Anything else

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

  • I agree to follow this project's Code of Conduct
@HuangFru HuangFru added the type:bug Something isn't working label Sep 4, 2023
@HuangFru
Copy link
Contributor Author

HuangFru commented Sep 4, 2023

It seems that 'sun.security.krb5.Config.refresh();' deleted by #1819 has some problems. The sun.security.krb5.Config is refreshed when logging in but not in "setConfiguration". However, I did not see this error in the Trino and Spark.

The real cause of this problem is that sun.security.krb5.Config has been initialized once before the "constructUgi" method (the location of this initialization is temporarily unknown), and because the krbConf file location has not been set at this time, so Config is empty. Then since the "refresh" method is removed, you will get an empty Config when you reach the "setConfiguration" method.

I only see this extra initialization in the Flink engine, so I did not see the same error in the Spark and Trino.

@HuangFru HuangFru changed the title [Bug]: Flink can not set configuration when setting UGI [Bug]: Can not set configuration when setting UGI Sep 5, 2023
@HuangFru
Copy link
Contributor Author

HuangFru commented Sep 5, 2023

It has nothing to do with the engine, it has something to do with the version of hadoop. The higher version of hadoop will initialize krb config once when the configuration is created.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant