-
Notifications
You must be signed in to change notification settings - Fork 416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Improvement] generate credential according to the data path and metadata path #5648
Comments
Hi, I’d like to work on this. Please let me know if it’s okay. |
@orenccl sure |
Currently, the only solution I can think of is to pass CredentialUtils.vendCredential(
- credentialProvider, loadTableResponse.tableMetadata().location());
+ credentialProvider,
+ new String[] {
+ loadTableResponse.tableMetadata().location(),
+ loadTableResponse.tableMetadata().property(TableProperties.WRITE_DATA_LOCATION, ""),
+ loadTableResponse.tableMetadata().property(TableProperties.WRITE_METADATA_LOCATION, "")
+ }); - public static Credential vendCredential(CredentialProvider credentialProvider, String path) {
+ public static Credential vendCredential(CredentialProvider credentialProvider, String[] path) {
PathBasedCredentialContext pathBasedCredentialContext =
new PathBasedCredentialContext(
- PrincipalUtils.getCurrentUserName(), ImmutableSet.of(path), ImmutableSet.of());
+ PrincipalUtils.getCurrentUserName(), ImmutableSet.copyOf(path), ImmutableSet.of());
return credentialProvider.getCredential(pathBasedCredentialContext);
} However, I'm not sure what to do next—how to test it or what will be affected by these changes. Could you give me some hints? |
Yes, it should be worked if you are passing the specific locations. Do you have a AWS or GCS account ? If yes, you could try to create a Iceberg table with specific metadata or data location, test whether could write read data by spark, please refer to https://gravitino.apache.org/docs/0.7.0-incubating/iceberg-rest-service#exploring-the-apache-gravitino-iceberg-rest-catalog-service-with-apache-spark. If no, I could do some test based your PR. |
I want to try testing it myself and will update you on the progress. Thanks for your help! |
Please try to add a test in |
please contact me if encounter any problems, credential vending is something complicated |
What would you like to be improved?
Currently, Gravitino generates credential according to the table location, we should also consider
write.data.path
write.metadata.path
in iceberg table properties. https://github.com/apache/iceberg/blob/main/core/src/main/java/org/apache/iceberg/TableProperties.java#L265-L273How should we improve?
No response
The text was updated successfully, but these errors were encountered: