Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add ability to use OCI per customer #144

Merged
merged 1 commit into from
Feb 14, 2025
Merged

feat: add ability to use OCI per customer #144

merged 1 commit into from
Feb 14, 2025

Conversation

saghul
Copy link
Member

@saghul saghul commented Feb 11, 2025

This is slightly different than the OpenAI / AzuerOpenAI runners. Instead, OCI credentials are configured globally for a given Skynet runner, just like the local one, and if configured, requests shall be sent to OCI rather than to the local running llama.

The ability to run without a local llm directly targetting OCI is still retained.

This is slightly different than the OpenAI / AzuerOpenAI runners.
Instead, OCI credentials are configured globally for a given Skynet
runner, just like the local one, and if configured, requests shall be
sent to OCI rather than to the local running llama.

The ability to run without a local llm directly targetting OCI is still
retained.
@saghul saghul requested a review from quitrk February 11, 2025 08:50
@saghul saghul merged commit 700aa11 into master Feb 14, 2025
@saghul saghul deleted the customer-oci branch February 14, 2025 16:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants