Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: AI Proxy, AWS Bedrock Converse-API Driver #13354

Open
wants to merge 20 commits into
base: master
Choose a base branch
from

Conversation

tysoekong
Copy link
Contributor

Summary

Supersedes #13054 which was completely broken.

Adds AWS Bedrock "Converse API" support to Kong AI Gateway.

Checklist

  • The Pull Request has tests
  • A changelog file has been created under changelog/unreleased/kong or skip-changelog label added on PR if changelog is unnecessary. README.md
  • There is a user-facing docs PR against https://github.com/Kong/docs.konghq.com - PUT DOCS PR HERE

Issue reference

AG-14

@subnetmarco
Copy link
Member

@fffonion let's start the review process on this large PR.

kong/llm/drivers/bedrock.lua Show resolved Hide resolved
kong/llm/drivers/bedrock.lua Outdated Show resolved Hide resolved
kong/llm/init.lua Show resolved Hide resolved
kong/llm/init.lua Outdated Show resolved Hide resolved
kong/tools/aws_stream.lua Show resolved Hide resolved
@tysoekong tysoekong changed the title Feat/ai proxy aws bedrock Jul 25, 2024
@tysoekong tysoekong changed the title feat: AI Proxy, AWS Bedrock Converse API Driver Jul 25, 2024
Copy link
Contributor

@jschmid1 jschmid1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1/n

kong/clustering/compat/checkers.lua Outdated Show resolved Hide resolved
has_update = true
end
end

if plugin.name == 'ai-request-transformer' then
local config = plugin.config
if config.llm.model.provider == "gemini" then
config.llm.model.provider = "openai"
if config.llm.model.provider == "gemini" or config.llm.model.provider == "bedrock" then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You also seem to do quite a bit of comparison for model/providers. Consider doing something like

local GEMINI_PROVIDER_NAME = "gemini"

is_gemini = function()
  return config.llm.model.provider == GEMINI_PROVIDER_NAME
end

-- allows you to do compounds as well

is_gemini_or_bedrock() = funtion()
  return is_gemini() or is_bedrock()
end
Copy link
Contributor Author

@tysoekong tysoekong Jul 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But will this JIT really badly, and have worse performance? Is there anywhere else I can copy from for this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -219,7 +243,7 @@ end
-- @param {string} frame input string to format into SSE events
-- @param {boolean} raw_json sets application/json byte-parser mode
-- @return {table} n number of split SSE messages, or empty table
function _M.frame_to_events(frame, raw_json_mode)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure that you patched all frame_to_events invocations. This is a shared library and there may be some false expectations when the function signature changes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah checked all providers. This won't happen again after this release, will be moved into their respective drivers.

I have no idea why I put this function in shared in the first place...

@tysoekong tysoekong changed the title feat: AI Proxy, AWS Bedrock Converse AI Driver Jul 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment