Skip to content

LLM server coverage by agent exported as microservices #244

@corradodebari

Description

@corradodebari

Checklist

  • I have searched the existing issues for similar feature requests.
  • I added a descriptive title and summary to this issue.

Summary

Templates export for SpringAI (OpenAI API compliant/MCP) and LangChain(MCP) support is limited to OpenAI or OLLAMA LLM server providers.
For on-premises deployment at scale, OpenAI-Compat API, as well as HuggingFaceEndpointEmbeddings and and CompatOpenAIEmbeddings API should be supported to have an vLLM/TEI/TGI production deployment.

Why?

No response

How?

No response

Additional Context

No response

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions