Infrastructure & Risk
Ecosystem Interoperability
Map which frameworks, SDKs, routers, runtimes, and serving engines support major AI providers and local models.
How to use this dashboard
Map which frameworks, SDKs, routers, runtimes, and serving engines support major AI providers and local models.
Use this map to check which AI tools, SDKs, frameworks, and deployment platforms support each provider or model family.
Ecosystem Interoperability
6 records| Vercel AI SDK | SDK | Supported | Supported | Supported | Supported / provider-dependent | Via compatible providers / OpenAI-compatible APIs | Supported / compatible provider | Community/local pattern | 2026-04-29 | Official docs / package ecosystem check required |
| LangChain | Framework | Supported | Supported | Supported | Supported | Supported via integrations or compatible endpoints | Supported via community/provider patterns | Supported | 2026-04-29 | Official docs / integrations list check required |
| LlamaIndex | Framework | Supported | Supported | Supported | Supported | Via compatible endpoints / community integrations | Via compatible endpoints / community integrations | Supported | 2026-04-29 | Official docs / integrations list check required |
| LiteLLM | Router / Proxy | Supported | Supported | Supported | Supported | Supported | Supported | Supported | 2026-04-29 | Official provider list / GitHub metadata check required |
| Ollama | Local runtime | Not native provider target | Not native provider target | Not native provider target | Local/open model support | Local/open model support | Separate service | Native | 2026-04-29 | Local runtime / model library check required |
| vLLM | Serving engine | OpenAI-compatible serving API | Not direct provider API | Not direct provider API | Serve compatible open weights | Serve compatible open weights | Separate service | Separate local runtime | 2026-04-29 | Serving docs / model compatibility check required |