Skip to main content

Welcome to Core AI

Core AI is a type-safe abstraction layer over LLM provider SDKs for TypeScript. Write provider-agnostic code with a unified API for chat completion, streaming, embeddings, image generation, and tool calling.

Why Core AI?

Switch between OpenAI, Anthropic, Google Gemini, and Mistral without changing your application code. Core AI provides a consistent interface across all providers while maintaining full type safety and access to provider-specific features when you need them.

Key features

Unified API

Write once, run on any provider. Switch between OpenAI, Anthropic, Mistral, and Google GenAI without code changes.

Full type safety

Strict TypeScript types, Zod-based tool definitions, and zero any types for complete type safety.

Streaming support

Async iterable-based streaming with optional aggregation via toResponse() for both text and structured outputs.

Structured outputs

Schema-validated object generation and streaming with z.infer<TSchema> for type-safe results.

Tool calling

Define tools with Zod schemas that automatically convert to JSON Schema for any provider.

Multi-modal

Support for text, images (base64 and URL), and file inputs across compatible providers.

Embeddings & images

First-class support for embeddings and image generation, not just chat completions.

Lightweight

Thin wrappers over native SDKs with no heavy runtime dependencies.

Supported providers

ProviderPackageChatStreamingEmbeddingsImage Generation
OpenAI@core-ai/openai
Anthropic@core-ai/anthropic
Google GenAI (Gemini)@core-ai/google-genai
Mistral@core-ai/mistral

Next steps

1

Install Core AI

Choose your package manager and install the core package with your preferred provider.Go to installation →
2

Run your first example

Create a simple chat completion in under 10 lines of code.Go to quickstart →
3

Explore features

Learn about streaming, structured outputs, tool calling, and more.Browse examples →