Running Examples
All examples are located in theexamples/ directory of the repository.
Prerequisites
- Node.js 18+
- Dependencies installed from the repository root:
Environment Variables
Create a.env file at the repository root:
You only need API keys for the providers you plan to use. Each example specifies which key it requires.
Run an Example
From the repository root:Basic Features
Chat Completion
Basic
generate() chat completion with OpenAI. Demonstrates the simplest way to get started with Core AI.Streaming
Streaming output with
stream() and toResponse(). Shows how to handle real-time text generation with event-based streaming.Multi-Modal Input
Multi-modal input using text and image URLs. Demonstrates how to send images alongside text for vision-enabled models.
Error Handling
Handling
LLMError and ProviderError. Shows proper error handling patterns for both local validation and provider errors.Advanced Features
Tool Calling
Tool definition and full tool-call round trip. Complete example of defining tools with Zod schemas and handling tool execution.
Embeddings
Embeddings with
embed(). Shows how to generate vector embeddings for text using OpenAI’s embedding models.Image Generation
Image generation with
generateImage(). Demonstrates creating images from text prompts with configurable size options.Structured Outputs
Typed structured output with
generateObject(). Shows how to generate strongly-typed JSON objects validated with Zod schemas.Streaming Objects
Streaming structured output with
streamObject(). Demonstrates real-time streaming of structured JSON with incremental validation.Provider Examples
Core AI supports multiple AI providers with the same unified API.Anthropic Provider
Using Anthropic (Claude) with the same
generate() API. Shows how to switch providers while keeping the same code structure.Google GenAI Provider
Using Google GenAI (Gemini) with the same
generate() API. Demonstrates provider portability with Google’s models.Mistral Provider
Using Mistral with the same
generate() API. Shows how easily you can integrate Mistral’s models into your application.Example Categories
1. Getting Started
Start with these examples if you’re new to Core AI:- 01-chat-completion.ts - Your first chat completion
- 02-streaming.ts - Real-time streaming responses
- 07-error-handling.ts - Proper error handling patterns
2. Advanced Capabilities
Explore powerful features:- 03-tool-calling.ts - Enable models to call functions
- 05-embeddings.ts - Generate vector embeddings
- 06-image-generation.ts - Create images from text
- 04-multi-modal.ts - Process text and images together
3. Structured Data
Work with typed, validated outputs:- 11-generate-object.ts - Generate structured JSON
- 12-stream-object.ts - Stream structured data
4. Multi-Provider
Experience true provider portability:- 08-anthropic-provider.ts - Claude models
- 09-google-genai-provider.ts - Gemini models
- 10-mistral-provider.ts - Mistral models