The embed() function generates vector embeddings for text input using embedding models. Embeddings are useful for semantic search, clustering, recommendations, and other AI tasks that require numerical representations of text.
const result = await embed({ model: openai.embedding('text-embedding-3-small'), input: 'This is a test', dimensions: 256 // Reduce from default 1536 to 256});console.log(result.embeddings[0].length); // 256
const result = await embed({ model: openai.embedding('text-embedding-3-small'), input: 'Sample text for embedding'});if (result.usage) { console.log('Tokens used:', result.usage.inputTokens);} else { console.log('Usage information not available');}
const articles = [ 'Python programming tutorial', 'JavaScript web development', 'Cooking pasta recipes', 'Italian cuisine guide', 'TypeScript type system'];const result = await embed({ model: openai.embedding('text-embedding-3-small'), input: articles});// Use embeddings for clustering (e.g., K-means)// Group similar articles together based on their embeddingsconst embeddings = result.embeddings;
Different providers have different embedding models and capabilities:
Copy
// OpenAIimport { openai } from '@coreai/openai';const openaiEmbed = openai.embedding('text-embedding-3-small');const openaiLarge = openai.embedding('text-embedding-3-large');// Other providers may have their own embedding models// Check provider documentation for available models
Use smaller dimension sizes when possible to reduce storage and computation costs. The text-embedding-3-small and text-embedding-3-large models support custom dimensions.