llm
asterai
LLM prompting across multiple providers
v1.0.0
Public
Unified LLM component supporting 12 providers through a single `prompt` function. Pass a model string in `provider/model` format and get a response back — no SDK juggling, no provider-specific code.
Supported providers: OpenAI, Anthropic, Mistral, Groq, Google, Venice, xAI, DeepSeek, Together, Fireworks, Perplexity, and OpenRouter. API keys are configured via environment variables (e.g. `OPENAI_KEY`, `ANTHROPIC_KEY`).
Switching models is a one-line change — go from `openai/gpt-5-mini` to `anthropic/claude-opus-4-6` or `deepseek/deepseek-chat` without touching any other code. OpenRouter support also gives you access to hundreds of additional models through a single key.
Interface
Exports
asterai:llm/llm@1.0.0
prompt(prompt: string, model: string)
Imports
wasi:io/poll@0.2.2
wasi:clocks/monotonic-clock@0.2.2
wasi:clocks/wall-clock@0.2.2
wasi:random/random@0.2.2
wasi:io/error@0.2.2
wasi:io/streams@0.2.2
wasi:cli/stdout@0.2.2
wasi:cli/stderr@0.2.2
wasi:cli/stdin@0.2.2
wasi:http/types@0.2.2
wasi:http/outgoing-handler@0.2.2
wasi:cli/environment@0.2.0
wasi:cli/exit@0.2.0
wasi:filesystem/types@0.2.0
wasi:filesystem/preopens@0.2.0