Astra is an opinionated Typescript framework that helps you build AI applications and features quickly.
Mastra is a TypeScript framework designed to help build AI applications.
Features:
LLM Models: Uses the Vercel AI SDK for model routing, offering a unified interface to interact with LLM providers such as OpenAI, Anthropic, and Google Gemini. Allows selection of specific models and providers, with the option to stream responses.
Agents: Systems where the language model determines a sequence of actions. In Mastra, agents equip LLM models with tools, workflows, and synchronized data. They can access knowledge bases and call functions or third-party APIs.
Tools: Typed functions that can be executed by agents or workflows, featuring built-in integration access and parameter validation. Each tool includes a schema defining inputs, an executor function implementing its logic, and access to configured integrations.
Workflows: Durable, graph-based state machines that support loops, branching, waiting for human input, embedding other workflows, error handling, retries, and parsing. They can be built in code or with a visual editor, and each step has built-in OpenTelemetry tracing.
RAG (Retrieval-Augmented Generation): Enables the construction of knowledge bases for agents through an ETL pipeline that includes chunking, embedding, and vector search.
Integrations: Auto-generated, type-safe API clients for third-party services that can be used as tools for agents or steps in workflows.
Evals: Automated tests that evaluate LLM outputs using model-graded, rule-based, and statistical methods, providing a normalized score between 0-1 for logging and comparison. Evals can be customized with user-defined prompts and scoring functions.
Mastra can be run locally or deployed to a serverless cloud. To start using Mastra, you'll need Node.js (v20.0+) and an API key from an LLM provider like OpenAI, Anthropic, or Google Gemini. You can quickly start a new Mastra application using the create-mastra
CLI tool.