Skip to main content
Version: v1.0.0

Runner (LEIA Customer API)

Repository: leia-org/leia-runner

The LEIA Runner is the AI session execution engine. It manages LEIA instances, handles student–AI conversations in real time, and integrates with LLM providers (OpenAI). It is consumed by both the Designer Backend and the Workbench Backend.


Tech Stack

TechnologyPurpose
Node.js + Express.jsRuntime and HTTP server
RedisSession state and task queuing
OpenAI SDKLLM provider integration
ZodRequest schema validation
Swagger UIInteractive API documentation
JestTesting
nodemonDev server with auto-reload
DockerContainerization

Prerequisites

  • Node.js >= 16.x
  • npm
  • Redis running locally (default: redis://localhost:6379)
  • A valid OpenAI API key

Project Structure

leia-runner/
├── api/ # OpenAPI/Swagger spec files
├── config/ # Configuration and environment loading
├── controllers/ # Request handlers for each route
├── models/ # Data models and schemas
├── routes/ # Route definitions
├── services/ # Core business logic and LLM integration
├── tests/ # Jest test suites
├── utils/ # Utility functions
├── index.js # Application initialization
├── server.js # Server entry point
├── .env.example # Environment variable template
├── .oastoolsrc # OpenAPI tooling configuration
├── Dockerfile # Container build configuration
└── package.json # Dependencies and npm scripts

Environment Variables

Copy the example file and fill in your values:

cp .env.example .env
VariableDefaultDescription
PORT5002HTTP server port
REDIS_URLredis://localhost:6379Redis connection string
RUNNER_KEYR2D2C3POBearer token required by callers to authenticate requests
OPENAI_API_KEY(required)OpenAI API key for LLM calls
DEFAULT_MODELopenai-assistantDefault LLM model used for new sessions
warning

OPENAI_API_KEY has no default value, which implies the server will not function without it. Change RUNNER_KEY from its default before any non-local deployment.


Local Development

  1. Fork and clone the repository:

    git clone <your-fork-url>
    cd leia-runner
  2. Install dependencies:

    npm install
  3. Copy the environment template and configure your values:

    cp .env.example .env

    At minimum, set OPENAI_API_KEY to your OpenAI key.

  4. Make sure Redis is running locally on port 6379.

  5. Start the development server with auto-reload:

    npm run dev

The API will be available at http://localhost:5002. Interactive Swagger documentation is served at http://localhost:5002/docs.


Available Scripts

ScriptCommandDescription
Dev servernpm run devStart with nodemon (auto-reload)
Productionnpm startStart the production server
Testsnpm testRun all Jest tests
Installnpm run setupInstall all dependencies
Update depsnpm run update-depsUpdate all dependencies

API Reference

All endpoints are prefixed with /api/v1. Every request must include:

Authorization: Bearer <RUNNER_KEY>

Sessions

MethodEndpointDescription
POST/leiasCreate a new LEIA session instance
POST/leias/:sessionId/messagesSend a message to an active session

POST /leias request body:

{
"sessionId": "unique-session-id",
"leia": {
"spec": {
"persona": { },
"behaviour": { },
"problem": { }
}
},
"runnerConfiguration": {
"provider": "openai-assistant"
}
}

POST /leias/:sessionId/messages request body:

{ "message": "Hello, I need help with this problem." }

Models

MethodEndpointDescription
GET/modelsList available LLM models and the current default

Available models:

Model IDDescription
openaiStandard OpenAI chat completion
openai-assistantOpenAI Assistants API (default)
openai-advancedAdvanced reasoning model

Evaluation

MethodEndpointDescription
POST/evaluationEvaluate a participant's final result against the LEIA's problem criteria

Request body:

{
"sessionId": "unique-session-id",
"result": "The participant's final answer..."
}

Response:

{
"evaluation": "Detailed evaluation text...",
"score": 85
}

Problem Generation

MethodEndpointDescription
POST/problems/generateUse AI to generate a new problem definition

Cache

MethodEndpointDescription
DELETE/cache/purgeClear cached session data. Accepts ?sessionId=<id> to target one session
GET/cache/statsGet Redis cache statistics

Transcription

MethodEndpointDescription
POST/transcriptions/generateGenerate a text transcription from an audio file (multipart/form-data)

Full request/response schemas are available in the interactive Swagger UI at http://localhost:5002/docs.


Contributing

  1. Fork the repository and create a branch off main:

    git checkout -b feat/my-feature
  2. Make sure Redis is running and your .env is configured before running any tests.

  3. Write or update Jest tests for any new or modified endpoints:

    npm test
  4. Use Conventional Commits for your commit messages (feat:, fix:, docs:, etc.).

  5. Open a Pull Request with a clear description of the changes to the session execution logic or API surface.