Cyrus now supports Cursor and OpenAI models

Cyrus now supports Cursor and OpenAI models
Pegah Vaezi·March 03, 2026·

You drop a Linear issue. Cyrus picks it up. Thirty minutes later, there's a pull request in your repo — tested, reviewed, with a demo and a deploy preview, ready to merge.

Today, we're making that even faster.

Cyrus can now run Cursor Composer and OpenAI Codex — not just Claude. This means you can choose the right model for each task, optimize for speed or accuracy, and even bring your own OpenAI subscription.

Why this matters

The bottleneck in most teams isn't talent — it's capacity. Every sprint has five things competing for the same two developers. That feature request that should take an hour ends up waiting a week because the team is buried.

With Cursor and OpenAI support, Cyrus gives you more options to eliminate that bottleneck:

  • Cursor Composer 1.5 generates code at ~250 tokens/sec — roughly 4x faster than before
  • OpenAI Codex scores 77.3% on Terminal-Bench 2.0, the industry standard for code accuracy
  • Codex Spark can run at up to 1,000 tokens/sec when you bring your own Codex subscription

You pick the model. Cyrus does the work. That feature that was going to wait till next week? It ships before lunch.

Cursor integration

Cursor has become the IDE of choice for AI-powered development. Now, Cyrus can run the Cursor agent directly to process your Linear issues — leveraging Cursor Composer's speed and code intelligence while orchestrating your full workflow.

How it works

When you configure Cursor as your model provider, Cyrus:

  1. Runs the Cursor agent on your issues using the cursor-agent CLI
  2. Generates code 4x faster with Cursor Composer 1.5's 250 tokens/sec throughput
  3. Maintains your workflow — running tests, creating PRs, updating Linear automatically

The result: faster code generation without changing how Cyrus orchestrates your development process.

Setting up Cursor

In Cyrus:

  1. Navigate to your integrations settings
  2. Find the Cursor tile and click "Configure"
  3. Paste your Cursor API key

In Cursor:

  1. Go to the Cursor dashboard → Integrations tab
  2. Create a new User API Key named "Cyrus"
  3. Copy the key and paste it into Cyrus

For self-hosted deployments:

Install the cursor-agent CLI on your Cyrus server:

# Follow Cursor's official installation guide
# Verify installation
agent --version

# Make sure it's in your PATH
which agent

Once configured, you can set Cursor as your default model provider or use it for specific issue types. View full Cursor setup guide →

OpenAI Codex support

OpenAI Codex brings proven accuracy and flexibility. Whether you're using an OpenAI API key or bringing your own ChatGPT/Codex subscription, Cyrus can now route issues to OpenAI models.

Why use OpenAI Codex

  • 77.3% accuracy on Terminal-Bench 2.0 — industry-leading code generation benchmarks
  • Up to 1,000 tokens/sec with Codex Spark when using your own subscription
  • Bring your own subscription — use your existing ChatGPT or Codex plan instead of API credits
  • Cost control — set usage limits to manage costs effectively

Authentication options

Cyrus supports two ways to connect OpenAI:

Option 1: API Key

  1. Create an API key at platform.openai.com
  2. Copy the key immediately (it won't be shown again)
  3. Add it to Cyrus integrations settings
  4. Make sure billing is configured in your OpenAI account

Option 2: ChatGPT/Codex Subscription

For self-hosted deployments:

# Authenticate with your subscription
codex login

For cloud-hosted deployments:

  1. Enable device code authentication in OpenAI security settings
  2. Use the device authorization flow in Cyrus
  3. Follow the prompts to authenticate your subscription

Configuring OpenAI as default

Once authenticated, you can:

  • Set OpenAI as your default model provider for all issues
  • Configure optional usage limits
  • Mix and match with Claude for different issue types

View full OpenAI setup guide →

Choosing the right model

Different models excel at different tasks. Here's how teams are using Cyrus with multiple providers:

Speed-first workflows:

  • Use Cursor Composer for quick iterations and rapid prototyping
  • Generate code 4x faster when speed matters more than complexity

Accuracy-first workflows:

  • Use OpenAI Codex for mission-critical features
  • Leverage 77.3% Terminal-Bench accuracy for production code

Cost-optimized workflows:

  • Bring your own OpenAI subscription and use Codex Spark at 1,000 tokens/sec
  • Set usage limits to control costs while maintaining automation

Hybrid workflows:

  • Use Claude Opus 4.6 for complex architectural decisions
  • Use Cursor Composer for rapid UI development
  • Use OpenAI Codex for API integrations and backend logic
  • Let Cyrus route issues to the right model automatically

Why model flexibility matters

By supporting Cursor and OpenAI alongside Claude, Cyrus adapts to how you already work rather than forcing you into a single approach. This means:

  • Reduced vendor lock-in: Your automation workflows aren't tied to a single model provider
  • Cost optimization: Bring your own subscriptions or use API keys as needed
  • Performance tuning: Choose speed, accuracy, or cost based on each task
  • Future-proofing: As new models emerge, Cyrus can integrate them seamlessly

Getting started

Setting up Cursor

Navigate to your Cyrus integrations settings, find the Cursor tile, and click "Configure." You'll need a User API Key from your Cursor dashboard. For self-hosted deployments, install the cursor-agent CLI on your server.

View full Cursor setup guide →

Setting up OpenAI

Add your OpenAI API key through the Cyrus integrations panel, or authenticate with your ChatGPT/Codex subscription using device authorization flow. Once connected, you can set OpenAI as your default provider or use it selectively.

View full OpenAI setup guide →

If you're already using Cyrus, the new integrations are available now in your dashboard. Not using Cyrus yet? Get started today and see how model flexibility eliminates your development bottlenecks.

Team of engineers preparing a rocket for launch - Ship your next feature with Cyrus
Trusted by product teams at Retool, Gamma, TinyFish, and more

Break free from the terminal

As your Claude Code powered Linear agent, Cyrus is capable of accomplishing whatever large or small issues you throw at it. Get PMs, designers and the CEO shipping product.