- Added calculate_display_width() method to properly handle emoji widths - Fixed overlapping text in session manager UI by accounting for double-width emojis - Handles compound emojis with variation selectors (like 🗂️) - Uses saturating_sub() to prevent underflow in padding calculations 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
GPT CLI (Rust)
A lightweight command-line interface for chatting with AI models (OpenAI and Anthropic), written in Rust.
This is a Rust rewrite of the original Python gptCLI, providing the same functionality with improved performance and memory safety.
Features
- Multi-provider support - Works with both OpenAI and Anthropic models
- Session persistence - Conversations are automatically saved and can be resumed later
- Model switching - Change models on the fly with interactive selection
- Web search - Enable web search tool for OpenAI models (when supported)
- Reasoning summaries - Enable reasoning summaries for compatible OpenAI models
- Interactive commands - Full set of slash commands for session management
- Cross-platform - Runs on Linux, macOS, and Windows
Installation
Prerequisites
- Rust 1.70 or later
- API keys for the providers you want to use:
OPENAI_API_KEYfor OpenAI modelsANTHROPIC_API_KEYfor Anthropic models
Build from source
git clone <repository-url>
cd gpt-cli-rust
cargo build --release
The binary will be available at target/release/gpt-cli-rust.
Usage
# Run with default session and model
./target/release/gpt-cli-rust
# Start with a specific session
./target/release/gpt-cli-rust --session my-session
# Start with a specific model
./target/release/gpt-cli-rust --model claude-3-5-sonnet-20241022
# Combine options
./target/release/gpt-cli-rust --session work --model gpt-4o
Supported Models
OpenAI
- gpt-4.1
- gpt-4.1-mini
- gpt-4o
- gpt-5 (default)
- gpt-5-chat-latest
- o1
- o3
- o4-mini
- o3-mini
Anthropic
- claude-3-5-sonnet-20241022
- claude-3-5-haiku-20241022
- claude-3-opus-20240229
- claude-3-sonnet-20240229
- claude-3-haiku-20240307
Commands
Chat Commands
- Type normally to chat with the AI
- Use
/helpto see all available commands
Session Management
/list- List all saved sessions/new <name>- Create a new session/switch [name]- Switch to another session (interactive picker if no name)/delete [name]- Delete a session (interactive picker if no name)/clear- Clear current conversation
Model Management
/model [name]- Switch model (interactive picker if no name)/models- List all supported models
Features (OpenAI only)
/tool websearch on|off- Enable/disable web search/reasoning on|off- Enable/disable reasoning summaries/effort [low|medium|high]- Set reasoning effort level (GPT-5 only)
Other
/help- Show help/exit- Exit the CLI
Environment Variables
OPENAI_API_KEY- Your OpenAI API key (required for OpenAI models)ANTHROPIC_API_KEY- Your Anthropic API key (required for Anthropic models)OPENAI_BASE_URL- Custom base URL for OpenAI API (optional, for proxies)DEFAULT_MODEL- Default model if not specified (default: gpt-5)
Session Storage
Sessions are stored as JSON files in ~/.chat_cli_sessions/. Each session contains:
- Conversation history
- Current model
- Feature settings (web search, reasoning)
- Metadata (last updated time)
Differences from Python Version
While functionally equivalent, this Rust version offers:
- Better performance - Faster startup and lower memory usage
- Enhanced safety - Rust's type system prevents many common errors
- Improved error handling - More detailed error messages and recovery
- Modern UI - Better terminal colors and interactive selection
- Cross-platform - Single binary that works across platforms
Development
Project Structure
src/
├── main.rs # Entry point and CLI argument parsing
├── cli.rs # Main CLI loop and command handling
├── core/
│ ├── mod.rs # Core module exports
│ ├── session.rs # Session management and persistence
│ ├── client.rs # API clients for OpenAI and Anthropic
│ └── provider.rs # Provider definitions and model lists
└── utils/
├── mod.rs # Utility module exports
├── display.rs # Terminal display and formatting
└── input.rs # Input handling and interactive prompts
Building
# Development build
cargo build
# Release build (optimized)
cargo build --release
# Run tests
cargo test
# Run with debug output
RUST_LOG=debug cargo run
# Format code
cargo fmt
# Check for issues
cargo clippy
License
This project maintains the same license as the original Python version.
Description
Simple CLI to talk to popular AI models. Currently supports openAI and Anthropic models. Has session management and most features enabled.
Written in rust with the help of claude code
Languages
Rust
99.9%
Shell
0.1%