v1.0.0

Your intelligent terminal companion, making command-line operations intuitive and safe.

Star on GitHub

Local LLM Priority

  • Python 3.8+ - Required for core functionality
  • 4GB+ RAM - Recommended for local models
  • Internet Connection - Required for initial setup and API mode
  • Local models retain 100% privacy

> AI-Powered Terminal Installation

$ git clone https://github.com/dheerajcl/Terminal_assistant.git

Cloning repository...

$ cd Terminal_assistant && ./install.sh

Dependencies installed

> Supported Platforms

Local Processing

Via Ollama integration

OllamaOllama

Cloud Providers

Multiple API options

Groq
Groq
OpenAIOpenAI
Anthropic
FireworksFireworks
OpenRouterOpenRouter
DeepSeekDeepSeek

Local Configuration

Ollama-based private processing

# Install Ollama for local LLM's

$ curl -fsSL https://ollama.com/install.sh | sh

# Get base model

$ ollama pull <model_name>

# Configure local mode

$ shellsage config --mode local

>Interactive Setup Wizard

$ shellsage setup

? Select operation mode:
▸ Local (Privacy-first, needs 4GB+ RAM)
  API (Faster but requires internet)

? Choose local model:
▸ llama3:8b-instruct-q4_1 (Recommended)
  mistral:7b-instruct-v0.3
  phi3:mini-128k-instruct

? API provider selection:
▸ Groq
  OpenAI
  Anthropic
  Fireworks
  Deepseek

✅ Configuration updated!

> See shellsage in action..!

Error Diagnosis

$ rm -rf /important-folder

🔎 Potentially dangerous absolute path detected

🛠️ Suggested: rm -rf ./important-folder

Natural Language Command

$ shellsage ask "show system memory usage"

🔍 Translating request...

free -h && top -bn1 | grep 'Mem'

Complex Workflow

$ shellsage ask "backup all .js files"

📝 Creating backup workflow...

mkdir -p ./backups && find . -name '*.js' -exec cp {} ./backups \;

Safe Execution

Interactive confirmation for destructive commands with safety checks

AI Diagnosis

Context-aware error analysis and automatic fixes for common issues

Hybrid Modes

Seamless switching between local privacy and cloud performance

Multi-Provider

Support for Groq, OpenAI, Anthropic, Fireworks, Deepseek

Model Management

Easy switching between llama3, mistral, phi3 and other models

Privacy First

Local processing option keeps all data on your machine

> FAQ