Skip to content

Sdaas/ask

Repository files navigation

ask

A natural language CLI that converts queries to shell commands using a local LLM (Ollama).

$ ask find all files bigger than 100MB

  1. find / -size +100M -type f  # Search entire filesystem for files > 100MB
  2. find . -size +100M -type f  # Search current directory only

Select [1-2], default=1, [Q]uit: 2
>> find . -size +100M -type f
[E]xecute [C]opy [Q]uit (default=E): E

./downloads/archive.zip
./videos/demo.mp4

Installation

Step 1 — Install Ollama

brew install ollama

Step 2 — Pull the recommended model

ollama pull qwen2.5-coder:7b

This downloads ~4.7GB. qwen2.5-coder:7b is the recommended model for shell command generation (84.8% HumanEval, fast on a laptop).

Step 3 — Install ask

brew tap sdaas/tap
brew install ask

That's it. Ollama starts automatically when you run ask.


Usage

ask <natural language query> [options]

Examples

ask find all files bigger than 100MB
ask list running processes sorted by memory
ask show disk usage for each directory in /var
ask count lines in all python files recursively
ask compress the logs folder into a tarball
ask --dry-run delete all .tmp files older than 7 days
ask --verbose find duplicate files in this directory

Flags

Flag Description
--dry-run Show the command but do not execute it
--verbose Print the LLM prompt, raw response, and timing
--help Show usage information

Configuration

Config file is auto-created at ~/.ask/config.yaml on first run:

ollama_model: qwen2.5-coder:7b  # Ollama model to use
history_limit: 5                 # Number of past interactions sent as context
dry_run_default: false           # Set to true to always dry-run

Logs & Debugging

All interactions are logged to ~/.ask/ask.log:

2026-04-12 10:23:01 [INFO] Query: find big files
2026-04-12 10:23:02 [INFO] LLM returned 2 option(s)
2026-04-12 10:23:05 [INFO] Executed: find . -size +100M | exit=0 | stdout=42B

Interaction history (queries, commands, outputs) is stored in ~/.ask/ask.db (SQLite).

To inspect it:

sqlite3 ~/.ask/ask.db "SELECT user_query, executed_command, exit_code FROM interactions ORDER BY id DESC LIMIT 10;"

How Memory Works

ask stores the last N interactions in SQLite and passes them as context to the LLM. This lets subsequent queries reference previous results:

ask find all .log files older than 30 days
# → find /var/log -name "*.log" -mtime +30

ask delete them
# → LLM sees previous query+output and generates: find /var/log -name "*.log" -mtime +30 -delete

Note: Full pronoun resolution ("them", "those") is best-effort in v1 — the LLM infers context from the truncated output passed in the prompt. Complex cases may require rephrasing.


Limitations

  • Shell-affecting commands (cd, export, source) cannot change your current shell session when run via ask, since commands execute in a subprocess. ask will warn you and suggest running the command directly.
  • Model accuracy: LLM-generated commands are suggestions. Always review before executing, especially for destructive operations (rm, drop, etc.).
  • Ollama model size: qwen2.5-coder:7b requires ~4.7GB disk space and ~5GB RAM. For lower-spec machines, consider llama3.2:3b (1.9GB) — update ollama_model in ~/.ask/config.yaml.
  • Clipboard: [C]opy uses pbcopy (macOS only).

Development

git clone https://github.com/Sdaas/ask
cd ask
python3 -m venv venv && source venv/bin/activate
pip install -e .
pytest tests/ -v

Run directly after installing:

ask find all files bigger than 100MB

TODO / Future Enhancements

  • Cloud LLM backend (Claude): Add support for Anthropic's Claude API as an alternative to Ollama for users who prefer a cloud-hosted model or don't want to run Ollama locally. Would use claude-haiku-* for speed and cost. Requires ANTHROPIC_API_KEY and the anthropic Python package.
  • Smarter context resolution: Improve pronoun handling ("delete them", "move those") by parsing structured output from previous commands rather than passing raw truncated stdout to the LLM.
  • Linux clipboard support: [C]opy currently uses pbcopy (macOS only); add fallback to xclip/xsel for Linux.
  • Shell-state commands: Support cd, export, and source by emitting shell-eval-safe output that the wrapper script can eval, allowing these commands to affect the user's current session.

About

A natural language CLI that converts queries to shell commands using a local LLM (Ollama)

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors