RUST CLI + LLM REASONING + RAW MARKET DATA

eli

The terminal-first financial research agent that fetches raw data, writes Python to analyze it, and builds instincts through prediction. Runs locally. Uses any LLM.

$ cargo install eli-cli
OpenRouter Anthropic OpenAI Ollama
eli ~ mistralai/devstral-2512:free
Analyze the correlation between AMD and INTC over the last 30 days. Identify the single biggest day of DIVERGENCE. eli[1] → plan: MODE: READ | APPROVALS: AUTO checklist: Fetch 30d/1h timeseries for AMD and INTC Compute daily returns and identify divergence Fetch news for the identified day $ eli finance timeseries --tickers AMD,INTC --range 30d --granularity 1h eli[2] → Writing Python to compute daily returns... $ cat << 'EOF' > analyze_divergence.py import json from datetime import datetime with open('amd_intc_30d.json') as f: data = json.load(f) # Compute daily returns and find max divergence for date in dates: amd_ret = (amd_close / amd_prev - 1) * 100 intc_ret = (intc_close / intc_prev - 1) * 100 divergence = abs(amd_ret - intc_ret) EOF python3 analyze_divergence.py TypeError: 'str' object cannot be interpreted as an integer eli[3] → Fixing timestamp parsing for ISO format... $ python3 analyze_divergence.py New max divergence: 9.72% on 2026-01-09 (AMD: -0.54%, INTC: +9.18%) $ eli finance news --ticker INTC --date 2026-01-09 ╭───────────────────────────────────────────────────────────────────╮ INTC surged +9.18% while AMD was flat (-0.54%) Catalysts identified: • CEO meeting with President Trump • Launch of next-generation AI PCs at CES 2026 • Numeric evidence: 9.72% divergence is 3x typical spread ╰───────────────────────────────────────────────────────────────────╯ status: ✓ done

How Eli Works

01

Rust CLI fetches raw data

No web scraping. No hallucinated numbers. OHLCV candles from Yahoo Finance and FRED, cached locally with SHA256 keys.

eli finance timeseries --tickers GLD,SLV --range 1y --granularity 1d
02

LLM writes Python to compute

Need correlation? RSI? Daily returns? Eli doesn't estimate. It generates a Python script, executes it, and reads the actual output.

03

Self-corrects on errors

If the Python throws an exception, Eli reads the traceback, fixes the code, and retries. Autonomous debugging until the math works.

04

Confirms with news catalysts

Numbers without narrative are noise. Eli fetches news for the exact date of anomaly to confirm the "why" behind the data.

eli finance news --ticker NVDA --date 2026-01-09
05

Builds instincts through reflection

After each research session, Eli can record predictions and later compare them to outcomes. Over time, it builds institutional memory.

instincts/INTC_reflection.md

Built-In Finance Tools

timeseries OHLCV candles with flexible range and granularity (1m to 5y)
snapshot Point-in-time data: price, market cap, shares outstanding, EV
fundamentals Quarterly income, balance sheet, and cash flow statements
filings SEC filings (8-K, 10-K, 10-Q) with full text extraction
news News articles for a specific ticker on a specific date
search Find tickers by keyword (e.g., "semiconductors", "gold miners")

Why This Architecture

No Web Search = No Hallucination

Web search is disabled by design. Every claim must trace back to structured data from the finance tools. No "I found an article that says..." guessing.

Python Execution = Verified Math

When Eli says the correlation is 0.87, it's because it wrote and ran the numpy calculation. The code is in your working directory. Audit it.

Any LLM, Your Choice

Works with OpenRouter (free models like Devstral), Anthropic, OpenAI, or fully local with Ollama. Bring your own API key. Switch mid-session.

Zoom & Correlate Method

Start with 5-year monthly data for context, zoom to 10-day hourly for the event, add correlated assets, confirm with news. A systematic investigation protocol.

Approval Modes

Run fully autonomous with /auto or require human approval for each command with /plan. Read-only mode for pure research.

Research Artifacts

Every session saves a markdown report to eli_research/ with the query, methodology, data sources, and conclusions. Institutional memory built in.