Releases: DevMando/MandoCode
v0.10.0 — File-read and cloud-signin reliability
Three reliability fixes for the file-system plugin and the cloud sign-in flow, plus one cosmetic UI tweak.
Fixed
- Nested same-name folder reads —
read_file_contents,write_file, andedit_fileno longer mangle paths when
a project's folder layout repeats the project root's last segment (e.g. theMyApp/MyApp/MyApp.csprojpattern from
dotnet new).GetFullPathnow resolves the literal path first and only falls back toStripRedundantRootPrefix
when nothing exists at the un-stripped target. Previously the heuristic always stripped a matching trailing segment,
so legitimate nested files appeared as "File not found" with the suggested "elsewhere" path identical to the requested
one. ollama signinnot launching after fresh install —RunOllamaSigninAsync,TryStartOllamaProcess, and
AutoPullAsyncnow resolveollamavia the newResolveOllamaExecutable()helper, which falls back to canonical
install paths (%LOCALAPPDATA%\Programs\Ollama\ollama.exe,/opt/homebrew/bin/ollama,/usr/local/bin/ollama, etc.)
when bareollamaisn't on the running process's PATH. Fixes the silent "browser never opens" failure when MandoCode
is launched in the same shell that just installed Ollama (installer PATH updates only apply to new processes).- First-run wizard imperative output getting repainted —
InitializeConnectionAsyncnow wraps
RunOnboardingFlowAsyncin_setupActive = true/false, suppressingHomeViewfor the entire wizard run instead
of just the 401 auto-recovery path. Closes the latent VDOM redraw race that could clobberollama signinURLs, error
messages, or progress lines during first-run setup.
Added
- Auto-open browser on
ollama signin—RunOllamaSigninAsyncnow captures stdout/stderr, scans for the first
https://URL Ollama prints, and callsOpenInBrowserdirectly so the sign-in page opens reliably regardless of
Ollama's own browser-launch behavior. The URL is also re-emitted viaAnsiConsole.MarkupLineas a clickable link with
a "If your browser didn't open, copy the URL above" follow-up — survives any subsequent VDOM redraw and gives users a
copy/paste fallback.
UI
- Green selection highlight in setup wizard —
OnboardingFlow.Selectnow appliesHighlightStyle(new Style(foreground: Color.Green))to everySelectionPrompt, replacing Spectre's default blue. Affects the model
picker, "What would you like to do?", "Sign me in now" branch, "Pick a starter model to install", and every other
arrow-key choice in the wizard.
Install / update: dotnet tool update -g MandoCode
Full changelog: v0.9.9...v0.10.0
v0.9.9
🎯 v0.9.9 — Onboarding overhaul
First-run is now a guided wizard instead of a static "install Ollama, run serve, pull a model, then come back"
panel. The path from dotnet tool install -g MandoCode to first chat is auto-detected, auto-recovered, or
auto-installed at every step where it's possible. New users hit zero dead-ends; existing users get faster /config, a
new /model quick-switch, and a --doctor preflight.
Highlights
- 🪄 Guided first-run wizard — auto-fires on first run and via
/setup. Walks you through Ollama install → daemon
start → cloud sign-in → model pick → context size, all from inside the terminal. - 📦 Auto-install Ollama — runs
winget install Ollama.Ollama(Windows),brew install ollama(macOS), orcurl install.sh(Linux) for you. Inherited stdio so you see the installer's progress and can answer prompts. Falls back to
opening ollama.com/download if anything fails. - 🔐 Auto-launch
ollama signin— wizard spawns the CLI command, browser opens, the local token is written
automatically. Fixes the "I signed in on the website but chat still 401s" trap. - 🚀 Auto-launch
ollama serve— when the daemon's down, the wizard offers to start it with a live-progress
spinner. - 🛟 401 auto-recovery on chat — if a chat hits 401 mid-session, the cloud sign-in walkthrough fires immediately.
No need to type/setup. - 🩹 Trailing-slash heal —
http://localhost:11434/is silently corrected. - 🎯 Wrong-port auto-fallback — when
ollama servesucceeds but your configured URL doesn't reach the daemon, the
wizard auto-probeshttp://localhost:11434and switches if reachable. No retype required. - 🪟 PATH-refresh detection on Windows — checks canonical install paths
(%LOCALAPPDATA%\Programs\Ollama\ollama.exeetc.) whenwhere ollamafails, so post-install detection works without
relaunching mandocode. - 🧠 Hardware-aware local model picker —
qwen3.5:0.8b(CPU-only) →2b→4b→9b(8+ GB VRAM) with size and
hardware expectations spelled out. Auto-pulls your selection. - ☁️ Cloud upsell after local pull — informational tip pointing to
minimax-m2.7:cloudas the more capable
alternative (free withollama signin).
New commands
| Command | Description |
|---|---|
/setup |
Guided wizard — reconnect to Ollama, install/sign in, or pick a different model |
/model |
Quick switch — pick a different model + context size |
mandocode --doctor |
Preflight check from the CLI: .NET runtime, Ollama status, models pulled, sign-in state. |
| Exits 0 if green, 1 if anything's off. |
UI improvements
- k-notation max-tokens picker (
32k,128k,200k) with current value highlighted at the top via← current
marker - Tiered context-size guide with educational intros on temperature and max-tokens (what tokens are, relationship
to model context window, per-tier usage examples) - VDOM-aware text input for URL entry in
/setupand/config— RazorConsole'sTextInputinstead of Spectre's
TextPrompt, fixing dropped keystrokes alongside the live render loop. Pre-filled with current URL ("press Enter to
keep current") - Combined cloud/local model picker with
(cloud)/(local)badges. Replaces the old two-step "Cloud or Local?
→ list" flow. - Inline color tags (
<red>,<green>,<yellow>,<cyan>) in the markdown renderer for chat error responses.
401 errors now showError:in red and the recommended action in green; backtickedollama signinkeeps its purple
inline-code styling.
Changed
- Default cloud model:
minimax-m2.5:cloud→minimax-m2.7:cloud(config defaults, README,/learn, system
prompt, error messages) - Default
MaxTokens: 4k → 32k for new installs (existing configs preserved) MaxMaxTokens: 256k → 200k after testing showed many cloud models hit a practical limit closer to 200k once
system prompts and tool definitions are accounted for. Existing configs with 256k saved get clamped on next load./helptable reorganized with/setupand/modelnext to/configand disambiguated descriptions- 16+ failure messages across the app now consistently surface
/setupand/retryas recovery paths
Fixed
- Status-aware
/api/tagsfetch with auto-retry — transient daemon hiccups no longer misroute users with pulled
models into the "no models yet" flow - Post-pick cloud auth check via
/api/generate— pulled cloud models that show in/api/tagsafter sign-out no
longer cause the wizard to falsely report "Setup complete" before chat fails with 401 - Picked-model validation via
/api/showbefore the wizard declares success — surfaces models listed in/api/tags
but not actually loadable - Misleading "Couldn't start Ollama" error when
Process.Startsucceeded but the URL didn't reach the daemon (port
mismatch); now distinguishes "process didn't launch" from "process launched but URL unreachable" <Markup>Razor component bracket markup — failure-state lines that used[yellow]…[/]now render in actual color
viaForegroundpropsLearnContent.Display()double-render on everyStateHasChanged
Breaking changes
- Configs with
MaxTokens > 200k(i.e. 256k saved) get clamped to 200k on next load. The picker shows 32k highlighted
next time/configor/modelopens — re-pick if you want. - Configs without an explicit
maxTokenskey now default to 32k instead of 4k. Configs with"maxTokens": 4096are
unaffected.
Install
dotnet tool install -g MandoCode # new install
dotnet tool update -g MandoCode # upgrade from 0.9.8
mandocode # launch — wizard auto-fires on first run
mandocode --doctor # troubleshoot if anything's offv0.9.8
Highlights
- 🧩 Skills support — auto-invoked by the LLM, with a force-skill command (#43)
- 🔌 MCP client support — interactive setup wizard (#44)
- 📋 Paste handling + batch-deny on tool denial (#45)
- 🖥️ Markdown rendering rebuilt on RazorConsole's translator pipeline (#47)
- ⏱️ Streamed
execute_commandoutput with idle-based timeout (#48)
Install / Update
```
dotnet tool update -g MandoCode
```
What's Changed
- v0.9.7 — Task planner redesign… (#42)
- Skills support (#43)
- MCP client support (#44)
- Paste handling + batch-deny (#45)
- README logo update (#46)
- Markdown rendering on RazorConsole (#47)
- Streamed execute_command output (#48)
- CHANGELOG for v0.9.8 (#49)
Full Changelog: v0.9.6...v0.9.8
v0.9.6
Changelog
All notable changes to MandoCode will be documented in this file.
[0.9.6] - 2026-04-06
Added
- Unit test project with 65 tests covering
InputStateMachineandDiffService- Text input, cursor movement, backspace/delete, submit
- Command autocomplete filtering, dropdown navigation, accept/dismiss
- Command history navigation with saved input restoration
- Paste handling with newline-to-space conversion
- Static helper tests (
IsCommand,GetCommandName) using parameterized[Theory]tests - Diff computation for new files, modifications, deletions, identical content
- Line ending normalization (Windows
\r\n, old Mac\r) - Large file fallback with sampled diffs
- Context collapse with configurable context lines
- Empty response recovery — when the model returns blank (context overflow), MandoCode now shows a helpful message instead of freezing with a blank screen
- Rendering timeout guard — markdown rendering runs with a 30-second timeout and a "Rendering..." spinner after 1 second, preventing permanent freezes on large responses. Falls back to raw text if rendering exceeds the limit.
Fixed
- Timeout retry deadlock — when a request timed out (5-minute limit),
RetryPolicywas treating the timeout as a transient error and retrying the entire request including all file reads. With 2 retries, this could silently hang for up to 15 minutes. Now the cancellation token is properly passed toRetryPolicyso timeouts fail fast. - Silent exception suppression — replaced 6 bare
catch { }blocks withcatch (Exception ex)logging toDebug.WriteLineacrossTerminalThemeService,App.razor,FileSystemPlugin,FunctionInvocationFilter, andShellCommandHandler - Shell command injection surface —
ShellCommandHandlerandFileSystemPluginnow useProcessStartInfo.ArgumentListfor proper argument escaping instead of manual string concatenation withcmd.Replace("\"", "\\\"") - Inconsistent config validation —
Program.csaccepted anymaxTokens > 0whileValidateAndClamp()enforced[256, 131072]. All validation now uses centralized constants and helpers onMandoCodeConfig
Changed
- Config validation constants (
MinTemperature,MaxTemperature,MinMaxTokens,MaxMaxTokens) and helpers (IsValidTemperature,IsValidMaxTokens) are now defined once onMandoCodeConfigand referenced byProgram.cs,ConfigurationWizard.cs, andValidateAndClamp()
MandoCode v0.9.5 – Unified Input, Smarter Onboarding, Slicker UX
Highlights
- Unified input state machine – Both the VDOM prompt and the imperative console loop now share the same InputStateMachine, so autocomplete,
history, and paste behave identically everywhere. - Seamless autocomplete – / and @ dropdowns no longer glitch; selecting a file or command instantly snaps back to the VDOM prompt with your
text intact. - Robust paste handling – Windows’ “large paste” confirmation no longer injects stray Enter keystrokes; pasted text is normalized and
inserted safely.
VDOM & input pipeline
- Async PromptInput bridge – PromptInput now awaits the autocomplete hand-off, drains buffered keystrokes, and feeds them back into the
shared state machine, eliminating the “typing on an empty console” flash. - @ browsing polish – Accepting directories/files keeps the dropdown state and immediately repopulates the VDOM prompt, so you can chain
references without retyping.
Smarter onboarding
- Connection lifecycle – Clear “🔄 Connecting…” → “✅ MandoCode is ready!” flow with /retry at any time.
- Offline guidance – A yellow panel explains exactly how to start Ollama (install, ollama serve, pull a model) and keeps /config + /learn
accessible. - /retry command – Rechecks the endpoint, re-validates the model, and updates the UI without restarting the CLI.
UX & personality
- Modern banner – Clean gradient “MANDOCODE” figlet replaces the wolf animation for a sharper first impression.
- Help tweaks – /exit is the only exit command; /retry and an @file tip are surfaced in the help table.
- Spinner upgrades – Loading messages pull from a new hip-hop / 90s tech / Matrix / CS 1.6 list, and every Spectre spinner style auto-loads
so animations never break.
Token & output polish
- Session totals still print above the prompt for quick reference.
- Per-response summaries remain right-aligned ([~in, out: tok/s]) without cluttering the scrollback.
Docs
- Added VDOMArchitecture.md, RazorConsoleComponents.md, AutocompleteSkill.md, and TaskPlanner.md explaining the new state machine + VDOM
bridge and how the CLI renders messages.
dotnet tool update -g MandoCode
Update with and enjoy the smoother experience!
Thanks to everyone who tried v0.9.2 , it had quite a bit of major breaking bugs, I hope this version serves you better.
THANK YOU!
v0.9.2
What's New
- Arrow key navigation — move cursor left/right within your input with arrow keys, Home/End to jump to start/end, Delete key
support - Fixed paste detection — fast typing no longer falsely triggers paste mode
- Fixed cursor after paste — pasting long text no longer freezes input or breaks Enter submission
- Pasted text shown inline — no more [pasted N characters] summary, actual text is displayed
Install / Update
dotnet tool update -g MandoCode