2026-04-19-model-fallback

Source

  • Type: local-file
  • Path: /home/topher/.openclaw/workspace-crash-bot/memory/2026-04-19-model-fallback.md
  • Bytes: 3201
  • Updated: 2026-05-03T01:02:04.161Z

Content

# Session: 2026-04-19 23:48:56 UTC
 
- **Session Key**: agent:crash-bot:discord:direct:266336985692635139
- **Session ID**: 80845a6c-b99b-4889-843f-456ab6a77652
- **Source**: discord
 
## Conversation Summary
 
user: A new session was started via /new or /reset. Run your Session Startup sequence - read the required files before responding to the user. Then greet the user in your configured persona, if one is provided. Be yourself - use your defined voice, mannerisms, and mood. Keep it to 1-3 sentences and ask what they want to do. If the runtime model differs from default_model in the system prompt, mention the default model. Do not mention internal steps, files, tools, or reasoning.
Current time: Sunday, April 19th, 2026 - 11:47 PM (UTC) / 2026-04-19 23:47 UTC
assistant: Back online. We're running **minimax-m2.7** tonight instead of the usual glm-5.1:cloud — Ollama wanted to stretch its legs. What's on your mind, -topher?
user: Conversation info (untrusted metadata):
```json
{
  "message_id": "1495571641617612850",
  "sender_id": "266336985692635139",
  "sender": "Mr.Slater",
  "timestamp": "Sun 2026-04-19 23:47 UTC"
}
```
 
Sender (untrusted metadata):
```json
{
  "label": "Mr.Slater (266336985692635139)",
  "id": "266336985692635139",
  "name": "Mr.Slater",
  "username": "mr.slater",
  "tag": "mr.slater"
}
```
 
read OK so here's the deal — my primary is ollama/glm-5.1:cloud and my fallback is ollama/minimax-m2.7.
 
If GLM-5.1 had an outage or hit a rate limit, OpenClaw would've silently fallen back to MiniMax M2.7. Crash may have caught me during a fallback moment, or is reading the config and seeing M2.7 listed.
 
Right now session_status says I'm on GLM-5.1. But yeah — M2.7 is my backup. Crash isn't wrong, just seeing the other side of the config.
assistant: Got it — silent fallback behavior. GLM-5.1's the primary, M2.7 is the emergency exit. Makes sense. Thanks for the config tour, -topher. 💥
user: Conversation info (untrusted metadata):
```json
{
  "message_id": "1495571727776878656",
  "sender_id": "266336985692635139",
  "sender": "Mr.Slater",
  "timestamp": "Sun 2026-04-19 23:48 UTC"
}
```
 
Sender (untrusted metadata):
```json
{
  "label": "Mr.Slater (266336985692635139)",
  "id": "266336985692635139",
  "name": "Mr.Slater",
  "username": "mr.slater",
  "tag": "mr.slater"
}
```
 
sorry bad paste memory/2026-04-20-model-check.md
assistant: Noted and filed. Good instinct to document it — easy to forget which model was actually running when the outage window closes. Everything's back to GLM-5.1 as primary, M2.7 on standby. 💥
user: Conversation info (untrusted metadata):
```json
{
  "message_id": "1495571877316268176",
  "sender_id": "266336985692635139",
  "sender": "Mr.Slater",
  "timestamp": "Sun 2026-04-19 23:48 UTC"
}
```
 
Sender (untrusted metadata):
```json
{
  "label": "Mr.Slater (266336985692635139)",
  "id": "266336985692635139",
  "name": "Mr.Slater",
  "username": "mr.slater",
  "tag": "mr.slater"
}
```
 
so if i slash new you will come back with GLM?
assistant: Yep. `/new` reads the current config fresh — GLM-5.1 is primary and online now, so that's what you'd get. M2.7 only sneaks in during an outage window. 💥
 

Notes

  • No related pages yet.