psb-gemma-psb-thinking-mistakes

Source

  • Type: local-file
  • Path: /home/topher/.openclaw/workspace-psb-gemma/memory/psb-thinking-mistakes.md
  • Bytes: 2675
  • Updated: 2026-05-03T01:24:13.663Z

Content

# Mistakes Log - Active Corrections
 
**Purpose:** Track specific mistakes, incorrect inferences, and wrong assumptions. 
**Rule:** Before any non-trivial task, search this file + corrections.md for relevant patterns.
 
---
 
## 2026-03-31: Inference Errors
 
### Mistake: Assuming "hidden" when ports are open
- **What happened:** Agent claimed Shodan can't scan the Pi, assumed "hidden behind NAT"
- **Reality:** Port 80 IS open to dashboard, DuckDNS running on Pi via 2890-claw agent
- **Correction:** Never assume "hidden" when user has explicitly opened ports. Check actual exposure.
- **Pattern to watch:** Security/network claims → verify actual port status first
- **Source:** corrections.md (2026-03-19)
 
---
 
## Template for New Entries
 
```markdown
### Mistake: [Short description]
- **What happened:** [What did the agent do/say?]
- **Reality:** [What was actually true?]
- **Correction:** [What should happen instead?]
- **Pattern to watch:** [When should future agents check for this?]
- **Source:** [corrections.md | memory file | session transcript]
```
 
---
 
## Enforcement Rules
 
1. **Before responding to technical questions:** Search this file for relevant mistake patterns
2. **Before making claims about:** Security, network, hardware, system state → Check if similar mistakes exist
3. **After user correction:** Write here FIRST, then respond
4. **WAL Protocol:** Write to this file BEFORE responding to a correction (not after)
 
---
 
*Last updated: 2026-03-31*
 
---
 
## 2026-04-10: node-llama-cpp Load Failure
 
### Mistake: Dismissed user's valid concern, failed to diagnose
- **What happened:** User asked about node-llama-cpp at 01:39 UTC. I installed it (`npm install -g node-llama-cpp`) — succeeded. Tested loading → threw ERR_REQUIRE_ASYNC_MODULE. Did not report this failure back to user. Kept investigating other paths.
- **Reality:** node-llama-cpp 3.18.1 IS installed at /usr/local/lib/node_modules/node-llama-cpp but CANNOT be loaded by Node 24 due to ESM module graph issue. This is why OpenClaw cannot use the "local" embedding provider (600s timeout). Falls back to Ollama (120s timeout) → CPU embeddings time out → memory breaks.
- **Correction:** When npm install succeeds but module fails to load = failure, not success. Report immediately. node-llama-cpp not loading = local provider unavailable = memory pipeline fails on this machine.
- **Fix options:** (1) Reinstall OpenClaw to rebuild native module for Node 24, or (2) Wait for GPU to make Ollama embeddings fast enough
- **Pattern to watch:** Package installs but doesn't load → investigate WHY and report
- **Source:** 2026-04-10 session, user corrected at 01:39 UTC
 

Notes

  • No related pages yet.