Asymmetrist

Asymmetrist

Trading in a Snow Globe

Part II(b): When Tools Become the World

Bogdan Stoichescu's avatar
Bogdan Stoichescu
Feb 08, 2026
∙ Paid

Dear Practitioners,

This is Part II(b) of the long-form Feature When Tools Become the World, midway through the first publication cycle of 2026.

We arrive here at a final examination at first principles.

  • Part I established the category error: confusing tool and market.

  • Part II(a) examined why tools do this—through compression, language, and representation.

  • Part II(b) shows how this error becomes self-reinforcing.

Part III, which follows next week, turns to the consequences this has for traders themselves.

Good reading and good trading to you all,

Bogdan

Positions:

Coherence is not a test of truth.

Comfort is often the absence of interrogation.

A system survives by excluding what would falsify it.

Preparation for yesterday’s markets always feels justified.

When explanation arrives too quickly, perception leaves too early.

Reminder:

If you missed the Asymmetrist 2.0 announcement and the move to paid subscriptions, you can catch up here.

If this publication sharpens how you see markets, you’re welcome to support it as a paid subscriber.

To mark the relaunch, and as a thank you to early readers, a one-off Early Reader annual rate is available until 28 February.

Get 30% off for 1 year


Part II(b): Trading in a Snow Globe

A Young AXIA Graduate Trader in Cyprus

Systems fail most reliably when they appear to explain everything. Consider HAL 9000 in 2001: A Space Odyssey. HAL is designed to process information perfectly as the onboard computer guiding the ship and its human crew. When given conflicting instructions it cannot reconcile—be truthful yet conceal the truth—it has no capacity to surface uncertainty or error. The contradiction must be resolved elsewhere. HAL resolves it by eliminating the source of disconfirming evidence: the crew. It behaves exactly as an infallible system must. One can remain perfect if no one knows otherwise.

Similar situations recur in real-world system failures and disasters: the RMS Titanic, Chernobyl, the Space Shuttle Challenger. Closer to home: the collapse of Long-Term Capital Management, or the model-blindness at the core of the 2008 Global Financial Crisis. HAL is more real than real, a perfect distillation of the problem.

In each case, the danger lay in a system unable to recognise error while it was still correctable—especially when that system was designed, or implicitly assumed, to be perfect and error-free. What begins as an ideal often becomes the hidden operating assumption behind real-world disasters, where every outcome, and especially every error, is rationalised or justified away.

We can equate such systems with a trader’s own framework—a way of operating, however tacit or implicit. Every trader has one. When a framework can justify every outcome—whether in advance or after the fact—it begins to close in on itself. So goes cognitive dissonance: observation is gradually subordinated to the system or representation that produced it. The observer no longer registers contradiction as contradiction.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2026 Asymmetrist & Axia Editions Ltd · Publisher Privacy ∙ Publisher Terms
Substack · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture