There's a question I keep circling back to in this newsletter: when does a heuristic stop being useful? In Issues 1 through 6, I've mostly framed this as a technical problem. CIOH breaks when CoinJoin mixes inputs. Taint analysis breaks when funds pass through exchanges. Change detection breaks when Taproot removes address type signals. Each failure is a technical countermeasure defeating a technical method.
But there's another way heuristics stop being useful that has nothing to do with cryptography or protocol design. Sometimes the analysis is technically possible but economically pointless. The cost of performing it exceeds the value of the information it produces.
I've started calling this the economic privacy floor, and I think it explains more about real-world Bitcoin privacy than most of the cryptographic techniques that get the attention.
The cost of a trace
When I trace a transaction in TrailBit, each hop takes work. Not just computation — analyst time. At each step I have to apply CIOH, check for change outputs, assess whether a CoinJoin occurred, cross-reference against known entity databases, and make judgment calls about which path to follow. A careful single-hop analysis might take 10 minutes. A 20-hop trace through branching paths can take hours.
Now scale this up. A forensic investigation into a real theft or laundering operation might involve thousands of addresses, hundreds of branching transaction chains, and weeks of analyst time. Blockchain forensics firms charge for this. Compliance teams have budgets. Courts require documentation of methodology at every step.
The economics are straightforward: if someone steals 100 BTC, the investigation budget can justify extensive analysis. If someone moves 0.01 BTC through a moderately complex set of transactions, nobody is going to spend $50,000 in analyst time tracing it.
This isn't a privacy technique. Nobody designed it. But it functions as privacy for the vast majority of Bitcoin transactions that simply aren't worth investigating.
Fragmentation as a privacy strategy
Here's where it gets interesting. You don't need CoinJoin, PayJoin, or any privacy-enhancing technology to raise the cost of tracing your transactions. You just need to fragment your funds.
Take 1 BTC and split it across 50 transactions into 50 small UTXOs held by 50 different addresses in your own wallet. From a technical standpoint, CIOH will cluster most of these addresses together if you ever consolidate them. Change detection can follow the splits. Taint propagation can track each fragment. None of the heuristics are broken.
But an analyst now has 50 paths to follow instead of one. Each path needs the same per-hop analysis. The transaction graph has expanded from a simple chain into a tree, and the number of terminal nodes grows with each additional split. The analysis is still possible in theory. In practice, the budget runs out before the analyst reaches the end of every branch.
This is privacy through economics, not cryptography. The heuristics still work at each individual step. What breaks is the economic case for applying them comprehensively.
Why this matters more than people think
The Bitcoin privacy conversation is dominated by technical solutions. CoinJoin. Atomic swaps. Lightning Network. Confidential transactions on Liquid. Each of these has genuine cryptographic properties that defeat specific heuristics. They're interesting, they're well-researched, and they get all the conference talks.
But most Bitcoin users will never use any of them.
CoinJoin requires specific wallet software, costs fees, and creates transactions that — despite improvements like WabiSabi — can still be fingerprinted. Lightning requires channel management and liquidity planning. Atomic swaps are complex and rarely used outside of technical communities. Confidential transactions exist on a sidechain with limited adoption.
Meanwhile, every Bitcoin user who makes ordinary transactions is already creating a natural level of fragmentation. Every time you receive Bitcoin to a new address, every time your wallet creates change, every time you receive payments from multiple sources — you're expanding the graph that an analyst would need to traverse.
The question isn't whether your transactions are technically traceable. They almost always are. The question is whether anyone will spend the money to trace them. For most people, the answer is no — and that's where their actual privacy comes from.
The gap nobody quantifies
I haven't seen much discussion of economic privacy floors in the forensic literature. That might be because the concept cuts against the implicit assumption that blockchain transactions are broadly traceable. It's a harder story to sell when you acknowledge that most traces are resource-constrained.
Any tool that operates at scale has to make trade-offs between depth and speed. Automated risk scoring needs to process millions of addresses, which means applying heuristics broadly rather than investigating each path manually. The result is approximations built on approximations. That works well enough for obvious cases — direct transfers from sanctioned addresses, large flows through known darknet markets. It degrades for anything that requires nuanced multi-hop analysis.
The gap between what automated scoring reports and what a thorough manual investigation would find is itself an economic privacy floor. An automated system might say "this address has 3% exposure to high-risk sources." What it doesn't say is whether that 3% figure would survive scrutiny if an analyst actually traced every contributing path. In most cases, nobody checks.
Implications for forensic research
If I'm right about this, it has consequences for how we think about blockchain forensics.
First, it means that forensic accuracy is not a fixed property of a method. It varies with the resources available. A well-funded investigation by the FBI into a $100 million hack will produce more accurate traces than an automated compliance scan on a $500 deposit. Same blockchain, same heuristics, different outcomes. When we publish error rates for forensic methods — like the 63% figure from Gong et al. that I cited in Issue 1 — we should specify the resource context. Error rates at scale, using automated tools, are probably worse than error rates for targeted manual investigations.
Second, it means that privacy is a spectrum defined partly by economics rather than a binary defined by technology. A user who fragments their transactions across 100 UTXOs hasn't achieved cryptographic privacy. They've raised the cost of analysis to a level where most adversaries won't bother. Whether that's "enough" privacy depends entirely on who might want to trace them and how much those people are willing to spend.
Third, it raises uncomfortable questions about who gets investigated. If forensic analysis is resource-constrained, enforcement follows the money in a literal sense: large amounts get traced, small amounts don't. This creates a system where small-value money laundering is effectively undetectable not because it's technically hidden but because it's economically invisible. Whether that's acceptable is a policy question, not a technical one.
The connection to Issue 6
This is why I ended the change address detection piece by noting that when heuristics degrade, the cost of accurate tracing increases. Every time a heuristic becomes less reliable — whether through Taproot adoption, CoinJoin usage, or just the natural entropy of a growing transaction graph — the analyst has to work harder per hop. More time spent means higher cost. Higher cost means fewer traces completed. Fewer traces means more de facto privacy.
The privacy community focuses on making heuristics fail technically. That's valuable work. But heuristics don't need to fail completely to create privacy. They just need to become expensive enough to apply that most analysts give up.
Where I might be wrong
I should flag the obvious counterargument: automation is getting cheaper. Machine learning models can traverse graphs faster than human analysts. Computing costs decrease over time. What's economically impractical to trace today might be trivial tomorrow.
This is a real concern. If forensic tools improve to the point where fully automated analysis is both fast and accurate, the economic privacy floor drops. The fragmentation strategy that works today might not work in five years.
But there's a countervailing force: the Bitcoin transaction graph is also growing. More transactions, more addresses, more branching paths. The graph's complexity increases the cost of analysis at the same time that automation decreases it. Which force wins is an empirical question that depends on the rate of improvement in forensic tools versus the rate of growth in blockchain complexity.
My guess — and it's only a guess — is that the economic privacy floor will persist for small-value transactions even as tools improve. The economics of attention are hard to escape. Even if analysis is cheap per hop, the number of possible paths grows combinatorially. At some scale of fragmentation, exhaustive analysis becomes impractical regardless of automation.
But I'm genuinely uncertain about this. If you work on forensic tooling and think I'm wrong, I'd like to hear why.
The uncomfortable truth
The real takeaway from six issues of examining Bitcoin heuristics is this: the system's privacy properties are determined more by economic incentives than by cryptographic guarantees or forensic capabilities. Most Bitcoin transactions are private in practice because nobody cares enough to trace them. A small number of transactions are traced because the amounts justify the cost. An even smaller number involve sophisticated privacy techniques that defeat the heuristics outright.
If you're building privacy tools, this should inform your threat model. You don't need to defeat every forensic technique. You need to raise the cost of analysis above the budget of your likely adversary.
If you're building forensic tools, this should inform your honesty about what those tools actually do. Automated risk scores are economic shortcuts, not ground truth. They work within their resource constraints. Outside those constraints, they produce numbers that look precise but aren't.
And if you're a user who just wants to transact without your financial life being mapped, the boring truth is that you probably already have more privacy than you think. Not because Bitcoin is private — it isn't. But because the economics of tracing your specific transactions don't make sense for most adversaries.
That's not a satisfying answer. Privacy shouldn't depend on being too small to notice. But it's the honest one.
Geo Nicolaidis
Builder, TrailBit.io
If you found this useful, subscribe to get the next issue in your inbox. Each issue breaks down a different heuristic used in Bitcoin forensics — what it assumes, where it breaks, and why it matters.