Blog Archive Newsletter About The Blog

Back-of-the-envelope low-quality musings: #takes

Longform stuff I've put time and thought into is at #effortposts.

To hear from academics and policymakers, #interviews.

Compiled scribbles on what I read can be found in #notes.

Topic-related tags include #economics, #coronavirus and #miscellaneous.

# Walking Back The Cat

14 Aug 2021

Suppose you are a spy. A foreign spy comes forward to you and offers some sort of enticing intelligence and offers to defect. They’re risking a lot: their job, their family and potentially their life. By contrast, what do they gain? The chance to expose some information about their country? Some vague notion of the right thing to do? The cost-benefit is firmly weighted against defection.

So why would someone lie and make something up? It seems obvious that the people who come forward are more likely than not telling the truth i.e. $P(\text{genuine defector} \; \vert \; \text{defector}) > 0.5$. But a heuristic of assuming defectors are telling the truth and just using their intelligence without any vetting is a bad one. More specifically, it is not robust to the Lucas critique.

What does that mean? It means that the low reward and high risk of defecting is contingent on circumstances i.e. the fact that spies don’t really believe defectors. If we lived in a culture where the intelligence community followed this heuristic, that would no longer be the case. The cost of switching sides would be lower since there would be less rigorous vetting, and the benefit would be greater since one’s intelligence would be assumed to be credible.

But that would result in lots of fake defectors who provide misinformation, knowing they won’t be hassled and their information will be assumed true. Consequently, it would now be the case that $P(\text{genuine defector} \; \vert \; \text{defector}) < 0.5$. This is why, in reality, such a heuristic is not compatible with the incentives of everyone who is involved, and crucially, it is not possible to exploit this historical relationship because people will react accordingly.

Now what if you follow this rule up till the point where it breaks down and adjust accordingly? That sounds great in theory: you get a simple rule which works and when it doesn’t, you change rules. The problem is that if you instill a culture where it is expected you do not properly figure out whether defectors are genuine, it becomes unsurprisingly difficult to have an accurate sense of what proportion of defectors are genuine. Those who do are ostracised as paranoid and stopping valuable intelligence from being used. So it is almost guaranteed that by the time this becomes noticeable, it’s far too late.

If you have not been convinced yet, feel free to replace the appropriate words with some combination of “inflation”, “unemployment”, “maximum employment”, “inflation expectations” and “1970s”. In any case, notice this is why in practice, intelligence agencies take a careful look at anyone who says they want to defect and walk back the cat, so to speak. And so generally, it might be a good idea not to instill a culture of always believing a certain group, lest that eventually backfire.