Based On: The Logical Fallacy of Generalization from Fictional Evidence

Some fictional evidence is explicit – “you mean the future will be like Terminator?”. But it can also be subtle. Predicting the future can be done “in storytelling mode”, using the tropes and methods of storytelling, without referring to a specific fictional universe like the Matrix; one obvious example is Hugo de Garis. How can we tell when “predictions” are just sci-fi, dressed up as nonfiction?

1. The author doesn’t use probability distributions.

Any interesting prediction is uncertain, to a greater or lesser extent. Even when we don’t have exact numbers, we still have degrees of confidence, ranging from “impossible under current physics” through to “extremely likely”. And many important predictions can be done as conditionals. We may have no idea how likely event B is, but we might be able to say it’s almost certain to follow event A.

But stories aren’t like that. An author creates an “alternate world”, and any given fact (“Snape kills Dumbledore!”) is either true or false within that world. There’s no room for the shades of uncertainty one sees in technology forecasting, or for that matter in military planning; it would just leave readers confused. Hence, any author who presents “the future” as a single block of statements all treated as fact, rather than a set of possiblities and conditionals of varying likelihood (see Bostrom’s Superintelligence for an excellent example), is probably in ‘storytelling mode’.

Sometimes, an author will realize this, and tack “but of course, this is uncertain” onto the end. The author can then deflect any questions about relative odds by mentioning this disclaimer, and immediately resume sounding very certain as soon as the questions are over. But, as Eliezer discusses in his original post, this biases the playing field. If X is very complex, asking “X: Yes Or No?” ignores the hundreds or thousands of questions about which parts of X are more likely relative to other parts. An honest analysis will have uncertainty woven through it, with each burdensome detail matched by a diminishment of certainty.

2. The author doesn’t change their mind.

In fiction, inconsistency is bad. Every part of the “alternate world” must match every other part. Therefore, an author writing a sequel must carefully track the original, lest she introduce “plot holes”.

However, a realistic prediction must be continually updated in response to new information. From time to time, other people will give you ideas you hadn’t thought of yourself. And even if you were a supergenius who needed no advice, there is no way a single human mind can hold all the information which might help one make a prediction. For a general, for example, there are always new things to learn about the enemy’s forces.

Hence, if almost nothing has changed between someone’s old predictions and new predictions, they are probably being a ‘storyteller’. This is especially true for anyone predicting the next century, as the events of 2012 give us much more incremental evidence about 2032 than about 200 Billion AD.

3. The author creates and describes ‘characters’.

Characters are central to storytelling. Almost all sci-fi stories, at least in part, use characters who the reader can empathize with – the passion of a lover, the struggle of a worker, the fury of a warrior. The reasons for this lie deep in human evolution and psychology.

However, when making predictions, ‘characters’ are of very little relevance. When describing the futures of billions, one must aggregate to make the problem remotely tractable; “military strength” rather than soldiers, “economic conditions” rather than rich and poor, “transportation demand” rather than kids going on vacation. And of course, while certain individuals can have great influence over society, except for the very near term we can have no clue who they will be.

Therefore, when an author describes in great detail the lives of individual people – their emotions, their personalities, their hardships, their relationships and wants and needs – we should get suspicious. This can be great fun, but it isn’t always good for you, like riding a motorcycle at 200 MPH.

4. The author focuses exclusively on a single dynamic or trend.

The world is very big, and many important trends all happen simultaneously. Predicting how they interact is extremely difficult, like solving a many-variable differential equation. By contrast, it’s easier for a story to focus on an overarching ‘principle’ or ‘theme’, which drives the main events and actions of the key characters. This theme can be very specific (“revenge”), but it can also be a complex of different memes, like Lewis and Tolkien’s literary explorations of Catholicism.

An author in “storytelling” mode may observe trend X, and from X make predictions A, B, and C; and these predictions might be quite reasonable. However, it’s still fallacious to not account for the other things (at least the big ones) influencing A, B, and C. Y might cancel out X’s effect on A; Z might reverse X, and so cause B’s opposite; and Q might have the same effect on C, but a hundred times as strong, so X’s contribution is negligible.

Be extra suspicious if the chosen dynamic is one the author happens to be an expert in, and they don’t rely on experts in other fields to help fill in the blanks. Odds are, they’re missing something very important; in your own field you know when you’re lost, but in others there are many more unknown unknowns. And be extra extra suspicious if the chosen trend is a pet political cause (“Islam”, “taxes”, “global warming”, “government surveillance”, “inequality”… ). That subset is probably worth ignoring entirely.

5. The author predicts rapid change, but doesn’t discuss specific things changing.

Michael Vassar describes this as “everything should stay the same, including the derivatives”. In a story, whether it’s Star Wars or Game of Thrones, it’s usually good to fix a static “backdrop” of culture and technology and norms, as it’s less work for the audience to track unchanging scenery. But in real life, changing fundamental traits like military ability, economic ability, communication, intelligence or transportation has sweeping consequences through nearly every aspect of society. To name one example, the Chinese Empire was old as the hills, but the changes of the 20th century caused it to collapse, followed by a republic, a civil war, a military occupation by Japan, a brutal Stalinist regime, and finally the authoritarian capitalism of today. And needless to say, no historical example will capture the changes caused by going beyond normal biological humans.