I thought that most of the risks listed in David Robertson’s catalogue of woes, “What is Mankind’s greatest peril?” at The Moderate Voice were pretty far-fetched. Yes, I think that the death of the sun, the sun getting too hot or bright to sustain life on earth, or a life-obliterating asteroid strike are all inevitable but they are of so low a probability within a human timeframe that they’re not worth worrying about. I think that others of the risks he identifies, e.g. running out of resources, are even less likely and are born from an ignorance of history, economics, and technology.
However, it did make me start thinking. Are there some near-term higher probability risks about which we should be concerned? I immediately thought of at least two.
One risk is that presented by do-it-yourself biological weapons. This abstract from ScienceDirect should give you the general contours of the problem:
Biological weapons achieve their intended target effects through the infectivity of disease-causing infectious agents. The ability to use biological agents in warfare is prohibited by the Biological and Toxin Weapon Convention. Bioterrorism is defined as the deliberate release of viruses, bacteria or other agents used to cause illness or death in people, but also in animals or plants. It is aimed at creating casualties, terror, societal disruption, or economic loss, inspired by ideological, religious or political beliefs. The success of bioterroristic attempts is defined by the measure of societal disruption and panic, and not necessarily by the sheer number of casualties. Thus, making only a few individuals ill by the use of crude methods may be sufficient, as long as it creates the impact that is aimed for. The assessment of bioterrorism threats and motives have been described before. Biocrime implies the use of a biological agent to kill or make ill a single individual or small group of individuals, motivated by revenge or the desire for monetary gain by extortion, rather than by political, ideological, religious or other beliefs. The likelihood of a successful bioterrorist attack is not very large, given the technical difficulties and constraints. However, even if the number of casualties is likely to be limited, the impact of a bioterrorist attack can still be high. Measures aimed at enhancing diagnostic and therapeutic capabilities and capacities alongside training and education will improve the ability of society to combat ‘regular’ infectious diseases outbreaks, as well as mitigating the effects of bioterrorist attacks.
The technology to engage in bioterrorism or biocrime is already available and will only become smaller, cheaper, and easier to use. And there will always be people with grievances or other motives. This risk may be mitigated through the democratization of knowledge, decentralization, and debureaucratization. All three of those are deeply unpalatable to people who owe their livelihoods to the preservation of the status quo. Translation: we can’t rely on the CDC in Atlanta to protect us from this risk.
Another risk is authoritarianism, not just in far away countries in Asia and Africa but in Europe or North America. 30 years ago when Francis Fukuyama wrote “The End of History”, that seemed pretty far-fetched but it seems a lot less so now. It seemed as though the tide of history were flowing towards liberal values but now I’m not so sure. Should authoritarianism come to Europe (again) or North America, it will be clothed in liberal values and the good of the people. As H. L. Mencken put it
The urge to save humanity is almost always only a false-face for the urge to rule it. Power is what all messiahs really seek: not the chance to serve.
I presume some readers will consider my failure to list climate change as a significant near-term risk a serious omission. I consider climate change a risk but not as near-term a risk as appeared to be the case a decade ago. Mitigating that risk would be easier if so many of those highly concerned about it weren’t Chicken Littles, Neo-Malthusians, hucksters, or profiteers or were more interested in changing their own behavior rather than the behavior of others.
Another risk that doesn’t seem to be as pressing as it did a couple of decades ago is too many people. Prosperity seems to be an adequate way of mitigating that particular risk.
Are there any other high or medium probability near-term risks to the human species about which we should be concerned?