Hopes and fears
To help us grasp the shape and scope of these challenges, the Millennium Project – an international think tank – releases an annual State of the Future report, which outlines the major hurdles facing humanity over the next 35 years. It illustrates our complicated relationship with science and technology. Just as the beginning of the industrial revolution influenced the underlying themes of Mary Shelley’s Frankenstein, we too are worried about the unforeseen complications that the latest developments could bring.
The report tells us of the great hopes that synthetic biology will help us write genetic code like we write computer code; about the power of 3D printing to customise and construct smart houses; of the future of artificial intelligence where the human mind and the computer mind meet, rather than conflict.
But at the same time, the authors of the report – Jerome Glenn, Elizabeth Florescu and their team – express fears that there is a great chance we could be outstripped in pace by the evolution of scientific and technological development. The authors suggest that we seek out human-friendly control systems, since advances in these fields mean that lone individuals could make and deploy weapons of mass destruction.
There are two concerns here: one to do with agency, the other relating to structures. Individuals have the potential to use scientific and technological advances to cause harm. This is a growing problem, as science and technology continues to degrade what Max Weber referred to as the state’s “monopoly on violence”.
To reduce the risks associated with agency, we will rely on structures that encourage good behaviour, such as systems for justice, education and the provision of basic necessities for life.
But it is not clear how we will arrive at such structures, and where the responsibility to develop them will fall; whether it’s to regions, states or international organisations. This is especially pressing, as many states have either foregone a welfare system, or are in the process of destroyingit. It’s unclear where education and training come in, or how regulatory control is to work across so many local, national, societal, and commercial boundaries.
An ethical approach?
Whether or not our global society is outstripped by science and technology largely depends on us. And this is part of the problem, as William Nordhaus warned us as early as 1982, in his work on the Global Commons. The report calls for an ethical approach to creating systems, forms of information, and models of control that would allow us to engage with science and technology as it develops.
This means embedding ethical considerations into the way we think about the future. The authors want a larger discussion on global ethics, such as that we have seen rooted in the work done by the International Organisation for Standardisation – the world’s largest developer of voluntary international standards.
Ultimately, where we end up in relation to science and technology is a matter of coming to terms with how we interact with these developments. Until we do so, a safe and prosperous world may elude us.