Last night, I re-watched the film Minority Report through my iTunes account, as a little exercise in exploring the real science and technological inspiration behind scifi films.
I was interested in watching the film again after speaking with David Kirby, a Senior Lecturer in Science Communication Studies at the University of Manchester and author of Lab Coats in Hollywood. Against the backdrop of a Frankenstein-like figure life-size cardboard cutout in his office, Kirby told me that the gesture-based computer interface technology in Minority Report was what he calls a ‘diegetic prototype,’ designed and built by John Underkoffler, a then PhD student at MIT Media Lab and current Chief Scientist for G-Speak, LLC, a firm that has brought that diegetic prototype to life.
“John, thank you for making science fiction real.”
Given the opportunity to design the technology for Minority Report as a Hollywood science consultant, Underkoffler literally took off running with the prototype, approaching it as if it were a real R&D project, not a display trick for Hollywood. As a result, he was approached by individuals, organizations and companies that later saw the film, who asked Underkoffler whether the technology was real, and if not, whether it could be made real.
“From Underkoffler’s perspective, his work as a science consultant on Minority Report was not simply a minor component in this story; his well-worked-out diegetic prototype was the crucial element in the development process,” Kirby wrote in Lab Coats in Hollywood.
I personally find it fascinating that vivid depictions of new technologies in science fiction can inspire and create desire for their real-life counterparts. I also wonder why we as viewers and consumers love the idea of operating new technologies that we first read about in science fiction novels or watched on the big screen. Is it because, in the contextualizing and humanizing of the technology through a story, we feel like we know the technology, and we want it more than a technology we don’t know?
As I watched Minority Report, paying especial attention to those gestural interface technology scenes because of my talk with David Kirby, I was struck by other elements of the film that were prescient in their depiction of future scientific advancements and human use of such advancements.
For example, even the idea of “Precrime” is an instance of taking a scientific possibility – the prediction of future crime events – and showing its potential benefits and dangers. Although the idea of using visions seen by the children of drug-abusers, the “precogs,” is certainly more fantasy than science, using data mining and crime analytics to build patterns and predict future events hits closer to home.
“What is Big Data? A meme and a marketing term, for sure, but also shorthand for advancing trends in technology that open the door to a new approach to understanding the world and making decisions. There is a lot more data, all the time, growing at 50 percent a year, or more than doubling every two years, estimates IDC, a technology research firm. […] Google searches, Facebook posts and Twitter messages, for example, make it possible to measure behavior and sentiment in fine detail and as it happens.” – Steve Lohr
What if Big Data took the role of the “Precogs” in Minority Report, incriminating certain groups of people, or even individuals, based on crime patterns and trends that emerge from the analysis of massive amounts of data? And just as with the Procogs, the problem with using this resource for crime prediction, for example, is that while analysis of large amounts of data can identify trends and patterns, individual crime prediction carries both logistic and ethical problems.
“Police forces in the UK have built up masses of crime data through years of police work, and now that computer power’s getting cheaper, it’s possible to look back at that data and analyze it, to try to build a picture a possible future and predict crimes that might happen before they take place,” Susan Watts, science editor of the BBC's Newsnight programme, said in a video about big data and crime prediction published on the BBC technology news page on April 3, 2013.
Government collection of massive amounts of civilian data including phone call records, social media and online activity is also, unfortunately, not a foreign idea in our current world. Crime and terrorism prediction and prevention might just be possible with the massive amounts of data available to government agencies, as long as data measurement technology is up to snuff. The ethics of such data collection and analysis, however, is, frankly, a nightmare akin to Minority Report.
"The American public is fearful that in this massive amount of data you get that there's the ability of the federal government to synthesize that data and learn something more than maybe what was ever contemplated by the Patriot Act." – JOH Sen. Mike Johanns of Nebraska
More than projecting and inspiring technologies of the future, science fiction puts these technologies in a cultural context through the use of story. Through narrative, science fiction can comment on how humans could and should interact with these technologies. Whether that produces public desire for a real-life form of a scifi technology, or whether it produces open concern about the possible uses of a nascent technology or scientific advancement, the power of science fiction as a narrative playground for science, as David Kirby puts it, should not be underestimated.
I'm on a science in science fiction kick right now, so if you have any feedback or examples of how scifi has inspired real-life science, please comment below or send me an e-mail!