weka blog detail third level banner

Nowhere to Hide – How Blade Runner Could Have Used AI to Find the Replicants

Barbara Murphy
October 10, 2017

All us Blade Runner fans have been waiting in anticipation for the Blade Runner 2049 movie, and in preparation I decided to watch the original 1982 version which came out long before the world went digital.

Set in the future (2019), the movie showed how humans had successfully created “replicants” to do the off-world dirty work of humans.  These fictional bio-engineered robots had taken artificial intelligence to the zenith, and could only be distinguished from humans with the Voight Kampff machine test that could detect and register the presence of empathy. However, in order to identify the rogue replicants it required tracking them down, interrogating them in person and hoping for an emotional response that would indicate a non-human present.

Panning forward to 2017, it is both interesting and amusing to see what the 1982 movie got wrong and how we would go about solving the problem today using AI techniques.

But to begin, let’s review a short but notable list of things that are glaringly wrong when compared to 2017:

  1. We had figured out how to make superhuman copies of ourselves using AI, to the point where they could not be distinguished from humans
  2. But people were still reading newspapers
  3. People smoked in meetings
  4. Nobody carried a cell phone
  5. Computers were still green text based CRTs
  6. Cars could fly but required a driver

 

A recent edition of the Economist reveals how far machine learning has really taken us.  Artificial intelligence (AI) systems are actually the perfect instruments to find a replicants.  They are superb at identifying patterns from large amounts of data, they can detect cancer, identify personal traits, recognize an individual face with over 90% accuracy and even reconstruct facial images from DNA.

What AI has not been able to do is build systems with the ability to replace human beings in most activities, and it has been hindered by the level of computational and data processing power required to train a machine to be “human-like”. The AI revolution is just starting and has been driven by the development of massively parallel compute and storage systems.

GPUs from companies like Nvidia have the ability to process thousands of threads simultaneously enabling machines to perform facial recognition with far greater accuracy than humans can. But it takes a new class of storage system to keep these data-hungry GPUs fed.  It has to be massively parallel, scalable, high performance and the data has to be sharable.  A single GPU server can consume tens of Gigabytes per second read performance and WekaIO Matrix has demonstrated the ability to deliver close to local-disk file system performance across a shared, distributed file system.

It will take millions of compute hours and hundreds of petabytes of data for basic training on an autonomous vehicle.  Imagine what it will take to create the autonomous human – aka the Blade Runner replicants.  So with 2017 technology Decker could have saved himself the challenge of looking for the proverbial needle in a haystack problem by simply identifying the rogue replicants utilizing facial recognition gleaned from millions of security cameras. Of course, with 2017 technology he could have tracked them down with something as simple as a cell phone signal or an embedded RFID chip – but then there wouldn’t be the sequel Blade Runner 2049.

close-btn