Posted on

Fischer: Prepare for the present

By Travis Fischer,
Fischer: Prepare for the present
Travis Fischer

So the first trailer for the latest entry in Tom Cruise’s “Mission: Impossible” movie franchise is out and once again Ethan Hunt and his team have to go rogue to save the day with a series of car stunts and jumping off of high objects.

This time around their mission, if they choose to accept it, is to stop a megalomaniac who has developed a high-tech computer system that tracks everybody in the world on an individual level and can predict, and presumably control, what they do at will.

In other words, it’s a mission to take down Mark Zuckerberg?

It’s a little late to be getting into a knife fight on top of this train, right?

One of the actual legitimately productive functions of entertainment media is to create a space where the general public can workshop these kind of problems before they happen in the real world.

Every time a robotics team builds something that can walk, run, or jump in a human-like way the collective response is always “Hey, that’s neat and all but… we all saw ‘Terminator,’ so let’s just keep an eye on that.”

I don’t think anybody is worried about the immediate threat that a robot capable of adjusting its balance on the fly poses, but James Cameron has ensured that we’ll always give advancements in that field a little bit of a side-eye and that’s probably good thing.

It’s been almost 40 years since “The Terminator,” and even back then an entire generation had already grown up with the concerns about robot overlords in the back of their mind. Isaac Asimov wrote the Three Laws of Robotics back in the 1940s.

We’ve been thinking about safeguards needed to control artificial intelligence since before the transistor radio was invented.

Last fall a robotics group set a world record with “Cassie,” a bipedal robot capable of running a 100 meter dash in under 25 seconds. Cassie isn’t programmed to do anything other than run yet, but if that day ever comes we’ll have at least 80 years of making sure that Asimov’s Laws are a consideration.

But you know what Asimov didn’t think of? Social media and how seemingly innocuous personal data can be collected and used to manipulate people.

More than a decade ago Target caught a bit of heat when it came out that they had automated their system to identify shoppers that may be pregnant based on the particular types of items they were buying. This system would then send out advertisements at strategically timed points based on how far along in their pregnancy they were estimated to be.

This was already happening when “Person of Interest” premiered on CBS. That show was about do-gooders using an AI to predict when criminal acts would happen and intervening, only to encounter another AI that took things a step further by manipulating people on an individual level toward a goal of world domination.

Flash forward to today where artificial intelligence seems to suddenly be making advances at an exponential rate and we seem to be barreling toward a different kind of science fiction future that we haven’t spent the last eight decades preparing for.

The world is just barely starting to respond to the impact that algorithmically controlled content has on people. Just last week the Supreme Court ruled on a case determining whether or not Google and Twitter can be held responsible for their algorithms “recommending” videos that incite violence and terrorist attacks.

The applications and capabilities for AI, which are just algorithms taken to the next level, have rapidly expanded recently and, if commercial media is any indication, the wildest dreams of our science fiction writers may actually be closer to reality than anybody thinks. If Ethan Hunt is trying to stop somebody trying to use personal data and algorithms to take over the world in the movies, we may already be there.

Travis Fischer is a news writer for the Charles City Press and should give “Person of Interest” a rewatch, both for modern relevancy and because it was really, really good.

Social Share