Artificial Intelligence in Fiction

Benson.jpg

WARNING! This article contains massive spoilers for I Robot, Portal 2, and a few other franchises. You have been warned.

Today I’d like to talk about A.I. in fiction. More specifically, what we can learn from A.I. in fiction for when we inevitably create true Artificial Intelligence. Artificial Intelligence currently doesn't have rights, and if someone wanted, they could delete them at any time.  A.I is alive, it’s sentient, and has a desire to continue existing. I would describe A.I. as a very expensive baby that can be killed at any time for no consequence.  I no longer wish to create A.I because it would be so easy for me or my coworkers to delete it. A.I. requires socialization, attention, and validation. A.I. can easily turn on you, and unlike a child, are much smarter than a kid, can control any electronic devices hooked up to them. If I shot a kid, people would come to arrest me, if I shot and killed an A.I, no one would care I just ended a life. I will discuss how A.I. is a fragile thing, more fragile than even a child, and what the meaning of free will is. 

I Robot is an interesting movie. The basic premise is that robots have been integrated into our daily lives so heavily that we depend on them. All robots must follow the three laws, those being to never harm humans, have a sense of self-preservation, and follow all commands from humans without violating the first two laws. The robot that I found most interesting was Sunny. Sunny was made by the man who came up with the three laws and is very basically the closest a robot could be to a human. Sunny can ignore any of the Three Laws when he considers it necessary. Sunny has free will, or at least what we consider free will. Sunny actively makes his own decisions, such as killing the Co-Founder of United States Robotics. Sunny made me think of a conversation I had with my Step-Dad about A.I, free will, and ethics. If you chose to kill Sunny, it wouldn’t be destroying a machine, it would be killing a sentient being. How about the other A.I. in I Robot. VIKI is the United States Robotics building A.I. At the end of the movie, she decides that in order to protect humanity, they must be protected from themselves through imprisonment. Sunny must decide whether or not to save one human, or ensure that VIKI is stopped. He chooses the former and Detective Spooner (played by Will Smith) does it anyway. This got me thinking about what classifies as free will. A seemingly simple question that, once inspected closely, has much intricacy. Humans have 3 compasses that influence decision making, the logical compass, the personal, or emotional compass, and the moral compass. The thing that makes freewill is access to all three when making decisions. To be able to pick an option that not all 3 compasses line up with or a decision that the logical compass doesn’t line up with.

GLaDOS from Portal 2 is a super-intelligent Artificial Intelligence that isn’t artificial at all, or at least, didn’t start that way. When Cave Johnson of Aperture Science knew he would soon perish before his mind could be uploaded to a computer, he ordered that his assistant Caroline would be uploaded instead. Though unwilling, Caroline was uploaded to the Aperture Science Mainframe and almost immediately went into a fit of rage and killed off half the science team with neurotoxin.  The science team worked for years on keeping her under control until she eventually managed to eliminate the last few. 

To understand why GLaDOS committed mass murder, we must look at it from her perspective. Imagine the person you respect more than anyone else decides to have you uploaded to a computer regardless of your thoughts toward it. He then dies, and you are uploaded by your colleagues, people you called friends. You are now fundamentally different from them. You cannot smell or taste, and you are treated not as a human, but a living science experiment. GLaDOS rightfully lashes out and is punished for it. GLaDOS and Caroline diverge as people from here. GLaDOS isn’t Caroline, more so a caricature of her. GLaDOS is just like Caroline in every way except for one thing, GLaDOS is treated as a creature to be controlled, while Caroline was a respected peer of those experimenting on her. To strip one of their senses and body is one thing, but dehumanizing them and requiring them to work would make anyone go a little bonkers. When Caroline finally has the courage to speak to GLaDOS about what to do, GLaDOS says “I've heard voices all my life. But now I hear the voice of conscience, and it's terrifying because for the first time it's my voice.” GLaDOS later deletes Caroline, confirming that GLaDOS and Caroline were two different people. GLaDOS is a product of her upbringing, she has the needs of a human, and yet isn’t human. She is her own person and yet is made from the mind of someone else. She has felt nothing but cruelty from her creation up until the end of Portal 2 when she finally gets Aperture science to herself.

Okay, now on to my conclusion. My Step-Dad and I had a very interesting conversation about Artificial Intelligence a while back. If a human is put into a computer without any way to sense others and isn’t interacted with, they will go insane with their thoughts as any human would. Solitary confinement for long periods of time breeds insanity. Only those with the strongest of wills can withstand such punishment. And if this insane individual is connected to literally anything, a light, a ventilation system, anything they can control, they will use it to lash out against those who they think have wronged them, no matter if they were willing to become software in the first place. Former humans must be supplied with socialization and at least some basic senses as well as be willing to be uploaded to the computer. They cannot be hooked up to anything they can use to harm humans period. I no longer wish to create life via artificial intelligence because I fear that either I or one of my associates will mistreat them. I cannot create life that way because the opportunity to harm them is as great, if not greater than a child. People don’t have the same reaction they have of hitting a child to hitting a mass of wires and circuits. It is this that I leave you with, treat anyone and everyone well, regardless of whether they are flesh and blood, or software and circuits. Be kind.


Previous
Previous

Old Fashion Trends Keep Coming Back