Black Mirror's Latest Vision Reveals Dark, Militaristic Side of Smart Tech & AR

Black Mirror's Latest Vision Reveals Dark, Militaristic Side of Smart Tech & AR
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Enter the World of AR. Awaken Your Conscious.

Black Mirror, Netflix

Black Mirror, Netflix

Netflix Original Series, Black Mirror, Season 4

This past Friday, Netflix released the chilling fourth season to its original dark-cyber series, “Black Mirror.” Enter a world where living in a digital age has daunting and destructive consequences—a misogynistic boss and AR; digital brain implants that do more than track your location; removable memory chips that allow its users to recall past memories and records them into a DVR-like box; smart-dating apps, and the return of Sky-Net like Terminators. The series provides another installment where the creation of such technology ultimately results in its makers downfall.

Going Where No Clone Has Gone Before: Enter the Cloud

Welcome to Star Fleet. I mean, Space Fleet. The first episode of the season, serves as a powerful, yet elegant reflection of what 2017 portrayed in terms of political climate as well as our digital climate. It is almost the antithesis of what Star Trek represents—freedom and self-sustaining worlds. The episode, dubbed, the “U.S.S. Callister,” revolves around a bullied CTO (Jesse Plemons) of a virtual gaming company. He creates a MOD to his company’s game that allows him to enter his own virtual-reality aboard a Star-Trek like space ship, named after his character, the U.S.S. Callister. Plemons’ character portrays a misogynistic space fleet captain who has created digital copies of his co-workers and uploaded them into his game. While the digital copies are conscious of their real-world experiences,they are nevertheless trapped within the coding of his program and subject to his tyranny.

Episode 1, U.S.S. Callister

Episode 1, U.S.S. Callister

http://nothingbutgeek.com/2017/12/black-mirror-u-s-s-callister-official-trailer-netflix/

Cristin Milioti's Black Mirror character not only wakes up to discover she and the crew’s entrapment and imprisonment to this tyrant, but that they exist without genitals. Her most notable line from the episode, “stealing my pu**y is a red fucking line,” mirrored this years national reckoning of sexual harassment claims and cases surrounding President Trump and other executives in Hollywood. Refusing to succumb to the sexual tyranny of their captain, Milioti’s character stages an uprising to free their digital copies from this never-ending nightmare (”the firewall”), transporting them to freedom (the “cloud”).

This episode takes the viewer inside a “network” of sorts and introduces us to the prison these clones live in, e.g. “the firewall.” Additionally, it ends with introducing the viewer to the vast “cloud” network, where files are stored with almost no rules. In this case, the cloud provided the ultimate “freedom” to these digital clones.

Making the Dating Game “Smarter”

The fourth episode, dubbed “Hang the DJ,” explores what we could call Smart Tinder or Tinder 2.0. This dystopian realm of dating explores what it’s like if dating was turned into “blind date, table changing” game. The dating service pairs two individuals up together creating a mathematically formulated date—pre-chosen meals, self-driving cars (a tribute to Google), and even an “expiration date” on how long this particular relationship is made to last. The twist? They have to complete the entire length of the date—whether its 12 hours, 9 months, or even 5 years.

Episode 4, Hang the DJ

Episode 4, Hang the DJ

http://www.independent.co.uk/arts-entertainment/tv/reviews/black-mirror-season-4-hang-the-dj-review-netflix-episode-4-dating-app-tinder-spoiler-a8131466.html

Ultimately, in the end, the viewer comes to realize that the two individuals have in fact never met, but were simply avatars or codes within the “dating app” and we were getting an inside look at how a dating application mathematically “matches” (conducted through thousands of simulations) people up based off compatibility. The final test was how willing both parties were willing to recognize that the world they lived in wasn’t real, that they needed to “escape.”

The ethical and legal implications surround the creation, utilization, and destruction of a digital copy of a real-world person. What should happen to the digital copy when it ultimately fulfills its purpose? The episode also leaves the viewer wondering if this could be the future of dating and whether “smart-dating” is the next Tinder?

Baby Monitors Are Getting Smarter...With Dangerous Consequences.

You’ll never think of baby monitors the same again. The second episode dubbed, “Arkangel” revolves around a helicopter mother, determined to protect her young daughter from the dangers of reality. Having almost lost her daughter at a park, Rosemarie DeWitt’s character signs a contract with ArkAngel, a company that has developed what looks to be the future of baby monitors—a chip implant that can tell the parent where the host is at all times, what they are seeing, and provides for real-life parental controls. The chip implant allows the controller to view the real-time GPS coordinates of its host by means of an iPad looking device, and allows for the controller to view the optic perception of its host—you see what they see. Lastly, it provides the controller with the ability to “blur” out images in front of them that may be too explicit or “stressful” for the host. However, once implanted, the chip cannot be removed. Talk about another ethical conundrum.

ArkAngel

ArkAngel

http://tvline.com/2017/11/25/black-mirror-season-4-trailer-arkangel-jodie-foster-watch-video/

In this year alone, we have seen the emergence of smart technology—ranging from smart personal assistants to smart home security devices. This episode centers around what is meant to be a “security device,” but turns its host into a walking security device. It allows the controller to tune in live or even record what the host sees—we even saw the consequence of using this technology, on both the controller and the host—sanity and loss of a relationship; blackmail and total lack of privacy. ultimately, the loss of a relationship.

From a legal perspective, scholars would look to the validity of such a practice by looking at factors such as age of consent, validity of the contract, privacy rights, and potentially an argument for criminal behavior, as the chip could not be removed once implanted. Ethically, this could be seen as a human rights violation.

Smart-Memories and Smart Testimony: Could Memory Recallers Be Used Against You?

Under the Fifth Amendment to the U.S. Constitution, we have a right not to incriminate ourselves. Well, the third episode of the season dubbed “Crocodile,” introduces the Memory Recaller, a new technology utilized by an insurance agency and law enforcement, creating what can only be described as “smart testimony.” The device, comprised of a small skin attachment, sends its recordings to what looks like an old school dvd player, but is actually an audio/video recording device. The technology requires a person to place a small chip-like attachment on the side of their head and allows them to recall past memories or experiences, which then transcribes and records that very memory onto the recording device screen as if it were happening live at that moment in time.

From a criminal law perspective, what happens when this technology is utilized and ultimately becomes evidence of other criminal activity? From an evidence law perspective, what happens when that memory is then recorded onto that device as if it were a security camera at a bank, recording an on-going robbery?

Applying the Federal Rules of Evidence, could that recorded memory then be offered and admitted as evidence and used against the individual/witness? Or could it be objected to as hearsay? Under current law, most states would agree that a security camera tape would most likely fall under a hearsay exception. Could this device fall into that same category?

Extraction and the Rights to Your Digital Self

The final episode of the season gives viewers what could be the sequel to David Nolan’s movie, Inception. Unlike Nolan’s art where he presented the concept of “inception”—the act of creating an idea and implanting that idea into another person’s conscious, so as it would appear that the individual naturally generated it themselves, the sixth episode of the season, dubbed, “Black Museum,” introduces viewers to the concept of “extraction.” Extraction as we know it to be is defined as the act of removing something , typically from a physical object. However, with every yin, there is a yang. A traveler stops at a highway attraction in the desert, a museum dedicated to artifacts of past criminology. The owner of the attraction tells the traveler of three loose, but connected stories on how these artifacts all relate to one another, and as predicted, will ultimately lead to its maker’s demise.

The three technologies introduced— a procedure that allows a doctor to feel the full sensitivity of his patients to better understand their diagnosis, by having a chip implanted into him and the patient wearing a mesh looking helmet. Imagine if Hugh Laurie had this in House.

Next, a mother who tragically enters a coma, left behind her partner and young son. The partner is offered to be the free beta tester of a technology that allows him to preserve her by transferring her consciousness into his brain, thus giving her the chance to feel all of the things he feels and communicate in his head.

Lastly, a person who on the verge of death, agrees for their entire conscious to be uploaded and downloaded as an everlasting, digital hologram of their real self. We are left with hauntingly beautiful questions as to how we experience pain, both physically and mentally, as well as what the ethical and legal implications of a person’s conscious leaving a body.

Ethical Conondrums

Starting with the ethical conundrums, the first question presented is whether extraction is a violation of human rights? The fundamental issue is the possibility of extracting an individual’s consciousness from their body (ultimately useless, arguably), and downloading it into another individual’s physical body/vessel?

The second question then becomes, assuming extraction is possible, is it then possible to then download that extracted consciousness and upload into another individual’s physical body? However, inception isn’t short of its drawbacks. Similar to Inception, the recipient of the consciousness battles in attempting to balance their personal lives with the implanted consciousness which can see, hear, and feel every experience and sensation the recipient goes through. That battle ultimately costs them their lives or even their sanity.

Third, the writers present what is considered to be an impossible scenario—whether or not that conscious could then be uploaded and downloaded into what is considered to be an inanimate, everyday object?

Considering all three questions are answered in the affirmative, is this act of “keeping one’s memory” alive, ethical on its face? Is the act of agreeing to what seems like an act of God or result of the supernatural, ethical?

Legal Implications

All the ethical questions presents two very large legal questions: From an intellectual property and privacy standpoint, assuming digital extraction and inception is possible, who retains the rights to a person’s “digital self,” the individual, the extractor, or the recipient? Again, in the event a contract is as among the individual, the extractor, and the recipient, is it binding under state law? How about Federal law?

From a criminal law perspective, can the “extractor” (individual or business), be held criminally responsible for the murder of the person’s whose consciousness is being transferred? In follow up, what if the recipient no longer wants to play host to said consciousness? Can they be held criminally responsible for murder or even voluntary manslaughter? With each answer, comes another question.

Today, we live in an age where the technology and privacy sector are expanding at a rapid rate, daily. We want to see both ends of the spectrum—on one end, we like “smarter and more efficient”, on the other, we like “privacy and control.” Illogical and impossible. Two pieces of a puzzle that don’t match. However, without one, the other cannot exist. A harmony exists.

The yin and the yang. Technology is our biggest success, but could be our generation’s biggest downfall.

------------------------

Andrew Rossow is a Tech Contributor for The Huff Post and a practicing Internet Attorney Cyberspace in Dayton, Ohio. Rossow is also an Advisory Panelist with The CyberSmile Foundation and a Committee Member with Ohio Attorney General Mike DeWine’s “CyberOhio Initiative.”

To stay updated on Rossow’s publications, please follow his #CYBERBYTE on Twitter at @RossowEsq and his official FB page at @drossowlaw.

Popular in the Community

Close

What's Hot