How our personal bubble is being disrupted

How our personal bubble is being disrupted
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

By Tobias Rordorf, St. Gallen Symposium

Fake News, Echo Chambers, Filter Bubbles, Social Bots and Psychological Targeting: Not a day goes by, without somebody coming up with a new buzzword trying to explain the allegedly unpredictable turnouts of Brexit or Trump's victory from a digital standpoint. The fact is, never before has social media played such a pivotal role for opinion-forming and in a way, it has been the surrogate for the cracker-barrel-talks. In contrast to turning to traditional news stations, people value the ostensibly unfiltered and unbiased way of their social media streams.

The law of the jungle: If you're not paying for it, you're not the customer - you're the product

2017-01-04-1483534355-2683716-teaser_rordorf_huffpo.jpg

Facebook, Twitter and Co. though have long moved on from solely being social networks to dangerously easy usable advertising media that compensate the free use of their services by selling user data. In 2015 social networks worldwide accounted for $ 25.1 billion in advertising revenue (McDonald's revenue: $ 25.4 billion ). The roots for this success are not far-fetched at all. The networks' abilities to micro-segment its users is far more valuable than their enormous audience. And exactly this ability seems to stimulate businesses and campaigns around the globe to abandon blanket advertising . In other words: They do not want to shout from the rooftops and see who listens, but rather micro-target potential customers or voters and provide them with individualized messages that resonate with their personalities.

With every Google search, online purchase, survey answer and like on Facebook we render, we leave digital traces. Isolated, those fragments of information are virtually worthless. Their value arises when being aggregated and synthesized in one place, as well as sorted out and structured. The more connected our lives become, the more data is available for people looking to not only aggregate and synthesize but to communicate and advertise. The scope of Big Data surely has broadened considerably.

The most recent and astonishing example that our data can be used to individualize messages was the work of Cambridge Analytica (CA) in the U.S. preliminary and during the presidential elections. CA's analysis helped the Trump team to appeal to potential voters with exactly the right messages at the right time . But how do they do it? They combine "our online personas with our offline selves" . CA's approach is to model personalities by chaining data points (information on demographics, consumer and lifestyle habits and political affiliation, as well as unique psychographic information on motivation and decision-making) that are either obtainable from social networks or can be bought from consumer-data giants. Those digital profiles (aggregation of psychographic information) can thereupon be used to create personalized messages that resonate with the targeted personalities to stimulate engagement and wield influence. "It's personality that drives behavior, and behavior influences how you vote", says CA CEO Alexander Nix . This insight hardly seems groundbreaking but exemplifies an important point.

Obviously there are many doubts about the accuracy and effectiveness of the presented methodology. Many critics use harsh words to contradict the potential of Cambridge Analytica's approach. However, since many critics focus on the technical aspects (e.g. dark posts) of CA's approach, the question arises whether the criticism does justice to the scope of the abovementioned method.

Personally, I do not feel too much consensus for the critics - not because I appraise how our lives are being x-rayed by profiling but rather because of the way the criticism is derived. In my view, the critics miss the bigger picture. By channeling their energy on the "how" they truly miss the essence: The reach of Cambridge Analytica's approach and thereby its breach of moral ground, by influencing our ability to be self-conscious and -aware.

The case of Cambridge Analytica bluntly illustrates that the stage of sole awareness about the aggregation of our data has passed. Without doubt we are starting to experience what it means to sit in a glass box and experience how our privacy is not only jeopardized by digital profiling (nurtured by Big Data) but exerts influence on our digital behavior.

The disruption of our bubble: Learn to like what you dislike!

Futurist Khanna's idea of smart contact lenses that would blank out homeless people from the view seems quite fatuous and morally questionable . But what if our social media streams become exactly those smart contact lenses and fabricate a personal ecosystem of information which leaves out things that do not match our individual preferences? At first we would presumably enjoy the good side of our digital smart contact lenses simply because we are presented with what corresponds with our interests and beliefs. We feel solicited, maybe even slightly flattered when addressed directly (hence it corresponds with the desire for tailored communication). However, this simultaneously constitutes the downside. Imagine a situation where you are solely confronted with views, information and opinions that resonate with your personality and reality. Let's call this situation the personal bubble. Data-driven, micro-targeted communication therefore disrupts the way we behave and engage digitally.

Bluntly said, one could contend that your own digital traces dictate the boundaries of your personal reality, your bubble. A self-inflicted subconscious censorship so to speak. This leaves our perception biased and vulnerable to manipulation as we become intellectually isolated.

Technological abilities do not necessarily lead to desirable and ethical outcomes (as in the case of smart contact lenses). The case of Cambridge Analytica impressively demonstrates the effective corollary of Big Data and Data Science. But it also reminds us that the seemingly endless prospects of technology should not be attributed magical abilities, or it will eventually leave us with mindcuffs. Our digital behavior is being disrupted and influences the way we act on- and offline. It is about time we question the common moral ground and advocate transparency of algorithmic filters. But who is responsible to set the boundaries? For now, it's you and me!

The dilemma of disruption will be debated at the 47 St. Gallen Symposium, held from 3-5 May 2017 in Switzerland.

Popular in the Community

Close

What's Hot