So Facebook wants to measure our moods now. In a lot of ways, that juicy tidbit of information shouldn't surprise anyone, not with Facebook's track record. Let's take a look. We already know Facebook filters and curates our feed, reads our messages, looks our pictures, and sells our personal information, even our faces. Facebook tracks where we are physically going. Facebook listens to what we are saying on our smartphone microphones. Facebook monitors the sites we visit and uses facial recognition technology to identify us in photos. Facebook builds a portfolio on us, regardless of whether we even belong to Facebook. In the end, Facebook does literally everything except clone us, which curiously would likely fit within their general philosophy as a "revenue earning opportunity."
Sure, that last point was an exaggeration in fact, but perhaps not in Facebook theory. For Facebook is doing everything it can to rebuild another you, faster, stronger, more perfect, and more capable of maximizing the profit it hopes to extract with advertisers. Sound farfetched? Not when you throw in emotional manipulation, on top of everything else, which seems to be the impetus behind England opening an investigation as to whether Facebook violated data protection laws with its actions. This is striking; Facebook is now capable of affecting and directing our behavior. Uh oh.
Most telling here is not the general public outcry over another Facebook privacy invasion but rather the total acceptance as fact, an almost acknowledged expectation, that Facebook could stoop so low. That's what happens when you build a bond of mistrust. How big a bond? According to a Reason-Rupe April 2014 Poll, people trust the IRS more than they trust Facebook with their personal information. That's pretty low. But wait, it gets lower. For in the same poll, when compared to the NSA, more people chose to trust the NSA.
So how does Facebook get away with it this time? As always with Facebook, the company will do what it can to sweep the whole unfortunate set of circumstances under the rug. First, they'll "come clean" with the public and work with regulators during any investigation. And Sheryl Sandberg says "we never meant to upset you." That's really big of you, thanks, Sheryl, I feel so much more trusting. Actually, that's called PR. Next, they'll toy with the details, which in this case means pointing out the experiment was only for a short time and only impacted several hundred thousand innocent people, slim pickings compared to its purported billion user base.
Not that anyone believes that, which is why one of the data scientists in charge of the study wrote a sort of apology over the weekend. I say "sort of" because rather than call the glass half-empty, he decided to call this one overflowing. "The goal of all of our research at Facebook is to learn how to provide a better service." Perhaps that is true, but the real question here is better for who? Certainly such actions and dishonesty aren't better for users. That leaves the advertisers, data brokers, executive team, board of directors and stock owners, which are the only cries that keep Facebook data scientists up at night.
Facebook has become that partner where irreconcilable differences have been glossed over too many times and we should have broken up months ago if not years earlier. As irony would have it, Facebook has even developed an algorithm based on their invasive analysis of our communications to predict when our personal relationships will fail and in what timeframe. How about right now. Check out this cheeky yet remarkably apt video by Matthias: DELETE YOUR FACEBOOK. Thanks for the memories, Facebook. You can keep mine; you own them anyhow. I'm just moving on. It's a new day, and time for a fresh start with fun and safe partners like Sgrouples and DuckDuckGo. Cheers!