It's free, accessible and user-friendly. It has many functionalities that schools and teachers love. But is it worth risking the privacy of students who use it as well as potentially that of their families?
Last November, Hart Research released a study stating that 76% of teens are concerned about maintaining their online privacy. And it comes as no surprise that students are becoming more savvy about identity theft, since on average, 500,000 kids now have their identities stolen each year. Unfortunately, they don't find out their identity is compromised until they apply for an educational loan or a credit card.
We have already had privacy alerts in the healthcare field; now we are facing the same threats in our schools. Although we have HIPAA regulations in place to protect consumer health records, which are also free, big businesses, such as Big Pharma, have found a way to get around this and intrude on our privacy. Similar to HIPAA in healthcare, in education there is the Family Educational Rights and Privacy Act (FERPA), which protects the privacy of student education records.
With this information at hand, it is imperative for parents and students to know what is at risk when their schools rush to use free apps offered by Google, the world's biggest advertising firm.
I recently had the opportunity to interview Jeff Gould, the President of SafeGov, about our concerns regarding Google Apps for Education.
Q. Exactly what is data mining and why should parents and schools be concerned about it?
A. In the general sense, "data mining" means what the metaphor suggests -- it is the practice of using all sorts of smart software algorithms, ranging from the simple to the incredibly sophisticated, to sift through vast mountains of data in search of "nuggets" of hidden meaning or patterns that can be useful for scientific or business purposes.
Data mining can be a very good thing when it is used for legitimate purposes. For example, scientists use data mining to search through the medical records of thousands of cancer patients to identify unsuspected risk factors or responses to drug treatments. This kind of data mining can save lives. A controversial use of data mining is when businesses use it to try and figure out what consumers might want to buy before they actually buy anything. A famous example of this was published in a New York Times story in 2012 that recounted how the retail chain Target used data mining to infer that a teenage girl was pregnant based on her buying habits. The controversy came from the fact that Target figured this out before the girl's own family did.
Q. How does Google use data mining and how does it relate to GAFE?
A. You may think of Google as a search engine. But they are first and foremost an advertising company -- in fact, the largest advertising company in the world. Google actually sells more ads in the United States than the entire U.S. newspaper industry, and that's only a fraction of their total business. The way they got to be so big is that they have developed the world's most powerful data mining techniques for understanding what consumers are likely to do. When you use a web site or service provided by Google, they are monitoring everything you do or say.
If you use Google search, they keep a record of everything you search for, and what links you clicked on. If you use Gmail, they read all of your email (inbound and outbound) and use their software to figure out what you are interested in. They know what you watch on YouTube. They can even track most of the non-Google web pages you visit, because whenever you open a web page that contains an ad placed by Google's subsidiary DoubleClick or a Google+1 button (their equivalent of a Facebook "Like"), Google knows that you have seen that page. If you use WiFi at home, Google knows where you live, because their StreetView cars collect WiFi addresses in addition to taking pictures of your house.
So in short, Google knows a vast amount about you, probably far more than you realized. They combine all of this information to create an incredibly detailed -- and perhaps shockingly accurate -- picture of who you are. And they are doing this not just for you, but for hundreds of millions of other people too -- all day, every day.
Unfortunately this use of tracking and profiling is being used with GAFE. This is where we get onto a slippery slope about business models based on data mining and user profiling. I think most people agree that these business models are acceptable in the consumer market when people know what is going on and agree to be tracked in exchange for what are admittedly some really great free services. But when this super-intrusive and sometimes frankly rather creepy model gets introduced into schools, that is where I -- and I think most parents -- draw the line. After all, we accept that fast food joints and junk food brands are allowed to advertise their products on TV or on billboards. Big Macs, Twinkies and Coke are probably bad for you -- certainly if you eat them every day -- but most of us agree that they shouldn't be completely banned by law.
However, most of us also agree that these kinds of products should not be advertised or promoted in schools. And it's the same for targeted online advertising based on data mining and profiling. It has its place in our consumer society, but not in our schools!
Q. What does Google do with all this information and why should parents be concerned?
A. They use this information -- which of course they constantly update every time you touch the Internet -- to decide which ads to put in front of you when you visit a web page they control. This might be one of their own web pages or a page belonging to one of the thousands of publishers who use Google's services to sell ads. Google only gets paid when a user actually clicks on an ad, so they have every incentive to pick the ads that you are most likely to click on. And they have become extraordinarily good at making these predictions. This is the power of data mining applied to the task of profiling users and understanding what they are likely to do next.