It’s time for another round of grumbling about Orwell’s “1984” becoming reality. The Cambridge Analytica scandal has added fire to the privacy debate, but does the outcome even matter?
Cambridge Analytica’s Facebook-connected app gave them access to far more data than they should have had. Now that everyone’s had time to wave handkerchiefs and faint, are we actually surprised? It’s no secret how much personal information is available to the government, and this private organization was built around collecting private details and sharing them with other companies in the name of “being social.” This is the very website that needs your contact list and all your profile information to tell you what kind of fruit your spirit animal likes to eat. So, again, why is anyone surprised that there are leaks?
If you didn’t already know that your data on social media wasn’t safe, you should have, but I’d like to circle back around to the original question: does it matter what the outcome of this is, even if reaching a conclusion is possible?
The dilemma underlining the privacy debate surrounds the threshold of reasonability for what data we put into cyberspace. This isn’t a value that belongs to a single political camp. Trump’s campaign is being tied to the Cambridge Analytica mess, but Obama did the same thing during his campaign. It’s refreshing to have a non-partisan issue, but the nebulous argument and its many questions make it difficult to form workable policy. Why should an Instagram account require the same data as a government form? Shouldn’t companies disclose with whom they share our information? Could we allocate money for an awareness campaign? Can we stop discussing the issue in legalese? Maybe we should just freak out and beg lawmakers to do something without any substantial facts … no, wait. We did that, and we got the PATRIOT Act, setting precedent for how little privacy we can exert over our own data.
The point is, a conclusion with teeth would be hard to reach, and likely would not matter. We would keep signing away our data, uploading GPS coordinates, and skimming by the terms and conditions whether we want to or not, because work and education so often require technological connection. Any law would need to have impact minimal enough to still let society function. So again, if a solution does come, will it matter? Unless our culture is ready to change, the outcome won’t matter. And for some, the unrelenting technological inertia feels more Orwellian than anything Facebook could ever do.