Follow BigDATAwire:

December 28, 2020

Social Cooling: Living in a Big Data Society

Sebastian Schaub

(Aniwhite/Shutterstock

“Data is not the new gold, it is the new oil, and it damages the social environment,” stated Tijmen Schep. Strong words, indeed. But if you’ve been inundated with articles espousing all the wonders that Big Data brings over the last few years, then it’s high time for a wake-up call. To take Schep’s thoughts further, like oil leads to global warming, so data leads to social cooling. So why should any of us be worried about the concept of social cooling?

Big Data (indeed big companies and big industry) is largely getting out of control, with everything now being turned into data. Too much centralised power (responsibility) rests in the hands of a few private companies, without too much accountability insofar as how the data that they collect / hold is processed. The notion that Big Brother is watching you has never felt so absolute and the notion that many of us are changing our behaviour because of this intense scrutiny is worrying to say the least. Make no mistake, Big Data is supercharging this effect.

And we can’t talk about Big Data without illustrating the part that algorithms have to play in all of this mayhem. Essentially, anytime your data is collected and scored, so-called data brokers use algorithms to uncover all kinds of private details about you—friends and acquaintances, religious and political beliefs, and even sexual orientation or economic stability.

And this is a good time to introduce the notion of “mathwashing.” This is the belief that because algorithms are intrinsically created by mathematics, they must be neutral, right? Wrong. People design algorithms (also computers) and, importantly, decide which data to use and how to weigh it. Still think algorithms are neutral? Algorithms work on the data that humans provide. That means that data can be untidy, often incomplete, sometimes fake, and richly affected by the wide range of human meanings. Again, to quote Schep, “bias in, bias out.”

Algorithms are not neutral; they make mistakes because people make mistakes and they introduce bias because people introduce bias. We have arrived in an era where computers are doing much more than expected but that even those who created them aren’t fully aware of the consequences.

(Sangoiri/Shutterstock)

So how does any of this affect you or I? Well for one thing, your digital (footprint) is always there, all of your historical online behaviour. And this ‘digital reputation’ can affect you in a myriad of ways. For example, what you post on Facebook might influence your chances of landing that job you applied for or even cause you to lose a job. You could even find that the rate of that loan you applied for is adversely affected because you have ‘bad’ friends. And don’t forget our earlier observations on the quality behind the data that drive these decisions – these effects are independent of whether the data is good or bad.

The awareness that many have started to feel regarding being watched is bringing about changes in our online behaviour. Are you not clicking on a particular link because you think your visit will get logged and this could impact negatively upon you? Are rating systems failing due to people wanting to conform to an average? Digital reputation systems also serve to hamper free speech – under an authoritarian regime who dares to say what they really think about those in power when it could cause issues when looking for jobs, for loans or even travel visas?

Hopefully the message that personal data is being monetised by big business is also spreading. But how many people know that this is usually being done without their knowledge or consent? Remember, everything that’s shared online can be considered a public record – social media doesn’t offer any privacy. Perhaps this is why Facebook usage among young people in Germany is dropping rapidly?

Overall though, public awareness of social cooling remains low. Air pollution and climate change were once relatively unknown – now look at where we are with the likes of David Attenborough and Greta Thunberg shouting from the rooftops. So we must increase awareness, although there remains a big job ahead with respect to the perception of data and privacy. Schep posits that when algorithms judge everything we do, we need to protect the right to make mistakes. When everything is remembered as big data, we need the right to have our mistakes forgotten.

Which leads us to the notion of privacy. Privacy is the right to be imperfect, even when judged by algorithms. We should be able to click on that link without fear, make comments without reprisal or befriend who we want without affecting our ability to get a job or a loan. Privacy and anonymity are the cornerstones of any decent VPN. It allows you to overcome blockages, geo-restrictions and ad tracking (a massive data compiler). I passionately believe in protecting those who wish to maintain their privacy and anonymity online and look forward to helping raise awareness whilst data mining and the information it exposes on all of us, increases aggressively.

About the author: Sebastian Schaub has been working in the internet security industry for over a decade. He started the hide.me VPN eight years ago to make internet security and privacy accessible to everybody.

Related Items:

Algorithm Gets the Blame for COVID-19 Vaccine Snafu

Anger Builds Over Big Tech’s Big Data Abuses

Big Data Backlash: A Rights Movement Gains Steam

BigDATAwire