Big Brother is watching

You are probably not aware exactly what data trails you are leaving behind you.

In 1994 ̶ well before most Maltese had an email address or Internet access or even a personal computer ̶ a certain Philip Agre foresaw that computers would one day facilitate the mass collection of data on everything in society. Agre, then a University of California, Los Angeles (UCLA) humanities professor, predicted that people would willingly part with massive amounts of information about their most personal fears and desires.


The amount of data we produce every day is truly mind-boggling. It is estimated that 2.5 quintillion bytes of data  ̶ that’s 18 zeros after the decimal point ̶ are created each day, but that pace is only accelerating with the growth of the Internet of Things (IoT). Over 90% of the data in the world was generated in the last two years alone.


More than 7 billion humans use the internet (that’s a growth rate of 7.5% over 2016). On average, Google now processes more than 40,000 searches every second (3.5 billion searches per day). While 77% of searches are conducted on Google, it would be remiss not to remember other search engines are also contributing to our daily data generation. Worldwide there are 5 billion searches a day.

Your data trail


We leave a data trail whenever we use our favorite communication methods, be they sending texts or emails. Every minute, we send 16 million text messages, do 990,000 Tinder swipes, send 156 million emails, place 15,000 GIFs via Facebook messenger, send 104 million spam emails, and make 154,200 calls on Skype. Big Brother also knows that every minute we are taking 2.3 million photos on our mobile phones.


In your trails, which clever data hackers can access, manipulate, and distribute to millions of people, you will have given away information about yourself, your family, and your friends, your likes and dislikes, whom your admire and whom you hate with a passion, whether you are stupid or intelligent, whether you are a worker or a professional, where you eat and entertain yourself, whether and how often you visit paedophile or sex sites, whether you’re chatting up somebody else’s wife or husband, and a million other interesting or sordid facts. What happens to that data is beyond your control.


You are probably not aware exactly what data trails you are leaving behind you. If you wish to do so, read this. You’d be amazed. And that does not include very personal data like your medical records, income tax, voting preferences, etc.

Influencing people


The data being collected from ad networks and mobile apps for myriad purposes is being used to influence consumers, sway elections, or condition minds. Agre foresaw our inability to resist well-crafted disinformation and he foretold that artificial intelligence would be put to dark uses if not subjected to moral and philosophical inquiry. Then, no one listened.


“Genuinely worrisome developments can seem ‘not so bad’ simply for lacking the overt horrors of Orwell’s dystopia,” wrote Agre in an academic paper. That seems eerily prescient. What was a startling vision of a future has come to pass thanks to a data industrial complex that knows no borders and few laws. Now, many of Agre’s former colleagues and friends are rereading his work, as pitfalls of the Internet’s explosive and unchecked growth have come into relief. Holding technology companies accountable has become a major challenge.


We don’t know what Agre would say about the present. In 2009, he simply dropped off the face of the earth, abandoning his position at UCLA. When friends reported him missing, police located him and confirmed that he was okay, but Agre never returned to the public debate. His closest friends know where he is but will not talk, citing respect for Agre’s privacy.

Artificial intelligence


By the early 1990s, Agre came to believe the field of artificial intelligence had gone astray, with artificial intelligence developers ignoring critiques of the technology from outsiders. Nevertheless, AI has barrelled ahead unencumbered, weaving itself into even “low tech” industries and affecting the lives of most people who use the Internet. It guides us on what to watch and read on YouTube and Facebook, it determines sentences for convicted criminals, allows companies to automate and eliminate jobs, and enables authoritarian regimes to monitor citizens with greater efficiency and to thwart attempts at democracy.


Today’s AI, which has largely abandoned the type of work Agre and others were doing in the ’80s and ’90s, is focused on ingesting incredible amounts of data and analysing it with the world’s most powerful computers. But as the new form of AI has progressed, it has created problems ̶ ranging from discrimination to filter bubbles to the spread of disinformation ̶ and some academics say that is in part because it suffers from self-criticism.


Last December, Google fired AI research scientist Timnit Gebru, after she wrote a paper on the ethical issues facing Google’s AI efforts, highlighting the ongoing tension over the ethics of artificial intelligence and the industry’s aversion to criticism. “It’s such a homogenous field, and people in that field don’t see that maybe what they’re doing could be criticised,” says Sofian Audry, a professor of computational media at University of Quebec in Montreal.


The mass collection of data has both changed and simplified human behaviour to make it easier to quantify. The scale on which it has happened is beyond most people’s imagination. Social media and other online networks have corralled human interactions into easily quantifiable metrics, such as being friends or not, liking or not, a follower or someone who is followed. To complete the circle, the data generated by those interactions has been used to further shape behaviour, by targeting messages meant to manipulate people psychologically.

“Your face is not a barcode”


In 2001, in response to the use of facial recognition in public places, Agre wrote that “your face is not a barcode”. Again, he correctly predicted that, if the technology continued to develop in the West, it would eventually be adopted elsewhere. And so the Chinese government is hell-bent on an unprecedented programme of surveillance, well on the way to track everyone inside its country within 20 years.


In quite a few countries, a debate is raging over the use of facial recognition technology by law enforcement and immigration officials, and some states have begun to ban the technology in public places. Joseph Muscat had the same vision for Malta; thank God he did not proceed.


Most people only worry when they learn of data breaches reported in the newspapers. There are plenty of them, though not always widely reported. Some of the biggest ones were those of CPlanet which released a lot of information about 98% of voters in Malta, including their voting preferences; and the Lands Authority case where thousands of personal records became available on the internet.


If you’re still not worried enough, you might do so when you learn that, according to information published by DLA Piper, in 2019/2020 Malta had 31 breaches per 100,000 of population, placing 12th in the EU!

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments