Comment & Analysis
Mar 27, 2026

Information and Control: Are We Living in a Post-Privacy World?

As concerns over the use of algorithms to automate interpersonal processes rise, critics wonder whether privacy and the right to privacy still exist the way they used to

Irune Camps SánchezContributing Writer
blank
Photo from Integral Life

William O. Douglas once said that “the beginning of all freedom is the right to be left alone”; that is, the right to privacy. Privacy has historically been associated with a myriad of things, including privilege (private planes or private islands), secrecy, a lack of transparency, nonconformity, dissent, and the taboo. Nowadays, privacy is understood as an issue of power and the relationships of power between the individual, the state, and the market. It can be argued that we currently live in a post-privacy world. Privacy as individuals keeping information or data about themselves to themselves does not exist anymore. Rather, privacy is turning into a relationship between the individual and organisations in the form of keeping secrets and about what they do with these secrets.

Concerns around digital privacy started to arise in the early 2000s, which led computer scientists to implement privacy protection frameworks called “differential privacy”. This technology was aimed at protecting the individual’s privacy whilst continuously extracting data from their online behaviour to learn about their patterns. Algorithms that use differential privacy add some randomness to the data so that it is impossible to identify whose data it actually is, all of this without changing the overall results. This has made users more willing to share their data with these companies, since their identities could remain protected. This has led these mechanisms to become increasingly common. It is currently used in iPhones and some government censuses. Social media companies also use this technology to learn about user patterns and refine the algorithm to maximise and assure the user’s use of their apps.

In addition, lately this type of technology has been used by banks, police departments, and companies. Banks can now use algorithms to decide who gets a loan, or to predict who will fail to pay back loans. Some police departments have fed years of criminal activity and arrest records into “predictive policing” algorithms that have sometimes sent officers back to patrol the same neighbourhoods. This has led to the over-policing of certain areas which disproportionately affects racial and ethnic minorities, as well as working-class people. Furthermore, enterprises and employers can now use AI algorithms that can predict who would be the best-suited candidate for different jobs. In his New York Times article, Maximilian Kasy said: “You might not know it, but an artificial intelligence algorithm used to screen applicants has decided you are too risky. Maybe it inferred you wouldn’t fit the company culture or you’re likely to behave in some way later on that might cause friction (such as joining a union or starting a family).” These algorithms automate the process of choosing the most profitable individual for different positions. Just with the scant details it has of you (from your CV, for example), it can predict how you will behave at work based on patterns from people similar to you extracted from data protected by differential privacy.

ADVERTISEMENT

Therefore, privacy has grown to be a collective issue, rather than an individual one. Giving up your privacy means that organisations can use your data to automate different processes that aim at analysing people similar to you. This is a process that is inherently dehumanising, as it puts your qualities and achievements in the same pool as everyone else’s in order to monitor and measure productivity. Through these processes, each individual stops being themselves. Human beings become patterns and sets of behaviours; they become numbers.

Not only is it dehumanising, this practice is also rooted in aims to control and exploit. Adorno and Horkheimer, in their book Dialectic of Enlightenment, talked about the rise of science and knowledge as the “disenchantment of the world”. To do science and to acquire knowledge is to understand what is around us. Once you understand something, you can control it, and exploit it. The data collected by these organisations allows them to understand us, and consequently, control us and exploit us.

Perhaps we are not in a completely post-privacy world yet, but we are close to it. Individual action is not enough anymore. Privacy has become a collective problem, and as a society, the only way to fight back against it is for all of us to attempt to exert collective control on our data. That is, for us to determine for what purpose and for whose benefit our data should be used. It has to become a political issue. Furthermore, it is crucial to understand how this technology works and how these organisations function as well. This not only means following the news on this technology and being up to date on its recent developments, it also means being aware of what these companies use this revenue for and who it directly benefits. After all, the more we understand them, the more we will be able to control them too.

Sign Up to Our Weekly Newsletters

Get The University Times into your inbox twice a week.