unveiling the shadows

There’s nothing new under the sun. Back in the mid-1700s, philosopher and economist Jeremy Bentham came up with the idea of the “panopticon” prison. Put simply, this was a building in which all the cells had one open wall facing towards a central watch tower. While the prisoners couldn’t see the tower, the prison guards could observe every inmate at all times of the day or night. 

Today, Bentham’s panopticon has become a metaphor for a new kind of ubiquitous surveillance based on a range of digital technologies. Most of us are aware that when we browse the web, shop online, or post on social media our actions are recorded and analysed. By and large, the data collected in this way are used for relatively benign purposes such as targeting ads appropriately, predicting demand or improving services.

But that’s just the tip of an ever-expanding iceberg. The data we generate unthinkingly every time we log on can affect how much we pay for our goods and services. For instance, if a predictive analytics algorithm on an airline sees patterns in customer activity suggesting they are likely to become frequent flyers in the future, then the result might be special discounts. Equally, banks use a range of data to assess credit risk, with loans and mortgages priced accordingly.


But the Panopticon was really about social control – or to put it another way, observation as a way to shape behaviour. That principle is already being applied widely. In the workplace, activity is measured using digital tools. Most city centres are festooned with CCTV. In the future, facial recognition cameras may become a commonplace means of deterring crime or excluding potential perpetrators from certain spaces. 

All of this raises some genuinely tricky ethical questions, particularly around the utility of monitoring and tracking tools balanced against an individual’s right to privacy.   

In the Workplace

In some circumstances, you could argue that privacy is not really an issue. For instance, in 2022, a report by the New York Times found eight out of the largest 10 private US companies tracked the productivity of their workers and 67 per cent of businesses with more than 500 people on the payroll were using monitoring software. 

This kind of technology can take a number of forms. In warehouses, wearable technology enables employers to see how quickly and efficiently their staff are working and where they are in the building. Meanwhile, software measuring keystrokes can assess the productivity of staff using desktop PCs and laptops. This kind of keystroke technology was widely deployed during the pandemic when home working was the order of the day. 

Employers have always monitored productivity. For instance, as reported recently in the Washington Post, the idea of assigning tasks and creating metrics that could be measured goes back to Frederik Winslow Taylor who published a book on “scientific management” in 1911.  Today, employers are simply adopting his principles for the digital age.

So the potential problem here is not necessarily one of privacy. Rather it’s a question of whether such close surveillance is effective in terms of motivating employees.

On The Street

Meanwhile, the streets of most cities are now monitored constantly by CCTV cameras. Essentially, the tower that guards a panopticon prison has been replaced by a control room where security personnel can monitor activity in a designated area. There are some privacy issues here but for the most part, the presence of CCTV is accepted as a means to cut crime. However, cutting-edge technology is changing the game. Facial recognition raises the prospect that individuals might be arrested or told to leave based on personalised data stored by the operators, regardless of whether that information is accurate.

Meanwhile, agencies such as GCHQ monitor internet and phone traffic, using algorithms to detect patterns of activity that could point to criminal or terrorist activity.

And In the Home

By now, most of us have probably made our peace with the fact that social media platforms crunch data to target ads and that Amazon knows everything there is to know about our shopping habits. We are also aware that our browsers collect cookies, which allow ads to be tailored to the individual.

But increasingly, data is being drawn from a much broader range of touchpoints. Personal assistants, such as Alexa, sensors in fridges, or in the cars we drive. It’s voluntary at the moment, but in the future when all cars are computers on wheels, insurers may know pretty much everything about the way we drive and price their policies accordingly. Meanwhile, smartwatches can feed back information on everything from our location to our health. As the Guardian newspaper pointed out, the Internet of Things will change surveillance substantially, with most of the information going into “corporate reservoirs.” 

The usage of this data can be controversial, to say the least.

All of which raises the question of whether the increased use of tracking and monitoring is a good thing or something to be feared.

For his part, Jeremy Bentham was a Utilitarian – a philosophy that believed all actions to be good if they brought joy. And tracking and monitoring certainly offer benefits. Smartwatch data can help make us healthier. Information on driving patterns can make roads safer. Facial recognition has probably prevented terrorist incidents. Equally though, the information can be misused.

But it’s important that citizens have the right to know how, why and when data is being collected and for what purpose. We should all know who the watchers are and have the right to opt-out. Transparency is key.