Every Internet of Things (IoT) salesman is hyping their products. Businesses have to be clear about their goals before they decide what to install and how. But if they get it right, it can be transformative. Supply chain ‘Ninja’ Patrick Strauss always asks the same question when he comes into a business as an Internet of Things (IoT) supply chain consultant. The simple question is ‘why do you need this IoT technology?’
In London’s Piccadilly Circus, an advertising screen the size of two basketball courts detects the ages, genders and moods of passers-by and responds by displaying targeted ads. The process uses facial-recognition cameras hidden behind the Piccadilly Lights billboards to pick out faces in the crowd and assess which adverts might be of interest. To some observers, it’s just a fun use of modern technology, but to a significant and growing number of people the facial-recognition cameras represent a chilling shift towards invasive advertising that damages privacy. Meanwhile, the iPhone X is using the same technology on a vastly smaller scale. The phone contains an unlocking function called Face ID, which is also able to track the owner’s attention and facial expressions. It could be used to study how faces respond to ads, or how long owners pay attention to social media feeds and YouTube videos.
Here in macrocosm, in Piccadilly Circus, and microcosm on the iPhone X, is the future of advertising, with facial-recognition enabling the targeting of individuals and social groups. There are a number of consequences. For one thing, everyone will be classified into narrow segments with their own adverts. This means the group experience of sharing recognition of adverts – whether they are from Nike, McDonald’s, or Coca Cola – will no longer happen. More seriously, the cameras threaten to undermine our privacy.
Arum Sinnreich is a US professor of communications who has been campaigning for more regulation of the technologies. He argues that there is no limit to the “potentially devastating” transformative impact of “ubiquitous” digital surveillance. “It alters the very fabric of our social and personal lives, and changes the processes by which we build identities, develop relationships, participate in the civic process, and express intimacy,” says Sinnreich, Chair of the School of Communication at Washington’s American University.
Sinnreich is uncomfortable with the Piccadilly Lights display despite the reassurances about privacy from screen owner Landsec. The property and investment company promises images are held for micro-seconds and never stored. Individuals cannot be identified and every face is analysed as if the camera has never seen it before. But these types of protestations do not convince Sinnreich. “I can’t comment on this individual example, but systems using the cameras are becoming more common and they are rarely open source and freely auditable, which means the claims to discard the data are ultimately a matter of trust rather than verification,” he says. “Even if the surveillance operator is acting in good faith, there’s no guarantee the system won’t be used by third parties – state, corporate or criminal – who wish to collect and use the data for their own purposes. Finally, there is a cultural dimension to this – the normalisation of ubiquitous surveillance – that has its own deleterious social effects independently of the use of the data.”
Some big brands have already used facial-recognition technology in their advertising, but the full potential has not been realised yet. In 2012, Nike’s “Free Face” campaign allowed people to control the movements of virtual shoes with facial expressions, demonstrating the flexibility of Nike footwear. The app captured users’ facial expressions via webcam. In 2013, Virgin launched a “Blinkwashing” campaign, allowing viewers to change storylines in a video with the blink of an eye. These fun and creative ad campaigns were hits with the public. But the technology has become much more accurate and these isolated examples are just the start of a revolution in advertising, Sinnreich believes.
One likely use for the cameras will be to enable retail stores to classify their customers into demographic segments. In a similar way to the Piccadilly Circus displays, cameras will assess the gender, age, race and mood of customers, then project targeted adverts on to nearby screens. Professor Kevin Bowyer, a biometrics expert at the University of Notre Dame, suspects many US stores are already using the technology in this fashion without telling customers. He compares it with the scene in Steven Spielberg’s futuristic film Minority Report, when John Anderton (Tom Cruise) strolls through a mall and is bombarded with ads mentioning his name. A camera scan of his eyes is enough to target Anderton with ads for Lexus, Guinness and American Express. “This is no longer the future. It’s happening right now,” Bowyer says.
Bowyer says retail stores won’t have to keep quiet about the cameras for long as the public will quickly get used to the idea. He points out that people have largely accepted how similar technologies track them between websites and target them with ads. Although he anticipates a backlash to the ubiquity of cameras, he expects it will be fleeting. “Judging by the experiences of Facebook and social media there won’t be that much of a backlash. Everyone knows what Facebook is doing, but will Facebook fall off the map? No. Experience shows that people will trade off a lot of privacy for a little bit of convenience,” Bowyer says.
A survey from First Insight supports his view about the public’s willingness to accept the trade-off. The research found that, although 75% of consumers said they would not shop at a store using facial recognition technology for marketing purposes, more than half (55%) would be open to facial recognition if they knew there was a benefit involved, such as getting a discount.
Surveillance cameras, including the ones used for advertising, are rapidly becoming an intrinsic part of modern life. There are nearly 250 million surveillance cameras installed worldwide, according to IHS Insight, with revenues tripling from US$14 billion in 2013 to US$42 billion in 2020. Society also has to contend with surreptitious recording technologies, such as drone cameras, Google Glass and Snapchat video glasses. Sinnreich says our faces have become “bar codes”, with cameras able to identify facial emotions more accurately than humans. Disturbingly, they can discern deep information about personal lives without our consent, or even our awareness of what’s happening.
“Privacy seems to be dying like the proverbial frog who boils to death because the temperature changes so gradually,” says Sinnreich. “Even when the inevitable kickback comes, it will probably be short-lived, concentrated among elite and well-educated consumers, and ineffective. The combination of apparent convenience and the impossibility of resistance will most likely be enough to prevent large swaths of the population from engaging in organised and sustained resistance.”
Sinnreich believes it will soon be practically impossible to travel around a city without being observed and assessed by cameras. Airport security agents use facial recognition to identify fake passports and casinos use it to spot cheats. Even churches in the US are using facial-recognition cameras made by a company called Churchix to identify who has, or has not, attended church. In Hollywood, Disney experimented last year with using hidden infra-red cameras in movie theatres to assess cinema-goers’ emotional reactions. The Disney cameras relied on deep learning using neural networks to recognise facial expressions. After observing an audience member for a few minutes, the cameras reliably predicted their facial expressions for the rest of the movie. The data set ended up with 16 million facial readings from 3,179 viewers.
Another American corporate giant, Amazon, has developed a technology called Rekognition, which can identify, track and analyse people’s faces in real time and recognise up to 100 people in a single image. Rekognition is able to refer to databases featuring tens of millions of faces. “Law enforcement can accurately flag matches and speed up their investigation,” Amazon claims. Orlando Police could not resist trialling the Rekognition facial recognition programme, but in May this year they had to defend themselves against criticism from the non-profit American Civil Liberties Union. The ACLU said the technology “can be readily used to violate civil liberties and civil rights” and that Black Lives Matters activists, undocumented immigrants and political protesters would be seen as “fair game”. Orlando Police Chief John Mina denied the accusations, but the doubts persist.
In the US, the public has even less protection than in the EU, where GDPR rules came into effect at the end of May. “Because of the prohibitions on storing and sharing data in the GDPR, it will have a somewhat ameliorating effect on the privacy threats of facial recognition. But there’s far more that can be done to protect EU citizens and consumers. In the US, where ‘free speech’ is sometimes interpreted by the courts to mean unchecked power over society by corporate interests, I am even more concerned,” Sinnreich says.
In practice, applying GDPR’s rules to different technologies will not be straightforward. Sinnreich says there are so many examples of facial-recognition that it becomes next to impossible to regulate every specific case. “Whether examples such as the Piccadilly Lights are GDPR compliant is a wholly open question and, ultimately, it will be up the courts to decide,” he says.
Companies may also find sly ways to get consumers to opt in to the use of facial-recognition cameras. Because of GDPR, Facebook is seeking explicit consent for targeted advertising, storage of sensitive information and the use of facial-recognition cameras “to detect which pictures users are in and help protect them against strangers using their photos”. But although Facebook’s approach appears open and reasonable, it has been much criticised.
TechCrunch’s Josh Constine wrote that Facebook was complying with the letter of GDPR law, but not its spirit. “The subtly pushy designs seem intended to steer people away from changing their defaults in ways that could hamper Facebook’s mission and business,” he wrote. Critics say that Facebook has made it easy to simply click “no” to decline the new permissions, but far more onerous to manage data settings. A large majority of people will simply be too lazy to opt out of allowing facial-recognition, or will not realise the full implications. “The extent and power of facial recognition are drastically underappreciated by the general public,” says Sinnreich.