News reports last week include some problematic stories for the media industry. ‘Advertisers use Google phone apps to eavesdrop on TV viewers’, The Independent and ‘Hundreds of smartphone apps are monitoring users through their microphones’ The Independent. Or perhaps most worryingly, ‘Some mobile games are listening to what children watch’ Engadget.

The original source of this troubling news item is a recent New York Times story, concerning American software company Alphonso whose solution is used in 1000s of apps, mainly on Android but also in a handful of Apple iOS apps. Supposedly the software listens to audio feeds but specifically not to voices, and could record content, time and location data, all potentially available for advertisers buying programmatic mobile display media.

In their defence, Alphonso point out that they are opt-in compliant, as this recording is spelled out in app descriptions and in relevant privacy policies. But does the public truly understand what they’re agreeing to? When you accept the terms of a game you’ve just downloaded, do you realise that it will use your phone microphone to listen to the TV or to other media you’re consuming at the same time?

More than ever properly targeted marketing is being fuelled by the data trail we all leave behind us. Understandably this is encouraging media owners to seek out richer data about potential customers. But is the thirst for richer, more granular data about individuals and their media habits taking the industry to places it shouldn’t go? As consumers become more aware of the volume and value of the information they are giving away, data privacy is likely to be a major topic in media through 2018. A broader public discourse around the topic will likely generate greater awareness, provoke unease and ultimately engender feelings of wariness and anger as deeper tracking and not-so-transparent solutions come to light.

The controversy isn’t new, but we don’t seem to have learned our lesson. In 2016 the FCC issued warnings to app developers who were using the Silverpush software, which had similar monitoring properties.

The desire for more data is understandable. It allows us to create more effective media strategies, buy more accurately and then report with greater certainty on what has worked. But the consequences of data acquisition for those whose information is being used must also be understood, and it is imperative that the spirit as well as the letter of all relevant legislation is followed by everybody involved. Indeed, while legislation lags behind the rapid changes in technology, a rigorous code of conduct should be upheld. It is vital that the public feel they can trust brands, and headlines such as these can only be detrimental to that trust. As the public become more aware of their personal habits, behaviours and lives being recorded and used mainly without their express consent, 2018 could very much be the year of a backlash or at least the very start of it.

As the available pool of data grows, agencies need to ask deeper questions of the data offered by solution providers. Advertisers carrying out their own in-house programmatic trading must do the same. But even with all the right questions being asked and all the “correct” answers being provided in good faith, there remains the possibility of a third-party provider higher up the supply chain doing something which, although perfectly legal, makes many advertisers and their customers feel uncomfortable.

Eventually a refusal to target or track people in this way might become a USP in itself. We might see brands boasting that they don’t use in-depth or covert targeting data, vying with their rivals for the trust of the public. That indeed would be an interesting outcome.

While we’re all preparing for the arrival of the GDPR, perhaps now is an apposite moment to ensure that we’re morally unassailable in ways that go beyond merely adhering to the latest data legislation. As an industry we need to question just how far we are prepared to go, before the public comes to realise the degree to which the technology in their home and that they carry with them everywhere is vulnerable to abuse, and that the minutiae of their lives is being recorded and monetised by faceless bodies half a world away.

There is little doubt that 2018 will see greater emphasis on data, a growth in concerns over privacy and appropriate data management, and a legislative landscape that is slow to keep up with technological changes. The Alphonso example shows that we’re moving beyond the usual tracking of location, searches, clicks, views and engagement, and leveraging wholly different types of data such as sound. While the hegemony of the screen in the digital space is eroded by the rise of “voice” and the growth of the internet of things, it’s likely that data capture will remain at the forefront of most media owner’s agendas; after all, this is how most online platforms are funded. On this basis it’s likely that the greater intimacy of voice in particular will bring about data capture opportunities that many find intrusive. This will form the battleground to be fought over during the next twelve months.

Orwellian? It doesn’t have to be. But the industry does need to police itself better in the coming months and years, and try to stop fires starting through the use of effective codes-of-conduct and better compliance. By constantly questioning methods of data collection and refusing to deal in anything that is less than wholly transparent we can hope to hang on to our most important asset: trust.

Walt Disney’s epithet is still true: people spend money when and where they feel good”. It is for this reason that, when it comes to data protection, holding the moral high ground will be the most profitable position over the long term.

Jon Clarke
Director of Innovation