Data privacy has been a huge topic in the digital landscape in the past year, and it’s not going away anytime soon.
Free Guide: The Inbound Marketer’s Guide to Search Engine Optimization
Users are becoming increasingly concerned with the amount of data collected and used by tech companies for ad targeting purposes, and are calling for better measures to protect privacy online.
The concerns here are very real — Facebook’s Cambridge Analytica scandal is proof of the damage that can be done when data is misused and trust gets broken.
However, at the end of the day, many of the websites we love exist due to ad revenue. Ad-supported content is what makes these websites free to use, and advanced targeting helps serve users up more relevant ads, which keeps advertisers coming back — essentially keeping the entire ecosystem afloat.
Furthermore, the majority of user concerns surrounding data collection and ad targeting stem from a lack of visibility of what’s going on behind the curtain: How did I get selected for this ad? Who’s paying for this and why do they want me to see it?
Beyond just visibility, users want to be able to control what data can be used for ad targeting. There are countless examples of targeted ads that have resulted in unforeseen consequences for users: there have been members of the LGBT community outed at work due to ads targeted to them on work computers, families finding out about unplanned pregnancies due to an algorithm that predicts “pregnancy probability” based on browsing behavior, and many others.
Clearly, something has to change if we want to preserve free access to the internet and the ad-supported content we love while also finding a better way to control how and why our information is being used.
Well, Google thinks so too.
In a blog post published Thursday, Google’s Senior Product Manager of User Trust and Privacy Chetna Bindra discussed the growing battle between personalized ads and the user data that powers them — and Google’s mission to help remedy it.
In the post, Bindra notes that until this issue is standardized across all browsers, websites, and advertisers, it will never get better.
To kickstart action, the post links out to an in-depth proposal by Google that calls for publishers, advertisers, tech companies, and users to come together and reach a solution that works for all parties.
Google’s proposal document is not a set of new standards, but rather, an invitation for key players to come together and have further discussions on what can be done to improve the health of the digital ads industry.
Here’s why standardization is so important: when every platform, browser, or website has its own system to allow/disallow cookie tracking or data sharing, it only confuses the users more about what’s happening with their data and how they can best stay safe online.
Furthermore, Google points out that when browsers choose to block cookie tracking entirely, that still hurts the overall ecosystem because it results in publishers not getting the ad revenue from their traffic, which can lead them to produce less content for the end user.
Additionally, without a standardized system in place, companies are finding ways around cookie tracking blocks, which, as Google states, causes more problems in the long run:
“Broad cookie restrictions have led some industry participants to use workarounds like fingerprinting, an opaque tracking technique that bypasses user choice and doesn’t allow reasonable transparency or control. Adoption of such workarounds represents a step back for user privacy, not a step forward.”
Instead, Google invites others in the industry to come together and create an ad-based framework guided by three key principles:
First, users should have transparency. They should be able to easily see and understand how their data is being collected and used for ads.
Second, users should have choice. Their choices about how they experience the web should be respected and any attempts to bypass those choices should be prevented.
And third, users should have control. They should have the ability to adjust how their data is collected and used to tailor the ads they see, including whether those ads are personalized at all.
The proposal then goes on to outline how the ecosystem can change to ensure that users have transparency, choice and control for ads. The document breaks this down into three key components that Google hopes will serve as a “baseline” for a conversation with others in the industry.
Proposed industry standards for ad transparency, choice, and control
For all online activity, Google proposes that a user should easily be able to see the following information:
What data is being collected on them, by who, and the reason
Who is responsible for each ad a user sees
What caused an ad to appear for that specific user
Specifically, Google proposes that this information be available at specific levels of web activity including:
Broader ecosystem for research purposes
Additionally, Google proposes stricter standards against companies that use targeting methods that invade user privacy, like fingerprinting — and establish consequences for those that don’t comply.
However, this proposal was created to initiate a larger conversation about the importance of user privacy in digital advertising and what all the key players in the system can do to make it a reality.
Of course, that conversation can’t happen without adequate feedback from everyone involved (which is a LOT of people). So, Google created a feedback document to make it easier to collect and compile all thoughts on the matter.
Based on feedback, Google plans to launch an early, open-source browser extension that will show more detailed information about ads to give users the transparency, choice, and control they need.