Ex-UM privacy chief lifts lid: Google has ‘captured’ trade associations and holdcos, personalisation-precision a ‘fallacy’ based on ‘garbage’ data reaching ‘fake people’
What you need to know:
- Arielle Garcia was the US privacy lead at IPG-owned agency UM. She “believed the industry narrative” that privacy and personalisation could be balanced – and brands could gain commercially from targeting responsibly with data amassed and processed by a sprawling supply chain.
- But that all came to a head last year when a report by Adalytics suggested Google was targeting kids. Google flatly rejected that allegation but for Garcia – with UM clients seemingly affected – it was time to get out.
- Now she’s urging marketers to wise up to the personalisation-precision-privacy “fallacy” that exists, in her view, to enrich big tech and the intermediaries increasingly complicit in that transfer of budgets.
- Using her own segmented data from adtechs and brokers she’s shining a light on just how bad the data being used to prop up a $700bn actually is.
- Garcia says the large agency networks have been shorn of objectivity due to perverse incentives with the big industry associations likewise paid off.
- She thinks big tech is now manoeuvring to hedge against anti-trust action by pushing deeper into black box AI-powered ad placement and search.
- That poses an extinction-level event for publishers, per Garcia. The “good news” is that they must now “re-orientate to quality” to survive.
- But that requires marketers to refuse black box, non-transparent products and services.
- Recent history suggests such a shift might prove to be a challenge.
- Get the unfiltered version via the podcast.
Google has captured the trade associations … They buy their way into every room … If you have marketers looking to major trade associations for guidance, and they're saying ‘nothing to see here’, of course that's going to lull marketers into a false sense of security.
Data dump
Just how accurate is the user data being traded by advertisers, agencies and data firms in the $700bn global digital advertising system? The former Chief Privacy Officer of UM in the US, Arielle Garcia, is exasperated – it’s junk she says.
To prove it Garcia recently accessed her profile from an adtech vendor and found she was in “500 different audience segments across seven different data brokers, and what I saw was just a bunch of contradictory, useless, garbage data.” I.e. she was both a man and a woman, worked in food service, agriculture, but was also a defence contractor, an engineer and was simultaneously below the poverty threshold and classified as high income. Trading these data junk bonds allows for a host of data “premiums” to be applied by various intermediaries in the process of executing digital advertising campaigns.
The problem is, marketers are paying for that “garbage”.
She says big agency groups have likewise lost their way. It’s not just principal media models (arbitrage) that are problematic, but the fact most of them are incentivised to hit targets by the likes of Google and it distorts the market – “it’s literally about their objectivity,” she says.
Agencies are not the only ones. “Google has captured the trade associations … they buy their way into every room,” she asserts, with marketers then lulled “into a false sense of security”. Garcia argues Google’s manoeuvres with PMax – it’s AI-powered “just trust us” media placement product for its owned media assets like YouTube – “gives Google the ability to opaquely use their black box algorithms to move money wherever they want”, which could prove handy ahead of impending antitrust trials.
Meanwhile, AI Overviews in search will “massacre traffic” for publishers who she says must stop forcing people to log in and refocus on quality.
For marketers, Garcia urges a “reorientation around people” and “prioritising quality over the illusion of precision.” How? “Demand transparency” and hold agencies accountable. “No one messes with the client that audits.”
For everybody else Garcia has one ask – work out if YouTube is “covertly tracking people” via a little-known identifier.
Marketers have been tricked into believing that precision and personalisation equals performance. There's so much that's wrong with that, and the inaccuracy of the data is only one piece of that fallacy.
Awkward customer
Prepping for a conference talk spurred Garcia’s dive into her own segmentation. Back when she was UM’s privacy chief, Garcia would usually use such talks to “plead” for marketers to use data responsibly. But now at a non-profit watchdog, her approach is somewhat pointier. “I can just say ‘the data is garbage’,” says Garcia. So she set out to prove it.
“Only a handful of ad tech platforms let you do this, but one of them had a portal online where you’re able to access your data, and in this case, that meant the audience segments you’re in.”
Finding herself in 500 segments – most of them contradictory and laughably wrong – was “absolutely wild”, per Garcia. “It shows the reality that data providers are incentivised to have people in as many segments as possible. The more segments [people] are in, the more likely they are to be able to monetise them … They’re selling this garbage data back and forth to one another. It’s actually pretty fascinating.”
The upshot is that vast ad budgets are often being traded on a mountain of crap.
“I call that the precision illusion. Marketers have been tricked into believing that precision and personalisation equals performance,” she says. “There’s so much that’s wrong with that, and the inaccuracy of the data is only one piece of that fallacy.”
The reason the precision illusion exists, “is because of how many commercial interests are tied up in it”, per Garcia.
“Programmatic was fundamentally supposed be a way to more efficiently connect demand and supply. Publishers were supposed to be able to more easily and readily monetise more of their inventory. For marketers, ‘we can reach our audience wherever they are’ was the big promise. But in reality, it’s just emerged as a way to prop up this $700 billion industry of intermediaries that rely on that idea.”
Either way, the conference talk – at a publisher forum – caused a reaction. “People were openly laughing as I walked through the data … Someone came up to me and said, ‘I’m so glad that they allowed you to speak here’, which shows you the culture of silence that is the tenor of the industry.”
It's no longer about providing service and delivering value to clients. It's about playing pinball with their ad budgets.
Ethics gradient
In 2020, Garcia launched UM’s privacy and responsibility practice. Then last year, after a decade with the holdco, she left to head up intelligence at US non-profit watchdog Check My Ads Institute. Since then she’s been on a mission to shine a light on where the ad industry is being bent out of shape.
“What I believed was the industry narrative. I believed there was this balance to be struck between privacy and personalisation and that there were actual commercial imperatives for brands that [took] a thoughtful journey to being responsible with data.”
But the more she saw, the less that narrative rang true.
“I recognised somewhere along the way that none of this was about what was right for brands. This was all about propping up commercial models,” says Garcia.
Holdco approaches, she suggests, are a big part of the problem.
“It’s no longer about providing service and delivering value to clients. It’s about playing pinball with their ad budgets,” says Garcia, charting the evolution from trading desks, then barter models, then principal trading models through to incentive deals with big platforms.
“With this shift in commercial model, what happens is, if you’re an account lead, and your client thinks you’re phenomenal, but you haven’t hit the targets you need to hit for all of the different offerings that exist within the agency’s walls … and those other offerings are where the margin’s at, then guess what? You’re not keeping your job. So that fundamentally changed the role of agencies,” she suggests.
Today, “for holding companies, agency is a misnomer, because they are certainly not prioritising the interests of clients”, adds Garcia.
The holdcos would undoubtedly disagree. But Garcia says misaligned objectives and incentives go beyond the principal-based trading models (AKA arbitrage) that agencies use to make higher margins on media to fund other services that a procurement-driven approach to lowest cost media can otherwise preclude. (Some analysts defend that approach, suggesting marketers “sometimes prefer arbitrage” if it means they can get the services they need and please procurement.)
But Garcia thinks competing objectives have gone too far – with platforms complicit.
“Sometimes it’s about just establishing certain spend commitments that they need to meet with vendors or driving adoption of particular products. Google has what they call advertising incentives, which they offer to agencies to promote adoption of certain products. So … that’s going to influence the agency’s willingness to be critical of those vendors where they have targets to hit, and that is an entirely different thing,” she says. “It’s not just about the inventory. It’s literally about their objectivity.”
I have yet to see any holding company executives speak ill of Performance Max – why is that? I always thought that we needed to balance personalisation and precision and privacy. But what I realised was … brands are not benefiting from this. This is all a complete farce, and no one is willing to say it out loud because they don't want to piss off Google.
Performance problems
Left unchecked, a lack of objectivity creates further problems. Garcia thinks Google’s Performance Max product is a standout example. PMax, in simple terms, does the planning and buying automatically using AI. There is little transparency on where the buys run, or the rationale for ending up there. “It’s their ‘just trust us product’,” per Garcia.
She says Adalytics’ ‘Made for Kids’ report of last year – which ultimately played a role in her decision to exit UM after she spoke out – underlines why a lack of transparency is undermining the entire industry.
“What the Adalytics report found was that YouTube was serving adult-targeted ads on kids videos. Let’s say you’re a bank. You are targeting adults, and your ads are running on Blippi, a kids’ show. Now, particularly problematic is that when those ads were part of a Performance Max campaign … Brands have less control over where their ads run, and also get less transparency.
“For example, you can’t see granular reporting on Google’s owned and operated properties like YouTube, so you could not see what channels or videos your ads were being served on.
“So you’re that bank, you’re running a Performance Max campaign, you are targeting adults or you think you are [but it could be] your ads are being served on these kids videos – you don’t know. Toddlers tend to have sticky fingers, so they click the ads by accident, and now, congratulations, your campaign has learned to optimise to sticky fingered toddler clicks.”
But no holdco wanted to comment on those findings.
“I honestly have yet to see any holding company executives speak ill of Performance Max – why is that? So for me this was the wake up call. A bunch of things converged in my head, because I always thought that we needed to balance personalisation and precision and privacy. But what I realised was … brands are not benefiting from this. This is all a complete farce, and no one is willing to say it out loud because they don’t want to piss off Google.”
Meta has always been unapologetically bad … Anyone that goes on Facebook for five minutes can see that it's basically devolving into AI-generated content and bots that are just pinging each other back and forth … [But] people aren’t afraid to talk about Meta.
Fear factor
Hence joining Check My Ads: “We don’t take adtech money. We can say things out loud. We can start the conversations that need to be started.”
Garcia says that is particularly important since trade associations seem increasingly reticent to criticise big tech.
“Google has captured the trade associations,” she claims. “They buy their way into every room, into every closed door meeting. They make absolutely certain that they can control the narrative. So it’s a problem when the trade associations that are supposed to represent their members have been captured.
“If you have marketers looking to the major trade associations for guidance, and they’re saying ‘nothing to see here’, of course that’s going to lull marketers into a false sense of security.”
Asked whether the same could be said of Meta, Garcia counters that “Meta has always been unapologetically bad” and that its equivalent to Performance Max, called ‘Advantage+’ has “largely the same type of [transparency] issues … Anyone that goes on Facebook for five minutes can see that it’s basically devolving into AI-generated content and bots that are just pinging each other back and forth”, suggests Garcia. But she says the difference is “people aren’t afraid to talk about Meta”.
Publishers are facing an absolute existential crisis. On the one hand, you have Google's AI Overviews, which is going to massacre traffic. [On the other] you have Performance Max, which gives Google the ability to opaquely use their black box algorithms to move money wherever they want … But to force people to log-in is to cut off one's nose to spite their face; it is not the answer.
Quality or bust
The fundamental issue is that marketers, agencies and publishers “ended up completely taking our eye off the ball in terms of caring about quality media environments and tricked ourselves into thinking we can reach our audience wherever they are. When a lot of the time, it’s not actually what’s happening”, says Garcia. She thinks the solution for the buy-side, sell-side and the intermediaries in the middle, is to “re-orientate around quality”.
That kind of change is hard to manage while paying the bills. But Garcia thinks there will soon be little choice.
“The good news is that publishers are facing an absolute existential crisis. So I’m actually more encouraged than ever that things will need to change, because publishers do not really have much of a choice but to act. On the one hand, you have Google’s AI Overviews, which is going to massacre traffic,” she says.
On the other, “you have Performance Max, which gives Google the ability to opaquely use their black box algorithms to move money wherever they want.” (Garcia suggests that will see Google “doubling down on its owned and operated” properties ahead of potential regulatory challenges.)
“So publishers, on the one hand, are facing continually declining traffic, and they’re being given this other option of buying into alternative ID solutions – they’re able to claim higher CPMs where there is an identifier. But then you have falling traffic, and when you do get traffic, you’re going to force those people to log in so that you can assign an ID and maybe get some extra money. Now that cannot be the plan … To force people to log-in is to cut off one’s nose to spite their face. It is not the answer,” she says.
“The answer needs to be reorienting around quality … We have to reorient around people, and then we need to get to a place where we’re actually back to prioritising quality instead of this illusion of precision.”
So how do we get there?
There's absolutely no rationale that should prevent [platforms and agencies] from providing transparency. So advertisers need to kill the market for black box products … and hold agencies accountable … I will tell you firsthand, no one messes with the client that audits.
Kill black boxes
“Marketers have to demand transparency into their buys. There cannot be a market for black box products,” Garcia suggests. “There’s absolutely no rationale that should prevent [platforms and agencies] from providing transparency. So advertisers need to kill the market for black box products.”
Marketers, says Garcia, need to take the time to understand their contracts and the way their agencies are compensated.
“I don’t think that the principal trading model is good for anyone, but at least understand where your agency is making its money and actually hold them accountable. So if you set standards in your contracts, for example, ‘don’t run my ads on kids content’, then you need to go back and hold agencies accountable. And I will tell you firsthand, no one messes with the client that audits. They don’t. It’s just not worth it.”
One of the [Adalytics] findings was that YouTube creates a persistent, immutable identifier called the 'X-Goog-Visitor-ID'. There's nothing on the internet about this. There's nothing in their documentation … There's still no clarity from Google on this. So that’s my call to action for everyone: Let's find out what X-Goog-Visitor-ID is.
Covert tracking?
Google faces a major adtech antitrust lawsuit in the US in September. If it loses, a break-up of its adtech business may be the end result. Hence Garcia’s theory that Google is positioning to “double down” on its owned and operated business via PMax as a hedge.
“So the bigger thing to watch there is going to be how they [the US Justice Department] address Google’s access to data [which] has always been their power. How are they going to meaningfully prevent Google from using data from across its properties to bolster its ad business? How are we going to get meaningful transparency to enable enforcement of all of that?”
She cites one of the many assertions within Adalytics’ Made for Kids report by way of example (Google dismissed the report as being “filled with basic errors”.)
“It’s 200 pages long, but one of the buried findings was that YouTube creates a persistent, immutable identifier called the ‘X-Goog-Visitor-ID’,” says Garcia.
(The Adalytics report states that X-Goog-Visitor-ID is specific to the YouTube app, but that there “does not appear to be anyway to delete, edit, or remove consent for this identifier. Google’s privacy policies and terms do not explicitly disclose this identifier.” The report adds: “When a consumer installs the YouTube iPhone iOS app, and begins using the apps to watch YouTube videos on their iPhone, it appears that a specific HTTP request header called “X-Goog-Visitor-ID” is continuously sent to YouTube’s servers from the user’s phone. This request header contains what appears to be a high entropy, unique identifier. Regardless of whether or not a user is signed into the YouTube app, or whether they reset their app analytics IDs, this identifier appears to be persistent with a constant value. The identifier does not appear to be disclosed in any privacy policy or technical documentation.”)
“There’s nothing on the internet about this. There’s nothing in their documentation,” continues Garcia, who questions whether Google may therefore be “covertly tracking people”.
“No one followed up on this, no one was asking the question … There’s still no clarity from Google on this. So that’s my call to action for everyone: Let’s find out what X-Goog-Visitor-ID is.”
(Mi3 filed an overnight request to Google locally and will update this article with its response.)
Google responds: A Google spokesperson said they are “looking deeper” into the X-Goog-Visitor-ID question. They refuted the broader comments made and said the Adalytics report had been “widely discredited”.
Per the spokesperson:
“We respectfully disagree with many of the assertions made. Google’s investments in advertising technology offer advertisers transparency and control over where their ads serve. Our AI-powered advertising products, including Performance Max, prioritise delivering business value to our customers, agencies and partners.
“The deeply flawed Adalytics report that is referred to has been widely debunked and discredited. The report makes completely false claims and draws uninformed conclusions based solely on the presence of cookies, which are widely used in these contexts for the purposes of fraud detection and frequency capping.”
Mi3 will update this article with any further detail on X-Goog-Visitor-ID.
There’s more in the podcast. Get the full download here.