Personalization, privacy and the mystery of the red hair dye

A retailer's failure to meaningfully use years of first-party data results in irrelevant personalization and missed opportunities.

Contributors:
Leah Parker
CIPM
Senior Manager, Data Protection
Air New Zealand
Being in the privacy profession for a decade now, I have had my share of conversations that start with "we want to provide customers with more personalized offers" or worse, "we want to increase our intimacy with our customers." I have a terrible poker face and can only imagine what my body language does when a conversation starts like this.
This morning, however, I found myself wishing I had a more "intimate" relationship with my online grocery provider. Hear me out.
Every week, like clockwork, I place my online grocery order. I have been doing this for seven years. Seven years of loyalty. Seven years of scanned barcodes, substitutions, forgotten avocados and emergency top-ups because I forgot the milk. And yet, when I open the app and scroll to "personalized offers," I am confidently presented with red hair dye and pet food.
I do not dye my hair red.
I do not own a pet.
I have never owned a pet.
I have never — explicitly or implicitly — indicated to this retailer that either of these things might be relevant to me.
And yet, there they are. Every week.
What makes this particularly odd is that, based on my order history alone, my grocery provider almost certainly knows a great deal about my household. They could reasonably infer how many people I shop for. They likely have a good sense of the age range of my children — hello school-lunch staples, farewell nappies. They know the types of meals we cook, how those meals change with the seasons and when life is busy versus when I optimistically plan to cook from scratch.
They know all of this because I've told them, quietly, consistently, through thousands of first-party transactions over many years.
And yet the best they can do is suggest hair dye and pet food.
This is where I start to feel frustrated by how privacy and personalization are often framed as opposing forces. As if organizations must choose between rich, personalized customer experiences that are inherently invasive or privacy-respectful approaches that result in generic, low-value interactions.
This framing is misleading.
I am not asking my grocery store to infer sensitive attributes about me, combine my data with third-party sources or surprise me with insights I did not knowingly provide. I am simply asking them to use the data they already hold, collected directly from me through ordinary transactions, to provide a more useful experience in return.
At the moment, the value exchange feels lopsided. The retailer benefits from my data through demand forecasting, supply chain optimization and customer retention. I benefit very little. The offered "personalization" feels disconnected from my actual behavior and, over time, undermines rather than builds trust.
And here is the part that often gets lost in privacy discussions: I would be willing to share more information if it genuinely improved my experience.
I would happily state my dietary preferences.
I would tell them what kinds of meals I enjoy cooking.
I would welcome the ability to say "please don't show me pet products," "I don't want alcohol promotions," or "I'd prefer not to see children's products."
None of this requires invasive profiling. It requires transparency, clear purpose and meaningful choice.
This could be accomplished through highly privacy-respectful methods: clear preference centers, explicit opt in to personalization, the ability to turn personalization off entirely and controls over more sensitive product categories. Customers should be able to understand why they receive certain recommendations and adjust those signals over time.
The irony, of course, is that the data already exists. It is already held, governed, retained and linked to my account. But rather than being used to create mutual value, it largely serves the organization's interests. When customers don't see tangible benefits, they disengage, opt out or stop trusting the system altogether.
Privacy should not be used as the explanation for poor customer experience. When done well, privacy is an enabler of better design. It should result in experiences built on trust, proportionality and genuine reciprocity.
I still wince when I hear talk of "increasing intimacy with customers." But if my grocery provider wants to start by acknowledging that seven years of weekly orders probably says more about me than an algorithm guessing I own a dog, I'm open to the conversation.
Just … please stop with the hair dye.

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.
Submit for CPEsContributors:
Leah Parker
CIPM
Senior Manager, Data Protection
Air New Zealand



