Personalisation and the ‘creepy factor’ of marketing



The recent EU General Data Protection Regulation (GDPR) is the gold standard of data protection laws which came into effect in May this year with the aim of protecting data for all EU citizens similar to a doctor protecting patient information – on a need to know basis.

Ultimately, what GDPR does is protect information relating to an identifiable living person including their name, photographs, IP addresses etc (European Data Protection Supervisor, 2016). The introduction of GDPR has caused increased awareness in how personal data is used and shared by businesses. This has heightened since the Cambridge Analytica scandal, who grossly misused Facebook data to influence voter decisions in the Brexit referendum and the American Presidential election (Granville, 2018).

So what can be deduced from the introduction of the most stringent data protection laws to date? Should we be worried by recent examples of personal data exploitation?

What we know is that personal data can be used to target and manipulate consumers into seeing and feeling what marketers choose, so we must keep ethical boundaries explicit. Laczniac and Udell (1979) predicted future trends in marketing, stating that flexible marketing would come about because of advancing technology whereas ethical marketing is driven by social responsibility demands. Advancing technology has enabled marketers to access big data that has previously not been available and this, through platforms such as facebook, enables personalisation of adverts to a scarily accurate degree.

Personalised adverts can appear in many different ways, from using a consumer’s first name in an email, to using cookies to make clothes a consumer has looked at appear on facebook adverts. In some cases, personalised adverts can be an interesting way to connect with your audience, like Spotify’s user data adverts in 2016 (Nudd, 2016). However it is also easy for marketer’s to get this very wrong, and leave the consumer feeling violated - as was the case with Target when they decided to send marketing materials to women they predicted to be with child from their purchasing choices (Clark, 2014). While this isn’t illegal for companies to do as the data is collected with consent, it makes people feel uneasy to know that this information is kept and used without their knowledge, and often not to the consumer’s benefit. Not only is this an unethical practice, it can have serious implications by alienating your clients.

Big data is an emerging field which provides countless opportunities for marketers and the technology utilising it has dramatically improved. However the boundaries (both legal and social) between improving operational and strategic risk or risking reputational damage are only a fraction apart (Buytendijk and Heiser, 2013). Reputational damage can happen overnight, whereas regulations and ethical discussions progress at a much slower pace than the data is harvested. Paperchase was targeted by the social media campaign Stop Funding Hate and a vast number of consumers expressed displeasure. Paperchase responded 48 hours later announcing they were pulling advertising from the Daily Mail, the source of Stop Funding Hate’s efforts (Greenfield, 2017).

What can be done to ensure you stay on the right side of your customers? Companies looking to avoid overstepping the line should consider implementing a strategy when it comes to big data:

Firstly, engage in ethical debate to determine what is and isn’t appropriate for the organisation. Relying on regulatory compliance that is not tailored can still allow business missteps and cause backlash from the public. Different target audiences will have different views on what is acceptable; part of consumer research must now involve degrees of ‘Digital Native’-ness. Expecting consumers to read your terms and conditions as the only method of protection is not going to be perceived as effective when things go wrong and will certainly erode consumer trust.

Secondly, develop a code of conduct to certify that data is only used for stated purposes and includes ethical checks and balances to ensure all legal implications are considered, especially re-identification. Available data will increase alongside the number of internet users, expected to reach 5bn by 2020 (Conick, no date). While this represents an opportunity, it also increases potential threats. Those who are not digitally literate will struggle to understand the consequences of dispensing their data and will rely heavily on businesses to be trustworthy.

Finally wherever possible, the ‘shock’ and ‘creep’ factors must be removed from personalised marketing efforts. If you create hyper-personalised adverts that makes the consumer feel you know too much, this may cause them to worry about what else you actually know (Daykin, 2015). As ever with this topic, finding balance is key.

The limits of consensus refers to the normative prescribed limitations of behaviour in society (Ross, 1970). By changing the limits to a narrower stance, you can ensure you remain on the correct side of the consumer making them feel safe and protected.


References

Buytendijk, F., & Heiser, J. (2013, September 24). Subscribe to the FT to read: Financial Times Confronting the privacy and ethical risks of Big Data. Retrieved July 5, 2018, from https://www.ft.com/content/105e30a4-2549-11e3-b349-00144feab7de

Clark, N. (2014, August 19). Tailored experience or digital stalking? Has personalisation gone too far? Retrieved July 5, 2018, from https://www.theguardian.com/media/2014/aug/19/tailored-experience-or-digital-stalking-has-personalisation-gone-too-far

Conick, H. (n.d.). As Data Gets Bigger, So Do the Risks. Retrieved July 5, 2018, from https://www.ama.org/publications/MarketingInsights/Pages/bigger-data-bigger-risks.aspx

Daykin, J. (2015, March 19). Personalised marketing at scale is the next big thing in digital. Retrieved July 5, 2018, from https://www.theguardian.com/media-network/2015/mar/19/personalised-marketing-digital-future

Dubinsky, A., & Hensel, P. (1986) ‘Ethical dilemmas in marketing: A rationale’, Journal of Business Ethics, Volume 5, Number 1, Page 63

European Data Protection Supervisor. (2016, November 11). Data Protection. Retrieved July 5, 2018, from https://edps.europa.eu/data-protection_en

Granville, K. (2018, March 19). Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens. Retrieved July 5, 2018, from https://www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html

Greenfield, P. (2017, November 20). Paperchase apologises for Daily Mail promotion after online backlash. Retrieved July 5, 2018, from https://www.theguardian.com/media/2017/nov/20/paperchase-apologises-for-daily-mail-promotion-after-online-backlash

Laczniak, G. and Udell, J (1979), 'Dimensions of Future Marketing', MSU Business Topics 27, pg. 33—44


Nudd, T. (2016, November 29). Spotify Crunches User Data in Fun Ways for This New Global Outdoor Ad Campaign. Retrieved July 5, 2018, from https://www.adweek.com/creativity/spotify-crunches-user-data-fun-ways-new-global-outdoor-ad-campaign-174826/

Comments

  1. Hi Georgia, I very much like you highlighting that companies need to open the ethical debate internally and develop a code of conduct. Because ethics are just not clear-cut and can't be framed only into what is legal or not, the culture of the company and how they deal with the consumers, how they perceive and value them should be communicated to the teams. I think this can help to find a common understanding and approach in the digital marketing arena too. I am not going to elaborate more as I have posted already two comments ;) but I was curious to see your ideas on the digital privacy issue... Take care, Anja

    ReplyDelete
  2. Hello Georgia

    This presentation is well outlined and flowing. It addresses the opportunities of presented by big data at the same time pointing out the potential pitfalls. The importance of data in terms of marketing is clearly outlined as enabling personalisation and effective targeting.
    I was impressed with the point of a code of conduct. This places responsibility on organisation rather relying on the legal aspect which is constantly being made obsolete by technology development and pushes the organisation to take responsibility.
    The solution that you have suggested are clear and implantable. However, I fell that inclusion of info graphics could have increased the levels of engaging your reader.

    Bright

    ReplyDelete

Post a Comment

Popular posts from this blog

Argos: a successful digital strategy

Welcome

Hidden messages in video content