Deception by Design

I will be referring here to a study undertaken by the Norwegian Consumer Council or Forbruker Radet.  We are speaking here of the subtle and not-so-subtle techniques that the big gatekeepers (Google, Microsoft, Facebook) employ to “nudge” users into giving up more privacy than they otherwise might.  What I really want to point the reader towards, however, are the wider implications of this engineering, beyond privacy questions strictly speaking, into the Marcusean realms of compliant efficiency and technological rationality, which I have discussed in some depth in previous essays.  The subheading to this study, titled “Deceived by Design”, “How tech companies use dark patterns to discourage us from exercising our rights to privacy”, employs the neologism “dark patterns”, a term only 8 years old, and can be defined as a user interface that has been carefully crafted to trick users into doing various things not necessarily in their best interest. This employment of deception stems from the monetization of data.  From the report:

“Because many digital service providers make their money from the
accumulation of data, they have a strong incentive to make users share as much
information as possible. On the other hand, users may not have knowledge
about how their personal data is used, and what consequences this data
collection could have over time. If a service provider wants to collect as much
personal data as possible, and the user cannot see the consequences of this
data collection, there is an information asymmetry, in which the service provider
holds considerable power over the user.”

One of the most salient techniques is “nudging”.  Again, from the report:

“The concept of nudging comes from the fields of behavioural economy and
psychology, and describes how users can be led toward making certain choices
by appealing to psychological biases. Rather than making decisions based on
rationality, individuals have a tendency to be influenced by a variety of cognitive
biases, often without being aware of it.  For example, individuals have a
tendency to choose smaller short-term rewards, rather than larger long-term gains (hyberbolic discounting), and prefer choices and information that confirm
our pre-existing beliefs (confirmation bias). Interface designers who are aware of these biases can use this knowledge to effectively nudge users into making particular choices.  In digital services, design of user interfaces is in many ways even more important than the words used.  The psychology behind nudging can also be used exploitatively, to direct the user to actions that benefit the service provider, but which may not be in the user’s interests. This can be done in various ways, such as by obscuring the full price of a product, using confusing language, or by switching the placement of certain functions contrary to user expectations…[T]he information asymmetry in many digital services becomes particularly large because most users cannot accurately ascertain the risks of
exposing their privacy.”

In the case of privacy settings, the basic fact is that most users do not go into the menus of Amazon, Google, or Facebook to change what is set up by default.  This can only be because there is an implicit trust in the service provider on the part of the user.   This trust is abrogated as a matter of course.  Time and again the service providers show their true colors by making it difficult for the user to maximize privacy.  The study offers the follolwing example:

“The Facebook GDPR [General Data Protection Regulation] popup requires users to
go into “Manage data settings” to turn off ads based on data from third parties.
If the user simply clicks ‘Accept and continue’, the setting is automatically
turned on.  This is not privacy by default…Facebook and Google both have default settings preselected to the least privacy friendly options.”

I think that I have conveyed the general idea of the problem.  While this study does not reveal any techniques that have not been previously known, to grasp just how it all works in the macro sense is something that encounters considerable resistance on the part of the average user.

Franklin Foer, in his review of the latest Jaron Lanier book Ten Reasons to Delete Your Social Media Accounts Immediately, makes reference to the “Facebook manipulation machine.”  The practices outlined in the Norwegian study make it clear that this manipulation machine is not limited to Facebook.  It is ingrained in the protocols that one encounters nearly every step of the way as one navigates the internet. As such, it functions as as a giant behavior modification apparatus, acclimating the user to a new cognitive paradigm, one in which user autonomy is progressively degraded, arguably on the way to its eventual elimination. I ask the reader of this article to keep in mind that this manipulation machine is in its nascent stages.  Look how far it has come in just the eight years since the term “dark patterns” was coined.  What will the state of affairs be in ten, twenty, a hundred more years?

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s