The Dangers of Dark Patterns

There’s no doubt that SEO and Internet Marketing has come a long way. The past few years have been chockfull of substantial shifts in the ways that we do business – causing quite a few growing pains. But with an ongoing transition away from spammy tactics and a refocused spirit on elevating the needs of our customers, it seems that the industry is finally reforming its ways.

But then I see this…

Or this…

Or scammy stuff like this (actual monthly price 4.99)…

And I wonder are we really changing that much? Or are we just shifting our crafty tactics to new areas?

Understanding Dark Patterns

These types clever UI techniques are known as Dark Patterns. Based off of the inverted principles of usability, they’re intended to fool users into opting-in when they otherwise wouldn’t.

Where user interfaces are there to make things as clear and intuitive as possible, dark patterns utilize all the black magic of confusing structure, double negatives, and bait and switch techniques in order to drive users to convert. In essence, dark patterns can be boiled down to an “easy-in” / “difficult-out” philosophy where, instead of giving users honest choices, companies assume their desired actions.

And they’re everywhere…

So What About Us?

Now, you may say, “This isn’t my doing,” and for the most part, you might be right. But here’s the deal, when it comes to A/B testing or CRO, most of us are convinced entirely by testing results. And guess what? These kinds of practices test really well.

For example, take a look at the options below. Which one do you think will do better?


It’s pretty obvious to assume that Example 2 will provide a higher conversion rate… tempted right?

In his presentation on Dark Patterns, Harry Brignull summarizes the problem perfectly. He describes how these patterns are usually found in “aggressive environments” where there’s a “huge emphasis on metrics.” When designs are strictly dictated by marketing, there’s usually an over-emphasis on conversions, which often opens the door to ethically ambiguous design decisions.

Does this sound familiar? Do you base the success of designs primarily off of testing results? Now it may sting to admit it, but if you’re the driving force behind analytics – you’re the cause…

Why this Matters

We’re at a cross roads in our industry. For years we spammed, stuffed and faked our way to rankings with little attention to our end user. We ultimately cared more about our metrics, than we did about the people using our products. As Google has caught on and made painful adjustments though, we’ve changed our ways and decided to flaunt our white hats.

But I can’t help but ask have we really changed? In all the talk about doing “real company shit” and creating excellent content, do we really care more about our user? Or are we hiding our metrics obsession in craftier ways?

When it comes to SEO, we have a strong force of accountability: it’s called Google. There’s not doubt it will become more and more difficult to get away with black hat marketing techniques in the years to come – and so we’ll stop doing them. But, the reality is, there will always be opportunities to “black hat” in the work that we do. There will always be dark patterns we can use to manipulate and deceive for our own benefit.

When it comes to UI, there currently isn’t any kind of accountability beyond our users. There’s no way to stop or discourage this kind of behavior. But in the end, does it really even help? Does it really make our customers appreciate us more? Refer us more? Utilize us more? Or is it just another short-term way to make a few bucks?

As we continue to fly our white hats and talk about elevating our users, we have to ask ourselves: Do we really care about the needs and desires of our customers? Or do we only care when we’re being held accountable?

Luke Clum

Luke Clum

Luke joined the Distilled team in 2012. With a background in design, Luke loves discovering and promoting beautiful content on the web. He’s fascinated by the internet’s potential to act as a catalyst for creative ideas and is excited to be a part...   read more

Get blog posts via email


  1. Awesome post, Luke!

    I don't know how about other people, but you did manage to look me at the problem from another perspective.

    When it comes to A/B testing we all really think little of end-users, we just choose the scenario that brings in more dollars and think we've done a great job. In the result users can get fooled, distracted, confused or simply do what they wouldn't like to do.

    The rest of the industry catches up, does the same and - voila! - we have a new penalty from Google! Vicious circle that only we have the powers to break.

    reply >
  2. Will Critchlow

    Thank you for writing this Luke. I think this is a really important thing to be talking about - as per our internal language, I think that moving from "data driven decisions" to "data aware decisions" is an important goal for this kind of testing.

    We should choose the things to test with our eyes open about the wider implications of our choices and we should select the ultimate winner with the information that A outperformed B - but not necessarily having that as our only criteria.

    reply >
  3. Erin Mullinax

    Chalk-full should be "chockfull" - sorry it was bugging me :-) But anyhow, I think there will always be companies out there who are going to use some form of shady techniques and black hat marketing to quickly gain the metrics they covet, however I believe that the trend of transparency, RCS, etc is going to continue and is going to be the true test of success for companies in the years to come. As SEO's and internet marketers who believe in doing things the right way, we need to educate our clients and empower them to build their businesses and the communities around them by being open, honest and providing genuinely amazing customer service and products. The metrics and long term successes will come as happy byproducts of these best practices. Thanks for the article!

    reply >
  4. Luke this is a great article, thanks for bringing it to surface. Being deceptive is one thing but testing to see what viewers would naturally reach for is OK, in my opinion. We want our customers to be able to find what they are looking for with little time and effort. If the byproduct of testing UI is more conversions then Great! It's a win win for everybody!

    reply >
    • Hey David - Absolutely. User Interfaces need to be stringently tested in order to improve the user's experience. Data is essential.

      However, we can't just make decisions purely off of what our analytics tell us. We also need to factor in the natural, immeasurable elements of our designs.

  5. Kent

    I love that this comes on the tails of Marketo talking about "implicit" consent in email marketing, and ways to get around the negative effects of it.

    These kinds of techniques will only hurt us in the long run. People don't maintain loyalty to companies which try to fool them.

    reply >
  6. Angie

    Totally agree with this, Luke. I actually used to work for a company that would pre-tick services in the box during the shopping cart phase with many people not realising that the boxes were ticked. Needless to say, many people were shocked when they were overcharged and were wondering why they ended up purchasing extra items.

    I brought this up with the Sales and Customer Service Managers as they had this talk about how to improve the customer journey. They told me that they didn't care because it increased sales and they would give people a credit in exchange for the 'unwanted items'... what a seriously unethical practice!

    reply >
  7. Great post, Luke. We've been having brand vs. conversion talks internally as of late, so your article seemed especially relevant.

    We've recently adopted a net promoter score metric as a companion KPI to our CRO tests to make sure our loyalty stays in line with our conversion. We've integrated Optimizely with Qualaroo and pass through test parameters to the Qualaroo NPS question on our site-wide website survey. We can segment the data as our tests run to calculate variation-specific net promoter scores that can be compared with conversion rate performance.

    It's a great way to insulate against these dark patterns, as you call them. I've never thought of black hat UX principles in this way - great post!

    reply >

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>