Freedom in Process / How Distilled Finds Great Tools

Working according to a process means having real freedom to be creative and productive. What an odd statement, that a constraint on our actions would give us flexibility!

But think about it—in considering SEO or marketing, do we act with disregard or disdain for others? Do we allow ourselves to do whatever we like? No, we consider the needs of our customer or client or audience. We constrain our actions in ways that ensure we act in consonance with those needs. And we are free to act creatively within that context.

I’d like to take some time to walk you through my own experience with process at Distilled. This isn’t a philosophical discussion, it’s the real deal—the concept of the process is at the heart of effecting change or the ship it! mentality or any number of characteristics that make Distilled the way we are.

We’re proud of the way we work, and the way we never stop improving ourselves—and that’s something we want to share with everyone. Thinking about these things has an impact not only on your ability to improve your bottom line, but with the quality of your work life generally; how confident you feel about what you’re doing and how good you feel about getting shit done.

The graduated steps, then, that we’ll work through:

  1. Having a process.
  2. Having a good process.
  3. Improving that process.

As you can see, quite granular. Working with the knowledge of these three steps can enable us to accomplish so much more than we could without them. We’ll start by looking at the benefit of just having a process, and disregard the fact that a well-designed process can be hugely beneficial. Sometimes the mere existence of the process is an important benefit.

Make a List!

Wherefore Process?

Distilled has a procedure whereby potential tools for internal use are evaluated. Why? What was our motivation in implementing this? We’re not the types to make up arbitrary rules and processes where none are needed. Were we using poor tools? Were our tools making us inefficient? Were they holding us back?

The answer is a resounding no! We were doing a pretty swell job of consulting with the tools we were using, really. But if we weren’t primarily seeking to improve our tools, why did we need a process?

Well, we had some tools that were used fairly consistently. But our knowledge of which tools were being used was inconsistent—consultants were using different tools across offices. Folks were writing custom tools willy-nilly—often replicating work that, unbeknownst to them, had already been undertaken.

The problem, really, was communication. We didn’t know what our peers in other offices were using or creating on a day-to-day basis, or whether someone in the company had a license for tool X. There were conversations happening on a daily basis in which consultant Y had discovered tool Z and wanted to expound its virtues—only to discover that consultants in another office had been using that tool on a daily basis.

Our answer was to develop a process whereby tools would be evaluated and reviewed. With such a process in place there is a node through which every tool passes, a central point for all tool discussion.

For nearly every Distilled employee the tool process looks incredibly simple:

  1. Become interested in a tool, or identify a gap in our tool set.
  2. Express said interest in an e-mail to Ben.

That’s it. You’ve completed the process. You trust that your concern is going to be addressed, that you’ll be contacted if and when more information becomes necessary, and that if it is decided that we’ll recommend a new tool we’ll let everyone know about the tool and also how we think it can best be used.

The practical result of this is that you’ve delegated responsibility for answering any questions about tools to the tool auditing team. Regardless of the efficacy of the actual tool process, the simple existence of a tool review process has gone a long way to decrease time spent on conversations about which tools are the best and encourage consultants to not feel that they have to run down every tool they find. We are taking the responsibility for finding and using the best tools off of the shoulders of the consultants and allowing them the freedom to focus on their clients.

A valid question remains: now that Distilled has a tool evaluation process, are we using better tools (or using tools better)? I think we are. Having consultants using a recommended tool set encourages consistent communication and allows for more flexibility of work distribution. But that’s almost incidental. The reality is a more consistent workplace and less consultant overhead, and that’s been provided by our process.

How Distilled Finds Awesome Tools

 

Where...To the majority of Distilled employees, then, the tool audit procedure is the simple two-step affair described above. But to a few of us in the company, it’s much more than that. And there is something to be learned from our side of the process as well. I designed the tool auditing procedure we use at Distilled, and one of my responsibilities is to execute it. This is something that I touch on a more-or-less daily basis.

Having decided that a tool process was necessary, it became crucial to decide what type of process this was going to be. How much time and effort was going to be put into evaluating these tools? What level of detail would be necessary? It was apparent to me that this is the kind of thing that could make the difference between great and amazing. We needed to take this seriously.

Within the scope of my experience with the tool review process, I’ll fill you in on what the process is that we use, and the ways it really works out for us.

In practical terms our process needed to:

  1. Determine whether a tool was adopted.
  2. Determine for what use it was to be adopted.
  3. Determine whether currently adopted tools should be retained by applying to them as well.

In the end, inspired by some internal discussions that had been recorded in our London office about process design, I decided to go with a question tree (actually I think it’s more of a hypothesis tree in that it’s assertion driven, but I’m not committing myself to that). If you Google either of those terms, you’re gonna get a lot of scary looking visualizations of various decision making processes, but all I’m saying is that I wanted to start with a big abstract issue—should we use tool Y—and break it down into such small, granular chunks that each little bit could be answered very easily.

At the highest level of the tool review process, there are three questions being asked—and each of those questions is progressively broken down into smaller and smaller pieces until we have a list of bite-size, true-or-false questions. To give you the idea without working all the way down the the “bottom rung”, here is a brief outline which leaves lots out for the sake of brevity, but accurately describes the argument at the highest level:

  1. For task X, is tool a good fit?
    1. Are the types of inputs and outputs for Y appropriate to the problem?
    2. Is the output of Y accurate?
    3. ...and so forth.
  2. Is task valuable for Distilled to execute?
    1. Does directly provided revenue to Distilled?
    2. Does X help Distilled retain clients?
    3. ...and so forth.
  3. Given that 1 and 2 are true, will we adopt tool Y?
    1. Is a cost-effective means of accomplishing X?
    2. Is someone willing to own adoption of Y?
    3. ...and so forth.

The highly simplified design of questions has three important consequences. The first is that the types of questions we asked can be customized based on the tool in question—we can determine, for instance, the relevant questions to ask about any given task and its relationship to Distilled. Some tasks are highly connected to our ability to produce results for our clients, others have a more auxiliary role. We can easily swap in more relevant questions if needed.

The second benefit is that we can spend exactly as much time on this process as we need to. If a tool is small and needed for a trivial purpose, we can check all the required boxes and recommend it in five minutes. If a tool’s adoption has far-reaching consequences for the company, we can labor over every individual point as long as necessary, even to the point of providing supplementary documentation.

The third benefit of this design is that it becomes exceedingly easy to shit-can a tool we don’t need. We don’t have to work our way up all the way from the bottom level to the top if it isn’t necessary. Sometimes it’s ridiculously easy to observe that tool Y clearly doesn’t have a mapping of inputs to outputs that is valuable. Or that the output of Y just isn’t presented in a format which is consontant with the way Distilled works. Just write a big fat no for that question and watch the dismissal work its way up the chain.

It’s not a perfect process—I’ll tell you it certainly hasn’t spared me the pain of dealing with numerous calls with competing sales reps around the world—but on a practical level, it’s working so far. I’ve got limited time to execute it, with the help of the trusty Jacob Klein, but it scales back when I don’t have time and scales up when I do. It’s also a process which, taken on an abstract level, could be applied to many other problems as well—do we want to work with client X, or should we hold event Y. It’s all about asking the right questions. 

Trusting Constraints

There are a lot of tools available to Distilled, even within the narrow field of SEO. Many overlap in the scope of their output; it’s not uncommon for me to run across tools which are for all intents and purposes functionally equivalent. For any given problem, we only need one tool as our solution.

The result of this? I’ve got a long list of tools which I’ve declined to recommend.

It’s true that in order to save time I don’t complete a full workup on every tool that I decline—I think one of the benefits of the system I’ve devised is that I don’t have to. But there is at least one clear and explicitly described reason recorded for each tool. That’s part of the process.

It’s easy to feel confident about recommending a tool. You end up using it every day; its value is evident. But what about all of the poor, rejected tools? They sit in the “decline” pile, growing dusty, and eventually you forget them... Until they come back to life.

Maybe a consultant finds out about a tool at a conference. Maybe a new hire suggests a tool that was reviewed before their arrival. All of a sudden these zombie-tools have been reanimated. And I’ll find myself thinking, “Why did I decline this tool? It looks like so much fun!

Then I look at my notes and see that there was some reason I declined the tool—a very good reason indeed.

In order to have a good process—a process you can trust and live by—you have to not only trust the actions it recommends, but the inactions. As long as you are only taking action, you can pretty effectively convince yourself that you are always acting of your own volition—and that the process is just something that has to be done because someone at some point said “let there be process”. Processes serve to constrain actions as much as they promote them. A good process takes this into consideration and takes the necessary steps to encourage this trust.

The Change Effect

When we’re talking about Getting Shit Done with clients, we’re talking about Effecting Change. When we’re talking about Effecting Change, we’re implicitly saying that we want to change the way your company works. An infographic is not change. An (implemented) content strategy isn’t necessarily change. An improved capacity to develop and implement future content strategy interally? That’s some solid change, right there.

If you don’t have the right process in place, then—barring some sort of thousand-monkeys-infinite-time scenario, you’re never going to get the results you want. If you do, it will have been through pure luck, not through any skill of planning or execution. If all we did was create an infographic for a client and get them some links, we would be doing them a disservice and selling-short our own capabilities. Doing something isn’t changing something. Changing the way things are done is where it’s at.

We want to change how our clients work. We want to improve their process.

The fact that this philosophy carries over into our content strategy has been huge for Distilled, in my opinion. So many of our most blog posts are all about the publication or promotion of a  process which can be adapted to the needs of our audience. Consider these extremely popular posts:

Clearly there is something to this process thing.

And bringing it back to the tool review process, well—nobody’s really asked me to improve anything yet. It works pretty smoothly in the eyes of people who are making use of it. But I see the deficiencies, I work with them every day. So, like any good Distiller, I iterate, I hack, and I see what happens. At the end of the day, though, I’ve got a process to improve. If no one had solidified the process in the first place, I wouldn’t even have one to work with.

Of course in some cases, improving a processes might in fact mean getting a process in place—but fortunately, as I’ve suggested, just getting to that point can be valuable in itself!

Shipping the Shipping

Back in January Distilled had a week-long, all-hands meeting in London. It was great—everyone in one big room, everyone getting some face time, and shit gettin’ done left-right-and-center. One day that week was our Shipathon, which is exactly what it sounds like: one day where everyone in the entire company shipped something, and some people managed to ship several things.

Ship It! Tattoos

The “Ship It” culture at Distilled was definitely amplified by that one day, but it had been implicitly pervasive even before that meeting. In fact, when it comes to the output from our work, we’ve always been about pushing to get things done faster—to fail faster, to iterate, to learn from experience. In fact I wouldn’t even characterize it as a push, but a drive and a desire for this accelerated turnaround.

But do we intuitively associate these words equally with our processes? Having a process suggests forethought, deliberation—and, I can’t even pretend to deny, a bit of slowness.

So how does a focus on processes fit with a “shipping” culture? Some reconciliation is in order.

Processes facilitate shipping. In fact, the optimization of underlying processes is crucial, in my opinion, to the ability of individuals in an organization to effectively “ship it”. Take the work we do at Distilled. Say we want to deliver some amazing content to help our client out—let’s say our consultant has decided an infographic would be perfect. First off, we’ve got a process to come up with the concept for that content: we brainstorm, we consolidate, we iterate, we research, we iterate, we pitch, we iterate. When we’ve got something solidified, we get that content created. To do that, we reserve time with the creative team, we deliver a brief to the creative team, we meet with the creative team (and the client)... you get the picture. And you know what? Execution’s been taken care of because we have a process for that too and we reserved time with our outreach team way back when we first decided we were doing an infographic. All of these things we do on a daily basis allow us to consult better because we know we can ensure solid delivery.

If you want to get ahead, don’t ship just anything. Ship a process. Let the freedom flow.

And that’s all the wisdom I’ve got for you today. In fact, you’ve got your very own meta-process to run with—the “process process”. Have a process, have a good process, and  improve your process. So, does anyone out there have any processes they want to share? Anyone got a wild Visio flowchart they use? I’d love to know how people are thinking about optimizing their own work!

Benjamin Estes

Benjamin Estes

Benjamin is a senior consultant who joined Distilled in 2010. Having earned a BA in Mass Media, his intention is to continue studying the ways in which people interact with media and apply those lessons to his consulting. Ben-h264 // Born and...   read more

Get blog posts via email

6 Comments

  1. Sorry, but I find this post useless.
    You'd better blog about real tools with real examples

    reply >
  2. Ryan Berry

    Great article, Benjamin. I personally keep a spreadsheet (as a quick database, effectively) with different sheets for types of tools (eg keyword research, server performance... etc) but this simply has their names and then comments.

    This week I decided we really needed to have a PPC keyword research tool to monitor competitors but the problem is that I've been through this many times before. I'd look at SEMRush, Spyfu et al and decide on one but not actually draw up the pros and cons (or a question tree, if you like). I'd decide on a tool but then a few months down the line I'd see, "Oh, X is using this other keyword tool" and suddenly I'd look at switching. I'd likely researched it before and simply forgotten thus leading to wasting time evaluating yet another tool. All changed this week though! Now, for every type of tool researched, we have a project in Basecamp with pros/cons for each and the reason we eventually decided on a particular tool or tools. Any time we get into the "X is using something else" mentality, we simply refer to the project and see the pros and cons that were listed previously.

    reply >
    • Benjamin Estes

      Awesome, so glad to hear someone went through a really similar situation. I find it so helpful to be able to look up why I'm using the things I'm using!

  3. I like the 'process' lead approach.. and yes , it's OK to have it at a top level.

    But , another approach would be to share the negative / declined tools via a blog... or at least those that are marginal declines. A good reason for this , is that helps readers to understand the attributes/tasks you are measuring on.. but also, it may suit the needs of some users.

    Bear in mind, SEO bloggers do this kind of process by discussing the tools openly and reviewing their feedback.

    I am working on a process regarding link building and online relationship management but not found the perfect tools to perfect the process. Hence why this discussion about process is particularly interesting .

    reply >
  4. This was an excellent post- and something that I wish I'd implemented in my previous organization. There was no tools process there, but I think they would have benefited greatly from an org-wide process- and the sharing of those results. Asif is right that open communication is hugely helpful. Thanks for pointing to the Agile Google Docs post by Tom, too- that was another fascinating read. Though he didn't actually show me any cool things to do in Seattle. Sad! He should fix that ;) And then write about it!

    reply >

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>