It’s time to find your voice

I’ve been following the recent back-and-forth about blog comments closely, since I have the same question about this blog: should comments be turned on or off? I even mentioned recently that I’m going to turn comments off for a while and see how it goes.

Matt Gemmell is driving/documenting the debate in the most articulate way, and his recent post on pseudonyms is another example of that. Even though the conversation now mostly appears to have run its course, it occurred to me that the root of this debate is related to what Paul Ford calls the fundamental question of the web:

“Why wasn’t I consulted,” which I abbreviate as WWIC, is the fundamental question of the web. It is the rule from which other rules are derived. Humans have a fundamental need to be consulted, engaged, to exercise their knowledge (and thus power), and no other medium that came before has been able to tap into that as effectively.

The Internet gives people this idea that if they can’t respond directly to something someone else said on a web site, their fundamental right to be consulted is violated. And that’s just not true – we don’t have a right to be consulted on everything that happens around us. What is true, however, is that we all have a voice, and that finding that voice is extremely important for our own development.

So the thing is, we’re having the wrong discussion. We shouldn’t be arguing about whether comments should be turned on or off on a blog. What we should be talking about is how all of us can spend more time finding our own obsession and voice, and how we can share that with the world. Tom Standage argues that writing is the greatest invention:

It is not just one of the foundations of civilisation: it underpins the steady accumulation of intellectual achievement. By capturing ideas in physical form, it allows them to travel across space and time without distortion, and thus slip the bonds of human memory and oral transmission, not to mention the whims of tyrants and the vicissitudes of history.

So forget about comments – it doesn’t matter whether you have them turned on or not. The real question is which one of the many available options you’re going to choose to start writing and owning your voice.


Update: Reader Greg Mathes asks in an email, “What’s so important about finding our own voice?” To answer, I’d like to quote Clive Thompson in The Art of Public Thinking:

The process of writing exposes your own ignorance and half-baked assumptions: When I’m writing a Wired article, I often don’t realize what I don’t know until I’ve started writing, at which point my unanswered questions and lazy, autofill thinking becomes obvious. Then I freak out and panic and push myself way harder, because the article is soon going before two publics: First my editors, then eventually my readers. Blogging (or tumbling or posterousing or even, in a smaller way, tweeting) forces a similar clarity of mental purpose for me. As with Wired, I’m going before a public. I’m no longer just muttering to myself in a quiet room. It scarcely matters whether two or ten or a thousand people are going to read the blog post; the transition from nonpublic and public is nonlinear and powerful.

World IA Day: A lack of UX purpose (and what we can do about it)

I flew up to Joburg this weekend to speak at one of the World IA Day events that were happening in 14 cities around the world. The bulk of the talk was about Customer Journey Maps, and specifically how we used the technique to help us prioritize our roadmap at In this summary post I want to focus primarily on the topic I started the talk with. It’s about a particular gap I see in current UX work, and how Information Architecture is uniquely positioned to bridge this gap.

In 1955 David Ogilvey wrote a letter about his copywriting habits, and among other things, said the following about campaign work:

I write out a definition of the problem and a statement of the purpose which I wish the campaign to achieve. Then I go no further until the statement and its principles have been accepted by the client.

It seems that ther’s unfortunately plenty of UX work out there that jumps straight into wireframes without first understanding the design problem, as well as the purpose of the solution. Purpose – the reason for which something is done or created – often appears to be missing (*cough* *cough*). And this is where I believe Information Architecture can come to the rescue.

There are plenty of definitions of IA to choose from, but I like this one in particular by Peter Morville:

I like it because it brings into focus the idea that at its core, Information Architecture is about a unique way of seeing the world. A way that is essential to build successful user experiences.

I love the example Dan Klyn’s uses in Information Architecture is a Way of Seeing. You have to read the whole thing to appreciate it fully, but in short, he tells a story about having to deal with some pretty severe back pain recently. After visiting an MD who only gave him a prescription for Vicodin and some exercises that didn’t help at all, he ended up at a Chiropractor who was able to sort out the problem in just a few days (after taking an X-ray to help diagnose the problem). When asked why the MD didn’t originally take an X-ray to get to the root of the problem, the Chiropractor replied that it wouldn’t have mattered if she did:

Even if the MD had taken an X-Ray, she would not have seen what I saw. Show us each the same image and we see different stuff.

It’s this different way of seeing that makes the IA profession so crucial right now. IAs specialize in looking at a vast amount of information and making sense of it in a way that is credible, consumable, and relevant to users (and the business). Where most of us only see Navigation, they know that part is just the tip of the iceberg. Underneath it lie activities like Information organization, Information relationships, and IA research that all work together to give IAs their unique view of the world.

Within this large toolset that IAs have to choose from to do their work, Customer Journey Maps stand out as the one technique that can be most effective to bring purpose back to our UX work. As UX Matters defines it:

Customer journey maps are documents that visually illustrate an individual customer’s needs, the series of interactions that are necessary to fulfill those needs, and the resulting emotional states a customer experiences throughout the process.

These maps are important as a way to find UX purpose because it accomplishes the following goals:

  • It provides a common understanding within an organization about customer needs, product strategy, and business goals – i.e., the product’s reason to exist.
  • It’s an excellent product prioritization tool.
  • It’s a guiding light for design, always bringing the project and the process back to the customer journey and the purpose of the product.

There are many different ways to approach these maps, but I find the Adaptive Path way the most effective. It places a strong focus on user research, and forces you to think about the implications of the journey map, and how it can integrate with and guide the design process.

So, that was my story at the conference. Thanks to everyone who came out! Here are the slides from my talk:


The exhaust of our digital lives

Frank Chimero provides another eloquent take on frictionless sharing (automated posts in news feeds, like what song you’re listening to on Spotify):

The less engaged I become with social media, the more it begins to feel like huffing the exhaust of other peopl’s digital lives. It’s a bit of a weird situation: all that’s needed is a simple filter to prioritize manually posted content over automated messages.

He doesn’t explicitly say this, but the point in his post is clear. Automated content shows up in your stream not because it adds value to the network, but because it’s good Marketing ROI for brands.

You cannot design without content (structure)

Mark Boulton wrote a fantastic piece called Structure First. Content Always. He makes a strong case that it’s unrealistic (and just plain wrong) to require content to be written before design happens:

Let’s be really clear about this. It is unrealistic to write your content ““ or ask your client to write the content ““ before you design it. Most of the time. Content needs to be structured and structuring alters your content, designing alters content. It’s not “˜content then design’, or “˜content or design’. It’s “˜content and design’.

I often utter the phrase “you cannot design without content”, but in practice I still fall back on old habits when push comes to shove and content simply isn’t available. Mark provides a great solution in his post as he explains what content structure is all about, and how it fits into the design process.

An avalance of information and entertainment

How mankind will cope with the avalanche of information and entertainment about to descend upon it from the skies, only the future can show. Once again science, with its usual cheerful irresponsibility, has left another squalling infant on civilization’s doorstep.

Arthur C. Clarke in 1962

Should designers learn to code? Who cares, as long as they always remain curious.

Tucked away among the usual arguments for and against designers being able to code, Mandy Brown makes an interesting observation in Specialist or Generalist?, a roundtable discussion on the issue:

You do not need to be proficient in practices other than your own; but you ought to be curious. Curious enough to ask questions, to read about things, to get your hands dirty. It’s lack of curiosity about other disciplines that is deadly, not lack of skill.

This is a statement worth digging into, because curiosity is one of the most important characteristics of a good designer (well, of anyone, really). Sara Wachter-Boettcher explains the reason really well in her piece On Content and Curiosity:

Curiosity keeps us hungry. It leads us to tackle new challenges when the easy questions have all been answered. It makes us wonder how things could be better””even when they are, if w’d just pause to admit it, pretty damn good already.

There is a very legitimate counter-argument to being incurably curious, though. We might gain such shallow knowledge about so many different things that we end up unable to form or articulate opinions on anything because we just don’t know enough about a specific topic.

This is the core of the “specialist” argument, and it’s articulated very well in Stop Trying To Be Diverse, an interesting post about photographers on the Musea blog. The author tells the story of a particular photographer who spent his entire life shooting black-and-white portraits of people against a white or grey background. Boring, right? Well…

We don’t want to restrict ourselves to something like that because we feel that we will get bored. However, boredom is a great thing! What actually occurs is, boredom forces us to be more creative if we can push through it. Our work will improve if we find different ways to solve the same problem over and over, instead of switching between 10 independent problems. Avedon forced himself to come up with something new every time his subject stepped in front of his white seamless background. He had to find something unique about each individual, otherwise he would fail. The difference in his work came from what his subjects brought to the image, not by some new Photoshop filter or fancy off-camera lighting technique he used.

I’ll illustrate this with a story about my own online behavior. The other night I went online to do something (who knows what it was), and an hour later I found myself reading an article about a guy who feels that Louis C.K. was stupid because he made “only” $1 million from his standup comedy experiment. The person claimed that he could have helped Louis C.K. make $5 MILLION!!! I got to end of the article and all I could think was, “Why do I read this crap?” Well, I read it because my curiosity sometimes overcomes my importance filter. And getting that balance right is what we all need to make this curiosity thing work for good, not evil.

So, how do we arrive at (and maintain) this balance? How do we remain curious, and still manage to temper our gluttonous, Internet-fed thirst for All Things, All The Time? The core of the solution lies in learning what to cull from our lives, and what to surrender instead. In The Sad, Beautiful Fact That We’re All Going To Miss Almost Everything, Linda Holmes describes the two concepts as follows:

Culling is the choosing you do for yourself. It’s the sorting of what’s worth your time and what’s not worth your time. It’s saying, “I deem Keeping Up With The Kardashians a poor use of my time, and therefore, I choose not to watch it.” It’s saying, “I read the last Jonathan Franzen book and fell asleep six times, so I’m not going to read this one.”

Surrender, on the other hand, is the realization that you do not have time for everything that would be worth the time you invested in it if you had the time. Surrender is the moment when you say, “I bet every single one of those 1,000 books I’m supposed to read before I die is very, very good, but I cannot read them all, and they will have to go on the list of things I didn’t get to.”

We constantly need to learn how to make better decisions about culling and surrendering. For example, I should have culled that Louis C.K. article. And at some point I need to choose to surrender all the UX articles in my Instapaper queue and just freakin’ fire up Balsamiq.

Should designers learn to code? It depends entirely on each designer’s ability to decide if it’s a skill that should be learned or surrendered in the bigger picture of meeting his or her ultimate life/career goals. Put another way, there is no right answer, as long as the designer is at the very least curious enough to know how development works, and at best has made a conscious decision either to surrender the skill or dive in and learn it.

Think Different (as long as enough people will like it or retweet it)

In Facebook’s Philosophy Kyle Baxter makes a good point about what happens when sharing something becomes part of doing it:

Once the sharing is a part of the doing, you no longer consider whether to do something in the isolation of whether you want to do it. When sharing is a part of the package, you also consider how whatever it is you’re doing will reflect on you. You’ll consider what the general public’s, or your network’s, standards are for it.

Nick Bradbury makes a similar point in The Friction in Frictionless Sharing:

In the past the user only had to decide whether to share something they just read, but now they have to think about every single article before they even read it. If I read this article, then everyone will know I read it, and do I really want people to know I read it?

When you think this all the way through the implications are quite bleak. The theory is that the more we share about our lives, the more we tend to take into consideration what people might think of us before we do something. But it’s not just a passive “I wonder what they’ll think of me”. Figuring out what to do next becomes an obsession, a constant search to answer the same question over and over: what can I do that will get me the most likes or retweets?

It’s a dangerous game – one where we’re not just trying to hang on to our reputations, but actively using our knowledge of what our network “likes” to guide our lives. “Think different” becomes “Think different in a way that will generate the most engagement with my personal brand.” Maybe the value of Allen Salkin’s philosophy that “there is something magical about a life less posted” is that it frees us to live our own lives again.

The value of Very Small Data

Alan Mitchell on the problems with Big Data and the value of what he calls Very Small Data:

So there are two classes of data which help solve different types of problem. Big Data is statistical and deals with general trends and patterns; Very Small Data is specific and deals with getting things done: gathering the information needed to make a decision, to make an arrangement, or to get some administrative chore done. Because it’s Very Small and rather mundane and specific, it doesn’t seem as glamorous and important as Big Data. But it is.

He goes on to discuss four major problems with Big Data, and the enormous opportunities that exist in the area of Very Small Data. It’s an essay well worth your time.


  1. 1
  2. ...
  3. 177
  4. 178
  5. 179
  6. 180
  7. 181
  8. ...
  9. 198