If you listened to Monday’s podcast, you know that I’m on a bit of a kick about entropy. It has to do with this book I’m reading called The Information, by James Gleick. The book is a history of information and the rise of information theory. Really good stuff. And he spends a good bit of time going on about entropy.
Now, entropy is a bit of a scary word. It has a kind of intrinsic feel to it, like we have some intuitive sense of what it means but when it comes to spitting out a specific definition, we all turn into karaoke performers trying to sight read Snow’s song “Informer”.
Hello? 1992 called. They want their metaphor back.
Entropy, simply, is a measure of the unavailability of work inside a closed system.
At least that was its original definition. It was first purposed by Rudolf Clausius and described a specific quality of thermodynamics.
Energy as Information
James Clerk Maxwell was the first to link this quality to the idea of information. See, he looked at it as order and disorder. Order and disorder imply knowledge. To make order you must know something about the thing you are ordering. In his mind, it involved a little demon who controlled a very tiny door between two rooms. In one room were fast moving molecules. In the other room were slow moving molecules. And he decided what molecules got through the door. While he was sitting at the door he could choose to mix the molecules or keep them separate. But, because of the laws of thermodynamics, if he were to just open the door, after a period of time of fast molecules bumping into small molecules, every molecule would more or less be moving at the same speed*.
With the help of Maxwell’s Demon, entropy was now linked to information.
The second law of thermodynamics says that entropy is always increasing. This means that without intervention, everything moves from order to disorder. Or to put that another way, from specific to general.
If you’ve ever been to a business meeting, you’ve seen this before: Interesting, dynamic ideas often get presented at the start of a meeting and boring, mediocre ones often end them. Sweet, sweet car designs are presented at automobile shows and then the same boring sedans are cranked out year after year. Windows XP was supplanted by Windows Vista.
Entropy is everywhere.
A case can be made that what made Steve Jobs great was his ability to fight entropy in the extreme. Before him, computers were for governments, science and business. Because of him we can talk to our phone using natural language and it can respond to our information needs. For sure, he didn’t do this alone and in a vacuum. But it’s hard to deny that he brought information to the masses in a way that had never been seen or experienced before.
He gave people the tools to be able to manipulate information – to create order out of disorder. He created the technological environment that we are now living in. Would there be an Android without an iPhone? Would Windows 7 be half the OS it is now without having to compete with OSX?
Dig a little deeper into social behavior and two themes for how people deal with entropy begin to emerge.
People’s Relationship to Entropy
- They want to create order from disorder
- Not all the time
Now let’s run those two rules through a “customer” filter and see what happens.
Customer’s Relationship to Entropy
- People have a finite amount of energy to spend in a day
- As a result, people want to conserve their energy
- People want to expend energy on activities of their choice
- People do not want to spend their energy unnecessarily
And like that we’re out of thermodynamics and into the world of web design.
Common Sense Stuff
What a customer is saying is: I’ll buy your product or service if I like it and the price but I’m not going to spend a lot of energy to do it.
Now we can state the goal of web design in scientific terms:
The goal of web design is to produce a website with low entropy.
That is to say, a web design is successful when it makes it easy for a person to do what they want to with as little effort as possible.
And this can be measured. Right now. In fact, you may already be measuring it.
Entropy and Efficiency
Look at the number of visitors to your site. Look at the number of sales. Now, do that for the past six months, or year, or two years and get an average number of visits to sales. Whatever that percentage is, that’s how well your website has worked over that period of time.
Let’s say, for the sake of argument that you have 100 visitors to your website each month, on average, for the past year. And in that time, on average, you had 2 sales each month. Simple math will tell you that your website has a 2% conversion rate. That is to say, it is two-percent efficient.
The goal of user testing is to discover changes that can be made that will increase the number of sales on your website, given the same number of visitors.
If user testing is conducted and changes are made to the website in the above example and over the next several months the website averaged 4 sales per month, the website would have doubled its efficiency from 2% to 4%.
Efficiency is directly related to entropy. Entropy, remember is about order and disorder. The more order we bring to the website, the less energy the visitor has to expend to buy the product or service and the more efficient it is.
The reason we should talk as designers about entropy rather than efficiency is because efficiency is a by-product of entropy, not the other way around. Entropy is by its nature probabilistic. The more knowledge you have about the pages of your website, the more effect you have can on reducing entropy – you become like Maxwell’s demon deciding which molecules to let through the door.
Every page on your website that somebody can find via search or a link is a potential entry point. Likewise, every page is a click away from being an exit point. It’s all very messy and random.
The job of web designers, programmers, interface designers, and SEO people do is give shape to those pages.
SEO is responsible for managing the website’s relationship with search engines. Another way to think about it is that they are responsible for getting traffic to the website. In a closed system, an SEO guy wouldn’t be necessary. But our website itself exists in the larger eco-system of the Internet and so messaging extends beyond your website. SEO, because of its connection to traffic, is the first person to set expectations for your website’s visitors.
Designers and programmers work to bring shape to the website. E-commerce sites have catalog pages, product description pages, a cart, and a checkout process – and they show up in that order.
Information sites like Google, YouTube, and Wikipedia are designed so that information can be easily found and accessed.
From disorder emerges order.
On this website we’ve spent a good bit of time talking about defining a website’s critical path. We believe that user testing should revolve around improving the efficiency of that path. It’s important to remember that it’s a literal path. It is about energy flow.
Entropy, for a website, can be defined as the likelihood that a visitor to a website will NOT complete the critical path.
Fighting entropy on a website means giving form to and then reducing the resistance of the critical path.
This is why a conversion funnel is such a valuable web analytics tool. It shows entry and exit points with respect to the critical path. It points out to you places where user testing could reduce entropy.
On Friday, we’re going to take a look at the third-rail of web design: pricing.
Nothing introduces entropy into a website quite like pricing. Money is really a physical manifestation of a person’s energy. They know that they have to expend a certain amount of energy to accumulate money. Money, like energy, is also finite for most people. Thus, pricing is directly related energy, and thus, entropy.
We’re going to take a look at some pricing strategies that can reduce entropy and increase the odds that your site’s visitors will respond positively to your price point.
* I saw more or less because it’s not practical for the average person to know the behavior of every molecule. So what has risen in its place are laws of probability. That is to say, while a closed system tends towards maximum entropy, at the molecular level, there will be exceptions to this rule. Extremely unlikely events, however unlikely, still happen. But at the macro level, these probabilities are so low as to be practically non-existent.