teshimaryokan.info Biography THE INMATES ARE RUNNING THE ASYLUM PDF

The inmates are running the asylum pdf

Tuesday, April 9, 2019 admin Comments(0)

The recurring metaphor in The Inmates are Running the Asylum is that of the dancing bear--the circus bear that shuffles clumsily for the amusement of the. Ebook download any format The Inmates Are Running the Asylum Unlimited Free E-Book Download now. In his book The Inmates Are Running the Asylum Alan Cooper calls for revolution - we need technology to work in the same way average people think - we need.


Author: ELISA REEFER
Language: English, Spanish, Dutch
Country: Bahamas
Genre: Business & Career
Pages: 420
Published (Last): 26.05.2016
ISBN: 841-1-55329-427-2
ePub File Size: 26.68 MB
PDF File Size: 11.48 MB
Distribution: Free* [*Regsitration Required]
Downloads: 29720
Uploaded by: KOREY

At Pearson, Brad Jones supported this project throughout, but the most credit goes to Chris Webb, whose tenacity, focus, and hard work really made The Inmates. Business executives have let the inmates run the asylum! In his book The Inmates Are Running the Asylum Alan Cooper calls for revolution - we need. teshimaryokan.info Table of Contents Index Inmates Are Running the Asylum, The: Why High-Tech Products Drive Us Crazy.

This scenario is most disturbing in precisely the situation where it is most likely to occur: I was unaware, like the frog, of my cameras' slow march from easy to hard to use as they slowly became computerized. WordPress Shortcode. They knew I wanted to write a how-to design book, and—although they were encouraging—they expressed their doubts about the need for interaction design, and they wanted me to write a book to convince them of its value. They have no manufacturing cost. What's more, the occasional unfocused gripes of the users are offset by the frequent enthusiasm of the knowledgeable elite. In the information age, it is taken for granted that products are available at affordable prices to everyone.

If I forgot to turn it off, it automatically shut down after one minute of inactivity. One year ago, my second-generation digital camera, a Panasonic PalmCam, had an even smarter computer chip inside it. It had modes: I had to put it into Rec mode to take pictures and Play mode to view them on its small video display. In fact, it has a full-blown computer that displays a Windows-like hourglass while it "boots up.

There is no "On" setting, and none of my friends can figure out how to turn it on without a lengthy explanation. The new camera is very power-hungry, and its engineers thoughtfully provided it with a sophisticated computer program that manages the consumption of battery power. A typical scenario goes like this: I aim the camera and zoom in to properly frame the image. Just as I'm about to press the shutter button, the camera suddenly realizes that simultaneously running the zoom, charging the flash, and energizing the display has caused it to run out of power.

In self-defense, it suspends its capability to actually take pictures. But I don't know that because I'm looking through the viewfinder, waving my arms, saying "smile," and pressing the shutter button.

The computer detects the button press, but it simply cannot obey. In a misguided effort to help out, the power-management program instantly takes over and makes an executive decision: Shed load. It shuts down the power-greedy LCD video display. I look at the camera quizzically, wondering why it didn't take the picture, shrug my shoulders, and let my arm holding the camera drop to my side.

But as soon as the LCD is turned off, more battery power is available for other systems. The power-management program senses this increase and realizes that it now has enough electricity to take pictures.

It returns control to the camera program, which is waiting patiently to process the command it received when I pressed the shutter button, and it takes a nicely auto-focused, well-exposed, high-resolution digital picture of my kneecap. That old mechanical Pentax had manual focusing, manual exposure, and manual shutter speed, yet it was far less frustrating to use than the fully computerized, modern Nikon COOLPIX , which has automatic focusing, exposure, and shutter speed.

The camera may still take pictures, but it behaves like a computer instead of a camera. A frog that's slipped into a pot of cold water never recognizes the deadly rising temperature as the stove heats the pot. Instead, the heat anesthetizes the frog's senses. I was unaware, like the frog, of my cameras' slow march from easy to hard to use as they slowly became computerized.

We are all experiencing this same, slow, anesthetizing encroachment of computer behavior in our everyday lives. It has a very sophisticated computer brain and offers high fidelity, digital sound, and lots of features. It wakes me up at a preset time by playing a CD, and it has the delicacy and intelligence to slowly fade up the volume when it begins to play at 6: This feature is really pleasant and quite unique, and it compensates for the fact that I want to hurl the infuriating machine out the window.

It's very hard to tell when the alarm is armed, so it occasionally fails to wake me up on a Monday and rousts me out of bed early on a Saturday. Sure, it has an indicator to show the alarm is set, but that doesn't mean it's useful. The clock has a sophisticated alphanumeric LCD that displays all of its many functions. The presence of a small clock symbol in the upper-left corner of the LCD indicates the alarm is armed, but in a dimly lit bedroom the clock symbol cannot be seen.

The LCD has a built-in backlight that makes the clock symbol visible, but the backlight only comes on when the CD or radio is explicitly turned on. There's a gotcha, however: The alarm simply won't ever sound while the CD is explicitly left on, regardless of the setting of the alarm. It is this paradoxical operation that frequently catches me unawares. It is simple to disarm the alarm: Simply press the "Alarm" button once, and the clock symbol disappears from the display.

However, to arm it, I must press the "Alarm" button exactly five times. The first time I press it, the display shows me the time of the alarm. On press two, it shows the time when it will turn the sound off. On press three, it shows me whether it will play the radio or the CD. On press four, it shows me the preset volume. On press five, it returns to the normal view, but with the alarm now armed. But with just one additional press, it disarms the alarm. Sleepy, in a dark bedroom, I find it difficult to perform this little digital ballet correctly.

Being a nerdy gizmologist, I continue to fiddle with the device in the hope that I will master it. My wife, however, long ago gave up on the diabolical machine.

She loves the look of the sleek, modern design and the fidelity of the sound it produces, but it failed to pass the alarm-clock test weeks ago because it is simply too hard to make work.

The Inmates are Running the Asylum

The alarm clock may still wake me up, but it behaves like a computer. When it was armed, a single red light glowed. When it was not armed, the red light was dark.

I didn't like this old alarm clock for many reasons, but at least I could tell when it was going to wake me up. Because it is far cheaper for manufacturers to use computers to control the internal functioning of devices than it is to use older, mechanical methods, it is economically inevitable that computers will insinuate themselves into every product and service in our lives.

This means all of our products will soon behave the same as most obnoxious computers, unless we try something different. This phenomenon is not restricted to consumer products. Just about every computerized device or service has more features and options than its manual counterpart.

Yet, in practice, we often wield the manual devices with more flexibility, subtlety, and awareness than we do the modern versions driven by silicon- chip technology. High-tech companies—in an effort to improve their products—are merely adding complicating and unwanted features to them. Because the broken process cannot solve the problem of bad products, but can only add new functions, that is what vendors do. Later in this book I'll show how a better development process makes users happier without the extra work of adding unwanted features.

Porsche's beautiful high-tech sports car, the Boxster, has seven computers in it to help manage its complex systems. One of them is dedicated to managing the engine. It has special procedures built into it to deal with abnormal situations.

Unfortunately, these sometimes backfire. In some early models, if the fuel level in the gas tank got very low—only a gallon or so remaining—the centrifugal force of a sharp turn could cause the fuel to collect in the side of the tank, allowing air to enter the fuel lines. The computer sensed this as a dramatic change in the incoming fuel mixture and interpreted it as a catastrophic failure of the injection system.

To prevent damage, the computer would shut down the ignition and stop the car. Also to prevent damage, the computer wouldn't let the driver restart the engine until the car had been towed to a shop and serviced.

When owners of early Boxsters first discovered this problem, the only solution Porsche could devise was to tell them to open the engine compartment and disconnect the battery for at least five minutes, giving the computer time to forget all knowledge of the hiccup.

The sports car may still speed down those two- lane blacktop roads, but now, in those tight turns, it behaves like a computer. In a laudable effort to protect Boxster owners, the programmers turned them into humiliated victims.

Every performance-car aficionado knows that the Porsche company is dedicated to lavishing respect and privilege on its clientele.

That something like this slipped through shows that the software inside the car is not coming from the same Porsche that makes the rest of the car. It comes from a company within a company: Somehow, the introduction of a new technology surprised an older, well-established company into letting some of its core values slip away.

Acceptable levels of quality for software engineers are far lower than those for more traditional engineering disciplines. Whenever I withdraw cash from an automatic teller machine ATM , I encounter the same sullen and difficult behavior so universal with computers. If I make the slightest mistake, it rejects the entire transaction and kicks me out of the process.

I have to pull my card out, reinsert it, reenter my PIN code, and then reassert my request. Typically, it wasn't my mistake, either, but the ATM computer finesses me into a misstep. It always asks me whether I want to withdraw money from my checking, savings, or money-market account, even though I have only a checking account.

Subsequently, I always forget which type it is, and the question confuses me. About once a month I inadvertently select "savings," and the infernal machine summarily boots me out of the entire transaction to start over from the beginning.

To reject "savings," the machine has to know that I don't have a savings account, yet it still offers it to me as a choice.

Are pdf the the asylum inmates running

The only difference between me selecting "savings" and the pilot of Flight selecting "ROMEO" is the magnitude of the penalty. It doesn't tell me what that amount is, inform me how much money is in my account, or give me the opportunity to key in a new, lower amount.

Instead, it spits out my card and leaves me to try the whole process again from scratch, no wiser than I was a moment ago, as the line of people growing behind me shifts, shuffles, and sighs. The ATM is correct and factual, but it is no help whatsoever.

The ATM has rules that must be followed, and I am quite willing to follow them, but it is unreasonably computer-like to fail to inform me of them, give me contradictory indications, and then summarily punish me for innocently transgressing them. This behavior—so typical of computers—is not intrinsic to them. Actually, nothing is intrinsic to computers: They merely act on behalf of their software, the program.

And programs are as malleable as human speech. A person can speak rudely or politely, helpfully or sullenly. It is as simple for a computer to behave with respect and courtesy as it is for a human to speak that way. All it takes is for someone to describe how. Unfortunately, programmers aren't very good at teaching that to computers. Computers Make It Easy to Get into Trouble Computers that sit on a desk simply behave in the same, irritating way computers always have, and they don't have to be crossed with anything.

My friend Jane used to work in public relations as an account coordinator. The core of Windows 95 is the hierarchical file system. All of Jane's documents were stored in little folders, which were stored in other little folders.

Jane didn't understand this or see the advantage to storing things that way.

You might also like: IN THE LINE OF FIRE MUSHARRAF PDF

Actually, Jane didn't give it a lot of thought but merely took the path of least resistance. Jane had just finished drafting the new PR contract for a Silicon Valley startup company. She selected Close from the File menu.

Instead of simply doing as she directed and closing the document, Word popped up a dialog box. She responded—as always—by pressing the Enter key. She responded this way so consistently and often that she no longer even looked at the dialog box.

Running the pdf are asylum the inmates

The first dialog box was followed immediately by another one, the equally familiar Save As box. It presented Jane with lots of confusing buttons, icons, and text fields.

The only one that Jane understood and used was the text-entry field for File Name. She typed in a likely name and then clicked the Save button. The program then saved the PR contract in the My Documents folder. Jane was so used to this unnecessary drill that she gave it no thought.

At lunchtime, while Jane was out of her office, Sunil, the company's computer tech, installed a new version of VirusKiller 2. After viewing the file, Sunil closed it and returned Jane's computer to exactly the way it was before lunch. At least, he thought he did. After lunch, Jane needed to reopen the PR contract and get a printout to show to her boss. Jane selected Open from the File menu, and the Open dialog box appeared. Jane expected the Open dialog box to show her, in neat alphabetic order, all of her contracts and documents.

Instead, it showed her a bunch of filenames that she had never seen before and didn't recognize. One of them was named Readme. Of course, when Sunil used Word to view the Readme file, he instructed Jane's copy of Word to look in an obscure folder six levels deep and inadvertently steered it away from Jane's normal setting of My Documents. Jane was now quite bewildered. Her first, unavoidable thought was that all of her hard work had somehow been erased, and she got very worried.

Finally, in a state approaching panic, Jane telephoned Sunil to ask for his help. Sunil was not at his desk, and it wasn't until Monday morning that he had a chance to stop by and set things right.

Although computer operating systems need hierarchical file systems, the people who use them don't. It's not surprising that computer programmers like to see the underlying hierarchical file systems, but it is equally unremarkable that normal users like Jane don't. Unremarkable to everyone, that is, except the programmers who create the software that we all use. Jane's frustration and inefficiency is blamed on Jane, and not on the programmers who torpedoed her. At least Jane has a job.

Many people are considered insufficiently "computer literate" and are thus not employable. As more and more jobs demand interaction with computers, the rift between the employable and the unemployable becomes wider and more difficult to cross.

Politicians may demand jobs for the underprivileged, but if the underprivileged don't know how to use computers, no company can afford to let them put their untrained hands on the company's computers. There is too much training involved, and too much exposure to the destruction of data and the bollixing up of priceless databases.

The obnoxious behavior and obscure interaction that software-based products exhibit is institutionalizing what I call "software apartheid": Otherwise-normal people are forbidden from entering the job market and participating in society because they cannot use computers effectively.

In our enlightened society, social activists are working hard to break down race and class barriers while technologists are hard at work inadvertently erecting new, bigger ones.

By purposefully designing our software-based products to be more human and forgiving, we can automatically make them more inclusive, more class- and color-blind. Commercial Software Suffers, Too Not only are computers taking over the cockpits of jet airliners, but they are also taking over the passenger cabin, behaving in that same obstinate, perverse way that is so easy to recognize and so hard to use.

Modern jet planes have in-flight entertainment IFE systems that deliver movies and music to passengers. Advanced IFE systems are generally installed only on larger airplanes flying transoceanic routes.

One airline's IFE system was so frustrating for the flight attendants to use that many of them were bidding to fly shorter, local routes to avoid having to learn and use the difficult systems. This is remarkable, considering that the time-honored airline route-bidding process is based on seniority, and that those same long-distance routes have always been considered the most desirable plums because of their lengthy layovers in exotic locales such as Singapore or Paris.

For flight attendants to bid for unglamorous, unromantic yo-yo flights from Denver to Dallas or from Los Angeles to San Francisco just to avoid the IFE system indicated a serious morale problem. Any airline that inflicted bad tools on its most prized employees—the ones who spent the most time with the customer—was making a foolish decision and profligately discarding money, customer loyalty, and staff loyalty.

The computer IFE system that another large airline created was even worse. It linked movie delivery with the cash-collection function. In a sealed jet airplane flying at 37, feet, cash-collection procedures had typically been quite laissez-faire; after all, nobody was going to sneak out the back door.

Flight attendants delivered goods and services when it was convenient and collected later when their hands weren't full and other passengers weren't waiting for something.

This kept them from running unnecessarily up and down the narrow aisles. Sure, there were occasional errors, but never more than a few dollars were involved, and the system was quite human and forgiving; everyone was happy and the work was not oppressive. With cash collection connected to content delivery by computer, the flight attendant had to first get the cash from the passenger, then walk all the way to the head end of the cabin, where the attendant's console was, enter an attendant password, then perform a cash-register-like transaction.

Only when that transaction was completed could the passenger actually view a movie or listen to music. This inane product design forced the flight attendants to walk up and down those narrow aisles hundreds of extra times during a typical trip. Out of sheer frustration, the flight attendants would trip the circuit breaker on the IFE system at the beginning of each long flight, shortly after departure.

The airline had spent millions of dollars constructing a system so obnoxious that its users deliberately turned it off to avoid interacting with it. The thousands of bored passengers were merely innocent victims. And this happened on long, overseas trips typically packed with much-sought-after frequent flyers. I cannot put a dollar figure on the expense this caused the airline, but I can say with conviction that it was catastrophically expensive.

The software inside the IFE systems worked with flawless precision but was a resounding failure because it misbehaved with its human keepers.

PDF Download The Inmates Are Running the Asylum For Android

How could a company fail to predict this sad result? How could it fail to see the connection? The goal of this book is to answer these questions and to show you how to avoid such high-tech debacles. In September , while conducting fleet maneuvers in the Atlantic, the USS Yorktown, one of the Navy's new Aegis guided-missile cruisers, stopped dead in the water. A Navy technician, while calibrating an on-board fuel valve, entered a zero into one of the shipboard management computers, a Pentium Pro running Windows NT.

The program attempted to divide another number by that zero—a mathematically undefined operation—which resulted in a complete crash of the entire shipboard control system.

Without the computers, the engine halted and the ship sat wallowing in the swells for two hours and 45 minutes until it could be towed into port.

Good thing it wasn't in a war zone. What do you get when you cross a computer with a warship? Admiral Nimitz is rolling in his grave! Despite this setback, the Navy is committed to computerizing all of its ships because of the manpower cost savings. To deflect criticism of this plan, it blamed the "incident" on human error. Because the software-creation process is out of control, the high-tech industry must bring its process to heel, or else it will continue to put the blame on ordinary users while ever-bigger machines sit dead in the water.

Techno-Rage An article in the Wall Street Journal once described an anonymous video clip circulated widely by email that showed a "[m]ustachioed Everyman in a short-sleeved shirt hunched over a computer terminal, looking puzzled. Suddenly, he strikes the side of his monitor in frustration.

As a curious co-worker peers over his cubicle, the man slams the keyboard into the monitor, knocking it to the floor. Rising from his chair, he goes after the fallen monitor with a final, ferocious kick. The man in the video may well be an actor, but he touches a widespread, sympathetic chord in our business world. The frustration that difficult and unpleasant software-based products are bringing to our lives is rising rapidly.

Joke emails circulate on private email lists about "Computer Tourette's. The joke is that you can walk down the halls of most modern office buildings and hear otherwise-normal people sitting in front of their monitors, jaws clenched, swearing repeatedly in a rictus of tense fury. Who knows what triggered such an outburst: Or maybe the program just blandly erased the user's only copy of a page manuscript because he responded with a Yes to a confirmation dialog box, assuming that it had asked if he wanted to "save your changes" when it actually asked him if he wanted to "discard your work.

Cognitive Friction It's one thing to see that a problem exists, but it's quite another to devise a solution. One key part of problem solving is the language we use. Over the years, I've developed many useful terms and mental models. They have proven vital to framing the problem presented by hard-to-use software-based products.

In this chapter I will introduce those terms and ideas, showing how they can help bring the benefits of interaction design to our troubled process. Behavior Unconnected to Physical Forces Having just left the industrial age behind, we are standing at the threshold of the information age with an obsolete set of tools.

In the industrial age, engineers were able to solve each new problem placed before them. Working in steel and concrete, they made bridges, cars, skyscrapers, and moon rockets that worked well and satisfied their human users.

As we tiptoe into the information age, we are working increasingly in software, and we have once again brought our best engineers to the task. But unlike in the past, things haven't turned out so well. The computer boxes are fast and powerful, and the programs are generally reliable, but we have encountered a previously unseen dimension of frustrated, dissatisfied, unhappy, and unproductive users. Today's engineers are no less capable than ever, so I must deduce from this that, for the first time, they have encountered a problem qualitatively different from any they confronted in the industrial age.

Otherwise, their old tools would work as well as they ever did. For lack of a better term, I have labeled this new problem substance cognitive friction. It is the resistance encountered by a human intellect when it engages with a complex system of rules that change as the problem changes. Software interaction is very high in cognitive friction.

Interaction with physical devices, however complex, tends to be low in cognitive friction because mechanical devices tend to stay in a narrow range of states comparable to their inputs. Playing a violin is extremely difficult but low in cognitive friction because—although a violinist manipulates it in very complex and sophisticated ways—the violin never enters a "meta" state in which various inputs make it sound like a tuba or a bell.

The violin's behavior is always predictable—though complex—and obeys physical laws, even while being quite difficult to control. In contrast, a microwave oven has a lot of cognitive friction, because the 10 number keys on the control panel can be put into one of two contexts, or modes. In one mode they control the intensity of the radiation, and in the other they control the duration. This dramatic change, along with the lack of sensory feedback about the oven's changed state, results in high cognitive friction.

When you press the E key, the letter E appears on the page. On a computer—depending on the context—you may also get a metafunction. The behavior of the machine no longer has a one-to-one correspondence to your manipulation.

Cognitive friction—like friction in the physical world—is not necessarily a bad thing in small quantities, but as it builds up, its negative effects grow exponentially. Of course, friction is a physical force and can be detected and measured, whereas cognitive friction is a forensic tool and cannot be taken literally. Don't forget, though, that such things as love, ambition, courage, fear, and truth—though real—cannot be detected and measured.

They can't be addressed by engineering methods, either. The skilled engineers who manufacture microwave ovens typically consult with human-factors experts to design the buttons so they are easy to see and press. But the human-factors experts are merely adapting the buttons to the user's eyes and fingers, not to their minds.

Consequently, microwave ovens don't have much "friction" but have a lot of cognitive friction. It is easy to open and close the door and physically press the buttons but, compared to the simplicity of the task, setting the controls to achieve your goals is very difficult. Getting the microwave to perform the work you intend for it is quite difficult, though our general familiarity with it makes us forget how hard it really is.

How many of us have cooked something for one second or one hour instead of for one minute? How many of us have cooked something at a strength of 5 for 10 minutes instead of a strength of 10 for 5 minutes?

On the computer screen, everything is filled with cognitive friction. Even an interface as simple as the World Wide Web presents the user with a more intense mental engagement than any physical machine.

This happens because the meaning of each blue hyperlink is a doorway to some other place on the Web. All you can do is click on a hyperlink, but what the link points to can change independently of the pointer without any outward indication.

Its sole function is pure metafunction. The very "hyper"ness is what gives it cognitive friction. How We React to Cognitive Friction Most people, even apologists, react to cognitive friction in the same way. They take the minimum they need from it and ignore the rest. Each user learns the smallest set of features that he needs to get his work done, and he abandons the rest.

The apologists proudly point out that their wristwatches can synchronize with their desktop calendar systems, but they conveniently neglect to mention that it has been six months since they used that feature. They will get defensive about it if you press them on the point, but that is what makes them apologists. My home-entertainment system has literally thousands of features. I'm not an apologist, but I certainly qualify as a gadget freak.

I have learned how to use some of its gratuitous features, but they are too hard to use effectively. For example, my television has a feature called "picture-in-picture" PIP. It superimposes a second, smaller screen showing another channel in the lower-right corner of the main screen.

It is all done in software and can be completely controlled by buttons on the remote control. In theory, it is useful for such circumstances as keeping an eye on the football game in the PIP screen while I'm watching a movie on the main screen.

When the salesperson demonstrated it to me in the electronics showroom, it seemed quite useful. The problem is that it is just too difficult to control; there is too much cognitive friction involved in using it, and I cannot master it sufficiently well to make it worth the effort. It's just more enjoyable to watch one channel, as in the old days when one channel was all that the technology could deliver. Nobody else in my family has bothered to use the PIP facility even once, except by accident, and I occasionally come home to find someone watching TV with a PIP screen up.

As soon as I walk in the room, he or she asks me to turn it off. My TV has a 55'' screen and a Dolby sound system, and it receives a digital signal from an orbiting satellite, but otherwise my family members and I use it in exactly the same way we used our snowy, tinny, 19'' Motorola in All of those features go unused.

You can predict which features in any new technology will be used and which won't. The use of a feature is inversely proportional to the amount of interaction needed to control it. The satellite system is a very desirable dancing bear of a feature, so I put up with the complexity of source-signal switching to watch the satellite broadcast once a week or so.

Nobody else in my family was able to figure out how to view the satellite until I created a plastic-laminated cheat sheet that sits on the coffee table with a checklist of switches, buttons, and settings that must be made to connect up.

The PIP system not only uses a complex system of over a dozen buttons, but its interaction is very obscure and its behavior is unpleasant. After the first couple of tries, I abandoned it completely, as has everyone else. This pattern of cognitive friction abandonment can be found in every office or household with every software-based product. The Democratization of Consumer Power Traditionally, the more complex a mechanical device was, the more highly trained its operators were.

Big machinery was always isolated from the public and was operated by trained professionals in uniform. The information age changed everything, and we now expect amateurs to manage technology far more complex than our parents ever faced.

As more and more of our tools and systems get silicon brains, they are placed into the hands of untrained amateurs. Twenty-five years ago, trained human operators handled long-distance phone calls at our verbal request. Today, the most complex international calls are handled directly by any untrained amateur pushing buttons. Just a couple of decades ago, even gas pumps were operated only by trained service-station attendants.

Today, every individual is expected to be able to perform the gas-pumping transaction, as well as the associated financial transaction, using a credit or debit card. Twenty years ago, only trained tellers operated banks. Today, you operate your bank by using a gas pump or ATM.

The engineering process doesn't discern between the creation of a complex system that will be operated by a trained, paid professional and the creation of one that is to be operated by an indifferent amateur. The process of engineering doesn't have concepts to deal with that human stuff. It concentrates on the implementation issues: What is it made of? How will it be constructed? What controls will be needed to give input to all possible variables?

Blaming the User Most software is used in a business context, so most victims of bad interaction are paid for their suffering. Their job forces them to use software, so they cannot choose not to use it—they can only tolerate it as well as they can.

They are forced to submerge their frustration and to ignore the embarrassment they feel when the software makes them feel stupid.

For years, I've watched as dozens of software-industry executives have drawn on their whiteboards for me essentially the same diagram showing their view of the high-tech marketplace. It shows a pyramid—some draw it inverted—that is divided into three horizontal layers, each with an innocent-sounding phrase as a label.

Each executive superimposes an amorphous blob on it showing the portion of the market they are aiming to conquer. But each label is a euphemism—really a veiled slur, like a code phrase you'd hear a bigot use to keep someone out of the country club.

It is the bad design of the interaction that is at fault. Why would a vendor write off the lion's share of the market? Because it removes the blame for failure from the executives and software engineers and places it squarely onto the shoulders of the innocent users. The phrase "computer-literate user" really means the person has been hurt so many times that the scar tissue is thick enough that he no longer feels the pain.

Computer literacy means that when your program loses your document, you have learned enough not to panic like Jane in Chapter 1 , "Riddles for the Information Age," but to begin the slow, manual, utterly unnecessary search for it in the hierarchical file system without complaint. One characteristic of computer literacy is that it is like anesthesia: The patient drifts slowly and gently into unconsciousness.

There is little point in constantly whining and complaining about a piece of software that is a fixed and permanent part of your job.

Most people don't even realize how hard they are working to compensate for the shortcomings of a software-based tool. Most apologists consider computer literacy to be a badge of accomplishment, like a Sharpshooter's Medal. Actually, it is more akin to a Purple Heart, an official recognition of having suffered a wound in battle. Power users are simply apologists.

They are techno-enthusiasts who have sufficiently overcome their better instincts to be useful consumers of high-cognitive-friction products. They take pride in the challenge, as they might in the challenge of scaling a rock wall in Yosemite.

Software Apartheid There's an old joke in Hollywood that you can bump into a stranger in the grocery store and ask how his screenplay is doing. The stranger—without hesitation—will reply, "Great!

I've just restructured the second act to tighten up the action! You can buttonhole a stranger in line at Starbucks and ask how her Web site is doing. The stranger—without skipping a beat— will reply, "Great! I've just restructured the frames to tighten up the navigation!

The average person who uses a software-based product around here isn't really very average. Programmers generally work in high-tech environments, surrounded by their technical peers in enclaves such as Silicon Valley; Route outside Boston; Research Triangle in North Carolina; Redmond, Washington; and Austin, Texas. Software engineers constantly encounter their peers when they shop, dine out, take their kids to school, and relax, and their contact with frustrated computer users is limited.

What's more, the occasional unfocused gripes of the users are offset by the frequent enthusiasm of the knowledgeable elite. We forget how far removed we and our peers are from the inability of the rest of the country not to mention the world to use interactive tools without frustration.

We industry insiders toss around the term "computer literacy," assuming that in order to use computers, people must acquire some fundamental level of training. We see this as a simple demand that isn't hard and is only right and proper. We imagine that it isn't much to ask of users that they grasp the rudiments of how the machines work in order to enjoy their benefits. But it is too much to ask.

Having a computer- literate customer base makes the development process much easier—of that there can be no doubt—but it hampers the growth and success of the industry and of society. If cars weren't so deadly, people would train themselves to drive the same way they learn Excel.

The concept of computer literacy has another, more insidious, effect. It creates a demarcation line between the haves and have-nots in society. If you must master a computer in order to succeed in America's job market beyond a burger-flipper's career, then mastering the interactive system's difficulty prevents many people from moving into more productive, respected, and better-paying jobs.

Users should not have to acquire computer literacy to use computers for common, rudimentary tasks in everyday life. Users should not have to possess a digital sensitivity to work their VCR or microwave oven, or to get email. What's more, users should not have to acquire computer literacy to use computers for enterprise applications, when the user is already trained in the application domain.

An accountant, for example, who is trained in the general principles of accounting, shouldn't have to be computer literate to use a computer in her accounting practice. Her domain knowledge should be enough to see her through. As our economy shifts more and more onto an information basis, we are inadvertently creating a divided society. The upper class is composed of those who have mastered the nuances of differentiating between "RAM" and "hard disk. The irony is that the difference really is inconsequential to anyone except a few hard-core engineers.

Yet virtually all contemporary software forces its users to confront a file system, where your success is fully dependent on knowing the difference between RAM and disk. Thus the term "computer literacy" becomes a euphemism for social and economic apartheid. Computer literacy is a key phrase that brutally bifurcates our society.

But what about people who are not inclined to pander to technocrats and who cannot or will not become computer literate? These people, many by choice, but most by circumstance, are falling behind in the information revolution. Many high-tech companies, for example, won't even consider for employment any applicant who does not have an email address or whose resume isn't online.

I'm sure that there are many otherwise-qualified candidates out there who can't get hired because they are not yet wired. Despite the claims of the apologists, using email effectively is difficult and involves a significant level of computer literacy.

Therefore, it artificially segregates the workforce. It is the moral equivalent of the banking technique of "redlining. Although the red lines on the map are ostensibly drawn around economic contours, they tend to follow racial lines all too closely.

Bankers protest that they are not racists, but the effect is the same. When programmers speak of "computer literacy," they are drawing red lines around ethnic groups, too, yet few have pointed this out. It is too hard to see what is really happening because the issue is obscured by technical mythology.

It is easy to see—regardless of how true—that a banker can make a loan on one house as easily as on another. However, it is not easy to see that a programmer can make interactive products easy enough for people from lower socioeconomic backgrounds to use. As an industry, we are largely in denial about the problem of usable interactive products. There are too many apologists shouting about dancing bears.

Their histrionics drown out our doubts about the efficacy of our software-based products. Before we begin to look for solutions, we must collectively come to our senses about the scope and severity of the problem. This is the goal of the next section. Design Is a Big Word The theme of this book is that interactive products need to be designed by interaction designers instead of by software engineers.

This assertion often generates instant antagonism from programmers who have been doing design all along. Furthermore, these programmers fear that by taking design away from them, I'm taking away the best and most creative aspect of their work, leaving them condemned to coding drudgery unleavened with fun.

This is absolutely untrue. Their worry stems only from the imprecise nature of the term design. The entire software-creation process includes design, all the way from selecting the programming language to choosing the color of the delivery truck. No aspect of this lengthy and involved process is more design-filled than the programming itself.

Programmers make design decisions at every step of their process. The programmer must decide how each procedure will call each other procedure, how information and status will be shared, stored, and changed, and how the code's validity will be guaranteed. All of these decisions—and the millions more like them—are design decisions, and the success of each one depends on the programmer's ability to bring her experience and judgment to bear.

I draw a simple dividing line through this sea of design. I put the part of the design that will directly affect the ultimate end user of the product on one side.

On the other side is all other design. In this book, when I speak of "interaction design," I am referring only to the former. I call the remaining design that doesn't affect the end user program design. It is not possible to base the dividing line on purely technical criteria. It cannot be expressed in terms that are familiar to engineers because the differentiating factor is human, not technical, and engineering rules aren't applicable to people.

For example, the interaction designer typically is agnostic about issues such as which programming language is to be used.

However, occasionally the choice of language affects response time, which most assuredly is an interaction issue, and the designer will have something to say.

Almost all interaction design refers to the selection of behavior, function, and information and their presentation to users. End-product interaction design is the only part of the design that I want to take away from programmers and put into the hands of dedicated interaction designers.

The Relationship Between Programmers and Designers In a technical world dominated by engineers, internal program design has held sway, and interaction design for the end user's benefit has been incorporated only on an after-the-fact, spare-time basis. One of the goals of this book is to reveal the benefits of inverting this priority and making interaction design the first consideration in the creation of software-based products. Most Software Is Designed by Accident Mud huts and subterranean burrows are designed—albeit without much conscious thought—by the demands of rock and thatch.

Similarly, all software is designed by the arcane demands of programming languages and databases. Tradition is the strongest influence in the design of all of these media. The biggest difference is that the builder-designer of the hut will also be its primary occupant, whereas programmers typically don't use the software they design. What really happens in most programming shops is that there is no one on staff who has a clue about designing for end users.

However, these same clueless people are far from clueless about program design, and they have strong opinions about what they like, personally. So they do what they do, designing the interaction for themselves, subject to what is easiest and most enjoyable to code, and imagine that they are actually designing for users.

While it seems to the programmer that lots of design is getting done, it is only lots of program design, and very little end-user design. Because the lack of design is a form of design, whenever anyone makes decisions about program behavior, he is assuming the role of interaction designer.

When a marketing executive insists that a favorite feature be included in the product, she is designing. When a programmer implements a pet behavior in the product, he is designing.

The difference between good design and this kind of inadvertent, mud-hut design isn't so much the tools used or the type of gizmos, but the motivation. The real interaction designer's decisions are based on what the user is trying to achieve.

Ersatz designers' decisions are based on any number of other random rationales. Personal preferences, familiarity, fear of the unknown, directives from Microsoft, and miscues from colleagues all play a surprisingly large role.

Most often, though, their decisions are based on what is easiest for them to create. It implies that only the interface is answerable to the users' needs.

The consequence of isolating design at the interface level is that it licenses programmers to reason like this: Like putting an Armani suit on Attila the Hun, interface design only tells how to dress up an existing behavior. For example, in a data-reporting tool, interface design would eliminate unnecessary borders and other visual clutter from a table of figures, color code important points, provide rich visual feedback when the user clicks on data elements, and so on.

This is better than nothing, but far from sufficient. Microsoft invests many millions of dollars on interface design, but its products remain universally unloved. Behavioral design tells how the elements of the software should act and communicate. In our example, behavioral design tells us what tools you could apply to that table of figures, how you might include averages or totals.

Interaction designers also work from the outside in, starting from the goals the user is trying to achieve, with an eye toward the broader goals of the business, the capabilities of the technology, and the component tasks. You can go still deeper to what we call conceptual design, which considers what is valuable for the users in the first place.

In our example, conceptual design might tell you that examining a table of figures is only an incidental task; the users' real goal is spotting trends, which means that you don't want to create a reporting tool at all, but a trend-spotting tool. To deliver both power and pleasure to users, interaction designers think first conceptually, then in terms of behavior, and last in terms of interface. Why Software-Based Products Are Different Cognitive friction creeps into all software-based products, regardless of their simplicity, and cognitive friction makes them much more difficult to use than equivalent mechanical-age products.

As an example, here are the contents of my pants pocket: The knife is pure industrial age: You can see how it is built, how it works, and how to work it just by a cursory inspection—by manipulation.

When you flip open the knife blade, you can see that it is sharp, and you can imagine the power it has for cutting. The knife has a grand total of six blades, plus a toothpick and tweezers. The use of all of them is readily apparent. I can easily and intuitively discern how to manipulate the knife because of the way it fits my hand and fingers.

The knife is a pleasure to use. The keyless entry system accompanying my car keys is a different beast altogether. It only has two push buttons on it, so—from a manipulation point of view—it is much simpler than the knife. As soon as my hand grips the smooth, black-plastic case, my fingers naturally and intuitively discover the two push buttons, and their use is obvious: Press to activate.

Ah, but there is silicon, not steel, behind those buttons, and they are far harder to work than they seem.

The large button locks the car and simultaneously arms the alarm. There is also a second, smaller button labeled Panic. When you press it, the car emits a quiet warble for a few seconds. If you hold it down longer, the quiet warble is replaced by the full decibel blasting of the car alarm, whooping, tweeting, yowling, and declaring to everyone within a half-mile that some dolt—me—has just done something execrably stupid.

What's worse, after the alarm has been triggered, the little plastic device becomes functionally inert, and further pressing of either button does nothing. The only way to stop that honking announcement of my palpable stupidity is to walk to my frighteningly loud car, enduring withering stares from passersby, unlock the driver's door with the key, then insert the key into the ignition and twist it. It really makes me feel like an idiot. If my car merely got robbed it would make me feel violated and sad, but it wouldn't make me feel stupid.

In my previous book, I stated that the number-one goal of all computer users is to not feel stupid. I further asserted that good interfaces should avoid presenting users with ejection-seat levers intermingled with the controls for common, everyday functions.

Here is a classic example of a device that really makes users feel stupid by putting an ejector-seat lever right up front. Accidentally setting off the ejector-seat lever initiates a personally embarrassing episode tantamount to showing up at the office having forgotten your pants.

My Swiss Army knife just doesn't have the capability of doing that. Not only can I not imagine a reason why any person would want to use either of the functions on the second button, but I question why the makers of the control didn't take advantage of the golden opportunity to provide me with functions that are desirable and useful. Much to my surprise, I recently read in the Wall Street Journal about a bona fide use for the Panic button.

A family was camping in Yosemite National Park, and a wild bear began trashing their car in an attempt to get at the food locked within. The mother pressed the Panic button, and the alarm eventually discouraged the bear. Maybe that little button should be labeled "Bear Repellent. When I pop into the local Starbucks for some coffee, I don't need the level of protection that I need at, say, the airport.

I would really like to have the ability to lock and unlock my car from the remote without involving the alarm system. This would be quite useful when I'm just driving to local shops or dropping my kids off at school.

Another quite useful and desirable feature would be an option to support an even more secure locking system. Occasionally, when I return to my previously locked car, I find that it has become unlocked in my absence. This happens when someone with a similar car made by the same manufacturer parks near my car. When that person presses the button to lock his car, it also gives the signal to unlock mine, disarming the alarm, and opening up my car to the depredations of any passing sociopath.

This scenario is most disturbing in precisely the situation where it is most likely to occur: It sure would be a useful application of the technology if I could lock and arm my car in such a way that I could unlock and disarm it only by personal application of the metal key in the door. Obviously, I know that the technology exists to do this because that is how the alarm itself is turned off after it is triggered.

Unfortunately, the designers of the system made certain that regardless of how I lock the car, anyone's big button can unlock it. The Swiss Army knife is complex and packed with features, some hidden quite cleverly, yet learning and using it is simple, predictable, and intuitive. Using the keyless entry system is difficult, problematic, and capable of instantly embarrassing me. In short, the interaction with the system sucks.

It is plain old bad, and I hate it. The Dancing Bear On the other hand, if you made me choose between my knife and my keyless system, I'd toss away the knife in a New York minute. Immediately after first using my keyless entry system, I couldn't imagine ever not owning one. It is the single most convenient feature of my car, and I use it more often than any other one. I use it 10 times to every 1 time I use the knife. See our User Agreement and Privacy Policy. See our Privacy Policy and User Agreement for details.

Published on May 25, SlideShare Explore Search You. Submit Search. Successfully reported this slideshow. We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime. Upcoming SlideShare. Like this presentation? Why not share! An annual anal Embed Size px. Start on. Show related SlideShares at end. WordPress Shortcode. Published in: Full Name Comment goes here. Are you sure you want to Yes No. Be the first to like this.

No Downloads.