“Explanation, when regarded as the only goal of inquiry, becomes a substitute for understanding. Imperceptibly it becomes the beginning rather than the end of perception.” — Abraham Joshua Heschel
Here is a list of random thoughts that have occurred to me while reading Inventing the Individual:
- When you realize how deep the connection was between family and land, and what it meant in the ancient world to lose these fundamental connections, the history of the Jewish people becomes both more familiar and more miraculous.
- The history of humankind could be told as a story of evolving relationships between immediate and transcendent realities expressed in terms of interpretations given to these relationships.
- I’ve been drawing my asterisk symbol for close to a decade now. I think I have been wrong about the nature of the subject present at the nexus. The subject rarely singular.
- Slavery in pre-individual times might have a meaning inaccessible the imagination of an individual.
- “Now is a small town.” Yes, it is. These days historical cosmopolitanism is damn near nonexistent.
This old post warrants an edited re-post:
A person’s attitude toward science tells us much more about his attitudes toward his fellow human beings than it does with his attitudes toward nature.
Science is a multi-generational collaborative unfinished accomplishment of the most intelligent, inventive, scrupulous and industrious people humankind has ever produced.
To place one’s own gut feelings on truth at the same level as the knowledge produced by science, or to refuse to understand and contend with science’s accounts when they conflict with one’s own sense of reality — this violates two of the highest laws of reason, which might as well be one and the same: 1) respect reality with all your mind, heart and effort, and 2) respect your neighbor’s truth as you respect your own.
From Adam Gopnik’s “The Porcupine: A pilgrimage to Popper”:
In the real world, as Popper knew perfectly well, the response of the scientist who has proposed that all swans are white when a black swan appears is not to say, cheerfully, “Wrong again!” It is to say, “You call that a swan?” The principle of falsification would begin an argument rather than prove a point. But the argument was the point. The argument that the black swan would produce—an argument about what evidence was crucial, and why—was different from all other kinds of argument. Science wasn’t a form of proof. It was a style of quarreling. The reason science gave you sure knowledge you could count on was that it wasn’t sure and you couldn’t count on it. Science wasn’t the name for knowledge that had been proved true; it was the name for guesses that could be proved false.
I am not one of those people who sees service design as the grand catch-all for multi-touchpoint multi-/omni-channel experiences.
I feel the same way about “service” as I did in the early aughts about the term “user”. These words imply relationships between what is designed and the person whom it is designed. Designing for the wrong relationship means misframing the design problem. “User” implies a tool relationship. Users use things as a means to accomplish something. Of course we can apply the word ‘use’ broadly and see a movie as something an audience uses for entertainment or a concierge as something a visitor uses to get local information, but this breadth is purchased at the cost of consequential subtleties. What we need and expect from a word processor is different from what we expect from a concert or a bank. Discovering exactly what those needs and expectations are and developing satisfactory resolutions of those needs calls for different methods. The mistakes UX have historically made were often tied up with insufficient sensitivity to these distinctions. The same is true of “services”. We can reduce a drill to one component in hole-making service that spans a journey from discovering a need all the way to resolving it, and, yes, much is gained from seeing it this way, but if we are not careful, important distinctions can be lost.
And in fact I do believe certain things are currently being lost by this framing. Software as a service (aka cloud computing) has changed norms around how software is supposed to behave. We are now accustomed to think of web-based software as something that belongs to someone else that we are licensed to make use of. A decade ago, users were more likely to perceive software as tools to own, learn and eventually master. Upgrading was a purchase decision resembling the decision to replace a pen or a hammer with an improved model — not as a periodic change that just happens and requires us to adapt.
This seems mostly OK in many cases, especially where tools serve as front ends to services, for instance banking and accounting, or databases. But for software tools used for making things — word processing, image editing, ideating, music creation, even blogging — changes, especially subtle ones, distract from the tools purpose which is to be an invisible extension of a user’s abilities. It is important that such tools be utterly predictable, controllable and unobtrusive so the user can exercise mastery over the tool to keep complete focus on what is being produced. I am concerned that software designers have lost all awareness of this goal. They are focused on different problems.
Years ago I was struck by the elegance of James Spradley’s research method typology, defining them not by technique, but rather by the role played by the research informant. Surveys are performed with respondents, tests with subjects and ethnography with informants. I think a similar approach could be helpful for classifying design methods. Perhaps we could gain clarity by paying less attention to medium or channel of delivery and more attention to the kind of relationship we are trying to develop through our design between the designed thing and the people for whom it is intended.
A design trend that disturbs me intensely: obtrusive conveniences.
What makes these conveniences obtrusive is that they make it incredibly inconvenient to refuse what they offer and you end up fighting for control over what you are attempting to do.
An example that is driving me away from iOS is text selection. Instead of giving the user direct character-level control over selection, iOS tries to divine the user’s intention. Are they selecting just a character? or a word? or a text block? It never gets it right, and the effect is one of fighting for control.
Autocorrect also blows it constantly. If you use unusual words it constantly changes them to common ones for you. It is like one of those idiots who insists on finishing your sentences for you constantly despite having no idea that you are saying something they don’t already know. I can’t believe Jony I’ve hasn’t done something about how much effort it takes to type his name against the digital will of the devices he’s made.
And these behaviors are not even bad in a consistent way across apps. Now a new breed of “creative” coder has entered the scene who feels he can improve “the experience” by adding his own innovative flourishes to text editing. Nowevery editor you use has different behaviors around selection, spell checking, formatting, etc. Sadly, the more powerful HTML becomes, and the more empowered designers and developers are, the more inconsistent the overall OS platform user experience becomes. “Learn once, use often” has been replaced with utter chaos of second-rate ingenuity. The very editor I am using now (WordPress) is one of the worst offenders.
And don’t even get me started on autocomplete. When everything is optimal — the device is running smoothly, the internet connection is fast, and the user is typing accurately — autocomplete is great. But things are rarely optimal, so what actually happens is painful delays between keystroke and result, leading to mistyping, leading to attempts to delete and correct, with missed keystrokes and that same desire to escape being helped so ineptly.
Behind it all are philosophical principles which I can feel palpably in these interactions. For one thing, there is no awareness that this product is one element of a much larger experiences. For one thing, there is the experience of using the device, something few developers consider anymore. Then there is the experience of trying to get something done. And of course, there is the experience with organizations over time. Human-centered designers think about these overlapping contexts and design with them in mind, but in recent years companies have come to the opinion that iterative trial and error with ludicrously short development cycles that leave little or no time for testing will get them to a great product faster than being thoughtful or thorough. In all of this I detect a relapse, away from empathic discipline (thinking subjectively in terms of experiences) back into obsessive making of objects (which are still called “experiences” by people who like the idealistic tone of the term and the mouthfeel of X). But what bothers me worse is a sense that these coralling conveniences are ok for most people, who don’t really need control, and who are happy to say and do what is easily expected. In these near-irresistable conveniences I feel a sludgy flow toward a brave new world of lethargic uniformity where everything is dittoing, me-tooing, LOLing, emoticoning from a shrinking repertoire of publically recognized standardized experiences.
If any individuals are still out there, consider this a liberal beacon. Hello? Hello?
Apple used to innovate by asking “Wouldn’t it be great if people could ____?” This was what made them uniquely great.
Now Apple does what every other banal tech company does and asks “Wouldn’t it be great if we could make a thing that could ____?” Or even worse “Wouldn’t it be great if we made a thing that has ____ characteristics/features/specs?”
This is why Apple keeps coming up with the same ideas as everyone else in the industry and why none of what they do matters one bit, however much their gadgets get hyped by gadget enthusiasts.
This hyping is part of our problem: great designs are better to use than to obsess over and to talk about. Most of what is best in great design is hard to talk about and is boring to read about. Great design tends to disappear. But cool features, record-setting specs and thrilling visuals generates buzz and drives short-term sales.
I think our culture’s gadget porn problem might be destructive in ways that parallel our culture’s sexual porn problem.
Just as pornography confuses and misleads youth about healthy relationships between partners, gadget porn confuses consumers about healthy relationships between people and things. In both cases, what is most healthy is quiet and not much to talk about but makes life much better. Addiction to lust drives people into cycles of craving, temporary satiety and empty boredom.
When design isn’t rewarded in the market, companies stop taking it seriously. They don’t invest in making products that are great to use, the make sexy-looking gimmicks that open wallets. Our tools start out as pleasant diversions and end up as perpetually irritating distractions.