Category Archives: Design Instrumentalism

One way to see design

Design is materialized philosophy.

When designing something — which always and necessarily means designing something for someone — the central question is always: what is the right philosophy for this context?

The purpose of design research is to get to the heart of this central question out, and then to pose the design problem in such a way that designers think about the design problem in the right way, from the philosophical perspective suited to the problem.

Design briefs are tiny philosophical primers.

A good design brief will effect a perspectival shift in the reader (the designer) that brings new possibilities into view, possibilities that were inconceivable prior to the shift. This phenomenon is what is commonly called inspiration.

It is the job of design researchers to produce precision inspiration.

Designers develop hybrid systems

Reading Verbeek’s What Things Do, I’m reminded of Latour’s handy term “hybrid”, an entity that is neither purely subjective nor purely objective, but a fusion of both.

In Latour’s eye, the distinction between nature and society, or subject and object, which has seemed so self-evident since the Enlightenment, needs to be seen as a product of modernity that has far exceeded its expiration date. No other society makes this distinction in such a radical manner, and in ours it is more and more painfully obvious how poorly it allows us to comprehend what is happening in the world. The project of modernity, according to La tour, consists of the attempt to purify objects and subjects — we set objects on one side, subjects on the other, and draw a line between them. What is on the one side of the line is then material for scientists to investigate, with what is on the other side for the social scientists. … This purification and separation of subjects and objects, according to Latour, is coming to be less and less believable. Ever more entities arise that cannot be comfortably placed in this dichotomy. Latour calls these entities “hybrids.” The irony is that these hybrids thrive thanks to the modem purification: precisely because they don’t fit within the subject-object schema, we cannot recognize them and therefore they can proliferate at an astounding rate without anyone trying to stop or change them. But now, as their numbers become ever greater, it becomes more and more difficult to deny their existence. We are flooded with entities that straddle the boundary between humans and nonhumans…

Humans and nonhumans are just as bound up together in our culture as they are in others; therefore, Latour concludes, we need to study our technological culture similarly to the ways that anthropologists study other cultures. This means studying how the networks of relations between humans and nonhumans develop and unravel. In order to understand our culture, we must trace out both the process of purification and that of hybridization; we must understand how hybrids arise and why they are not seen as hybrids. In order to understand phenomena, they should be approached as black boxes that, when opened, will appear to contain myriad relations and activity.

If we grasp and internalize this understanding of hybrids, it becomes possible to compactly differentiate how designers approach their problems versus how engineers approach theirs — and why they so often marginalize designers and accidentally prevent designers from working in the way designers believe is best. Here it goes:

Designers develop hybrid systems. Engineers develop objective systems.

*

I’ve written two elaborations of this idea, material to supply the understanding that makes grokking the compact definition above possible. I’ll post both, because there’s no time this morning to combine them.

Version 1

An engineering perspective treats design as a sub-discipline of engineering. Design adds an aesthetic (and among more enlightened engineers) and usable “presentation layer” to a functional objective system.

A design perspective ought to treat engineering as a sub-discipline of design. Once a hybrid subjective-objective system is developed through a design approach, objective sub-systems can be defined within the larger context of the hybrid system and built according to engineering methods. According to a design mindset, an engineered system is always and necessarily a subsystem belonging to a larger hybrid system that gives it its purpose and value.

The reason so few people see the obvious truth of the latter design perspective is that their vision is obstructed by a philosophical blockage. The hybrid system concept does not play nice with the modern subject-object schema. Designers learn, through the practical activity of design, to view problems in a new way that is incommensurable with modernity’s default philosophy (as described by Latour).

But designers are rarely philosophical, so the methods rarely progress to the point of praxis. Design language is all bound up with humans and the trappings of subjectivity (emotions, opinions, habits, etc.) on the side of who the design is for and the trappings of romanticism on the side of who does the designing (insight, inspiration, creativity, passion, etc.) Design practice is a jumble of “recipes” — procedures, jargon, styles and theater — a subterfuge to make design fit the preconceptions of folks who don’t quite get what designers are really up to. So design submits to modernist schema and goes to modernism’s special territory for people people, romanticism.

Version 2

Designers consciously work on developing hybrid systems where subjective and objective elements relate and interact. In design, people and things are thought of together as a single system. And things are not only material objects; they can be ideas, habits, vocabularies, etc. Whatever makes a design work or not work, including the engineered elements, as well as all business, cultural, environmental considerations are part of a design problem. When a designer opens a black box of their own making, they will see subjects interacting with objects and subjects, and objects interacting with other objects and subjects.

Engineering, on the other hand, works inside the subject-object dichotomy, and works on problems of objective systems. As a matter of method, engineering purifies objects and arranges them in systems. When an engineer opens a black box of their own making they see objective components interacting and working as a system.

From the view of engineers, the interior of the black boxes they make are their concern, and the surface layer of the box — the point where subjectivity encounters engineering, is the concern of designers. Engineers build the black box; designers paint and sculpt it to make it appealing to people.

Philosophy fails

This morning I was forced to use the redesigned Freshbooks. They’d totally revamped the user interface. It was very slick and visual. And it was six times harder to use.

This is a depressingly typical experience in these times. I’m constantly suffering the consequences of wrongheaded, well-meaning changes made to things I rely on to not change.

I was lamenting to a friend how much work, time and money Freshbooks (and companies like them) sink into solving the wrong problem and making new problem.

I complained: “Not only did they thought about it wrong — they didn’t even think about how they were thinking. This was a philosophy fail.”

And these are the mistakes that offend me: philosophy fails. People failing to reflect when reflection is most needed. They need to think about how they are thinking about what they are doing… and they instead choose to be all startuppity, and just do. Just doing is valorized, and it shouldn’t be, any more than just thinking. We need thoughtful practice matched with practical thinking: praxis.

Let’s stop engineering philosophies

My pet theory is that philosophies have been developed in an engineerly mode of making, with emphasis on the thought system, and to be evaluated primarily epistemically: “is it true?” The Pragmatists improved on this by asking, “does it work”?

But I believe the pluralistic insight requires us to take a designerly approach to philosophy by expanding the questions we ask of philosophies to those of design (as originally posed by Liz Sanders): “is it useful; is it usable; is it desirable?” And human-centered design has taught us always to dimensionalize this triad with “…for whom, in what contexts?”

*

Very few people grasp what philosophies are, and how and why they are so important. They use philosophies that lead them to believe their philosophy is their belief system — the things they believe to be true true and the means by which they evaluate these truths. They think they know what their philosophy is, because their philosophy points them only to their explicit assertions and arguments, and this sets sharp limits to what they can think and what they can do with their thoughts.

Philosophies are the tacit thinking that give us our explicit truths and our sense of reality. We had better design them well! But we keep engineering them… for other engineers.

Maybe philosophy is waiting for its own Steve Jobs.

Maturing

Reading Appendix A of Rorty’s Achieving Our Country, “Campaigns and Movements” I came upon this bit: “Most of us, when young, hope for purity of heart. The easiest way to assure oneself of this purity is to will one thing—but this requires seeing everything as part of a pattern whose center is that single thing. Movements offer such a pattern, and thus offer such assurance of purity. [Irving] Howe’s ability, in his later decades, to retain both critical consciousness and political conscience while not attempting to fuse the two into something larger than either, showed his admirers how to forgo such purity, and such a pattern.”

That brought to mind another passage from the introduction of Nicolai Berdyaev’s Slavery and Freedom: “My thought has always belonged to the existential type of philosophy. The inconsistencies and contradictions which are to be found in my thought are expressions of spiritual conflict, of contradictions which lie at the very heart of existence itself, and are not to be disguised by a facade of logical unity.”

For me, this immediately connects up with three themes from Nietzsche’s thought: youth, wholesale thinking, and the compulsion to systematize. (To poke around in my glorious wiki — and you really should — use the password “generalad”). Rather than explicitly draw every connection, I will juxtapose some passages and make a concept chord meant to convey an ideal of maturity I learned from Nietzsche.

*

Rough consistency. — It is considered a mark of great distinction when people say ‘he is a character!’ — which means no more than that he exhibits a rough consistency, a consistency apparent even to the dullest eye! But when a subtler and profounder spirit reigns and is consistent in its more elevated manner, the spectators deny the existence of character. That is why statesmen with cunning usually act out their comedy beneath a cloak of rough consistency.

*

Beware of systematisers! — Systematisers practise a kind of play-acting: in as much as they want to fill out a system and round off its horizon, they have to try to present their weaker qualities in the same style as their stronger — they try to impersonate whole and uniformly strong natures.

*

I mistrust all systematizers and I avoid them. The will to a system is a lack of integrity.

*

Youth and criticism. — To criticize a book means to a young person no more than to repulse every single productive idea it contains and to defend oneself against it tooth and claw. A youth lives in a condition of perpetual self-defence against everything new that he cannot love wholesale, and in this condition perpetrates a superfluous crime against it as often as ever he can.

*

Consciousness. — Consciousness is the latest development of the organic, and hence also its most unfinished and unrobust feature. Consciousness gives rise to countless mistakes that lead an animal or human being to perish sooner than necessary, ‘beyond destiny’, as Homer puts it.’ If the preserving alliance of the instincts were not so much more powerful, if it did not serve on the whole as a regulator, humanity would have to perish with open eyes of its misjudging and its fantasizing, of its lack of thoroughness and its incredulity in short, of its consciousness; or rather, without the instincts, humanity would long have ceased to exist! Before a function is fully developed and mature, it constitutes a danger to the organism, it is a good thing for it to be properly tyrannized in the meantime! Thus, consciousness is properly tyrannized — and not least by one’s pride in it! One thinks it constitutes the kernel of man, what is abiding, eternal, ultimate, most original in him! One takes consciousness to be a given determinate magnitude! One denies its growth and intermittences! Sees it as ‘the unity of the organism’! This ridiculous overestimation and misapprehension of consciousness has the very useful consequence that an all-too-rapid development of consciousness was prevented. Since they thought they already possessed it, human beings did not take much trouble to acquire it, and things are no different today! The task of assimilating knowledge and making it instinctive is still quite new; it is only beginning to dawn on the human eye and is yet barely discernible it is a task seen only by those who have understood that so far we have incorporated only our errors and that all of our consciousness refers to errors!

*

When one is young, one venerates and despises without that art of nuance which constitutes life’s greatest prize, and it is only fair that one has to pay dearly for having assaulted men and things in this manner with Yes and No. Everything is arranged so that the worst of tastes, the taste for the unconditional, should be cruelly fooled and abused until a man learns to put a little art into his feelings and rather to risk trying even what is artificial: as the real artists of life do. The wrathful and reverent attitudes characteristic of youth do not seem to permit themselves any rest until they have forged men and things in such a way that these attitudes may be vented on them: — after all, youth in itself has something of forgery and deception. Later, when the young soul, tortured by all kinds of disappointments, finally turns suspiciously against itself, still hot and wild, even in its suspicion and pangs of conscience: how angry it is with itself now, how it tears itself to pieces, impatiently, how it takes revenge for its long self-delusion, just as if it had been a deliberate blindness! In this transition one punishes oneself with mistrust against one’s own feelings; one tortures one’s own enthusiasm with doubts, indeed, one experiences even a good conscience as a danger, as if it were a way of wrapping oneself in veils and the exhaustion of subtler honesty; and above all one takes sides, takes sides on principle, against ‘youth.’– A decade later: one comprehends that all this, too–was youth!

*

The so-called soul. — The sum of the inner movements which a man finds easy, and as a consequence performs gracefully and with pleasure, one calls his soul; — if these inner movements are plainly difficult and an effort for him, he is considered soulless.

*

The serious workman. — Do not talk about giftedness, inborn talents! One can name great men of all kinds who were very little gifted. The acquired greatness, became “geniuses” (as we put it), through qualities the lack of which no one who knew what they were would boast of: they all possessed that seriousness of the efficient workman which first learns to construct the parts properly before it ventures to fashion a great whole; they allowed themselves time for it, because they took more pleasure in making the little, secondary things well than in the effect of a dazzling whole. The recipe for becoming a good novelist, for example, is easy to give, but to carry it out presupposes qualities one is accustomed to overlook when one says “I do not have enough talent.” One has only to make a hundred or so sketches for novels, none longer than two pages but of such distinctness that every word in them is necessary; one should write down anecdotes each day until one has learned how to give them the most pregnant and effective form; one should be tireless in collecting and describing human types and characters; one should above all relate things to others and listen to others relate, keeping one’s eyes and ears open for the effect produced on those present, one should travel like a landscape painter or costume designer; one should excerpt for oneself out of the individual sciences everything that will produce an artistic effect when it is well described, one should, finally, reflect on the motives of human actions, disdain no signpost to instruction about them and be a collector of these things by day and night. One should continue in this many-sided exercise some ten years: what is then created in the workshop, however, will be fit to go out into the world. — What, however, do most people do? They begin, not with the parts, but with the whole. Perhaps they chance to strike a right note, excite attention and from then on strike worse and worse notes, for good, natural reasons. — Sometimes, when the character and intellect needed to formulate such a life-plan are lacking, fate and need take their place and lead the future master step by step through all the stipulations of his trade.

*

Learning. — Michelangelo saw in Raphael study, in himself nature: there learning, here talent. This, with all deference to the great pedant, is pedantic. For what is talent but a name for an older piece of learning, experience, practice, appropriation, incorporation, whether at the stage of our fathers or an even earlier stage! And again: he who learns bestows talent upon himself — only it is not so easy to learn, and not only a matter of having the will to do so; one has to be able to learn. In the case of an artist learning is often prevented by envy, or by that pride which puts forth its sting as soon as it senses the presence of something strange and involuntarily assumes a defensive instead of a receptive posture. Raphael, like Goethe, was without pride or envy, and that is why both were great learners and not merely exploiters of those veins of ore washed clean from the siftings of the history of their forefathers. Raphael vanishes as a learner in the midst of appropriating that which his great competitor designated as his ‘nature’: he took away a piece of it every day, this noblest of thieves; but before he had taken over the whole of Michelangelo into himself, he died — and his last series of works is, as the beginning of a new plan of study, less perfect and absolutely good precisely because the great learner was interrupted in his hardest curriculum and took away with him the justificatory ultimate goal towards which he looked.

*

A man’s maturity — consists in having found again the seriousness one had as a child, at play.

*

Human beings are naturally artificial.

It is not our nature that is most precious; it is our hard-won second-nature, that set of artifices that are so well-designed that they disappear into our being and into the world we perceive around us. They become so natural to us that we can no longer experience them as man-made, and we begin to see them as God-given if we see them at all. And they are God-given, if we understand our real relationship with God.

Amen?

Eroding to wisdom

The best quotes are the misattributed ones — overused maxims that become smoother as they tumble from paraphrase to paraphrase until they are worn smooth like river stones.

Whenever I track one of these retroactively adopted orphans back to their birthplace, I discover that almost always its character has been improved by the traumas of public life.

Take for instance the famous quote that Yogi Berra should have said, but actually never did say: “In theory there is no difference between theory and practice, but in practice there is.” The original quote appeared in flabbier form in a Usenet proto-meme: “In theory, there is no difference between theory and practice, but in practice there is a great deal of difference.” Incidentally, one Berra quote Berra really did say is “I never said most of the things I said.”

Mark Twain is a popular misattributed source of collaboratively improved quotes, probably because Twain is the only writer of pithy sayings most people know, so if they hear a pithy saying they assume Twain must have said it. A great example of a Twain saying that Twain never said is “If your only tool is a hammer, everything looks like a nail.” Quote Investigator found the earliest example of this quote to be “Give a boy a hammer and chisel; show him how to use them; at once he begins to hack the doorposts, to take off the corners of shutter and window frames, until you teach him a better use for them, and how to keep his activity within bounds.”

Another fake Twain quote: “If I had more time, I would have written a shorter letter.” Quote Investigator explains the earliest English expression of this thought is a translation of a Pascal quote, “My Letters were not wont to come so close one in the neck of another, nor yet to be so large. The short time I have had hath been the cause of both. I had not made this longer then the rest, but that I had not the leisure to make it shorter then it is.” It took 300 years to shorten this quote to its current svelteness.

I even prefer the bastardized versions of properly attributed quotes. William James comes to mind:

When a thing is new, people say: “It is not true.”

Later, when its truth becomes obvious, they say: “It’s not important.”

Finally, when its importance cannot be denied, they say “Anyway, it’s not new.”

Who could possibly prefer the original?: “First, you know, a new theory is attacked as absurd; then it is admitted to be true, but obvious and insignificant; finally it is seen to be so important that its adversaries claim that they themselves discovered it.”

This meditation on misattributed quotes hints at something important: The lessons of the “gossip game” might need some qualifications. It is undeniably true that factual information passed from person to person does degrade over the course of minutes, hours, days and months. But is this true of wisdom passed from generation to generation over the course of decades or centuries? Perhaps not. Maybe wisdom seeks its perfect form through wear.

The designer in me wants to include physical objects in the set of examples of “wisdom seeking form”. I have always loved the perfection of tradition-worn objects like houses, tables, chairs, knives, pens, teapots, clothes and bicycles. My love of erosive essentializing could make me look like some sort of conservative Platonist type, except for one subtle but crucial difference: the Platonist ideal lives above humanity in a heavenly realm of preexisting perfect archetypes; where my ideal lives among us in an eternal democratic project of iterative design, a trans-generational collaboration to makes things better and better, approaching but never quite reaching perfection.

*

A friend tells me I buried the lede on this piece, and that this gives the piece a frivolous effect. One thing I have learned reflecting on philosophical communication and my own characteristic miscommunications, is that philosophy tends to reverse normal patterns of explanation. Things don’t progress in the normal subject-to-predicate order. Instead, it goes predicate-predicate-predicate-subject. You don’t exactly know what the work is about until the about finishes abouting about and finally resolves into the “what”. A capacity to enjoy philosophy is tied to an ability to endure whatlessness for long anxious stretches, until the whole mess finally coalesces and crystallizes into clear conception that makes simple sense of what preceded it.

So there’s just no way am I going to put that lede out in front where it belongs. But, being a good Liberal, I do believe in compromise, so here is what I can do: I will exhume the lede, and append it to the end, so anyone who wants to can re-read the original with this explication in mind.

What I wanted to do was to demonstrate a progressive traditionalist attitude.

Progressive traditionalism might seem like a contradiction in terms, but this is a side-effect of unexamined views of tradition that produce two mutually reinforcing oppositions: 1) progressive anti-traditionalism that wants to ignore or trash an unacceptable past in order to clear the way for a better future, and 2) traditional traditionalism that sees the past as better and the present as unacceptable, and therefore wants a future that looks more like the past than the present.

Progressive traditionalism sees tradition as a long process of collaborative improvement. The past is a swirl of good and bad. Humanity, genius is mixed with ignorance and atrocities, and our ability to discern the good and bad is a direct result of the tradition’s progress. We wouldn’t know how appalling our past is if we hadn’t lived through it, learned from it and been changed by it. Further, this work is nowhere close to finished. We are making mistakes this very moment that will be obviously stupid and wicked within a decade. I believe one of those mistakes is thinking we must choose between wholesale condemnation or wholesale worship of the past instead of treating it with the critical respect it deserves.

I wanted to demonstrate this attitude simply, and I believed a good way to do this was to show that old famous sayings can actually improve over time through being worked on by innumerable unfamous people. And I wanted to make fun of our compulsion to project this simplicity back into the past by placing the perfected words into the mouths of acclaimed geniuses. Why would we want to do that? What is the source of this need? The hammer I carry is philosophy, and the nail I see here is the unconscious impulse to preserve the current popular philosophy (also known as “common sense”) at all costs. This current philosophy, by the way, is also producing our political crisis.

There is a lot to say on this subject and it connects with some of the things in my life I value most, including my adopted Jewish religion. But I’ll leave it here for now.

Usefulness, Usability and Desirability of philosophies

Tim Morton explains Speculative realism:

Speculative realism is the umbrella term for a movement that comprises such scholars as Graham Harman, Jane Ben- nett, Quentin Meillassoux, Patricia Clough, Iain Hamilton Grant, Levi Bryant, Ian Bogost, Steven Shaviro, Reza Negarestani, Ray Brassier, and an emerging host of others such as Ben Woodard and Paul Ennis. All are determined to break the spell that descended on philosophy since the Romantic period. The spell is known as correlationism, the notion that philosophy can only talk within a narrow bandwidth, restricted to the human-world correlate: meaning is only possible between a human mind and what it thinks, its “objects,” flimsy and tenuous as they are. The problem as correlationism sees it is, is the light on in the fridge when you close the door?

So far, 65 pages in, I am seeing absolutely no progress toward transcending the human-world correlate. I am seeing attempts at using measurements and mathematical models as a substitute for intuition, but what could possibly be more human than that, even when, or especially when, such substituting of ratiocination for instinct make our minds feel abstracted from our animal bodies. I am also seeing speculations about real objects and what they might be like substituted with access to first-person being of objects (first-object being?). When you insist the light is on in the fridge when you close the door because it is the nature of light to withdraw when doors are shut, you’ve posed a possibility for humans to consider or for human scientists to investigate, and that should not be confused with seeing the inner-light of the fridge with superhuman refrigerated eyes. And even if you place sensors inside the fridge, or discover ways to detect or deduce light inside a closed fridge, or account for your inability to sense, detect or deduce with ontological maxims of withdrawal, you may be “seeing” through long networks of instruments (both physical and mental) but it all converges and terminates in an all-too-human “eye”. And this is true whether that eye is manifested from some archetypal realm, or the eye is imagined in a man’s or god’s mind (along with what is seen), or if the eye is an organ emergenging from interplay of matter and energy situated in a space-time container or if the eye is an object within hyperobjects.

For us, one we know a thing it all becomes something for-us, including our conviction that it is not only what is known, and that it is for-itself. As Nietzsche said “We cannot look around our own corner: it is a hopeless curiosity that wants to know what other kinds of intellects and perspectives there might be; for example, whether some beings might be able to experience time backward, or alternately forward and backward (which would involve another direction of life and another concept of cause and effect). But I should think that today we are at least far from the ridiculous immodesty that would be involved in decreeing from our corner that perspectives are permitted only from this corner. Rather has the world become “infinite” for us all over again: inasmuch as we cannot reject the possibility that it may include infinite interpretations. Once more we are seized by a great shudder — but who would feel inclined immediately to deify again after the old manner this monster of an unknown world? And to worship the unknown henceforth as “the Unknown One”? Alas, too many ungodly possibilities of interpretation are included in the unknown, too much devilry, stupidity, and foolishness of interpretation — even our own human, all too human folly itself, which we know…”

So good for the speculative realists that they have uncovered another human perspective for thinking in a less human-intuitive way. If learning to think that way delivers on the promise to make an ecological ethic more accessible, I’m all for it.

However, I am beginning to worry that this access is most likely occur through the thin conduit of argument, which rarely fully engages human intuition or or taps into moral impulses which “know” by caring or neglecting.

And as a designer-philosopher, I know hitting all three is paramount. For in design these are the holy trinity of experience, the necessary conditions of adoption: Useful, Usable, and Desirable.

The vice of utilitarian, functionalist folks who fancy themselves objective is they find far too much desirability in mere usefulness, and that desirability motivates them to surmount difficulties in comprehension — and then they find yet more desirability in the accomplishment of having surmounted the difficulties. This is why engineers, left on their own, engineer systems that only other engineers can use, much less love. Design is changing all that (at least for things made for non-engineers) and Human-Centered Design is accelerating that change.

Philosophy has been and is in the same stage as pre-design engineering. Because it requires motivated philosophical investigation to even grasp what philosophy does and is, most people can’t even see what it can be used for, or to even detect the symptoms of an obsolete or corrupted philosophy (or, as today, clashing of multiple corrupted obsolete philosophies). Philosophers engineer philosophies for other philosophers.

When philosophies are popularized, all that changes is the Usability. Now an ordinary above-average-smart person can get a sense of what philosophers are making for each other. They probably can’t get the same jolt of pleasure out of it, since most philosophy exists for academic philosophers’ purposes and tastes, but they can get a bit of that surmounting-difficulties pleasure and they can plume their social personas with the book-learning.

What most needs changing is Usefulness and Desirability.

By usefulness, I mean recognizing that every philosophy enables us to think certain kinds of thoughts. The live problems that orient and motivate philosophical effort tend to produce philosophies well-suited to think similar problem-types. The philosophy will instantly become difficult to distinguish from the reality it understands, so there’s a bit of a trap-like character to it. Philosophies are not tools we hold, look at and manipulate. They are tools we climb inside, see from and act from.

By desirability, I mean that you are moved by it. You don’t force yourself to care, or work yourself up and amp-up what little caring you feel. You don’t get argued out of apathy. The philosophy simply makes the importance of whatever it does self-evident. You just do care, and you will not even be able to account for why. Philosophies produce their own motivation, and are actually the only thing capable of producing motivation out of thin air, apart from simple health.

Actually, in writing this, I changed my mind. Philosophy needs to give far more attention to Usability. Popularization of philosophy might help people absorb the content of a philosophy, but that’s the most superficial aspect. Philosophies are not for knowing, they are for doing, for application to real-world situations. Way too many people, even philosophers, think a philosophy is a thing that is known, an object of knowledge. This is not true. Philosophies are that by which everything else is understood and known. And Usability in philosophy is the degree of ease things outside the philosophy itself can be understood. To do that, we must tap into the power of the tacit layer of understanding, intuition. Philosophies ought to be designed for intuitiveness — not a preexisting natural intuition, but an acquired second-natural intuition that operates without conscious effort.

Here, the Usefulness of the philosophy becomes important: useful for what purpose? Because the purpose of the philosophy determines its Usability trade-offs. Scissors make cutting easy and propping open a door or chilling perishable food not-very-easy. A philosophy engineered to make it easier to integrate the latest findings of physics and to overcome the human tendency to think in such a human-centric human-scale way might be super-useful for writing provocative books and heavily-cited scholarly papers and building a reputation in an emerging school of philosophy, but it might not help that many non-academics make sense of the things they encounter all day, to be able to reach understandings with various people of divergent perspectives, to respond effectively to events in their lives, and to feel the importance of all this understanding, responding, communicating, doing and being.

This brings me to that passage from Morton’s book that inspired this post, and which brought to mind another passage, which I will quote after this one:

The undulating fronds of space and time float in front of objects. Some speculative realism maintains there is an abyss, an Ungrund (un-ground) deeper than thought, deeper than matter, a surging vortex of dynamism. To understand hyperobjects, however, is to think the abyss in front of things. When I reach out to mop my sweating brow on a hot day, the California sun beating down through the magnifying lens of global warming, I plunge my hand into a fathomless abyss. When I pick a blackberry from a bush, I fall into an abyss of phenotypes, my very act of reaching being an expression of the genome that is not exclusively mine, or exclusively human, or even exclusively alive.

This passage summoned to mind, a quote by A. S. Eddington:

I am standing on the threshold about to enter a room. It is a complicated business. In the first place I must shove against an atmosphere pressing with a force of fourteen pounds on every square inch of my body. I must make sure of landing on a plank travelling at twenty miles a second round the sun — a fraction of a second too early or too late, the plank would be miles away. I must do this whilst hanging from a round planet head outward into space, and with a wind of aether blowing at no one knows how many miles a second through every interstice of my body. The plank has no solidity of substance. To step on it is like stepping on a swarm of flies. Shall I not slip through? No, if I make the venture one of the flies hits me and gives a boost up again; I fall again and am knocked upwards by another fly; and so on. I may hope that the net result will be that I remain about steady; but if unfortunately I should slip through the floor or be boosted too violently up to the ceiling, the occurrence would be, not a violation of the laws of Nature, but a rare coincidence. Verily, it is easier for a camel to pass through the eye of a needle than for a scientific man to pass through a door. And whether the door be barn door or church door it might be wiser that he should consent to be an ordinary man and walk in rather than wait till all the difficulties involved in a really scientific ingress are resolved.

Well-designed philosophies open doors, and let our human, all-too-human, irreducibly-human eyes see what is in there so we can understand it and respond as humans and follow our human purposes.

An autobibliobiography

Well, I tried to write about my books and how I want to prune my library, and ended up writing a history of my interests. I know there are loose ends, but I am tired of writing, so blat, here it is:

I used to have strict criteria for book purchases. To earn a place on my shelf (singular) a book had to be either a reference or a landmark. In other words, I had to see it as persistently valuable in my future, or it had to be valuable in my past as something that influenced me. My library was personal.

Somewhere along the way my library became more general. References grew to include whatever I imagined to be the basic texts of whatever subject I cared about. Landmarks expanded to include any book that housed some striking quote that I wanted to bottle up and keep. How did this happen?

When Susan met me, I owned one book, Chaos, by James Gleick. This book is the landmark of landmarks. Reading it was a major life event for me. It introduced me to two of the most crucial concepts in my repertoire. 1) nonlinear processes, and 2) Kuhn’s theory of scientific revolutions. I loved the philosophical fairytale of Benoit Mandelbrot discovering a radical new way of thinking, and then skipping from discipline to disciple, tossing out elegantly simple solutions to their their thorniest, nastiest, most intractable problems, simply by glancing at them through his magic intellectual lens. He’d give them the spoiler (“look at it like this, and you’ll probably discover this…”) and then leave the experts to do the tedious work of figuring out that he was exactly right. And I loved it that the simplest algorithmic processes can, if ouroborosed into a feedback loop, can produce utterly unpredictable outcomes. We can know the dynamic perfectly, and we can know the inputs feeding into the dynamic perfectly — but we are locked out of the outputs until the process is complete. And then factor in the truth that numbers, however precise, are only approximate templates overlaid upon phenomena! Nothing outside of a mathematician’s imagination is a rational quantity. And in nonlinear systems, every approximation, however minute, rapidly amplifies into total difference. I’d go into ecstasies intuiting a world of irrational quantities interacting in the most rational, orderly ways, producing infinite overlapping interfering butterfly effects, intimating a simultaneously knowable-in-principle, pristinely inaccessible-in-fact reality separated by a sheer membrane of truth-reality noncorrespondance. I used to sit with girls and spin out this vision of truth for them, serene in the belief I was seducing them. Because if this can’t make a girl fall in love, what can? I still hold it against womenkind that so few girls ever lost their minds over one of my rhapsodies. They were into other stuff, like being mistaken for a person capable of losing her mind over the beauty of a thought, or being someone who enchants nerds and compels them to rhapsodize seductively. There’s a reason for all of this, and it might be the most important reason in the world, though I must admit, it remains pristinely inaccessible to me and an inexhaustible source of dread-saturated fascination. (If you think this is misogyny, you don’t understand my religion. “Supposing truth is a woman — what then…?”)

After I got married, my book collection expanded, reflecting some new interests and enthusiasms: Buddhism, Borges, and stuff related to personality theory, which became my central obsession. Somewhere around 2001 or 2002 I also became a fan of Christopher Alexander’s psychology of architecture, and I had my first inklings of the importance of design. Incidentally, one of the books I acquired in this period was a bio of Alexander, characterizing his approach to architecture as a paradigm shift. This was my second brush with Kuhn.) Until 2003 my book collection still fit on a single shelf.

In the winter of 2003 in Toronto, Nietzsche happened to me. Reading him, fighting with him, and being destroyed by him, I experienced intellectual events that had properties of thought, but which could not be spoken about directly. It wasn’t like an ineffable emotion or something that couldn’t quite be captured in words. These were huge, simple but entirely unsayable truths. I needed concrete anchors — concepts, language, parables, myths, images, exemplars — anything that could collect, formalize, stabilize, contain or convey what I “knew”. This is when books became life-and-death emergencies for me, and sources of extreme pleasure. I couldn’t believe you could buy a copy of Chuang Tzu’s sayings for less than the cost of a new car. From 2003 to 2006 my shelf grew into a library. I accumulated any book that helped reinforced my intense but disturbingly incommunicable sense of truth — what I eventually realized was a faith.

But then the question of this inexplicable state of mind and its contents became a problem to me. What exactly is known? How is it known? Why think of it in terms of knowledge? If it cannot even be said, then how can it be called knowledge? And the isolation was unbearable. I was in a state I called “solitary confinement in plain sight” with in an overwhelming feeling of having something of infinite importance to get across, but I couldn’t get anyone to understand what was going on or to consider it important enough to look into. I got lots of excuses, arguments, rebuffs, cuttings-down-to-size, ridicule and promises to listen in some infinitely receding later, but I could not find any real company at all, anywhere. This was a problem I desperately needed to solve.

Richard J. Bernstein’s hermeneutic Pragmatism is what hoisted me out of this void and gave me back a habitable inhabited world, with his lauded but still-underrated classic Beyond Objectivism and Relativism. Equipped with the language of pragmatism, hermeneutics, phenomenology and post-empiricism (Kuhn, again) I could account for my own experiences and link them to other people’s analogous experiences. Not only that — he began my reconnection with design, which had become a meaningless but necessary source of rent, food and book money. I was able to reengage practical life. But Bernstein’s method was intensely interpersonal, an almost talmudic commentary on commentaries ringing a missing central common text.

Richard J. Bernstein’s bibliography, however, was the flashpoint for my out-of-control library. Each author became a new collection. Kuhn, Feyerabend, Lakatos, and then eventually Latour, and then Harman and now Morton… etc. Geertz seeded an anthropology and sociology shelf, which is now a near-bursting book case. Hanna Arendt is a whole shelf, and spawned my collection of political books and my “CDC vault” of toxic ideologies. Gadamer and Heidegger were another space-consuming branch. Dewey, James and Peirce fill about three shelves. And Bernstein’s line of thinking led me directly to Buber, who also breathed fire into my interest in the research side of Human Centered Design (another half a case of books) and sparked a long process of conversion to Judaism (yet another half-case, and growing).

A bunch of these threads, or maybe all of them together drove me into Bruno Latour’s philosophy. Latour inflicted upon me a painful (and expensive) insight: Everything Is Important. Statistics, accounting, technologies, laws, bacteria, materials, roads. Therefore I must get books on everything, apparently. With this we finally ran out of room in my bookcases, them my library room, then our house. We had to get a storage space to cycle my out-of-season books into and out of again when I realize I must read that book right now. Susan just got a second space. I have books stacked up everywhere. I am a hoarder.

I am considering putting all these books back under review, and keeping only the books that fit those two original criteria. Is it a landmark for me? Is it a reference that I know I will use?

I cannot be everything, and I need to stop trying. I need things that help me stay me, and I need to shed the rest. Good design demands economy, tradeoffs, clarity of intent. I have a bad case of intellectual scope-creep. It is time to decide what is essential, and to prune away nonessentials so the rest can grow in a fuller way.

I have another half-written post I think I’ll finish now.

Hyperobjective spew

I’ve gotten sucked into Tim Morton’s Hyperobjects. I was reading Kaufmann’s book on Hegel, but after sampling few pages of this book on the recommendation of a friend Morton’s book felt “next”.

A few random notes:

This territory, settled first by Actor-Network Theory (ANT) and developed further by Speculative Realism, truly feels like where the philosophical action is. It is pro-science but anti-scientism, which matters quite a lot, given the left’s metastasis into an aggressively intensifying and spreading scientistic fundamentalism. It is built on the Pragmatist platform, as all good contemporary thinking is. It addresses our basic moral impulses along with our conceptions, and who cares about whatever doesn’t? This movement is for thinking folks beyond the academy. I have come to loathe the odor of papers meant to goose an academic’s scorecard. Back in the day I designed the interface for a system for capturing academic accomplishments for evaluation, so I know what drives ambitious edu professionals. Whoever let the MBAs into the dean’s office deserves to be shot.

This book definitely fits in the Object-Oriented Ontology (OOO) genre. As a genre, OOO seems not only influenced by, but highly derivative of ANT, and especially Latour, in its delight in dizzyingly heterogeneous lists designed to inflict ontological whiplash, and its ironic oscillations between light whimsy and the heaviest dread. I am writing this post from Paris, and I have to wonder if this literary texture doesn’t have something to do with Latour’s Frenchness. If there is one thing the French are not, it is streamlined. OOO is an unstreamlined genre. OOO profuses.

I’m struggling for a style for my 4-page pamphlet, so I’m a little genre-sensitized right now. I crave severe streamlining, to the point of geometry. The reason for is that I want to provide a minimal skeleton or scaffolding for thoughts, not the thoughts themselves. Now, that I’m writing this, maybe my genre is the genre of design brief. This is consistent with one of my core themes, that philosophy is a species of design. If this is true, and I am no longer inclined to doubt this background faith or its implications, wouldn’t this kind of design, like all others, benefit from a design brief? Design is directed by an intuited problem. Normally a problem is implicitly and instinctually felt by isolated individuals (as inspiration), or no problem is felt (as feeling uninspired). If framed explicitly as a brief, inspiration is socialized and made available to groups of collaborators. Briefs themselves are designed things, and my favorite kind of design is brief design. (By the way, a couple of months ago I developed a simple method for co-designing briefs that feels extremely promising, and I need to write about that. Note to self.) I think this pamphlet might be a universal design brief for designing design briefs. Yeah, you know I’ll stack me some metas. This insight may be a breakthrough, or a yerba mate overdose, or both.

Another thing I’m noticing that I like about OOO is their metaphysical surveying work seems right on. The property lines they’ve drawn between being and alterity, knowledge and reality are very close to my own. The only conception of religion that has ever made sense to me is the cultivation of relationship between knowing self and the barely-known reality of which self is part. Speculative Realism seems built on this well-surveyed property, each herm in its proper place.

And if I am not mistaken, according to this survey, transcendental and transcendent are diametric opposites. In understanding, the transcendental is what we bring to the table of knowledge, and the transcendent is what not-we brings.

Slurpy, mergy, touchy-feely notions of interpersonal being

Wow, this post really sprawled out. It hits a lot of my enduring interests. I’m not sure it is suitable for reading. It might just be a personal journal entry written to myself. Feel free to eavesdrop if you wish, but I cannot promise it will make sense or yield any value.

*

I listened to a fascinating Radio Open Source podcast on Hannah Arendt’s conception of evil, which ended with a wonderful discussion on empathy.

Jerome Kohn: Empathy is a fancy word or fancy theory that she argued passionately against. First of all she thought it was an impossible notion in the sense that it really means feeling what someone else feels. Sympathy, fellow feeling, is another thing. But empathy is the claim that you can actually feel what someone else is feeling. And for that Arendt found no evidence whatsoever. One could say it’s even the opposite of her notion of thinking from another person’s point of view. What you have to be able to do is to see a given issue from different points of view, to make it real. And then through those different points of view, with your own eyes, you don’t feel what the other person is feeling, you see what he is seeing through your own eyes, and then you can make a judgement. The more people you can take into consideration in this enlarged mentality, that actually is the foundation of reality for Arendt, the more valid your judgement will be.

Elisabeth Young-Bruehl: Jerry’s exactly right. Hannah Arendt was always opposed to these slurpy, mergy, touchy-feely notions about what binds people to each other. And she felt very keenly that what really binds one person to another is a commitment to try to see the world from that person’s point of view with your own eyes. Not to subscribe to their point of view or to merge with their point of view, but to be able to walk around and see what the world looks like from where they’re standing. But looking at it with your own eyes, so that you can then, as it were, discuss it with them. Not merge with them in some way, but discuss it with them. She was all about discussion. Not empathy in that sentimental way.

Christopher Lydon (host): And yet, well, there are distinctions without huge differences in some way. To put oneself in another’s mind is the beginning of something important.

EYB: To think that you can put yourself in another’s mind in the beginning of a terrible arrogance which has tremendous consequences. It’s a difference with great consequences. People who think they that they can know what another person thinks or feel what another person feels are narcissistic.

CL: Well, ok, I don’t want to make a philosophical or an endless argument about it. Isn’t it the incapacity and the lack of interest in that perspective precisely what she found at the core of Eichmann’s banality and Eichmann’s evil, really?

JK: Well, no, it was his thoughtlessness, his inability to think from any other point of view but his own.

EYB: Exactly. And these are very important distinctions.

This exchange is especially interesting to me for three reasons.

First: as a Human Centered Design researcher/strategist/designer, I am constantly telling people that I am in the “empathy business.” However, I have long been uncomfortable with the characterization of what I do as “empathy”. To characterize understanding another person subjectively as primarily a matter of experiencing how they feel misses the mark in a very Modernist way. (em- ‘in’ + pathos ‘feeling’). While feelings are important to what I do, they are not the primary focus. I would prefer to characterize my work as concrete hermeneutics, but words like that do not fly in the flatlands of business where thinking lags a minimum of three philosophical generations behind. So, I’ve adopted “empathy” and accepted the inevitable misconceptions that attend it, because that’s what it takes to be understood at all by most people.

It is hardly surprising that I see things similarly to to Young-Bruehl and Kohn, because I belong to their tradition. Heidegger taught Arendt and Gadamer who both taught my favorite thinker Richard J. Bernstein. A Clifford Geertz quote from Bernstein’s Beyond Objectivism and Relativism has stayed with me as an anchor for my understanding of what a good human centered designer does.

Second, I think that when we see things this way, we tend to treat emotionally-oriented people who are very sensitive and sentimentally responsive to people around them as having some kind of monopoly on human understanding. In my experience, there are multiple stages of coming to understanding of another person, and a talent for sensing and responding does not always correspond with a talent for grokking the “logic” of other people’s worldviews, nor an ability to think, speak and create from another worldview. It takes a fairly vast range of talents to function pluralistically.

I think a lot of the political problems we are experiencing today result from shoddy and retrogressive philosophical conceptions of alterity (“otherness”), which still see understanding of other people as very literally empathic. To know what is going on with another person, we must ourselves have had the experiences and emotions that other person has had. In an effort to understand and to demonstrate our understanding we must induce emotions similar to theirs. Two consequences follow: 1) The one who understands must try to produce the right emotions, and this production of emotion is the demonstration of understanding, which leads to some fairly repulsive public displays of political sentimentality. 2) The one who is understood is put in a position of judging the authenticity of those emotional displays, which is more or less being given the role of arbitrary judge. And if the feelings of the understood is viewed as the central datum or a special kind of insight (being “woke”) into a political situation (typically gauging the degree of prejudicial unfairness, its impact on those victimized by that prejudice and what is required to rectify that unfairness) this amounts to extreme epistemological privilege. Only the victim of prejudice has access to the reality of the situation, and those who are not the victims are incapable of perceiving how they participate in the perpetration, so to use the charming the formulation of today’s hyper-just youngsters, it is their job to STFU and to accept the truth dictated to them. It never occurs to anyone within the power hierarchy of wokeness that there’s anything superior to all this illiberal mess to awaken to. There are philosophical worldviews that are more thorough, more comprehensive and more expansive than the dwarfish ideology of the popular left, but for all the reasons they are eager to point out to anyone who defies them, they are entirely incapable of seeing beyond the motivated reasoning of their own class interests. (This does not mean I think the popular right is any better. It is not. We are in a Weimaresque situation of resentful evil left idiocy vs paranoid evil right idiocy, with the reasonable voices shoved to the margins.)

Third, I’ve found myself misunderstood by many close friends on how I view relationships, and Elisabeth Young-Bruehl did a great job of capturing how people think I see them: a “slurpy, mergy, touchy-feely notion about what binds people to each other.” I think the misunderstanding is rooted in this same conception of human understanding being primarily an emotional phenomenon. When my own ideal of marriage or of friendship is strained through the filter of today’s left worldview, it looks like a mystical merging of souls that arouses (and should arouse!) suspicions of domination and anxieties around loss of self. But any attempt I make to try to explain the difference between what I have in mind looks like, well, an attempt at philosophical domination and a threat to the selfhood of whoever is foolish enough to take it seriously. Who am I to tell someone something they don’t already know? And anyway, it smells very cultish to listen to someone claiming to know better than the public what is true and right. So, by the circular logic of the popular worldview of the left, it is superior to form one’s own individual opinion (never mind that this opinion on opinions is a product of an unexamined and manifestly broken worldview.)

Obviously, this means extreme alienation for anyone who adopts a sharply differing worldview that affirms the importance of collaboratively developing shared understandings with those around them. In an environment of extreme ideological conformity (with brutal social consequences for infractions) that exalts above all the importance of intellectual independence — but strictly within its own confined philosophical horizon — a philosophy of interdependence, of collaborative development of the very concepts one uses to form one’s opinions, and exalting a togetherness in shared worldview is marked for expulsion.

Anyway, what I really have in mind when I imagine ideal personal connections is, once again, that ideal sketched out by Bernstein, captured so well by Geertz, which I will now go ahead and re-re-quote.

…Accounts of other peoples’ subjectivities can be built up without recourse to pretensions to more-than-normal capacities for ego effacement and fellow feeling. Normal capacities in these respects are, of course, essential, as is their cultivation, if we expect people to tolerate our intrusions into their lives at all and accept us as persons worth talking to. I am certainly not arguing for insensitivity here, and hope I have not demonstrated it. But whatever accurate or half-accurate sense one gets of what one’s informants are, as the phrase goes, really like does not come from the experience of that acceptance as such, which is part of one’s own biography, not of theirs. It comes from the ability to construe their modes of expression, what I would call their symbol systems, which such an acceptance allows one to work toward developing. Understanding the form and pressure of, to use the dangerous word one more time, natives’ inner lives is more like grasping a proverb, catching an allusion, seeing a joke — or, as I have suggested, reading a poem — than it is like achieving communion.

And now I will quote myself:

“Understanding the form and pressure of, to use the dangerous word one more time, natives’ inner lives is more like grasping a proverb, catching an allusion, seeing a joke — or, as I have suggested, reading a poem…” or knowing how to design for them.

A design that makes sense, which is easy to interact with and which is a valuable and welcome addition to a person’s life is proof that this person is understood, that the designer cared enough to develop an understanding and to apply that understanding to that person’s benefit.

A good design shares the essential qualities of a good gift.

The kind of merging I have in mind is just sharing a worldview and using it together to live together, what Husserl (Heidegger’s teacher) called a “lifeworld“. I’ve called the process “enworldment”.

The merging aspect of this ideal enters the stage through my belief (shared, I believe by Process Theology) that souls are universe-sized. The pragmatic consequence of what one means when one says “everything” is the scope and density of one’s soul. To enworld* with another is to bring two “everythings” into harmonious relationship, and to begin to function more like a culture than two isolated individuals within this isolating milieu so many of us, without ever choosing, without even knowing we had a choice, inhabit as prisoners of our own destitute freedom.

(Note: that “enworld” link above is a pretty old post, and I’m not sure right now how much of it I still agree with. It makes me want to engage my old self in dialogue and try to discover how much common ground we have. How enworlded am I with my 9-years-ago self?)

Ancestors and siblings of process thought

While I’m scanning passages from C. Robert Mesle’s Process-Relational Philosophy, here are two more that inspired me.

The first passage appeals to my designer consciousness:

Descartes was wrong in his basic dualism. The world is not composed of substances or of two kinds of substances. There is, however, what David Ray Griffin calls an “organizational duality.” Descartes was correct that rocks and chairs and other large physical objects do not have minds, while humans do. In Whiteheadian terms, rocks are simply not organized to produce any level of experience above that of the molecules that form them. In living organisms, however, there can be varying degrees to which the organism is structured to give rise to a single series of feelings that can function to direct the organism as a whole. We can see fairly clearly that at least higher animals like chimps and dogs have a psyche (mind or soul) chat is in many ways like our own. This psyche draws experience from the whole body (with varying degrees of directness and clarity), often crossing a threshold into some degree of consciousness, and is able in turn to use that awareness to direct the organism toward actions that help it to survive and achieve some enjoyment of life. The self, or soul, then is not something separate from the body. It arises out of the life of the body, especially the brain.

The mind/soul/psyche is the flow of the body’s experience. Yet your body produces a unique mind that is also able to have experiences reaching beyond those derived directly from the body. We can think about philosophy, love, mathematics, or death in abstract conceptual ways that are not merely physical perceptions. Without the body, there would be no such flow of experience, but with a properly organized body, there can be a flow of experience that moves beyond purely bodily sensation. Furthermore, your mind can clearly interact with your body so that you can move, play, eat, hug, and work. There is a kind of dualism here in that the mind is not only the body but it is, in Griffin’s phrase, a hierarchical dualism rather than a metaphysical one. There are not two kinds of substances — minds and bodies. There is one kind of reality — experience. But experience has both its physical and mental aspects.

To my ears, this is a beautiful dovetail joint waiting to be fitted to extended cognition. “Rocks are simply not organized to produce any level of experience above that of the molecules that form them” but if a human organizes those rocks in particular ways, for instance drilling and shaping them into abacus beads, or melting them down to manufacture silicon chips, those rocks can be channeled into extended cognitive systems which in a very real way become extensions of our individual and collective minds. It is ironic to me that even at this exact instance, in typing out this sentence, a thought is forming before my eyes with the help of rocks reorganized as silicon chips which are participating in the “having” of this very thought. And if anyone is reading this and understanding it, my thought, multi-encoded, transmitted, decoded and interpreted by your own intelligence — rocks have helped organize this event of understanding! Humans help organize more and more of the “inanimate” world into participants of experience.

And now we are wading out into the territory developed by Actor-Network Theory, which asks, expecting intricately branching detailed answers: How do humans and non-humans assemble themselves into societies? I think the commonality within these harmoniously similar thought programs is their common rootedness in Pragmatism. It is no accident that Richard J. Bernstein saw pragmatism as a constructive way out of  the unbridled skeptical deconstruction of post-modernism, and that Whitehead, who acknowledged a debt to Pragmatism, is said to offer a constructive postmodernism.

The second passage appeals to my newly Jewish hermeneutic consciousness. This is a quote by Whitehead:

The true method of discovery is like the flight of an aeroplane. It starts from the ground of particular observation; it makes a flight in the thin air of imaginative generalization; and it again lands for renewed observation rendered acute by rational interpretation.

This, of course, is a description of the hermeneutic circle, the concept that we understand parts in terms of the concepts by which we understand them, but that our concepts are often modified (or replaced) in the effort to subsume recalcitrant parts. We tack between focusing on the details and (to the degree we are reflective) revisiting how we are conceptualizing those details. These are the two altitudes Whitehead mentions: an on-the-ground investigation of detail and a sky-view survey of how all those details fit together.

This is an ancient analogy. The Egyptians made the ibis, an animal with a head like a snake (the lowest animal) and the body of a bird (the highest animal) the animal of Thoth, their god of writing, the Egyptian analogue to Hermes. Nietzsche also used this image in Thus Spoke Zarathustra, and that is where I first encountered it.

An eagle soared through the sky in wide circles, and on him there hung a serpent, not like prey but like a friend: for she kept herself wound around his neck. “These are my animals,” said Zarathustra and was happy in his heart. “The proudest animal under the sun and the wisest animal under the sun — they have gone out on a search. They want to determine whether Zarathustra is still alive. Verily, do I still live? I found life more dangerous among men than among animals; on dangerous paths walks Zarathustra. May my animals lead me!” When Zarathustra had said this he recalled the words of the saint in the forest, sighed, and spoke thus to his heart: “That I might be wiser! That I might be wise through and through like my serpent! But there I ask the impossible: so I ask my pride that it always go along with my wisdom. And when my wisdom leaves me one day — alas, it loves to fly away — let my pride then fly with my folly.”

And I have seen the Star of David as an image of the synthesis of atomistic ground-up and holistic sky-down understandings. And this is one reason I chose Nachshon (“snakebird”) as my Hebrew name when I converted to Judaism.

*

(Eventually, I’ll have to try to connect process thought with my extremely simplistic and possibly distorted understanding of chaos theory. Eventually.)

Gorging ouroboros

Gorging Ouroboros

Every philosophy is a philosophy of some kind of life.

For too many generations philosophers have philosophized about philosophizing to philosophers philosophizing about philosophizing.

This has turned philosophy into something exasperatingly inapplicable to anything important to anyone except a professional academic philosopher.

My belief (or self-interested prejudice) is that being a philosopher who philosophizes a life of human-centered design is a great privilege at this time in our culture.

Human-centered design lives at the intersection of many of our most problematic oppositions: theory-vs-practice, objectivity-vs-subjective, intuitive-vs-methodical, individual-vs-collective, revolution-vs-evolution, symbolic-vs-real, narrative-vs-fact, qualitative-vs-quantitative, holism-vs-atomism, coercion-vs-persuasion, technology-vs-humanities, natural-vs-artificial . . . , etc.

My philosophy feeds on the live problems and anxious perplexities that seize groups of diverse people when they collaborate to improve the lives of other people by changing social situations — physically, practically, symbolically and emotionally — and in this effort become so desperate to succeed that they are willing to stake or sacrifice their own cozy worldviews for the sake of sharing understandings with others.

I am convinced that philosophy can (and will soon) regain its relevance. It just needs a diet of something other than its own self-gorged self.

Overcoming empathy

A disempathic world view: “We may be accused of lacking empathy, but this supposed deficiency is actually an efficiency, not only because there are convenient statistical workarounds, but because the very object of empathy is entirely useless. People can and should be understood in terms of observable behaviors and attributes. Any invisible “agent” slipped under these observable realities is at best too vague or messy to manage, and in all likelihood superfluous or nonexistent.”

You can’t argumentatively disprove a philosophy of this kind — certainly not in its own terms. With respect to mere argumentation, it is not a matter for disproof; it is a matter for disapproval. But disapproval is not objective. It is subjective, and therefore not admissible as a valid argument to a mind who excludes all but objective criteria. Arguments about arguments will ensue, but objective minds are unable to grasp how this kind of argument is even possible, and therefore it also does not exist. So let’s not.

Luckily, we are not limited to mere argumentation. We are not Medieval Scholastics who must gather around the council table to establish theological truth through logical connections of doctrinal assertions.

We are children of the Enlightenment, and we know that we are not chained to the council table and books and figures and dogmas and arguments. We are able — and obligated! — to stand up and exit the room with all its shadowy abstract depictions Truth — and walk out into the sunlight of reality  to see how our truths perform when we test their fitness in helping us live effectively.

This is where design thinking and social scientific method become gloriously useful. Both take subjectivity as real and testable. This sounds abstract until you realize that the fates of businesses and organizations of all kinds hang on subjectivity.

Philosophy design

For the last several weeks I have been trying very hard to care about Anglo-American analytic philosophy. In general, though, (with some exceptions) I have found its problems and approaches to resolving problems too tedious, too inapplicable and too dry to keep me engaged. It is cognitively, practically and aesthetically irrelevant to me.

Or to put it in UX language, for me, the experience is not useful, usable or desirable. I am not the user of this stuff.

I suspect the user of analytic philosophy is other professional philosophers who want to philosophize to other professional philosophers.

*

pirate_flag

Anglo-American analytic philosophy is the UNIX of philosophies.

My project is to design a Macintosh philosophy. (A well-designed thing to be used by people who don’t want to be forced to tinker with technicalities, unless they want to. And perhaps a thing that appeals especially to designers looking for tools to help them design better.)

*

Philosophy is a kind of design. It is a mind-reality interface.

Every philosophy permits us to render some aspects of reality intelligible, while confusing or obscuring others; supports us in some practical activities and while muddling others; helps us intensify the feeling of value of some things while devaluing others. In other words, a philosophy makes our life experience as a whole useful, usable and desirable. But like with every design, tradeoffs are necessary, and where to make these tradeoffs is a function of the user and the use context. We can be conscious about it and make these tradeoffs intentionally — or we can be like bad clients and persist in trying to have it all.

And as with all good designs, philosophies disappear.

*

Even bad interfaces disappear, leaving only frustration, alienation, friction, dissipation, confusion.

*

We would laugh at an argument over whether iOS or Android is truer. Maybe it is time we laugh at philosophical arguments the same way. Let other people  sit around and debate whose philosophy does the best job of representing the truth. I will do an experience assessment.

 

Scientific Method vs Lean Startup

In his instant-classic The Lean Startup, Eric Ries restores some crucial components of the Scientific Method to innovation processes, long-neglected by “scientific” management.  Among his most important restorations is the the experimental practices that are the heart of scientific discovery. This is enormously important: without experiment, the creative dimension of science is lost and “scientific rigor” of quantification becomes an expensive, time-consuming and intrinsically conservative hindrance to doing anything unprecedented.

However, I do not believe that Ries has restored the entirety of the Scientific Method, and for the sake of setting up an unimpeded engineering-dominated process, has omitted or de-emphasized key non-engineering components that improve outcomes and shorten timelines. Here is a partial list of omissions:

  1. Hypothesis formation. Hypotheses are not just guesses which can be tested experimentally. Hypotheses are informed guesses, and it is on-the-ground-discovery that informs mere guesses and transforms them into hypotheses. Starting with a hypothesis rather than some dude’s random notion can reduce development cycles. Also, some ideas are so weak that no amount of pivoting will tweak it to awesomeness.
  2. Theory. Theory in science is what directs experimentation and lends knowledge a progressive thrust. Without an appropriate theory, experiment devolves into aimless and fragmentary trial-and-error. This kind of aimlessness and fragmentation in a business context translates to confusing and disjointed products. It is not that Lean Startup does not accumulate knowledge, but that its “validated learning” is too product-centric and not nearly user-centric enough. Lean Startups know everything there is to know about their own product and the possible permutations of their product and the customer behaviors and reported opinions about the product, but insights into the user’s inner life and outer context — the things that inspire the best design ideas — will not readily surface using Lean Startup methods.
  3. Crisis. Without the rigor of theory and the discipline of reflection, the kinds of problems that produce revolutionary solutions cannot come into view. Teams will hack their ways right past the crises that and miss the chance to find simple radical product insights. This is the precise point where philosophy can become a competitive secret weapon. According to Wittgenstein “A philosophical problem has the form: ‘I don’t know my way about’.  Isn’t innovation  all about finding, posing and solving such problems?

I’m going to read as much as I can about Scientific Method and develop this thought further and support it with some research. But I’ve been sitting on this idea too long, and I wanted to at least sketch it out.

 

Techne, phronesis, design and innovation

A passage from Richard J. Bernstein’s Beyond Objectivism and Relativism, illuminates a problem I have encountered innumerable times working as a user experience consultant: the need for predictability in innately unpredictable situations.

Before I quote the passage, I should provide some background, which involves the role of process in the practice of design, and how the need for predictability and preconceptions about process play into it.

What clients want is an established, proven process which can be applied to their business problems in order to lead them step-by-predictable-step to a predictable outcome. The ideal is maximum predictability throughout the process.

Predictability, though, can apply to many different aspects of a process. For instance, predictability can be applied to the specific form a solution will take, or it can apply to the general effectiveness of a solution to solve defined business problems. It can apply to the specific functions a solution must perform or it can aim at achieving more general goals (and leave open the question of what specific functions are needed to accomplish those goals). It can apply to varying granularities of time, ranging from the time it will take to complete the whole process, to the time it will take to complete each particular step within the process, all the way down to the number of minutes it will take to complete each sub-task in a project plan.

The question of which particular things must be predicted is very important because predictability comes at a cost. Every point of predictability necessitates a trade-off of some kind.

For instance, predictability in regard to the form a solution will take limits innovation: it means the form is pre-defined. The kind of solution available to this kind of pre-definition is most often an assemblage of “best practices”, which is a euphemism for “imitation”. An assemblage of existing elements is easily pre-visualized and implemented methodically and predictably with easily predicted results: a competently executed best-practices frankenstein will perform well enough to earn an employee a shiny new resume bullet and maybe a year’s job security. When a client comes in white-knuckling a feature-aggregate “vision”, nine times out of ten what looks like fixation on an idea is in truth only a side-effect of severe risk aversion.

Genuine innovation requires a different and slightly more harrowing approach. It requires a higher tolerance for open-endedness. Innovation entails, by definition, the discovery of something significantly new: a possibility nobody has yet envisioned and considered. Until it is discovered, the innovation cannot be shown to or described to anyone. (Innovation: ORIGIN Latin innovat– ‘renewed, altered,’ from the verb innovare, from in– ‘into’ + novare ‘make new’, from novus ‘new’).

Innovation does not necessitate radical unpredictability, though, and it also does not entail an undisciplined or purely intuitive approach. The locus of the unpredictability is in particular points within the process where discovery and the need to innovate are concentrated. At the micro-level, a solid innovation process is still mostly constituted of predictable activities, but wherever open-endedness is needed, the demand for predictability is relaxed or suspended. At the macro-level, at the overall success of the solution a solid, user-informed innovation process is predictably effective in its results, even if it is unpredictable in matters of form.

Most companies fail to innovate, not because they lack ingenious, inventive, creative people capable of innovation,  and not because innovation is unavoidably risky, but rather because the thoughtless demand for predictability at all points precludes innovation.

A big contributing part of this problem is that for many people, practice means predictability. It means pursuing closed-ended goals, and evaluating ideas with pre-defined criteria. The notion of an open-ended process, where evaluation involves human deliberation and multiple satisfactory outcomes are possible seems antithetical to “best practice”.

Here is where Bernstein becomes useful. It turns out that the Greeks were aware of this distinction, and had names for the types of reasoning  involved in each process. According to Bernstein, one of the most fundamental and damaging philosophical blindnesses of our time is the identification of techne (of technical know-how) with method. We tend to impose our conception of techne on understanding and practice in general, and in the process we lose something very important and central to humanity, a type of reasoning Aristotle called “phronesis”, generally translated as prudence or “practical wisdom”.

 The chapter from which this passage is taken is excellent from beginning to end, but here is the most directly relevant part:

…Phronesis is a form of reasoning and knowledge that involves a distinctive mediation between the universal and the particular. This mediation is not accomplished by any appeal to technical rules or Method (in the Cartesian sense) or by the subsumption of a pregiven determinate universal to a particular case. The “intellectual virtue” of phronesis is a form of reasoning, yielding a type of ethical know-how in which what is universal and what is particular are codetermined. Furthermore, phronesis involves a “peculiar interlacing of being and knowledge… Understanding, for Gadamer, is a form of phronesis.

We can comprehend what this means by noting the contrasts that Gadamer emphasizes when he examines the distinctions that Aristotle makes between phronesis and the other “intellectual virtues,” especially episteme and techne. Aristotle characterizes all of these virtues (and not just episteme) as being related to “truth” (aletheia). Episteme, scientific knowledge, is knowledge of what is universal, of what exists invariably, and takes the form of scientific demonstration. The subject matter, the form, the telos, and the way in which episteme is learned and taught differ from phronesis, the form of reasoning appropriate to praxis, which deals with what is variable and always involves a mediation between the universal and the particular that requires deliberation and choice.

For Gadamer, however, the contrast between episteme and phronesis is not as important for hermeneutics as the distinctions between techne (technical know-how) and phronesis (ethical know-how). Gadamer stresses three contrasts.

1. Techne, or a technique,

is learned and can be forgotten; we can “lose” a skill. But ethical “reason” can neither be learned nor forgotten…. Man always finds himself in an “acting situation” and he is always obliged to use ethical knowledge and apply it according to the exigencies of his concrete situation.

2. There is a different conceptual relation between means and ends in techne than in phronesis. The end of ethical know-how, unlike that of a technique, is not a “particular thing” or product but rather the “complete ethical rectitude of a lifetime.” Even more important, while technical activity does not require that the means that allow it to arrive at an end be weighed anew on each occasion, this is precisely what is required in ethical know-how. In ethical know-how there can be no prior knowledge of the right means by which we realize the end in a particular situation. For the end itself is only concretely specified in deliberating about the means appropriate to a particular situation.

3. Phronesis, unlike techne, requires an understanding of other human beings. This is indicated when Aristotle considers the variants of phronesis, especially synesis (understanding).

It appears in the fact of concern, not about myself, but about the other person. Thus it is a mode of moral judgment…. The question here, then, is not of a general kind of knowledge, but of its specification at a particular moment. This knowledge also is not in any sense technical knowledge…. The person with understanding does not know and judge as one who stands apart and unaffected; but rather, as one united by a specific bond with the other, he thinks with the other and undergoes the situation with him. (TM, p. 288; WM, p. 306)

For Gadamer, this variation of phronesis provides the clue for grasping the centrality of friendship in Aristotle’s Ethics.

 …

…for Gadamer the “chief task” of philosophic hermeneutics is to “correct the peculiar falsehood of modern consciousness” and “to defend practical and political reason against the domination of technology based on science.” It is the scientism of our age and the false idolatry of the expert that pose the threat to practical and political reason. The task of philosophy today is to elicit in us the type of questioning that can become a counterforce against the contemporary deformation of praxis. It is in this sense that “hermeneutic philosophy is the heir of the older tradition of practical philosophy.”

 *

To put it in Bernstein’s and Gadamer’s language: a solid, innovative design methodology requires an intelligently coordinated blend of techne and phronesis, guided by phronesis, itself. It is an immenently reasonable process – meaning that the participants in the process make rational appeals to one another in order to come to decisions – but what is being arrived at is not predetermined, and the decision-making process itself is not determinate. Many good outcomes are acknowledged as possible. The innovators are not looking for a single right solution, but rather a solution that is among the best possibilities.

 *

Incidentally, innovation is not needed always and everywhere (any more than predictability is). Unrestrained innovation is not a desirable goal, as fun as it may sound.