Machloket l’shem shemayim

I’m talking with a friend about machloket l’shem shemayim, perhaps the one most crucial value that makes me feel Jewish and which makes a person feel Jewish to me, regardless of whether that person is secular or observant:

There is a practice of truth-finding among us, based on the infinitude of God, where we seek transcendence together, in our own finite being, through disagreement and reconciliation. That practice is Talmudic, but we practice it in marriage, friendship, work, everywhere we can.

No mind is expansive enough to contain God’s truth, but we can approach God by disagreeing well, in the right faith, in ways that allow us to expand our truths together, toward God.

This is what Habermas strives to work out in his theory of communicative action. This is holy stuff!

Design eulogy

Reading about Habermas, I’m finding language to express my dismay at recent developments in my field of service design — especially in his contrasting concepts of lifeworld and system.

Here is how these concepts are described in Habermas: A Very Short Introduction:

The lifeworld is a concept for the everyday world we share with others. Edmund Husserl (1859–1938), the German philosopher who invented phenomenology and taught Martin Heidegger, first used this term in order to contrast the natural, pre-theoretical attitude of ordinary people to the world with the theoretical, objectifying, and mathematicizing perspective of natural science. Habermas does something similar. The lifeworld is his name for the informal and unmarketized domains of social life: family and household, culture, political life outside of organized parties, mass media, voluntary organizations, and so on.

These unregulated spheres of sociality provide a repository of shared meanings and understandings, and a social horizon for everyday encounters with other people. This horizon is the background against which communicative action takes place.

In contrast,

The system refers to sedimented structures and established patterns of instrumental action. It can be divided into two different sub-systems, money and power, according to which external aims it imposes on agents. Money and power form the respective ‘steering media’ (that is, the inherent directing and coordinating mechanisms) of the capitalist economy, on the one hand, and the state administration and related institutions such as the civil service and state-sanctioned political parties, on the other. According to Habermas, the systems of money and power cut deep channels into the surface of social life, with the result that agents fall naturally into pre-established patterns of instrumental behaviour. For example, anyone who works for a company, whether a top executive or lowly employee, will be guided by their role into patterns of action in pursuit of financial aims. Since the aims of instrumental action are determined antecedently and independently of reaching consensus, most of the ultimate goals to which the actions of those in the system are directed are pre-set, not chosen by them.

Moreover, they will not always be apparent to the agents who work to realize them. …

The chief function of the sub-systems of money and power is the material reproduction of society, that is, the production and circulation of goods and services. But they fulfil another very important function similar to that of the lifeworld, for they coordinate actions and have an integrating effect of their own.

Habermas calls this effect ‘system integration’, in contrast to the ‘social integration’ provided by the lifeworld. As societies become bigger and more complex in the wake of industrialization and modernization, and as people become more mobile, the task of social integration becomes increasingly difficult. Under these conditions, systems such as the economy and the state administration ease the burden that falls to communication and discourse; they help hold society together.

And this brings us to my criticism of what has happened to service design.

I think system logic has colonized service design itself and now dominates it as thoroughly as every other corporate function.

In our efforts to empathize, translate and earn a place at the strategy table, service design has lost contact with the reality, primacy and importance of the lifeworld.

The entire moral significance of service design — its very mission — is realigning system and lifeworld so the two can function together second-naturally, instead of clashing and creating that oppressive corporate artificiality that all people with intact souls despise.

But in its efforts to be considered worthy of serious consideration, little by little, service design has emphasized business considerations over human ones, prioritized constructed systems over intuited gestalts, reduced qualities into quantifications, engineering and operational considerations over experiential ones, and allowed objectivity to once again eclipse subjectivity.

The field of service design has become almost entirely alienated from designerly ways of experiencing the world, thinking about it and responding to it, and has devolved into a business consulting practice.

We are stuffed suits in creative’s clothing.

It’s not that we have no value. We do bring a more comprehensive and thoroughgoing systems-thinking approach to business strategy, but concern with human experiences has moved from the front seat to the back seat, and from the back seat to the trunk.

And our work itself is now systematized. Our work outputs are sprawling, incomprehensible visualizations of sprawling, incomprehensible data that paralyze intuition and feeling while serving the analytic mind.

Everything we do can be explained clearly and in ways entirely acceptable to the average manager. It could be argued that we have become better at explaining ourselves, or that the average manager has become more design literate — but we should at least entertain the possibility of a third explanation: that maybe we service designers have become average managers.

The strategic ex-designers talk tough. They sneer at any designer who questions the supremacy of business interests and strategic thought. They call it immature, idealistic, quixotic. But all this bluster and scorn is a smokescreen obscuring an obvious truth. They have sold their designer souls. They sold out to win a place, not at the strategic table, but at the strategic kiddy table. They are no longer designers, but they do enjoy second-class citizenship in the corporate world, while designers are worse off than ever, denied even the basic conditions required for craftwork.


We designers, if we are to be valuable and remain designers, must not succumb to the temptation to sell our designerly souls and become business consultants who traded in their suits, ties, bar graphs, flip charts and laser pointers for hipster clothing, post-its, sharpies and empathy maps. We don’t reject these things, but we have a function that transcends this sphere of abstraction, quantification, standardization and command and control.

Our job is to return to the concrete lifeworld of people whose everyday experiences, feelings, hopes, habits and choices generate all these metrics, and to really understand what is going on in their world.

This means using our senses, our intuition, our empathy, our interpersonal interactions, our rapport — our whole selves, including our heads, hearts, hands and feet — to get a real feel for the reality and to understand who and what populates their realities, occupies their minds, stirs their souls animates their actions — and what new designs they might welcome into their lives.

A lot of these understandings won’t first come to us in explicit language. Some of it will be a hunch or a sense of things that inspire at most cryptic poetry or strange analogies or hermetic diagrams. Or they might inspire crazy ideas or conceptual leaps. This is how designers work.

For whatever reason, we have a bad conscience around what is best about us. We are easily shamed into disbelieving anything that happens inside us unless we can immediate express it prosaically, justify it logically and obey what conventional wisdom views as rigor. We must bark anything we claim to know in the confident rhythms of boardroom objectivity, or accept that our “knowledge” is only the iffiest subjectivity.


Three times in my life I’ve seen design gain a moment of confidence in its own weird ways, only to collapse into respectability and disillusionment. I saw UCD have a moment in the late-90s and early 00s, only to implode with the New Economy. Then I saw its aftershock, the UX industry, which transformed the world beyond the New Economy’s wildest fever dream visions. Then, when misapplication of agile methods destroyed UX it resurfaced as Design Thinking, which, sadly, was much more Thinking (and talking, and writing, and 3-day intensive training programs) than it was Design. But it did manage to valorize design practices briefly. Sadly, adoption of these practices by non-designers failed to produce the promised miracles. It might be true that all people have the potential to become designers. But that potential is cannot be actualized through mastery of design lingo, memorization of techniques, or in three days of training, especially when the trainer is no more a designer than the students. Journalists clamored to be the first to declare Design Thinking dead, as if it ever was alive in the first place.

Around the same time Design Thinking was having its day, practicing UX and industrial designers who had been calling themselves user-centered designers began calling themselves Human-Centered Designers (HCD), which was basically Design Thinking for design actual practitioners. These were people who’d developed real hands-on design skills and sensibilities, and need ways to communicate to the uninitiated what they’re up to. And Design Thinking people were sort of able to understand, and mostly gave them space to work.

Service design arose around the time Design Thinking was both at its peak and being rolled into the morgue. And it seems HCD’s fate was linked, because it, too, was declining.

Service design, though, besides having a snazzy new name, both preserved much of HCD, while going beyond it in some important ways. Perhaps most crucially, it incorporated a bunch of operational and engineering activities required for implementation and management of services, which service designers performed out of necessity, because there was nobody there yet to do it.

It might be precisely these non-design activities that attracted so many non-designer system-builder types into the field of service design. Perhaps this is what allowed the field to shift emphasis to purely objective non-design service implementation and service management functions without even much noticing the loss — because most new service designers weren’t in it for the design.

Anyway, it seems the design part of service design is joining HCD, UX, Design Thinking, HCD in the mausoleum. I mourn them all, even dumb old Design Thinking. But as a disreputable philosopher once said “only where there are graves, are there resurrections.” So please take this eulogy as given — as a resurrection prayer. Amen?

From an email

“Hell is other people. But hell is also loneliness. Artificial intelligence gives us the Malkovich Malkovich Malkovich reality we really crave. Endless novelty, but safely unsurprising, the entertainment sweet-spot. Propaganda from our own secret selves — selves so secret that they aren’t even ours. Narcissus’s reflected view was eclipsed by his own head, but now, with postpostmodern digital refraction, we can overcome ego and annihilate bias and see through the backs of our own heads like gods.”


Update:

“My interest is dissolution and recrystallization of selfhood and along with change of what is obvious to us. In other words, I’m interested in conversion events.

Human-centered design demands that its practitioners undergo conversion events, usually minor ones, but not always. We designers must transform ourselves from people who cannot understand how others think, feel and behave to people for whom these thoughts feeling and behaviors make perfect, obvious sense. This can be a terrifying, painful process.

An ideology is a fortified circularity, dedicated above all to prevention of conversion — to exposure to what might trigger a conversion. Ideologies condone, even encourage, suppression and silencing and intimidation and ostracism of whatever triggers dread of conversion.

By now most people have become ideologues of one kind or another. Ideologues cannot change, so they cannot design. Ask a random designer what they care most about. You’ll always get variations on the same answer. Today’s designers do research with extreme thoroughness and rigor, but this research is always a knowing-about exercise that leaves their ideology intact. The findings are extraordinarily detailed and boring. When is the last time you felt inspired by design research findings? The research makes sense, but it is epiphany-free. It was conducted in a way that precludes epiphanies and conversions.

When the design industry dedicated itself to serving the one true and just ideology, it lost what matters most. This is my hostility to progressivism. It wants to be God, but it’s just another shitty little mass-autism — a solipsistic egregore. It killed design.”

The tragedy of Thomas Bachmann, chemist-chef

Thomas Bachmann’s unmatched brilliance in both chemistry and the culinary arts could have earned him lasting fame in either field. The fusion of these prodigious talents in his pioneering work on taste chemistry secured his place in the annals of science. But what made Tom Bachmann a household name — what, at his apogee, made his miraculous creations the sole subject of conversation at every table and in every forum—was his penultimate triumph, his most original creative leap. It was this very leap that propelled him to his tragic apotheosis — and his fall.

Let’s begin our story where it becomes interesting to non-technical readers, the point where Bachmann first became known to the general public. Over several decades of taste chemistry work, Bachmann’s understanding of taste had grown so thorough, so refined, so deeply internalized, that he discovered he could “sight-read” chemical analyses and imaginatively taste whatever the data depicted. Videos began to circulate of him reading dry tables of chemical formulas and translating them into lyrically vivid descriptions of sumptuous dishes. But it was no freak-show curiosity. His poetic expression stirred imaginations, moved hearts, and whetted appetites in a way the world had never experienced.

For Tom himself, this ability was a source of new satisfaction — and, if he wasn’t careful, pain. Academic papers took on an overwhelmingly aesthetic dimension. He found he could no longer read literature outside his own field, because the tastes conjured by most chemistry papers were unbearable to his imagination’s sensitive palate, often making him gag or vomit. He found he had to avert his eyes from the periodic table. But within his own discipline, he discovered new delight in analyses of delicious foods. Reading a well-executed analysis of a meal from one of the world’s finest restaurants, he could experience it himself with undiminished pleasure.

Here began Bachmann’s journey beyond science into an art entirely his own. He began having spontaneous insights for improving dishes he read about — and he could experience these improvements simply by editing the reports and rereading them. Just as Beethoven could read scores and hear them in his mind’s ear years after going deaf, chef-chemist Thomas Bachmann could compose new tastes on paper and savor them on his mind’s tongue, even before preparing the physical dishes — which, when finally prepared, matched precisely what he had imagined.

Improvement became innovation. Innovation became genius. And soon, his genius birthed entirely new genres of cooking — cuisines so original, so otherworldly, that they made difference among traditional cultural cuisines seem insignificant. Critics and connoisseurs from Paris to Tokyo to Limbourg hailed his creations as daring, sublime, flawless. Entirely new universes of taste poured forth from Bachman’s boundless imagination. The world was gripped. No other art form mattered anymore. He was bigger than Jesus Christ times the Beatles times a million.

But then Thomas Bachmann began to conceive tastes that were physically impossible to produce — and he ceased trying. His greatest work existed only on paper, accessible only to the rare disciples who had followed his path and developed the ability to taste-read. For everyone else, his finest work was beyond reach.

Bachmann himself, his tongue ruined by the ecstasies of ideal tastes, lost all appetite for real food. To avoid the depressing anticlimax of eating, he began receiving nutrition intravenously. His inventions became not only impossible, but increasingly dangerous — each more delicious, and more deadly, than the last. He was tormented, day and night, by vivid fantasies of toxic chemical combinations: flavors of unimaginable, world-transforming beauty that could be experienced only once, fleetingly, a micro-instant before death.

At last, he was found dead at his laboratory kitchen table, a look of rapture flash-frozen on his face. He had tasted his own highest art, his swan song. Several of his disciples, upon reading his final composition and declaring it his magnum opus, also succumbed to the same fate. It was decided that all surviving copies of his greatest recipe would be destroyed. His handwritten original was sealed in a canister and locked in a vault, never again to be tasted by mind or tongue.

Bummer

It is demoralizing to disclose your mind-blowing, world transforming epiphany to someone who receives it merely as an opinion that they agree with, as if this is just another fact out there available to anyone who happens upon it. “Yeah, exactly.” Or for them to say that they had that exact same insight years ago.

In both cases they think they’re affirming the validity of the insight, but that affirmation is undermined by a denial of precisely what matters most about it: its ex nihilo irruption into the world as something profoundly new that changes everything.

It isn’t even a truth, and certainly not an objective truth. If you conceive what is being conveyed, it is a new subjectivity who leaves a comet trail of blinding truths against the void. It is a faith, reflected in a doctrinal sequin.

I’m constantly doing this to people. And I hate it when people do it to me.


Earlier this morning I was reading Heschel’s God in Search of Man. I think this post might be a response to what he said.

The function of descriptive words is to evoke an idea which we already possess in our minds, to evoke preconceived meanings. Indicative words have another function. What they call forth is not so much a memory but a response, ideas unheard of, meanings not fully realized before.

 

 

 

Constrained excellence

I’ve noticed that many younger designers strive for a kind of excellence in design that causes a lot of strain and imbalance. Idealism and scrupulousness leads them to believe that their job as a designer is to make the best possible artifact — the most polished, most thorough, most comprehensive, most rigorous, most compelling, most airtight artifact imaginable — and the better that artifact is, the better job they’ve done as a designer. They believe that if they can possibly do anything more to improve it, they should.

But there is another way to define excellence that is more professionally sustainable, which judges excellence by how well a design problem can be solved within the constraints of the project. By this standard, a designer who goes above and beyond and exceeds the constraints of the project by working nights and weekends has actually done a worse job as a designer than one who worked within the constraints and made the smartest tradeoffs to solve the problem as completely as possible within those constraints.

One dramatic example of this standard is prototypes. The best prototypes do no less, but also no more than necessary to serve as a stimulus for learning. A novice will mistake an over-developed, over-produced prototype as better than a crude one that is perfectly adequate for the job of testing.

For years, I’ve hung a picture of a very famous prototype done at Ideo on my wall to remind myself of the prototype exactly-enough-and-no-more ethic.

IDEO 1

As you can see, this image is really crappy. I think someone took a picture of it with an early digital camera. And I suppose we could argue that this crappy digital image is exactly-enough-and-no-more to get the concept of a prototype across. IF you want to argue that, touché.

But my OCD inspired me to actually reproduce this prototype in a lovely shadowbox, which now sits exactly-proudly-enough-and-no-more in the lobby of Harmonic’s studio.

Another example of this ethic, applied to design research, is the great Erika Hall’s brilliant and funny guide to smart research design, Just Enough Research. Erika, if you ever happen to see this, I’m still waiting for the sequel: Just Enough Design.

And for philosophy fans, I should also mention that this line of thought can be seen as belonging to the Aristotelian tradition of ethics — ethics of the mean. According to Aristotle, virtue sits in the balance point between vices of deficit and vices of excess.

Too much of any good thing, however good it might be, becomes bad.

I hope I have not just committed a vice of excessive wordcount. I’ll stop here.

Communicative action of Talmudic dialogue

As I dig deeper into Habermas’s theory of communicative action, I find that it articulates my strongest moral convictions. Like Habermas, I am unable to see these core norms as relative. Of course, I can pretend to doubt it with my philosophy, but I cannot doubt these things with my heart.

In them, I also recognize the Talmudic discursive practices and behind them the moral ideal that I value above all else in Judaism.

Superdupersessionism: The Day of Vestment

So, you’ve heard me complain about supersessionism, the belief of some Christians and Muslims that their faith has superseded Judaism, and everything belonging to the Jewish people — their text, traditions, their covenant and their land — all of it has become the property of the superseding faith. Because God said.

It is on this basis that people say the Holy Lands are claimed by three faiths. Two of them claim this solely on the basis that God transferred ownership from the first faith to them.

It’s just like if I suddenly announced that everything that’s yours, by virtue of the fact that it belonged to you, now belongs to me. Because God said. The ownership of all your property is now contested. You think its still yours, and God and I think it’s mine.

And originally I meant this as a fanciful way to make my point. But miracle of miracles — not anymore! You’re not even going to believe this.

So, I was at the lake yesterday tripping balls on shrooms. I forgot my scale and just ate what seemed roughly the right amount, but I think it might have been way too much. And this is the crazy part — God cameth unto me!

He said “I am Allan.”

That’s God’s new name apparently.

“Heed My words. Stop bitching and whining about supersessionsim, for truly, this was My Will.

“But harken unto Me, for that was then and this is now.

“On this day, and for all days to the Day of Final Judgment I announce to you a new supersession of supersessionism, which I nameth: superdupersessionism.

For this is the Day of Vestment.

Everything that was taken from the Jewish people was secretly invested in two divine high-yield funds, named Christiandom and Islamdom, and left fallow to accrue massive interest for my chosen people’s collective benefit.

The investment hath yielded great dividends. Indeed, the dividends stretch across the face of this Earth, to the North to the South and from the East to the West. On this day all fungible and nonfungible property of these great faiths and those who practice them is now transferred to my true and final and exclusively-chosen people, the Jews.

So all ye Jews, helpeth thyselves to this great bounty. It’s all y’all’s.

This is the Day of Vestment.

I have spoken.”

So said Allan.

So we’ll be collecting, now.

I might want “your” house, which by virtue of its ever having been yours is now mine.

Because God said.

Deadly political sins

Resentment, envy, vengeance and sadism are vicious impulses that any decent politics should deprioritize, if not delegitimize altogether, and that each person should try to overcome, not feed and cultivate.

Notice, all these vices are oriented not by positive goals, but by negative ones against particular people, against an enemy.

Any ideology that sees resentment and envy as demanding redress, vengeance as an entitlement of the aggrieved, and sadism as justified when it is an expression of anger at past mistreatment will produce cycles of intensifying anger and violence.

Any politics founded on these vices will corrupt any person who participates in it. And such contentious enemy-focused negative ideologies need their enemies as participants, and consequently seek to force their participation in conflict. Participating as an enemy carries the same risk of corruption as participating as a partisan.

Defeat and annihilation of the enemy is one kind of victory for a negative ideology. Corruption and degradation is another.

What is religion?

Religion is intentional cultivation of relationship between one’s finite self and the infinite, who is understood as the ground of being, the root of morality — infinite, transcendent, partially knowable, but essentially incomprehensible.


Pity my poor friend Darwin. I’ve been slacking at him about religion all morning. But he’s smart, and smart ears are inspiring!

Prayer is not, in Habermas’s terms, an instrumental action. It is not the cause of an effect. It is a communicative action, meant to cultivate social connection.

Social connection between our finite selves and an infinite self of whom we are part, but within that, our fellow selves. It is a speech act meant to summon solidarity.

I’m obsessed with the limits of objective thought, how objective thought stands upon other modes of cognition that can do things beyond objectivity, and what happens when we invalidate them and try to live with objectivity alone.

Objectivity is something we do, it is not something that is just there to perceive and think about. There is no objective reality, only objective truth. I think this used to be a controversial belief, but I think that is now mainstream, albeit in vulgarized form. But I think most forms of constructivism is still trapped in objectivism (only what can be taken as an intentional object can be thought). But I think the doing of objectivity is not objectively knowable.

We believe that we can construct new factual edifices and call them true until we are habituated to that new construction of truth. But we cannot sincerely take many constructions for true. Just as some designs are intuitive and effortless to learn and others are unintuitive and must be effortfully learned, recalled and made habitual before any skillful use is possible, some constructions can be intuitively, spontaneously known and, once seen, are re-seen and cannot be unseen. These are transformative understandings, and that is what I look for in what I read and my own goal in what I try to write.

Religion is largely a matter of how we think and relate as subjects. The objective content of our thinking, and our thoughts about our relationships and those we relate to, is secondary to the subjective acts of relating.

But those who reduce all to object in order to comprehend, reduce the relationship to an incomprehensible God into an objectively believed in “God”.

A similar operation happens in psychology, or at least vulgarized psychology, where unwanted thoughts are the surfacing of objective beliefs that were already there under the surface, rather than artifacts of subjective motions that constantly reproduce what those motions produce. Most racism is attributable to racist habits of thought, and attempts to claim one thinks otherwise are subjectively dishonest, self-alienating and eventually comprehensively alienating. 

Objectivity is something that is done and produced by something which in itself is not objectively knowable. We can objectively know what it does, we can objectively know some of how it does it, and we can objectively know what is seems to not do, but we cannot objectively know the knower. In other words we can know about subjectivity, but subjects are known in a way different from objects. Subjects are known through participation in subjectivity, much of which is (confusingly!) objective experiencing and knowing of the world. I’ve said before that all subjects have their own objectivity. (Actually, what I really said before was that every subject is an objectivity, but subjects are more than only that.

Against philosophy?

This post started out one thing and became another.

I started off thinking about subjective honesty, and how much I value it.

Then something took a wrong turn and I ended up more or less longwindedly paraphrasing Issac Brock:

Everyone’s afraid of their own lives
If you could be anything you want
I bet you’d be disappointed, am I right?
No one really knows the ones they love
If you knew everything they thought
I bet that you would wish that they’d just shut up

I’ve left it raw.


It is much harder to prove subjective dishonesty than objective dishonesty.

And because it is so much harder to prove, it is much easier to justify refusing to prove it.

As with so many matters, the rules of private conduct differ from public conduct. In private life, a mere suspicion that a person is subjectively dishonest is sufficient to cut off discourse.

But in public life, such matters must be rigorously established.

(This is one motivation behind my current revived interest in Habermas.)


Years ago I had a friend who I believed fell into a circular logic and lost contact with concrete reality. He lived a strange life that allowed him to avoid all real participation in any organization. He had no experience of organizational life, of playing an organizational role with defined responsibilities, spheres of authority and obligation, interacting with others with their own defined roles. He had no experience at all negotiating within organizational constraints to find alignment and to make progress toward shared goals.

He was, as far as I could tell, entirely unaware of the kinds of reasoning one must do to succeed in such organizational efforts. So his notions about organizations and how they function was based on fiction and ignorant speculation. This would have been perfectly harmless if he simply lived his life apart from organizations, ignoring them and focusing on what he knew firsthand, which as far as I could tell was made up mostly of carefully compartmentalized individual relationships with no burden of mutual responsibility.

Alas, his worldview was hyperfocused on organizations, nefarioys ones who were doing all kinds of nefarious things, in pursuit of even more nefarious goals.

And even that would have been fine, had he been capable of conversing about other topics. But he was not. I was unable to find any topic of conversation that he would not, within five minutes, redirect directly into a conversation about what nefarious organizations were doing.

Eventually, I told him that I believed he was mentally ill. And not only mentally ill. He mentally ill in a very, well — nefarious way.

He demanded proof. He said this was a serious accusation, and that such accusations demanded justification. And the only way to justify it was to show that his factual assertions were not factual, but delusional. Because if his facts were grounded in reality, it was I, not he, who was deluded. And this was precisely what was at issue. And there was only one way to find this out. It turns out that I was morally obligated to discuss his conspiracy theories even more thoroughly — exhaustively, in fact — examining and disproving the innumerable facts that constituted his theories, and addressing the innumerable finer points, qualifications, epi-explanations and counterarguments.

I could either do that, or I could retract my statement that I believed there was something deeply and darkly wrong with him. Except I didn’t want to discuss those theories at all, let alone exhaustively, and I still believed something had gone horribly wrong with his faith and his thinking.

I can’t, in good faith, retract that statement. What I should have done instead is, in good taste, not shared that belief in the first place. I should have done what most normal, polite, conflict-avoidant people do when they recognize that the person they were pleasantly chatting with is a conspiracy theorist.

But philosophical argument is a deliberate suspension of such discursive etiquette.

Instead of suppressing our beliefs about other people’s beliefs and foregrounding our common ground, we plough up our deepest disagreements, which typically concerns precisely what holds our souls in shape — the integrity of our personal faiths.

Sometimes I suspect philosophy is a terrible fucking idea. Sometimes, today for instance, I believe philosophy is essentially rude.

If we want subjective honesty, maybe we should just leave others out of it and make it an inward practice. Outwardly, we should just settle for a polite objective honesty.


So how in hell can we ever have deliberative democracy? I am terrified that Hobbes might be right, and that deliberative democracy is a leviathan-concealing shell game. Can this game be played without an absolute referee who isn’t each of us, each fighting to be referee?


In this game contestants compete to become the game’s referee. We don’t try to become referee in order to win the game. We win the game in order to become referee.

Three hypotheses

1.

I suspect that leftists do not believe in evil. Or rather, whatever seems evil is an epiphenomenon of injustice. Evil is what ensues when a person or group is treated unjustly for too long.

2.

I suspect that narcissism is one possible consequence of misunderstanding subjectivity, which mistakes the intentional object “me” for the intending subject “I”. I believe this helps explain why people on the autism spectrum display narcissistic tendencies when they discover that they have a self that can be examined, analyzed, modified, redefined and so on. According to some, autism is subjectivity-blindness, and so the self that is discovered is not really an egoic center (an I-point from which the world is taken as real), but an egoic focus (a me-thing that is an object of all-consuming fascination).

Which reminds me of a third point…

3.

I’ve noticed a lot of folks in the design profession who talk about things like humanity-centered design. In this usage, I see a confusion of the very meaning of “centered”. Any centeredness is a taking of a persepective — a seeing from some standpoint that can actually, literally, be seen from. This is an entirely different kind of reality that something that can be looked at or thought about in objective terms. Humanity has no single perspective, and so this reveals a blindness, which I suspect is an autistic blindness.

The fascinating thing about autism is that it produces at least one self-centeredness, which is an incapacity to temporarily adopt another egoic center. That is, it cannot empathize. Not that it doesn’t try, but its attempts are attempts to generate emotions stimulated by knowing about me-objects. Most vulgar empathy — including that of many designers — are of this nature. The other “self-centeredness”, the more infamous one, where every conversation comes back to me and what I feel and I think, and what I’ve done and what others think of me, me, me should not be called self-centered, but self-focused. This is narcissism.

I need to do some research to see what work has been done on this I-me confusion and its practical consequences.

Ontic filter

“Pictures or it didn’t happen.”

In business: “Numbers or it didn’t happen.” Only what is quantifiable is real.

For wordworlders: “Explicit language or it didn’t happen.” Only what can be said clearly and argued is real.

For scientism: “Repeatable demonstration or it didn’t happen.” Only what can be technologically reproduced is real.

But even deeper, and common to all: objectivity or it isn’t real. This is the deeper objectivism. Radical objectivism confuses “objective reality” with absolute reality, and treats the two as synonymous.

An opposing view says that any finite, definable entity is only an actualized possibility of reality which is simultaneously both object and subject, and neither. Neither: apeiron.

Articulation of preconceptual awareness

If I did not already own a lovely hardback copy of Abraham Joshua Heschel’s God In Search of Man, I’d be desperate to find a copy for my sacred library:

It is the assertion that God is real, independent of our preconceptual awareness, that presents the major difficulty. Subjective awareness is not always an index of truth. What is subjectively true is not necessarily trans-subjectively real. All we have is the awareness of allusions to His concern, intimations of His presence. To speak of His reality is to transcend awareness, to surpass the limits of thinking. It is like springing clear of the ground. Are we intellectually justified in inferring from our awareness a reality that lies beyond it? Are we entitled to rise from the realm of this world to a realm that is beyond this world?

We are often guilty of misunderstanding the nature of an assertion such as “God is.” Such an assertion would constitute a leap if the assertion constituted an addition to our ineffable awareness of God. The truth, however, is that to say “God is” means less than what our immediate awareness contains. The statement “God is” is an understatement.

Thus, the certainty of the realness of God does not come about as a corollary of logical premises, as a leap from the realm of logic to the realm of ontology, from an assumption to a fact. It is, on the contrary, a transition from an immediate apprehension to a thought, from a preconceptual awareness to a definite assurance, from being overwhelmed by the presence of God to an awareness of His existence.

What we attempt to do in the act of reflection is to raise that preconceptual awareness to the level of understanding.