Design a site like this with WordPress.com
Get started

Essay: Plants and Dilettantes (written while annoyed)

[The reader may remember this post, in which I went on a screed against writing like a jackass. This post takes a similar tone initially, but I hesitate to call it a “screed” because I start out being annoyed, but everything turns out ok in the end.]

One of the most frustrating things that I encounter in reading modern philosophy is statements of this type:

Philosophy since X has only thought of A, B, and C, and it has (illegitimately) only taken such-and-such form. Really, philosophy should be Y, etc. This is why everything is bad today, and no one wants to read philosophy.

Just about any contemporary philosophy/critical theory text (which also unhelpfully and maybe even patronizingly assumes the reader has read everything in the “Philosophy” section of the library since the Pre-Socratics, has an extensive knowledge of the punk-rock scene in New York, San Francisco, London, etc., or knows what “musique concrete” is.)

I’m being hyperbolic, of course. I’m also being uncharitable, but that is my privilege since this is a personal blog. Not all philosophers write like this (I have enough logic to know this, at least), and in many cases the people who do make such claims have good points. I do get bored and annoyed with the sometimes excessively “poetic” style of some contemporary philosophy, but an even bigger gripe is the sense that philosophers don’t read (or at least don’t write about) anything outside of their discipline.

However, I’ve just recently read a counter-example to (some of) these claims: The Life of Plans: A Metaphysics of Mixture by Emanuele Coccia, translated by Dylan J. Montanari. Coccia’s volume is fascinating for a few reasons, some more abstruse than others. I have a couple of intellectual habits that I’ve often found difficult to fit into philosophy: for one, I want to apply it. Yes, yes, metaphysics. What is it for, then? What does it do? Or convince people to do? How does it fit with the rest of the world? And, more specifically, the world of stuff? Another of my “bad” habits has to do with my dilettantism. I once used quotations from a detective novel set in North Korea to make a point in a paper on Derrida. The prof loved it, but he’s a Heidegger guy and willing to experiment. Of course, with Derrida you can get away with quite a lot, but I don’t often read much philosophy that makes use of sources too far from the 100s of the Dewey decimal system. I think that’s a shame.

Coccia does a few things that I find immensely refreshing. First of all, his analysis is grounded in actually knowing botany. He writes:

From the age of fourteen to the age of nineteen, I was a student in an agricultural high school in a small isolated town in the farmland of central Italy. I was there to learn a “real job”…Plants, with their needs and illnesses, were the privileged objects of all study that took place in this school. This daily and prolonged exposure to beings that were initially so far away from me left a permanent mark on my perspective on the world.

Coccia, The Life of Plants, xi.

Coccia’s studies obviously eventually diverged from a purely vegetarian (haha) diet, but this deep, specific education in a discipline involved with stuff, one that revolves around living, physical beings that, as Coccia makes clear, present some significant challenges to the way human beings often think of themselves and their world, nonetheless informs a book of philosophy that doesn’t just address itself to a lifeless ivory echo chamber.

Probably my favorite facet of Coccia’s writing is not in the body of the text, which is nonetheless quite interesting, but in the notes. The book doesn’t have a bibliography or works cited page, which annoys me, but it does include extensive endnotes. And they’re a gold mine for a dilettante like me.

In several notes Coccia offers readers suggestions for popular treatments of topics in botany, cosmology, and evolution, among other sciences. The notes are also replete with technical and specialized sources, of course, but the inclusion of less specialized materials demonstrates not only respect for the reader but also a refreshing sense that one can (and should) look to sources outside specialized writing in philosophy proper for material to incorporate into writings philosophical.

Coccia is clearly no dilettante, given his training in botany, but his inclusion of popular works in the sciences demonstrates, at least to my mind, an acknowledgment of the importance of the kind of edifying dilettantism one cultivates by reading works of popular science. I’ll explain.


Today, people go to universities to get degrees that will get them jobs. To be clear, this is not a bad thing in and of itself (nor is it new) – many lines of work require specialized and technical knowledge that one can much more easily gain in a formal setting than by going it alone. Universities have specialized equipment, libraries, and other resources that private individuals typically don’t have unless they have Jeff Bezos money. Western universities have their roots in the Catholic church (and, if Christopher I. Beckwith is right, ultimately in the vihara of the Buddhist world via Muslim madrasas). Clergy, lawyers, and doctors made up the entirety of university student bodies until fairly recently historically, and their courses of study were intended to prepare them for careers in these fields and in diplomacy, etc. However, the focus on utility in education tends to dissolve the more humanistic elements of education understood as a means of improving oneself. As universities become more and more like corporations, the sense that one is doing something more than jumping through a hoop on the way to a job fades into the background.

Even in historic situations where one went to university to, for example, become a priest, the actual knowledge acquisition was supplemented by a sense that one was becoming a kind of person. A newly-minted Anglican priest with bad personal habits (or heterodox positions on the Trinity, like Isaac Newton) would not be likely to go very far in the institution, regardless of their mastery of the material taught.

Like capitalism, which dissolved feudal bonds (a good thing), but then set up new problems, the modern corporate university has largely dissolved the sense of molding or shaping particular kinds of people, all the “educating the whole student” stuff you see in their fliers notwithstanding. Universities no longer act in loco parentis, which is good, and in most cases public universities don’t make weird requirements of their students for purposes of moral control. On the other hand, this means that universities are slowly becoming further and further integrated into the general webwork of hyper-industrial capitalism, creating students who may know how to do a certain job (when they even know that), but that are otherwise disinterested in the world or learning more about it. Learning, which capital understands purely in terms of “efficient” utility, becomes something one invests in, but under the aegis of all capitalist investment: ROI. Without a strong value proposition and good possibility of return on one’s investment, learning becomes, at best, a kind of “hobby.” Or at least something one does not pursue with the kind of intensity that an iron-worker with a fourth grade education in the 1930s would have consumed offerings from the Everyman’s Library or Penguin. Since from within the mind of capital there is no possible incentive aside from capital accumulation, whatever kind of person is produced by universities must, first and foremost, be more or less completely “mapped” and set up for integration into capital’s net. Of course, being heavily indebted with neither real estate or financial instruments to show for it contributes to disciplining those whose mapping doesn’t stick.

Coccia’s book, for all its merits, falls victim (a bit) to the blindness to work outside of philosophy that I’ve been describing. He offers a variety of introductory texts on topics in botany, but part of the book’s argument is that philosophy has largely ignored plants, to its own detriment. I’m not in a position to adjudicate this claim, although Coccia makes good arguments. But here’s the thing. There are people considering and thinking about plants and the world. They’ve been doing it for years, but they haven’t been doing it in philosophy departments.

Examples off the top of my head: the works of Loren Eiseley, Michael Pollan, Merlin Sheldrake, Robin Wall Kimmerer and others (without mentioning similar work in fiction, documentary films, etc). Kimmerer works directly in botany, Sheldrake is a scholar of fungus, Eiseley was an anthropologist, and Pollan has written several best-selling books on human interactions with plants and food.

Now, the cynic might object: under capitalism, the only books that get picked up and published by prominent presses are books that fundamentally do not challenge the social order. While these books may be interesting, they can’t actually offer any meaningful change because they are so popular. I have two points in response to this.

First, making this claim does capitalism’s job for it. Like all other forms of social organization, capitalism presents itself as natural. Financial “survival of the fittest” and unethical dealing suddenly become acceptable when, before, usury, simony, and other rules of the game under capitalism were not just crimes but sins, transgressions against moral law. The stakes were much higher than a fine from the SEC. Again, the only incentive capital can see is maximizing profits and accumulating more capital – if you have to behave unethically or immorally to do that, then you can just go to a tent revival or Pentecostalist service, have a blissed-out ecstatic experience that you take to mean assurance of your salvation, and then get right back to “the grind.” Hey, you gotta do what you gotta do, and you have to think that it is natural and normal that this be the case. But here’s the thing: capital is myopic in this way. You, the person living in a hyper-industrial capitalist society, do not have to be. Capital is hegemonic and creeps its way into every nook and cranny of the world, but it doesn’t go all the way down.

Maybe it is the case that Michael Pollan’s books simply serve to reinforce and reproduce capitalist forms of life. But how can you know that if you don’t read them? How can you know that buried in the garbage, are valuable bits that could be used, repurposed, remixed, or argued against? For all you know, Pollan may be keenly aware of the limitations placed on him by the vicissitudes of the book marketplace. Maybe there was a truly trenchant critique of mono cropping in one of his books that an editor ordered cut out. Besides, since we all live under capitalism, Pollan has to make money somehow. He could do it in ways far more compromising than writing books about fruit.

Second: If anyone hopes to find a way beyond capitalism and its depredations, they should celebrate the fact that anti-capitalist sentiment and critiques of capitalism – some of which do in fact get published by large presses – are becoming popular and, in the process, moving out of niche subcultures and into the suburbs. It is entirely possible that a book one could buy at a ridiculous markup in an airport bookstore with dramamine and some gum might articulate critiques of capitalism or offer alternatives or food for thought. But one might never know, because the title sounds like something one’s dyed-in-the-wool Hillary voter parents would like. Surely a book available in such a place couldn’t have anything to say to philosophy, Regina Philosophiae Gratia Deo.

I will admit that a book called something like The Subtle Art of Not Giving a Fuck (a real title), or The Flower Child Within: Psychodynamic Gardening against Attention Difficulties (fake, but plausible), does not appeal to me. I certainly wouldn’t pay for either title. But! I would consider checking out a copy from the library or borrowing a copy. (But definitely not going to certain websites in search of a pdf…) I would read it not to just gulp it down uncritically, but to actually engage with the world and what all the people that will also have to be on board with The Revolution are thinking.


And so, after rambling in the brambles, we return to Coccia and to the possibilities in popular science books. If there’s a point to all this, it’s that insofar as philosophy understands (or understood) itself as a universal discipline, a discipline for which no part of the world is completely foreign or inaccessible, one of the philosopher’s first jobs should be to learn as much as they can about that world, and actually try to do something with that knowledge. Even if that means being a dilettante. Some degree of specialization may not only be unavoidable but necessary in a world of incredible technical complexity. But it doesn’t mean one should pass up anything on the other shelves.

Advertisement

Essay: Some general principles, part II

[I’ve written this post to stand on its own, but readers curious about why the numbers start with “3” will want to take a look at the first post in this series. ]

Make things explicit

Make your commitments explicit – at least to the greatest extent possible. That’s where I start these considerations. Most of us, most of the time, run on a kind of autopilot, not really thinking about what we are doing or why, and letting habituation run things for us. This phenomenon should not come as a surprise to anyone, really, and I don’t intend to argue that one should work to remain fully aware of everything all the time. I would guess that just about everyone (at least everyone who lives in the modern world of jobs and commutes) has gotten in the car or on the bus and found themselves halfway to work before remembering that they meant to go to the drug store.

Injuries throw this phenomenon into sharp relief. My knee has improved substantially since I injured it in March, but I still have to be careful how I move. I was shelving books in the archives on Monday and needed to get up to the top shelf. I stepped onto the little rolling stool (you know, the vaguely cylindrical one all libraries have), and stepped up. In so doing I put just a bit more pressure on my injured knee than was wise, while also twisting a bit. Nothing popped, and the pain went away as soon as I straightened my leg out, but the experience told me that I still need to exercise caution. I can’t let my knee go on autopilot. Yet. Injuries, then, also show that the ability to do things without having to actively concentrate on them is also crucial to normal functioning.

Another reason for making one’s commitments explicit has to do with the therapeutic effects of writing these thoughts out. I don’t intend to suggest that writing about your problems will solve them (not least because I have no medical training), but at least in my own experience, getting these things out does often help one identify places where one might have some agency, an opening for something new or different. It also helps one find the knots and holes in their world, the places that still need elaboration – or even to exhumed.

On that note, I’ll continue with my general principles. Again, I take these as kind of “rules of thumb” that tend to structure my life, but that don’t go “all the way down.” Hopefully they make some kind of sense.


3. You don’t just see the world – the Earth worlds

I don’t think it constitutes going out on a limb to suggest that a real, mind-independent world exists outside of us. However, the suggestion that any individual could have complete, “objective” access to this world – to “see the world as it is” – strikes me as totally bananas. And also harmful.

For Heidegger, the “earth worlds” (where “worlds” is a verb). This means that the Earth, or parts of it, coheres into a particular “world” for a particular person based on that person’s existential projects. That is, how the world appears has to do with who a person is and what that person, therefore, does. Once you start actively trying to notice birds, you will see them everywhere, all the time. Try it. Once you learn how to surf, the ocean looks different than it did before. Now you see that the calm waves that appealed to you for swimming or floating aren’t any good for surfing. What appears and how it appears to you as the Earth worlds will differ for everyone. This is not to say that each person sees a different and mutually incompatible world – I think it’s pretty safe to say that the point here is rather that the valence or trajectory, or flavor of each person’s world differs from that of anyone else’s.

Timothy Leary (I think) called this a “reality tunnel,” making use of the emic/etic distinction from anthropology to make this point. For those unfamiliar, an “emic” perspective is “within” the object of study. A historian studying the social effects of Sufi lodges in Late-Ottoman Turkey who is also herself a Muslim, approaches the topic, at least partially, from an emic perspective. Another historian studying the same topic and period who is not a Muslim, on the other hand, would be approaching his research from an “etic” position, “outside” the object of study. Now, problems exist with this distinction, not least because any observer affects their object of study somehow, and often in ways unpredictable ahead of time. So, really, no one ever has a “pure” etic perspective because one must have some connection to the object/topic/person/etc. at hand in order to make any sense at all of it. But though “purely” etic perspectives remain beyond reach, one can, crucially, remember that one has a particular perspective.

Making one’s principles and beliefs explicit serves a useful purpose here, as well. Like most of these principles, I question (when not flatly denying) the possibility of transparency rather than translucence. I can know that I have a particular position, and bear this in mind when talking with other people – especially people I disagree with – but I can’t once and for all, thoroughly and systematically lay it all out. Why not?

For one, where is the “I” that would do this? Are these not “my” opinions and commitments? If “I” can see through myself, I have to see myself seeing myself, then myself seeing myself seeing myself, ad nauseam. Whatever “I” sees through me has to already be behind me, but by then I have to “I”s! So no. On pain of infinite regress, I cannot know my own mind transparently. But I can, by making my thoughts explicit, gain some translucence. How?

Let’s say I spend an hour or so writing out some thoughts. Then I get up from the table, make some tea, eat dinner, watch TV, whatever it is people do. During most of these activities I return to autopilot mode. [Just to clarify, this is not bad. It is necessary for life and not (completely) avoidable.] The next day, still in autopilot mode, I get up, go to work, do whatever it is I do for a job, come home, and find my notebook still sitting on the table. I idly flip through it while waiting for some water to boil and turn to the page where I made my thoughts explicit. Suddenly, I find myself face to face with myself but outside of myself. I see the words I wrote and find myself “snapped out” of the autopilot, if only temporarily. I have more thoughts on the imporatnce of what French philosopher Bernard Stiegler calls “tertiary retentions” or “epiphylogenetic retentions,” which lead to the next principle.

4. Cognitive structures don’t stop at the inside of your skull.

I consider it a significant limitation that people seem to think that their brains are where they do all their thinking. Even more limiting is the notion that one’s “self” is something external, transcendent, and unconditioned – a Cogito from…somewhere driving one’s body around like a car. Maybe the most limiting idea, actually, is the accompanying notion that thoughts are not “things” that motive action or that one has some degree of control over. Thoughts aren’t “real,” since they happen “inside.” My principles on this require careful elucidating, and I’m not even sure I’m getting what I want across. I’ve given preliminary and clumsy presentation of some of these ideas in my most recent Report from the Workshop, so hopefully readers won’t lack all familiarity. Consider principle #4 the most “work in progress” of the principles so far.

Consider: When you say what you think, you use words that are publicly available. Wittgenstein’s “beetle in a box” thought experiment has convinced me, at least, that we don’t have access to a “private” language – to use language, we must be integrated into a previously existing symbolic structure and adopt its use and conventions. When one speaks their native language, it feels “natural.” One might occasionally struggle to find the mot juste or to push a phrase off the tip of one’s tongue, but in general one’s native language doesn’t feel like speaking a language at all – it just feels like speaking. Contrast this with learning a foreign language (especially as an adult) – even speakers with a high degree of proficiency might still struggle sometimes and make mistakes. It takes a long time of dedicated practice and use for a language other than the language one grew up speaking to just feel like speaking.

Further, consider the ways that we use language. Speech comes first chronologically, although one listens long before one can speak with ease. After speech come reading and writing, complicated skills that sometimes present distinct challenges, but that, in most cases most of the time, any child can learn. The means by which we read and write – clay tablets, papyrus, bamboo slips, palm-leaf manuscripts, rag paper, digital screens – all exist outside of us (in the sense of not being part of our bodies). But today, at least in “developed” countries, it is nearly unthinkable for an adult to have never learned any reading and writing. “Functional” illiteracy, or not having read a book ever again after high school, we can understand, but not not knowing what a book it as all.

Reading, then, stands as a kind of language use that requires things “external” to us – objects that do not have vocal chords, mouths, and lungs. And these objects, once we adopt them, cannot subsequently be separated. No matter how hard I try, I cannot “un-learn” how to read the languages that I read. I might be able to train myself to focus on the letters I read rather than their significance as words, but even if I study type-faces for their aesthetics, I read the words the type spells out.

A “reading” mind is, then, different than a “non-reading” mind. And the differences don’t stop there. Consider reading a physical codex versus reading a digital version of the same text. While both count as “reading,” in the sense that one visually decodes arrangements of symbols, these readings nonetheless differ substantially in a variety of ways that are too detailed to go into here. Suffice it to say, that writing a to-do list on a piece of paper, or in an application on a smartphone, is a good example of the extent to which one’s mind does not stop at the inside of one’s skull, or even at the inside of one’s body generally.


I hope the reader will forgive my clumsy writing in this post (and the previous one in the series). Part of working to make one’s principles explicit involves making false starts and persisting at the edge of one’s conscious experience. I’ve insisted that one doesn’t have fully transparent access to the contents of one’s mind, but one can gain a certain degree of translucence. I’m making these posts public because it strikes me that writing so that others might read and understand what I’m talking about forces me to take an even further step outside my own head. I can’t rely on shorthands and assumptions, since I remain aware that others might not share them.

Essay: A Belated Birthday Address

On April 26th I turned 34, an age I like because 3 is a prime number and 4 is the square of a prime number. Both digits add up to 7, which is also prime (hell yes), as well as an auspicious number in several schools of esoteric and hermetic thought. Of course, next year is 35, in which both digits are primes (sick) but their sum is not (bummer). However! 8 is two cubed, and that’s still pretty good. Cubes strike me as even more esoteric.

And so, I’m firmly in my mid-thirties. That doesn’t really mean a whole lot to me except that now I have to care about the lumbar support on my desk chair in a way I didn’t before. Turning 34 has, however, gotten me thinking: how long before I can justify becoming a model train guy? I used to think I would wait until about 40 for that, but with the turning of another year plus having a nephew that will be ambulatory in short order, I’m wondering if I shouldn’t speed up the timeline some. I don’t wear white New Balance sneakers and I have little interest in either the Civil War or World War II, so I need something for some uncle cred. I already like birdwatching and, now, stamps, so maybe model trains are a logical next step?


For the last five years or so I’ve been participating in Postcrossing, an online platform that allows members to send and receive postcards from all over the world (maybe a great-uncle type thing to do). I love digging through my postcard collection to find the perfect one for each new address I get, and it’s always fun to get a card from someplace far away in the mail. But it isn’t just about the cards. I also love to see interesting stamps from all over and use nice stamps on my own cards. For my birthday this year, my lovely wife, M, got me some vintage US stamps. She also gave me some old canceled stamps from different countries around the world. One was a set with a dinosaur theme that included a stamp with a pterosaur on it from, wait for it, North Korea. Badass.

When I opened the little packet and realized what was inside I spent about a half hour ooohing and ahhhing over the postage. I would never have thought to buy the foreign stamps for myself, which made the gift all the more special. After arranging them all on the table I cracked up because of how silly I felt. Apparently I’m a character from a Charles Dickens novel: morally complicated, kind of stuffy, living in a hell-world of hideous economic inequality, and pumped about old stamps. I also wanted a pudding (flan) for a birthday treat, which made me feel like Oliver Twist or something. M was relieved when she saw my reaction to the stamps – she thought I would think they were silly, but they turned out to some of my new favorite things.

I haven’t arranged the stamps in an album yet, but it’s on my list of things to do. Again, as a Victorian child, I love an album. I have binders of postcards from my several years of Postcrossing and an album of ticket stubs, money, and other paper ephemera from my various trips. Almost all of my class notes from undergrad and grad school are squirreled away somewhere in plastic bins or folders, as are old journals. My photographic experiments with a Fujifilm Instamatic camera that M gave me a few years ago (she’s a talented present-giver!) now nearly fill three of their own albums as well. But for all that, I don’t think I could ever be a serious stamp collector, even though it would give me a reason to buy yet more albums. Not least because I don’t have the money to “invest” in stamps – or anything else for that matter, although I will say that anyone who wants to give me a gold Krugerrand would get Christmas cards from me for the rest of my/their life. (To be clear, not because Krugerrands are made of gold but because I think the name is funny, the guy on one the face side has the beard-but-no-mustache look of the seafaring people from The Wheel of Time, and the other side has a Springbok in the process of “pronking.” Plus, it’s legal tender in South Africa but doesn’t have a value stamped on it, which I like for some reason.)

While I’m kidding about the Krugerrand (mostly), I do often find that the things that most interest me aren’t the things that command the greatest price or airtime. I don’t mean the platitude that “the best things in life are free” or some variation on that theme. That old chestnut is simply not true – a postcard from a botanical garden, pistachio ice cream, a nickel flattened by a train, or a movie ticket stub that you find inside a used book aren’t free, even though they’re some of the best non-people things I can think of off the top of my head. Rather, I mean that collecting things for the sake of completing a set, or because of some externally defined standard of valuation, strikes me as odd. What if the other members of the set are ugly, or expensive? Or, the worst possible thing, not interesting? If I’m going to collect anything, it’s going to be because I like the particular items in question, because I find them interesting – I don’t really care if they’re valuable.


In my office at home, on a high shelf in the closet, I have a box of treasures. One of these is a McDonald’s toy shaped like a Big Mac that “transforms” into a dinosaur. I have had this toy since I was in the single digits in age, and it fascinates me. The box also contains cards, bits and bobs, and various other trinkets that I have accumulated over my life and that have either sentimental value or interest for me. None of these things is valuable in the monetary sense (I think), but I nonetheless like to think of this box (which started its life as one of those picnic boxes for a wine bottle and later carried, hilariously, a small hookah) as a kind of mobile Wunderkammer [literally “room of wonders”], a nomadic Cabinet of Curiosities that I’ve carried from one residence to the next over the years.

I don’t open the box to look at the things inside very often. It’s actually kind of an intense experience, bordering on the unpleasant. There is lots of Time in there, and Memory. In a way, I’m not sure I even keep it for myself. Maybe this is a bit Romantic of me, but I think I actually keep the things inside the box for posterity. I think that’s also why I’ve never been happy with ereaders and prefer physical codices, why I keep ticket stubs, why I take notes by hand, and why I’ve kept journals and notebooks – sometimes obsessively – for most of my life.

Last night I was working on my card catalogue (about which more in an upcoming series) and came across this quotation from Byung-chul Han’s The Burnout Society:

The imperative of expansion, transformation, and self-reinvention – of which depression is the flipside – presumes an array of products tied to identity. The more often one changes one’s identity, the more production is dynamized. Industrial disciplinary society relied on unchanging identity, whereas postindustrial achievement society requires a flexible person to heighten production.

The Burnout Society, 44.

I would substitute “hyperindustrial” for “postindustrial” here, but the point remains the same: personal identity has been well and truly hooked to market circuits of consumption, and the more identities change, the more there is to consume. To “Become a Stamp Collector,” I would have to buy: albums, stamps, other people’s collections, books or memberships to websites, flights to go to conventions, etc. A pretty penny for somebody, and more or less trackable and calculable. If Amazon sees me order a stamp album, you can bet your britches that my recommended items – now populated by books on Hegel, knee compression sleeves, and gardening stuff – will soon include reference books on stamp values, and maybe even some numismatic stuff (why not? I collect stamps, don’t I?)

I don’t think Han’s point is to argue that we should work to have a permanent, unchanging core Self that stays consistent across time. At least, I would hope not if he considers himself a faithful reader of Heidegger (which appears to be the case). Rather, the point is that hyperindustrial society incentivizes changing one’s “identity” via consumption, on a whim, to drive consumption, which then dynamizes production. With a weekend, a new Twitter account, and a few hundred dollars, I could become a sneakerhead, an “energetic healer,” or (God forbid) a crypto guy. The danger, then, is that I would mistake my consumption habits for truths about myself, rather than see my “self,” such as it is, as a process of projecting into a future that is not yet known and remains unforeclosed. The self, on Han’s account, is plastic. I would argue that this is not new to hyperindustrial or postindustrial society, but rather has always been the case. Capitalism has just established a parasitic relationship with this pre-existing aspect of human Being.

So what does this have to do with stamps, exactly? Or that weird treasure box I’ve described? If human Being is plastic enough that it can be harnessed and channeled via patterns of consumption as a vehicle for the flow of capital, this Being can also be modulated, molded, formed, and shaped in other ways, including through one’s immediate surroundings: one’s objects and the significance they have. While I’m sure that the sneaker guys who buy and sell and collect shoes take some sense of abiding satisfaction from that pursuit, it still strikes me as having capitulated to the insistence of hyperindustrial capitalism that its cellular components (us) exist the way it wants us to. It would strike me as a bit sad, although maybe this is unfair, for a sneakerhead’s grandchild to find grandpa’s shoe collection in a storage locker and spend time poring over it, reliving and widening their experience of their grandparent.

The things I keep, that I find valuable and wonderful, help to bolster and reproduce my own sense of self and the kind of Being I want to inhabit. But I also think of these things as affecting the Being of those who come after me. What will my grandchildren think, for example, of the reading log I’ve kept on index cards for years? For one thing, they’ll probably find it a pain in the ass that I insisted on keeping these records of paper, rather than in an Excel file. But, hopefully, they’ll also see the way my handwriting will change over time, growing spidery as I age until eventually someone else has to write the cards for me. Maybe they’ll think of the dusty shelf where I kept the box as it filled up, and then think of the shelves full of books that I insisted on long after physical books on paper were “practical.” What will they think of the odd assortment of bits and bobs in my carry-on Wunderkammer? The funny rocks, the stamps and bookmarks and ticket stubs and boxes of notebooks they’ll find when I’m gone? Will they be able to simply throw these away?


I’ve given the ideas above a fair amount of thought over the years. One of the handful of things I’ve published is a story called “Going Home” that touches on the sense of taking things with one, especially inconvenient physical objects. You can read it at The Ekphrastic Review here.

I didn’t set out to write something “serious” when I started drafting this essay. I guess things just turn out that way sometimes. One last point before I leave off philosophizing, though. If you weren’t convinced by my meditations above, think of it this way: M gave me some old stamps for my birthday, and from one stamp, for reasons singular to my own strange reality tunnel, I wrote an entire essay. How’s that for the importance of physical stuff?

Essay: Some general principles part I

Course adjustments

The other day I spent a bit of downtime writing out an attempt at formalizing (or at least making explicit) the general principles from which I tend to operate. I don’t pretend to have listed them all here, not least because I make clear in several of the principles that one doesn’t and can’t have “pure” or “transparent” access to one’s mind (not least because one’s “mind” doesn’t exist the way most people think it does). Asymptotic access, maybe, although even there I have questions.

I like to try and make these things explicit when I experience significant shifts in the demands on my time and effort, shifts which demand not only that I change what I spend my time doing, but also, in some sense, who I “am.” Semesters always end like blast doors crashing down. I go from more or less constant activity – grading papers, checking email, reading for class, writing for class, worrying about whether I’ve forgotten to do any of these things, wondering whether I have any drafts I could punch up and publish, etc. – to…nothing. I have a job this summer and other obligations, but these don’t keep me quite as busy as the semester does. Taking time to make my rules of thumb explicit means I get a chance to tap the brakes and avoid spending the summer spinning my wheels.

Speaking of rules of thumb: the principles I list below don’t work (for me) like hard-and-fast commands, or articles of faith. In fact, most of the time I don’t even realize that they structure my behavior, hence the value of making them explicit. If I know something about myself more or less clearly and explicitly, then I can make the conscious choice to continue operating along those lines, or try something else.

In the list below I have tried to articulate my principles (as of May 2022, anyway) as clearly as I can. [As I wrote and elaborated these principles, it grew clear to me that listing them all in a single post would prove too long, so I’ve just included the first two here and will address the others in further posts.] I have also tried to include relevant citations and sources of inspiration for these principles. [Keep an eye out for a post dedicated to this – it’ll take some time for me to put together.] I should emphasize, again, that these principles don’t exhaust my commitments. Nor do I argue that these principles hold in all places and times or that every person should adopt them. I can say from experience that explicit and consistent application of these principles has improved my experience of life in significant way, but that doesn’t mean everyone else will benefit the same way.


1. You are what you do, and vice versa

“Being” and “doing” operate in a recursive relationship that one can symbolize as a kind of “advancing” spiral (scare quotes because the advance does not approach a predetermined goal, but nonetheless moves in a general direction at any given moment).

No clearly definable difference exists between thinking (including “staring off into space,” “getting lost in thought,” writing down ideas, manipulating models, etc.) and doing things like mowing some grass or eating an apple. Thoughts affect the body/mind (on which more below) in a way similar to how physical activities do, but one mustn’t confuse the levels (see point X).

Despite no final teleology ahead of time (at least that one can know for certain rather than believe in hope for), a target exists at any given time under any given arrangement and schema of mind/body. That is, if the mind/body remain in their current relationship, one will tend to approach a certain point. And since one will eventually die, the point at which one dies can define the target retroactively. [For my own purposes, as an atheist, this makes sense to me. I recognize that those coming from theistic positions will differ on this point, but I nonetheless would argue that belief in a final target, in a universal teleology, doesn’t mean the same thing as knowledge of that final target. Belief in a final moral purpose to the universe does not preclude acting as though the future remains open and, to some extent, malleable. Knowing that the final teleology exists, and acting on this knowledge the same way one acts on the knowledge of where the nearest Walgreens is, brings problems. But that’s a topic for another post.]

Think of a ship underway. The ship could go to any number of ports, although not every single one of them. If the captain points the ship in broadly the right direction and just guns the throttle, the ship will probably not ever arrive at its destination because wind, currents, collisions, and all manner of other things will affect its course trajectory. Even a thickish mat of barnacles on one side might cause a list that takes the ship far off course without regular examination and correction (e.g. what I intend with this post). One should also note that a ship’s captain not only makes these regular course measurements and adjustments, but also logs them externally to him/herself in a publicly accessible form.

2. Not transparency, but translucence

One “never” (or as close to never as makes little difference) has total, transparent access to one’s mind/body. In support of this claim, consider the experience of being “drawn up short.” For example, consider two good friends having an argument. Things get heated – maybe they’ve had a couple drinks too many – and one says something absolutely unforgivable to the other. In the moment following this traumatic (in the sense of “resisting symbolization”) irruption, both friends just stare at each other – neither knows the other, now. The friend who made the outburst apologizes, but the dice remain cast. One cannot un-cross the Rubicon.

Experiencing this feeling of getting “drawn up short” doesn’t have to come from a situation like the one described above, but I trust that the reader will understand what I mean. While I gave a negative and interpersonal example, a variety of things can “draw one up short.” Catching a glimpse of a particularly spectacular mountain vista from the corner of one’s eye, narrowly missing stepping on a dog turd on the sidewalk, acute chest pains that turn out, after an EKG, to have come from bad gas. In Being and Time, Heidegger describes the experience of using a hammer, only for the head of the hammer to go flying when one tries to fasten a nail.

In The Myth of Sisyphus, Albert Camus writes that the person (he uses “man” throughout, but means “person”) who recognizes the absurdity of life gains what he calls “lucidity,” a new sense of the world and one’s possible places in it. The experience of being “drawn up short” offers an opportunity for a kind of lucidity in that it throws into stark relief the assumptions one makes about the world and its consistency.

To return to the metaphor of the ship, think about what would happen if the captain of a large ship, say a container ship of some kind, put the vessel underway and then violently cut the steering to the left or right. While here I probably demonstrate my ignorance of how modern ship steering works, for the purposes of the metaphor, we can imagine two salient possibilities. First, as the rudder slams into the suddenly adamantine water, it twists its internal mechanisms and snaps. Now no steering is possible. Insisting on constant, “pure” transparency of the mind/body to oneself leads, at least potentially, to total stasis and movement only at the whim of sea and sky. [I would argue that being “blissed out” and getting “beyond thought” in, for example, many forms of contemporary Buddhist practice or Pentecostal ecstasy, can lead to this possibility.] The second possibility involves the ship capsizing. The rudder and steering mechanisms hold, but the vessel lists too much, starts taking on water, and begins to sink. Here again, no steering is possible and one loses the control one had, however modest. Scylla and Charybdis, without even any monsters. This second possibility represents the fate of the Romantic, obsessed with some kind of “truth” to themselves that sits beyond their daily mind/body lives and efforts. As the ship sinks, it hopes to leave a beautiful corpse.

Against these possibilities stands the acceptance of translucence, the idea that one can and does have some access to the “internal” working of one’s body/mind, but that one cannot (and should not) attempt to gain “complete” access. I call this state “translucence” because something always gets in the way, but one can nevertheless see clearly enough to steer and make modest course corrections. Think of the difference between a naked light bulb and a lamp with a shade on it.


It would seem that I have now committed myself to a series of posts on this topic. Hopefully others find something of use in these principles. Keep an eye out for probably another two posts outlining the rest of my general principles (again, as of May 2022), as well as a final post consisting of a bibliography of works that I’ve often found useful or edifying. I still have some last assignments to complete before the end of the semester on Sunday, so I will probably post these latter entries this coming weekend or next week.

[For anyone interested, it turns out that I adopted this idea from a series of podcast episodes put out by Hilaritas Press, the literary executors of American writer Robert Anton Wilson. Each episode outlines the basic work and ideas of some of the thinkers and writers that influenced Wilson. RAW has remained a strong influence on me since I first encountered his work in high school, so I suppose this laying out of principles also serves as a kind of tribute to him. In any case, you can listen to the podcast here.]

Essay: Care and the Green Thumb

WARNING: If you have no patience for elliptical style, riffs and digressions, or etymological wordplay, best skip this post.


Problematic: What does it mean to have a “green thumb?”

For Heidegger, one properly acts through the hand. (Do note the singular.) Insofar as humans (which are not all Dasein, and, at least for Dreyfus, vice versa) have hands, we properly act. The hand distinguishes the human from the non-human in acting.

Of course, an immediate objection arises: what about the great apes? Or Old and New World monkeys? What about elephants, whose trunks are at least as capable of handling finicky bits as a human’s fingers? As Derrida argues pretty convincingly in The Animal that Therefore I Am, Heidegger’s thinking privileges humans over other species, thus inadvertently continuing a tradition that places humans, if not at center stage, then at least at the top of the playbill. Any attempt to identify and designate a specific difference between human and any given animal fails, on Derrida’s account, not least because one could always find examples of individuals that are not human doing things that, supposedly, only humans can do. Of course, DNA sequencing makes this trick even easier. I have a lot more common with a pumpkin than one might initially suppose. (A fact which I rather like. Pumpkins, when planted as part of a Three Sisters bed, provide shade and keep the soil cool and moist for the beans and corn. I’ve always felt more comfortable with support/maintenance roles – a point I will return to below. Besides, pumpkins are kinda round and squat, much like myself.)

For the moment, I want to bracket concern with differentiating humans from animals. While I find Derrida’s contributions useful and important, it nonetheless remains obvious to me that, even if one cannot clearly and permanently distinguish humans from species that are not human (and that this lack of distinction bears ethical ramifications), differences nevertheless persist.

Rather than the hand, then, I would look to the thumb, the means by which one (a human and a Dasein, for the time being) grips, encircles, takes hold of. In German, a concept is a ,,Begriff,” reminiscent of “gripped.” One encircles with a concept, creates a barrier or boundary (or perhaps a membrane), a place to hold on – a grip. In Heidegger’s “A Triadic Conversation,” the character of the Scholar most clearly represents the power of the ,,Begriff,” of the concept as boundary.


[A brief riff, if the reader will indulge me. Humans act through the hand, but this does not apply to all humans. Even bracketing for the moment individuals with impairments or motor difficulties, at a much more basic level the hand does not represent our originary means of “handling” things in the world. How does a baby interact with the world? By putting things in her mouth. One often reads “human” to mean “adult human” (historically also “white,” “male,” and “free” or “property owner.” But how did those adults get to the point of using only their hands to interact, with the mouth relegated to food, drink, medicine, stimulants, and (sometimes) the mouths and genitals of others? The mouth takes in, and indiscriminately, until the hand mediates the encounter.]


The longest of Heidegger’s “conversations” (collected in Country Path Conversations edited and with an excellent introduction by Brett W. Davis) takes place on, you guessed it, a country path. Three conversants, a Guide, a Scholar, and a Scientist, take up again a conversation they had left off a year earlier. As the conversation carries on, the Guide seeks to convince the Scientist that, contrary to popular belief, one can describe science as an applied technology, rather than the other way around. The Scientist, a physicist and positivist, resists these ideas, remarking that the Guide’s words make him feel “groundless” or dizzy. For the Scientist, the Guide is LSD in the water. But not so with the Scholar.

As the conversation ambles on, the Scholar tries to find ways to identify and encircle the Guide’s words. Some statement reminds him of Leibniz, or Spinoza. Unlike the Scientist, whose disciplinary specificity and (necessary!) rigidity make him an easy window to smash, the Scholar has a much more flexible immune response. He enlarges the circle of a concept, broader and broader, until it can, potentially, fill all of space. The Scholar, one could say, has a much firmer “grip.”

The range of the Scholar’s ability to “grip” novelty into his existing handhold makes him (an assumption – we don’t actually know from the text) a tougher nut to crack for the Guide (whom I think one can safely say represents Heidegger more or less in earnest). To the Scholar, anything the Guide says can be identified with an existing concept and fit into an existing schema. Resemblance oozes subtly into identity.

I have, of course, a literary analogy for this phenomenon. In William Gibson’s Pattern Recognition (probably his most interesting novel, in my opinion), the protagonist Cayce Pollard (about whom more in this post) travels from New York to London to Tokyo to Moscow, and each time finds herself playing a kind of game where, when faced with difference, she tries to fit it into an existing schema. Parts of London (which she calls the “mirror world”) are “really” like New York. Parts of Tokyo are “really” like London. Anyone who has traveled extensively, especially to big cities, will recognize this pattern of behavior, a pattern made increasingly understandable (if no more laudable) by the homogenization and leveling of global culture. For me, Shanghai “really” was just like Paris until I turned off the main thoroughfares and found myself firmly back in China again. But then I passed a Burger King, entered a Starbucks, and placed an order in English, at which point I could have found myself pretty much anywhere.


[I beg the reader’s indulgence for another riff. Starbucks, it seems to me, best represents the homogenized no-place subsuming cities large and small. I have visited Starbucks locations in several countries on three and a half continents, and each only stands out as a separate place in my mind because of its differential surrounding context. For example, I visited one in Shanghai located inside a huge multi-layer mall that I found garish and too bright. It looked just like all the “nice” malls I have ever visited, but something felt a bit “off,” like how UHT milk from a box doesn’t taste like fresh milk. Another Starbucks, in Mexico, I remember because the inside of the shop was too intensely air-conditioned, leaving the glass door to the outdoor seating area covered in a thick layer of condensation. It gets hot on the Yucatan Peninsula.

One might respond that McDonalds would serve as a better example of homogenization. I would not disagree. Initially I would say that McDonalds has more of a functional or even “low class” set of associations and homogenizes “from the ground up,” but that doesn’t exactly work since, for example in China, one can buy fast food from street vendors for much cheaper. McDonalds isn’t haute cuisine there, but it’s not a cheap source of fast and convenient calories. Again like Cayce Pollard, whose usual “allergy” to haute couture brands bothers her less in Tokyo than it does in London, context matters. Nonetheless, I think that Starbucks, which I associate with people tap-tapping away on MacBooks, better represents the digital and aesthetic homogenization of culture. Maybe a homogenization from the inside out, from the aspirational and downwardly mobile middle- and consuming classes that serve as insurance against overproduction. A smoothing of culture, as Byung-chul Han puts it in Saving Beauty. To put it a bit vaguely, a McDonalds anywhere feels like more of a “real place” to me than a Starbucks anywhere.]

Now, I don’t mean to suggest that making comparisons or finding similarities is some kind of problem in and of itself. You need some existing schema to apprehend a new idea, at least initially. Learning the grammar of your own native language makes learning a foreign one easier (or at least less totally baffling). The problem arises when all novelty is “fittable” into one’s schema ahead of time. We don’t live in a modular world, where pieces can go together in various ways, but are nonetheless standardized. This isn’t Legos. Heidegger’s Scientist needed his rigid positivism not only to actually conduct scientific research, but also to allow for the possibility of going beyond his scientism. Byung-chul Han writes (somewhere, I don’t have the citation right now) that knowledge differs from data in that knowledge is gained against resistance. The Scientist’s rigidity creates precisely such resistance. The Scholar’s erudition, on the other hand, more amorphous and loose than the Scientist’s, runs the risk of souring and overrunning the entire world. Like a gas, there’s nothing to push back against. Every Starbucks looks like all the other Starbucks, even if the layout and specifics differ slightly. If you’ve seen one Starbucks, you can probably literally imagine them all.


Speaking of Starbucks, where they wear green aprons, I now sense the approach of the point of this excursion, like a change in the wind. To return to the green thumb.

The thumb serves to grip, to encircle, to make concepted – ,,zu ‘Begriffte’ machen.” As we saw with Heidegger’s Scholar, this gripping broaches the possibility that, as Ecclesiastes would put it, “there is nothing new under the sun.” Everything strikes one simply as “like” something else. One cannot any longer imagine novelty so new that it passes through to trauma.

The green thumb, then, a subspecies of thumb as it were, “grips” and encircles. But now, we must ask: what does it encircle? How hard does it grip? Does the wrist remain loose and flexible, or taught, tight, under pressure? Do the muscles of the forearm suffice to accomplish the hand’s goal, or do you have to put your back into it and slip a disc? Does the grip involve all five fingers? Both hands? (Heidegger, to the best of my knowledge, does not ask or answer these questions. Part of his problem with typewriters has to do with one properly acting “through the hand.” Of course, as Don Ihde points out, this is a clear indication that Heidegger never learned to type with any proficiency.)

A green thumb means its holder (its haver? its bethumbéd?) can keep plants growing and alive. Many people described as having “green thumbs” can, of course, tell others in explicit terms how to care for plants, but their ability nonetheless continues to strike others as peculiar and impressive. And even they themselves cannot exhaustively describe their own capability. Why? Because “having a green thumb” does not mean “knowing all about plants and being able to express that knowledge systematically and precisely in symbolic form.” To those poor souls who always kill their succulents, the “green thumb” is magic , something almost preternatural of which they despair of learning. But this is a mistake.

The meaning of a “green thumb” really comes down to this: a particular way in which the green thumb “grips” the world. It is not a way of knowing in the sense of exhaustively and systematically articulating symbols through recall, but rather a way of comportment, a mode or key of being.

Consider an analogy with your native language. We say that one “knows” one’s native language, but we really mean something more like one lives one’s native language. (To put it in Heidegger’s terms, “language speaks us.”) Aside from sometimes struggling to find the right word, or occasional stumbles, one does not need to remember anything to speak one’s native language. Don’t believe me? Spend six months working diligently but not too intensely on Duolingo (any totally unfamiliar language will do), then take a trip to a place where that language is the native language of most of the population. If possible, try to avoid big cities where you are more likely to encounter others who can translate for you.

What will happen? Well, Duolingo works pretty well, so you’ll get up to speed on basic terms and meeting basic needs quickly enough. But beyond that, you will find yourself thrown for a loop. You will find, in your stumbling attempts to navigate the world and interact with others, hat how you communicate with others plays a significant role in forming both who you are to others and to yourself. The most difficult (and intimidating) part of learning a new language is the plummeting feeling of having to learn how to be yourself again.

A green thumb – or an eye for photographic composition, or an ear for musical composition, or a good arm in baseball – works the same way. One doesn’t “have” a green thumb or “know” a green thumb. One is a green thumb. That is, the green thumb serves as a descriptor of a mode of being in the world, one that cannot be exhaustively expressed because it does not come after the one doing the being – it is the being.

Another analogy might help. I do not know how to surf. If I accompany a surfer to the beach and we both look out onto the ocean, she and I will see different things. Not “literally” (at least assuming we have similar levels of visual acuity, etc.), but rather in the sense that the surfer will be able to tell if it’s a good day for surfing, and I won’t. She might be able to explain some of how she knows this, but not all of it. And, unless my being already exists in some sense “adjacently” to the being of a surfer, I may not even understand the things she is able to explain. However, if I begin learning to surf, if I practice surfing, if I become a surfer, then maybe someday she and I will be able to once again walk onto the beach and both see whether the waves are good that day or not.

The green thumb works the same way. One has to learn how to be such that one has a green thumb. While this learning must incorporate explicit symbolic knowledge to some degree, the real work, the real learning, and the real change in being comes from the doing, and from the becoming.

The green thumb, as a thumb, grips, it creates and holds concepts of the world. But the green thumb differs from, for example, the Scholar’s pre-configured means of expanding his grip, precisely because plants are not symbols. The mimosa tree in my front yard is, if the conditions are within a certain range, gonna mimosa. Period. I can help it along, shelter it, take care of it, feed it and water it, but fundamentally, the plant is doing its own thing. The green thumb “grips” the plant, but it can never do so completely, simply because the plant does not allow itself to be fully symbolized. It is outside of the human in a significant sense, and even an exhaustive knowledge of horticulture does not preclude the possibility of plants dying for what appears to be no reason. For all that one’s symbolic knowledge of plants can expand and expand, it eventually founders on the brute reality that the plant is not up to you.

And here we see the most salient facet of the green thumb. Insofar as it does “grip,” conceptualize, and encircle, it does so in the knowledge that this is only ever a kind of loose grip, a conceptualization that may prove useful in some cases, but ultimately fails to fully encircle its charge. It is a grip of care, the careful grip with which one holds a child’s hand while crossing the street. This is not a grip one can learn except existentially. By doing. And in so doing, by changing not just what one knows, but who one is.

Essay: That’s exactly what they want you to think.

(Formerly posted as “Report from the Workshop: 04/29/2022,” but I decided it’s much too long for a report and should stand on its own.)

Semesters are like volcanoes: they simmer and simmer for a long time without anyone thinking much of it, and then they decide one morning to violently explode.


Yesterday I submitted an outline that resulted from a semester-long independent study on Heidegger’s thinking of ontological “death” and its ramifications for education. I started with education, but somehow ended up creating a pretty ambitious research project involving existential death, conspiracy theories, and the epistemological necessity of vulnerability.

I say I started with education because, to be honest, I’ve grown tired of thinking about education. Problems in education increasingly strike me as consequences of more general (and therefore more invisible) social, technological, and epistemic limitations. Talking about “education” on its own seems more and more like missing the forest for the trees.

Of course, as a (sometime, amateur) Marxist, I shouldn’t find this surprising. The depredations of capital flow affect all aspects of social life, although differentially in different domains. I’m finding myself gravitating more and more toward what Heidegger calls Gelassenheit, “releasement,” a term he cribs from the German mystic Meister Eckhart and reappropriates for his own use. Where Eckhart would advocate “releasement toward God,” Heidegger would advocate “releasement to the things (in the world).” This allows one to “return to oneself” and see one’s existential situation anew and (potentially) with greater clarity. It’s also the exact opposite of the way that networked digital media platforms want people to behave. Thoughtful, meditative behavior doesn’t play well on platforms that run on a fuel of “engagement.”

[Upon reflection, it strikes me that one could read Gelasssnheit as a kind of “blissed out” disconnection from the world. I don’t think Heidegger intends this reading, although it’s not difficult to see how one could make this mistake. Rather, and I think this is important, releasement” for Heidegger is releasement to the world and how it shows itself to us. That is, one’s usual and unthinking apprehension of the the world is “broken” and then “reset.” I hurt my knee about a month ago, and though it’s much better now, it still feels different than it did before – I actually have to think about walking and pay attention to where I set my foot. I’m imagining releasement as occasioning something similar.]

For a while now I’ve been mulling over the idea that we humans, especially but not exclusively those of us in the Global North, have been “domesticated” by a product of our own invention. Networked digital technologies are “pharmacological” in that, on their own, they don’t have a positive or negative valence. Two aspirin help a headache. A bottle of aspirin, however, will kill you. It isn’t exactly a question of quantity, but rather of distribution and following impulses. Every time you get mad at something you see on Twitter and “clap back,” Twitter is literally (and I mean this in the dictionary sense, not for emphasis) making ad revenue from the reflexive operation of your neural pathways and fight-or-flight reflex because the more you stay online, the more angry and invested you get, the more fucking ads you are exposed to. It’s like if you “worked” in a factory where every time the doctor made your knee twitch with that weird hammer the hospital administrator got money from the hammer manufacturer. (Maybe that is how doctors work, I don’t know.) But that’s an essay for another time. Right now I want to talk about serendipity.


As I was typing up my outline to turn in I realized that several of the books I’ve read “for fun” this semester have borne direct relevance to the social epistemological questions I’m beginning to pose. This happens to me pretty regularly, actually, and it’s probably just a case of apophenia, seeing patterns where there aren’t any. Of course, if the universe is one unified thing and any individual and their sensory apparatus is a distinguishable part of it that, nonetheless, follows similar rules as elements of the universe at much larger and much smaller scales, then who is anyone to say that there aren’t patterns? Maybe we just need a different point of view.

{The sentence starting with “of course” in the above paragraph is dangerous. The astute reader will understand why. If you don’t understand why, just recognize that I was, and again literally, fucking around up there.}

Let’s talk about books. First, I started reading a collection of Philip K. Dick’s short stories in January. I keep the volume by my bed for nighttime reading, so I haven’t made a ton of progress through it. But even Dick’s weaker offerings bear the distinctly clammy and metallic odor of paranoia. His VALIS trilogy, written after a kind of mystical experience he underwent and then tried to work through in his Exegesis, features a millennia-long conspiracy in which the Roman empire never died and continues to enslave humanity. Wild. In Dick’s fiction, nothing is as it seems, and there is often no way out. (Incidentally, I appreciated the most recent Matrix movie for driving this point home. I’m a congenital contrarian, so I love that film because everyone else seems to hate it, but I also love it because Lana Wachowski strikes me as dedicated to not infantilizing her audience with a clearly spelled out “message.” Just like the previous installments in the series, the “moral” of the story is: “take a minute to think, you philistines!”)

I also began Robert Shea’s and Robert Anton Wilson’s Illuminatus! trilogy, a send-up of acid-trip political paranoia from the 60s and 70s. The narrative structure is experi-mental (see what I did there?) with point of view changes galore and makes reference to a wide variety of very specific conspiratorial schemas. The intention is clearly to satirize paranoia, but the novel does so in a way that leaves the reader unsure of just what the “real story” might be. My opinion, for what it’s worth, is that this uncertainty regarding the “real story” is good. Since Descartes, philosophers have looked for “absolute knowledge,” knowledge we could know without a shadow of a doubt that we knew. Personally, having read the bit of Descartes’ Meditations where he gets to his famous cogito, I think he may have been trolling. In any case, the spectre of “absolute knowledge” looms large and nastily. For a Biblical literalist, any challenge to a truth claim made by the Bible potentially throws the whole thing in question. Hence the literalist’s jumping through ever-more-spurious hoops to save the phenomenon. But here’s the problem: this kind of face- and phenomenon-saving behavior is now characteristic of everyone. Why can’t things be “true enough?” Or, saints preserve us, fucking metaphors?

Umberto Eco’s Foucault’s Pendulum, which I just finished the other day, actually makes that last point explicit. It’s the story of three editors at a publishing house who basically use a computer program (named after a medieval Kabbalist) to invent a global Knights Templar-themed conspiracy after encountering a strange Colonel with what he claims is a decoded Templar message. At first it’s a joke, designed to poke fun at the spurious dot-connecting done by the “Diabolicals,” enthusiasts of the esoteric who constantly submit manuscripts “revealing” hermetic and conspiratorial secrets. The editors are hard-headed skeptics, with what Eco describes as a kind of hard-headedness apparently congenital to the Piedmont region where they come from. Over time, however, that all starts to change. As the Plan becomes more and more real to them, and the stakes start getting higher, the narrator Casaubon reflects that he and the others have, precisely, lost the ability to think metaphorically or figuratively. The novel is deeply tragic, even though it is, like Illuminatus, intended as satire.

I’ve often thought that fiction is a better vehicle for some ideas across than non-fiction (especially in philosophy). Genre fiction like sci-fi or thrillers seems especially useful to me, and partially because it isn’t (or hasn’t historically) been taken seriously. Crichton’s Jurassic Park, for example, makes what seems like a pretty persuasive argument for at least some amount of caution in biological engineering, but when Jeff Goldblum’s lines get turned into memes, the thrust of the argument gets obfuscated.

Foucault’s Pendulum has been described as “the thinking [person’s] Da Vinci Code,” and I think that’s right. The point of the novel is to show that the logic of conspiracism leads to an abyss. When everything can in principle be connected but there is no nuance, no sense of when and which kinds of connections are appropriate, one falls into the trap of having no choice but to try and become omniscient. This is impossible (for a human being, anyway), and so omniscience comes to mean imprisonment in a miasma of one’s own epistemological overindulgences. It doesn’t even make sense to call it a “web” of connections anymore because a web has a particular valence – it isn’t arbitrary. While Eco could have probably made this point quite clearly in an essay (or, haha, a blog post), the novel’s form, that of an upper-level airport thriller, gets the reader in the guts in a way that making claims and articulating arguments does not.


“Interesting,” the reader has by now mumbled to themself a few times. “So you just happened to read several books that all had to do with paranoia and conspiracism, and then decided to do more research on this phenomenon? Seems pretty straightforward to me.”

I agree, actually. I’m not trying to argue otherwise. Rather, I’m trying to demonstrate that there doesn’t need to be a straight line from point A to point B in all cases, and even where such a line does in fact exist, one might not be able to perceive it until after the fact because, wait for it, the line itself might not exist until after the fact. (Hegel, whom I haven’t read, calls this Nachtreglichkeit, “retroactivity.”) That is, there’s a difference between conspiracy and serendipity, but sometimes this difference is hard to perceive. Either way, one should wonder, “does there need to be a reason?”

The final book I want to talk about, William Gibson’s Pattern Recognition, deals with a kind of serendipity of perception and offers a potential corrective for the pathological drive to omniscience. Probably best known for his earlier Neuromancer, Gibson basically invented the cyberpunk genre. Pattern Recognition, however, doesn’t exactly fit that mold. There are computers, of course. The plot actually comes to revolve around a series of film fragments of unknown provenance unearthed on the (2002) internet but the digital technologies and the world of the setting are all “real” and recognizable. The novel also has to do with, as the title suggests, pattern recognition, and seeing patterns where there aren’t any. But over the course of the novel the reader watches protagonists who don’t gain victory over the world of networked technologies and final, full understanding, but rather find a kind of catharsis in not knowing for sure.

The protagonist, Cayce Pollard (pronounced “case,” though she was named after the American mystic Edward Cayce (pronouned “casey”)), works as a freelance “cool-hunter,” roaming urban streets on the lookout for the Next Big Thing in fashion. She has a strange and somewhat uncomfortable ability to “sense” whether a logo will “work” on the market or not, as well as a complete intolerance for brand names and logos which she describes as a kind of “allergy.” Gibson makes a fair bit of hay over, for example, Cayce’s clothing – tags carefully cut out, the pseudo-brand of a “casio clone” watch sanded off. (Many of these descriptions read like museum copy twenty years on, which I think adds to the novel’s interest.) Cayce doesn’t know how she does what she does, only that it works. When she is hired by Hubertus Bigend, a Belgian businessman in the mold of a proto-Elon Musk, Cayce finds herself connecting her business of evaluating logos with her passion for finding whoever is making the mysterious online footage. Think Indiana Jones but it’s a black-clad woman from New York who does Pilates in the early 2000s. (Just to be clear, this description is intended as a positive appraisal of the book.)

While parts of the novel now feel dated (no smartphones, people communicate by calling and emailing rather than DM’ing, etc.), it nonetheless remains eerily resonant. The reader learns, about halfway through the novel, that Cayce’s father, Wingrove Pollard, worked as a security contractor for American intelligence services securing embassies. Win disappeared on the morning of 9/11/2001, but there has been no proof positive if he is dead or not. The novel takes place soon after 9/11, and the trouble with Win’s undeath has to do with his estate – Cayce and her estranged mother, who lives in a kind of hippy commune dedicated to scanning rolls of tape for so-called “electric voice phenomena” (EVP), cannot claim Win’s inheritance until he can be proven to be dead. But that isn’t really concerning to Cayce. Rather, the really concerning thing is not knowing.

There’s a lot of not knowing in this novel, and I would argue that the catharsis Cayce eventually reaches (which I won’t spoil) serves as a useful model for how we ought to live now. 9/11 has faded into the background of the American psyche over the last twenty-plus years (although not from American politics, unfortunately), but we still find ourselves living in a world beset by bad things happening for reasons opaque to us. The rush to claim that covid-19 was a Chinese-developed viral weapon, for example, tries to find an “explanation” for something that, insofar as it posed a threat to global health, at least initially simply had to be dealt with. I think it likely that scientists will know for certain where and how covid came from in my lifetime, but I don’t think we know now. That doesn’t stop speculation, though, driven by the pain of not knowing, of feeling the rope slip through our fingers as we hang over the abyss, unsure whether anyone will come and save us.


Pattern Recognition presents the reader with two questions that eventually merge into one for the protagonist: “who makes the mysterious videos?” And, “what happened to my father?” One of these questions is, eventually, answered. The other, however, is not. Or not completely. Not beyond a shadow of a doubt. But even with this possibility of doubt, Cayce finds a way to live. To “pollard,” in horticultural terminology, means to cut down a tree, leaving a stump from which new, straight branches will sprout. It’s a means of sustainable forestry because a few pollarded trees can produce lots of wood for quite a long time, rendering cutting down other mature trees unnecessary. One could read Cayce’s last name as reflective of the myriad possible coulds she encounters. There isn’t a main trunk to speak of – the postmodern “proliferation” has replaced the late-modern “grand narrative.” Coming from the position of Descartes, or later of Kant’s sapere aude!, “dare to know!,” the only choice in a world of massive complexity and scale seems to many of us to try, like the editors in Foucault’s Pendulum, to make sense of it all. The desire to become omniscient, to become God, to become identical to the universe itself, is a desire not for immortality and certainty, but for un-death and the constantly grinding need to continue suspecting. Either it all makes sense, or none of it does, says the inheritor of an Enlightenment grown malignant, and the abyss calls louder, louder.

What saves us from the abyss? Well, at least from my perspective, certainly not God. Neither will History, Justice, The Next Generation. The arc of history only bends toward justice if it is made to bend. The universe on its own seems supremely unconcerned with the whole thing, like a dandelion blowing in the breeze. We’re on our own and, like Cayce Pollard, unsure of what’s what. But also like Cayce Pollard, we’re not each of us all alone. Pollards produce myriad new growths from a single stump. We can still help each other, even if no one person finally “knows the score.” And we can also keep each other honest. Not necessarily by arguing, but simply by wryly asking, like the skeptical Piedmontese editors in Foucault’s Pendulum before they succumb to their own game, “you really think so?”

It would be all too easy for me to look at my reading this semester and think, “oh wow, I guess it’s my destiny to write about conspiracy theories since I read these books without realizing it!” But then, when I hear myself say this out loud, the other me that is identical to me but from further along the timeline, grins and says, “you really think so?”

Essay: Don’t Write Down Your Nose At Others (A Screed)

[A note to the reader: “screed” seems an accurate descriptor for this essay after my having written it. But a screed is not necessarily incorrect, just impolite. Since this is a personal blog, I make no apologies. Nor do I give specific examples.]

[Another note to the reader: I wrote this essay several weeks ago and have sat on it for a while because I don’t quite know how I feel about it after getting it all out. I still think I make good points here, but the essay is a bit repetitive. I’m posting it anyway because I haven’t posted in a while. Maybe I’ll come back to it later. -jk ]

Writing as a philosopher, “theorist,” “thinker,” etc. does not give one license to write like a jackass. I find myself increasingly irritated and impatient with “thinkers” who write from on top of the mountain of “theory,” where all the smart (read: “good,” “informed,” etc.) people live. These writers take the tonal equivalent of people from New York City or San Francisco who assume that others know all about the geography and administrative subdivisions of their city. No, I have no idea where “Queens” is, nor do I know what, if anything, being from there means. “The Bay Area” is another one. Which Bay? The Chesapeake?

Don’t worry: I’m not pulling a JD Vance and trying to pivot from college-educated cosmopolitan to straight-talkin’ yokel, although Vance’s cynicism in his own recent politically-motivated pivot is so astounding as to almost be impressive. I don’t have a problem with dense, abstruse, technical language. (Someone claiming to be “telling it like it is” can be guaranteed a skeptical eyebrow-raise from me. Thanks, Derrida.) In fact, I don’t even really have a problem with the claim that some ideas are so complex or counterintuitive or whatever that the text explicating them needs to be difficult. While I would argue that many conceptual difficulties can be more or less cleared up by trying to explain one’s ideas to a bright middle schooler, in principle I don’t have a problem with some texts simply being difficult. Anyone who has read and enjoyed one of Stephen King’s novels featuring Maine accents so thick you can hardly understand them has encountered an analogous phenomenon to some “difficult” theory. Readers of pulpy sci-fi or multiple plot-line “high fantasy” are in a similar boat.

So, what’s my problem, then? My problem is “theorists” or, even better, “thinkers” (ughhh) that don’t write difficult prose, but rather knowing prose, prose that will be read and appreciated (only) by those whose noses are attuned to the subtle aroma of rare discursive ambergris. And not only will this prose be read and appreciated, but part of the frisson of its appreciation is the disavowed knowledge that other people aren’t getting it because they aren’t as well-read as me and that I am, therefore, in some vague sense superior to them.

“Difficulty” is not the issue, nor is technical language or expecting a reader to do their share of interpretive work. The issue is the sly wink, the little nod of recognition that the reader and writer are, already, in the same club. Even more fundamentally, the members of that club refuse any attempt at trying to open membership to others not already a part of it. It’s “not their job to educate you.” (Yes, in fact, it is.) These writers make little attempt to explain their positions and give context to help bring their readers more fully into their discursive complex. They don’t seem to either be struggling to present the material or have struggled to think about it. When it comes to those not “in the know” – even before reading the book! – they simply shake their heads or shrug. Hélas, they say. What’re you gonna do?

In sombunal cases, “knowing” writing bears a resemblance to a bad habit I often see among highly-educated liberals: using “ignorant” as a slur rather than as a neutral descriptor. For these well-intended people, others who are not like them (i.e. anti-racist, anti-sexist, “woke,” cosmopolitan, desiring of adherence to politeness and “sensitivity”) are not like them, ostensibly, because they are ignorant. They don’t know enough. If they only went to grad school or read a damn book, they’d see the truth, just like the “right-thinking” liberals! While I share many of the positions these liberals espouse, at least the social ones if not their milquetoast economic stuff, I part ways with them over their refusal to admit the creeping condemnation that rides along like an invasive species with their noting that others don’t know fact X.

For the “knowing” writer, there are certain home truths (even when that writer is denying the existence of capital-T “Truth”). These truths are not up for question because in most cases they are not even made explicit. And, more importantly, one should demonstrate a certain affect about these unacknowledged truths. Those in the know are the “good” people, predestined by God in a latter-day literary Calvinism to paradise, while those unfortunate not count themselves among the elect have no hope to escape Hell. The reader not in the know, for the “knowing” writer, is a benighted rube and will, hélas, just have to stay that way, I guess. What’re you gonna do?

In many cases “knowing prose” isn’t marked by anything direct or explicit in the text. Rather, the “knowing” haunts it. There’s something in the tone, or the little parenthetical jabs, or the diction. To put it simply, “knowing” prose gives off a “vibe.” Talking about things in terms of “vibes” strikes me as a phenomenon worth considering. Complain all you want that this is an imprecise Zoomer re-appropriation of hippie slang, it still seems quite useful to me. “I just get a bad vibe.” You feel it in some peripheral part of your perception, like the little nudge you get to grab an umbrella before you leave for work, just in case. I wouldn’t argue that one can live on vibes alone – you need an argument, too – but vibes nonetheless serve as a useful starting point. And attention to the “vibe” of a text is precisely what leads me to frustration with such “knowing” writers. They have no sense of the nasty “vibe” they give off.

My internal Freudian speaks: “yes, but could your frustration not really be a projection of your own habits and tendencies onto a text?” Of course it could, and it probably is to some degree. I live in the same world as these “thinkers,” or at least in an adjacent zip code. I am definitely guilty of looking down my nose at others, and of doing so because they aren’t in the know. And yet. Two further points come to mind:

  • Does projecting onto a text necessarily disconfirm the observations in that projection? That is, does the possibility of my irritation stemming from projection mean, by itself, that I am therefore wrong in my observations? Could it not be that my observations are both born of projection and accurate, at least in some cases?
  • Does the fact that I have no doubt both looked down my nose at others and projected my own bad habits onto a text mean that I must do these things, or that I will always do these things? One would think that people might grow and change – otherwise no one raised in a racist society could become anti-racist. Despite the hemorrhaging of church membership and attendance, the Anglosphere sure seems to still pump out a fair number of Calvinists.

The “knowing” writer commits what is for me a cardinal sin in exposition: discounting entire groups of readers from the get-go as a way of further defining their own sense of worth and sufficiency, and of doing so at the expense of everyone not in the club.

I want to make something perfectly clear: I do not intend to argue that malignant, willful ignorance does not exist, or that non-college-educated people have some kind of “authenticity” which the college-educated have lost. I likewise do not want to argue that ignorance of particular facts makes one see more clearly. Learning about biology or ecology, for example, will (often) change one’s mind about how things are, hopefully in good ways. Rather, I want to point out that what I’ve called “knowing” prose does both the writer and the reader a disservice by alienating them both even further than they already were from others they assume not to be “in the know,” and does so without any basis in facts on the ground. They aren’t alienated from readers who will react antagonistically to their writing, or people who have no interest in it, but intelligent, sympathetic readers who are simply not (yet) playing the same “language game” as that of the “knowing” writer. Writers should write for a specific audience. But to structure that audience on the basis of a prelapsarian predestination to benightedness and the hellfire of “ignorance” hurts the writer in the end, and not least because it shrinks their potential market share.

Consider: let’s say you know something. Something important and useful. You want to write about it. Writers, as far as I know, want to write to explain their ideas to others, to engage with others and convince them of something or show them something they hadn’t seen before. The “knowing” writer does all this, at least to some degree. However, the “knowing” writer is not, deep down, actually upset that others don’t know or care about what they know and care about. If everyone read their book, gave the ideas some thought, and then adopted them, the “knowing” writer would no longer be special! To actually communicate their ideas and write to others effectively, the “knowing” writer must give up that extra little spurt of dopamine they get every time a benighted rube gives them a blank stare or asks a too-basic question. And in this day and age, who will willingly give up free dopamine?

That the people who write in this “knowing” way often identify or are identified as “radical” thinkers is especially egregious. I won’t deny that I have probably read more books than many people who didn’t go to college, but that just means that I have more work to do in writing for others. (Note even here one of the assumptions that inform “knowing” writing. For all I know, my neighbor who worked as a house painter for decades has read substantially more, and better, than I have.) What good is my erudition and knowledge if I don’t use it to the benefit of others who lack these things, or who have similar levels of erudition but outside of my field or area of interest? Why would I want to have a sense of self so deeply dependent upon there being others than whom I am “better” in some vague sense? Isn’t such superiority the logic of the “white man’s burden?” Of course, the “burden” borne by the white man is a scam – there was/is no intention of making good on any promises to improve the lives of those counted by the white man among the burdensome. Even leaving aside for the moment the question of what the white men think counts as “improvement,” the fact remains that being the one to heroically bear the burden feels much better than working toward solving the problem that led to one’s shouldering the burden in the first place. Having one’s cake and eating it, too. Martyrdom without all the nasty dying bits.

Last week [a month ago] I started (and stopped) reading Günther Anders’ Philosophy of Technology by Babette Babich. I was excited to hear about the book and actually requested that the UNM library buy a copy when it became available. Günther Anders is one of the overlooked thinkers of technology in the 20th century whose works, as far as I know, are still not translated into English. Since I don’t read German yet I was excited to see a philosophical biography and contextualization of Anders’s work that, I hoped, would make his work easier to read once my German is up to snuff. While I’m sure reading the book would help me approach Anders’s works with fewer unnecessary hurdles, I don’t think I’ll finish it. Not because of problems I have with Babich’s project in general, but because of her writing. She writes as someone “in the know,” someone willing to take on the hard work of thinking about the things that “really matter” and that are vital to our time “now more than ever,” and to do so from a position of barely-concealed scorn for anyone not likewise bearing this romantic burden.

Her introduction starts with a meditation on Anders’s habit of working from home (he never held an academic appointment), comparing it to the social changes due to the COVID-19 pandemic with deep-sounding musings on “home-work,” etc. While there’s a version of this idea that makes an interesting point, her way of expressing this meditation positions her as someone “in the know,” someone who understands the “real stakes” of social distancing, wearing a face mask in public, and working from home. As though the difficulties, frustration, and confusion of the pandemic were not by now felt bone-deep by everyone. She writes like an unselfconscious parody of a university professor, with diction that would read as a bit stilted and too-flowery if it weren’t so ridiculous. Even more than Heidegger, whom I would argue is the locus classicus for “knowing” writing, Babich is clearly “in the know,” and wants to make sure you know it too, maybe even more than she wants you to understand Günther Anders’s work. Even Nietzsche had some tact and decency. For all his claims that the readers had not yet come who would be able to read his books, he at least clearly suffered from his writing.

I won’t give examples of her “knowing” writing here because I don’t want to read any more of her book (and, on a personal blog, I don’t have to!) I don’t mean to pick on Babich in particular, her book just had the misfortune of serving as a nucleation point for subterranean grumblings I’ve registered pretty much since starting grad school several years ago. She is definitely not the only “knowing” writer I have encountered.

To conclude my screed, one more differentiation. I do not think the “general reader” exists. And, if they do, they are probably not particularly quick on the uptake. One cannot and should not write for “everyone.” This, in fact, does “everyone” a disservice. Anyone making a good argument will have a specific audience, including detractors and antagonists. If you don’t seem to have any enemies, double check your argument because you didn’t make it well enough. In contrast to the “knowing” writer, the honest writer is aware of their antagonists and takes them seriously if and only if those antagonists return the favor. Those unwilling to take your ideas seriously, even if only to argue against them, don’t deserve your time. But to then take the “knowing” stance and look down on them makes you even less worthy of being taken seriously. The “knowing” stance demonstrates nothing more clearly than one’s own weakness. Iron sharpens iron. To paraphrase Nietzsche, there’s nothing like a good enemy, but to the “knowing” writer is about as desirable as a hole in the head.

On stopped clocks (Stopped Clocks series #1)

Is that clock right?

(Note: I wrote this before the events at the US Capitol on January 06. After some time to gain more clarity on what happened there, I will probably consider those events in a similar vein to the essay below.)

They say that stopped clocks are right twice a day, but we need to ask three questions about this saying:

1. how is a stopped clocks right (if it is)?

2. what keeps the stopped clock from being “more” right?

3. what should we, who are (presumably) more right, do differently given that this stopped clock is sometimes right?

Several years ago I went to a meditation group meeting at a local library. I was a bit at loose ends socially and thought that meeting some new people or trying something new might be good for me. I was expecting the standard breath meditation, maybe a bit of “Om” recitation, a bell, maybe some candles. What ended up happening, however, was something much more interesting.

I don’t remember the name of this group and can’t seem to find them online. Long story short, they played their hand pretty quickly, and that hand was aliens. The woman leading the session was very nice and didn’t seem “woo-ey” at all, but as we settled in to the session, she described how the world is in spiritual peril but our friends from outer space are here to help. The goal of the evening’s meditation was to visualize a hole in the top of the head to receive the transmissions of extraterrestrial healing, and then to visualize that healing as a beam of light emanating out into the world through the forehead. The leader of the session reminded us gently that when we lost concentration on this healing energy, we were to silently think “Om” and return to the transmission.

We did this for about 45 minutes. I went along with it and found it relaxing, if not world-altering (for me, anyway). After the session, the leader mentioned that the group meets at a house twice a week to meditate together and invited all those at the meeting to join. That the group met regularly wasn’t strange, but she followed up her invitation with something very interesting. Another of the participants, who had been all in on the alien stuff from the start, asked if people attending these meetings should bring snacks or anything to share. The leader responded that no, these meetings were strictly business – obviously long-time members knew each other and sometimes socialized outside of the sessions, but the sessions themselves were totally purpose-driven. The point was to heal the world, and that’s what the sessions were for, period. She mentioned that a guy had been coming to the sessions religiously, never missing a meeting and seeming to take it very seriously, and none of the other participants even knew his name. He was just there to do the work. He would show up, say hello, sit down and get to meditating, then say goodbye and leave. Performing this meditation in a group, according to the leader, made it more effective. If I remember correctly, it had to do with the signal coming through more strongly when meditating as a group.

So what’s the point? While this experience was one of the stranger ones I can remember happening to me in a library, it is an excellent example of a situation that calls for interrogating a stopped clock. Let’s go through each question individually.

1: How is the stopped clock right (if it is)?

This example presents some challenges because it relies upon belief not only in the existence of extraterrestrials, but also in their benevolence and superior wisdom. Modern UFO stuff has a lot in common with the theosophy of the 19th century, including the concept that there are beings superior to humans in wisdom and/or understanding of the world who want to impart spiritual wisdom to those with ears to hear. This is not the part that is sort of right. What is sort of right, instead, is that the world is facing tremendous conflict and that this conflict both calls for and potentially responds to active effort on the part of people working as a group. For the stopped clock, there is a problem: spiritual poverty leading to violence, destruction, etc. This problem has a potential solution: the superior spiritual wisdom of the aliens. This solution can be applied: group meditation channeling the “good vibes” in a workmanlike way, which will help enlighten the world, leading to an end to strife.

Of particular interest here is the last element: the “workmanlike” effort at effecting this change. While the leader of the session was clear that at least some of the people in the group did know each other and socialized, she was equally clear that this was not expected of anyone participating. Like the nameless regular, there was no requirement that anyone make this practice a part of their social life or commit more time than that required for the group sessions. It was work, not a “hobby.” One could show up, do the work, and leave. I will return to this point, which I believe to be crucial, in a moment. For now, let’s turn to the next question.

2: What keeps this stopped clock from being “more” right?

This one appears, on the surface, to be easy. The problem is, again, obviously The Aliens. I am personally agnostic on the existence of extraterrestrials, although I assign it a fairly high probability. That being said, without proof positive of their existence (and benevolence, superior wisdom, etc.) this is an obvious candidate for this particular example’s impediment to being “more right.” The real impediment, however, is not just the aliens, but rather the practice itself. I don’t doubt that systematically trying to exude “good vibes” has a positive effect. So does being considerate, polite, charitable, etc. But, and this is important, these positive things don’t necessarily have material effects on a large, persistent scale. Channeling spiritual wisdom may make others feel good, but it won’t, by itself, feed them, or clothe them, or take care of them when they’re sick. “Thoughts and prayers” only go so far.

The clock stoppage here is that this practice doesn’t do anything in material terms. More specifically, it doesn’t operate on terms that are agnostic on or even opposed to belief in the aliens and their wisdom. A no-strings gift of food or medicine works regardless of who makes it, or under which auspices. The recipient doesn’t have to share the donor’s beliefs for the donor’s work to have a positive effect. My channeling good alien vibes, on the other hand, can only have a positive effect if other people know I’m doing it (barring for the moment the possibility of some kind of “spooky” effects we can’t identify, test empirically, or scale up systematically). I have a hypothesis about the popularity of this kind of “good vibes” practice as opposed to more materially effective practices that I will tackle more fully in another essay. For now, let’s turn to the last question.

3: What should we, who are (presumably) more right, do differently given the way that the stopped clock is right?

Now that we’ve established that the stopped clock is right twice a day and we know how it is right twice a day, we need to think about how this might affect our own efforts toward our own, more right, goals.

As we saw in question one, this meditation group takes a systematic approach: 1) something is wrong, 2) there is a way to alleviate or fix what is wrong, and 3) that way is practice X, performed in a disciplined, group setting that is not necessarily a significant part of the participants’ private lives. The only thing missing from this formulation is a step between 1 and 2 detailing why the problem exists. This lacuna represents an important absence for two reasons: first, filling it is difficult and requires rigorous, dispassionate analysis, analysis which will have to be ongoing as situations change. Thinking clearly and systematically requires distance and time along with effort. It also requires tolerance for error, failure and the wholesale abandonment of reasoning and practices that can be shown through experience and reason to be ineffective. Channeling good alien vibes, because it can’t be measured, can’t be shown to fail or succeed. And the absence of proof positive of alien involvement means that practitioners of this meditation must take it on faith that these wise extraterrestrials can help by mysterious means. They take a passive approach, literally acting as “channels” for extraterrestrial vibes, rather than collectively deciding on an approach. Identifying this passivity leads to the second reason for the lacuna’s importance: the lacuna can give the game away. I’m still working on this idea, but my preliminary thought is that this kind of lacuna is probably one of the keys to distinguishing between clocks that are only right twice a day and those that are more reliable.

Conclusion: Starting the Clock Again

So what does this mean for those who are “more right?” What would a materially effective version of this “woo-ey” practice look like? All we have to do (but this is not so easy, as it turns out) is take the structure of this meditation/channeling practice, bring it “down to earth,” and slip in the missing second step. At the abstract level, this might look something like the following:

1. A problem X exists.

2. This problem is caused by Y (with the proviso that the causes can and will change)

3. There is a way to alleviate X (again, with the proviso that methods may need to change as well)

4. The alleviation of problem X can be effected by practice Z

For those keeping score at home, this will be immediately apparent as the structure of the Four Noble Truths of Buddhist thought. Much of what is sometimes called “Western Buddhism” (though it is not limited geographically to the “West”) actually betrays itself by sneakily “stopping the clock.” That is, it abstracts the problem it claims to be in a position to solve from empirically verifiable reality, and then removes the second step in the systematic approach, closing the loop and leaving the otherwise effective method eating its own tail.

As an example of a “more right” version of this practice, maybe something like this:

  • A group that meets regularly, not individuals
    • Individual action is clearly important, but the group itself should act as a group. This takes on a particular salience now in what I like to think of as the digital “culture of exhibitionism.” For now, it might be best to just say that many hands make light work and institutions can outlast their founders.
  • This group is purpose-driven, that is, it tries to avoid turning the work into a “hobby”
    • If there is a social element to the practice, it takes a back seat and is not a requirement for membership, which is defined by the work itself.
    • The group’s work will feel like work. Sometimes it will feel like pleasant, purposeful work, the kind of work that leaves one feeling tired but happy. Other times it will feel like a burden, but like a burden worth bearing. In Ursula LeGuin’s The Dispossessed, the language of the anarchist planet Annarres has a word that means both “work” and “play” at the same time. There is another word, kleggich, that translates to “toil,” something that is universally (and correctly) avoided. A significant problem here is that most of the pleasant work we now do is directed toward “hobbies.” The kind of work imagined in this “more right” practice reads to us, under current circumstances, as kleggich, toil. This will need to change.
    • Another significant problem here, one that I will address in another essay along with the group question, is that of social media. All I will say here is that it is effectively a Faustian bargain: yes, these networks help spread the word, but they also create incentives that detract from the actual work itself. My inclination, for reasons I will elaborate in the same further essay, is to avoid them.
  • This group knows why it is doing what it is doing, that is, it maintains a collectively articulated theoretical framework that answers the question of why the problem exists in grounded terms
    • It’s important to note that a working abstract theory is necessary to get the stopped clock moving again. Without active theorizing, however abstract it may be, the material work will tend to “dissolve” into vague platitudes and other bullshit. The map is not the territory, but an accurate map is still necessary when exploring. Continually ensuring that the map is as accurate as possible is crucial.
    • Also of note: any theorization must remain aware of the fact that others, even those whom the group may be trying to help, will not or maybe even cannot buy into their theories. The important thing is the work, not that everyone join the group (again, more on this in the next essay). A good litmus test is to ask whether the results of a particular theoretical position will speak for themselves in terms of action, regardless of whether or not outside observers buy into the theory. If the results won’t speak for themselves, the theory isn’t good enough.
  • This group is actively trying to obviate itself.
    • The alien meditation in the example cannot be obviated because it cannot be demonstrated to have failed or succeeded. Its practitioners can only lose interest and move on to the next thing. This proposed group, on the other hand, would know when it had achieved its goal, and disband at that point. People could stay in touch, obviously. It’s difficult to work together with others for any significant length of time without forming connections, but these connections are ancillary to the work itself. Once it’s done, the group no longer exists. The horizon for the group’s ultimate dissolution may be infinitely distant, but it should be kept in mind.

This has been an initial stab at a spiderweb of thought I’ve been batting about for a while. I think I’m on the right track here, but still need to do some thinking. I’m envisioning a series of further essays on this theme addressing some of the impediments to putting this kind of approach into practice including:

  • the popularity of “good vibes” practices
  • the lacuna of how/why giving the game away
  • the problem of work, toil, and “hobbies”
  • the problem of social media platforms

I don’t pretend to have the last word on this, and would welcome any comments or suggestions.

Masturbatory Fictions, Masturbatory Reading

Précis: some thoughts on masturbatory v. sexual reading; the problem of genre fiction being taken “seriously;” an elaboration on Byung-chul Han’s Saving Beauty in a literary context. 


Insomnia is not uncommon for me these days, and its silver lining is that I often come up with interesting thoughts and questions while tossing and turning. A few nights ago I was tossing and turning and generally not having a great time when my thoughts turned to reading. In my sleepless lucidity, I came up with a term that I’m going to explore here because I think it offers something useful: masturbatory fiction and masturbatory reading. 

I’ve been thinking about reading a fair amount recently since being quarantined. Now that the semester is well and truly over and I have summertime freedom ahead of me, I’ve been taking more time to get to some of the backlog in my “to read” list. One of the items on my list is Stephen King’s The Dark Tower series. My wife gave me a copy of the entire series for our wedding anniversary, and I’ve decided to attempt a yearly re-reading. (See this post for more thoughts on this). I sometimes resist reading fiction because I get “sucked in.” Francis Spufford describes himself as a “fiction addict” in The Child that Books Built, and I think I share that sentiment. As I was tossing and turning, I stumbled on a question: what kind of pleasure does reading fiction bring? People talk about “important books” or “books that changed their lives,” but I’m sometimes skeptical that this actually results in real, i.e. physical, change. I suspect, and maybe this concept of “masturbatory” reading can shed some light here, that most often novels and the way that people read them just makes one feel good, rather than actually do anything differently. The only thing that changes is the readers’ feeling of themselves, not the physical speech and action among others that really constitutes their lives. 

I started re-reading Byung-chul Han’s Saving Beauty and I find his critique of “smooth” aesthetics relevant here. Han’s critique is that what he claims is the dominant aesthetic today, the reflective and smooth, has separated beauty and the sublime. Beauty, de-natured of its potentially disturbing, even destructive, sublimity, becomes something that “feels good,” that “goes down smooth.” He gives the art of Jeff Koons as an example of art that conforms to what he calls the “society of positivity,” in which any alterity or otherness is removed. Followers of Han’s work will no doubt recognize this concern from several of Han’s other works, including The Burnout Society, The Transparency Society, and The Expulsion of the Other. “Smooth” beauty, of which smartphone touchscreens are another example, for Han:

only conveys an agreeable feeling, which cannot be connected with any meaning or profound sense. It exhausts itself in a ‘Wow.’

Han, Saving Beauty, 3.

Han uses Koons’s work and touchscreens as examples for another reason: they are reflective. The viewer sees herself in the screen or the piece, being reassured of her own existence by herself. There is no other person or people to whom the viewer appears and through which she might be assured of her existence as a potential actor. (The second chapter of Hannah Arendt’s The Human Condition is relevant here.) Rather than shock or disturb the viewer into re-evaluating her self and what she does by creating a dissonance between the viewer’s sense of herself and how she appears to others, under the dominance of smooth aesthetics, 

Art opens up an echo chamber, in which I assure myself of my own existence. The alterity or negativity of the other and the alien is eliminated altogether.

Han, Saving Beauty, 5.

This has concerning consequences for the possibility of moral judgment as Hannah Arendt would see it, but that’s a post for another time. My concern here is that if visual art creates this closed echo chamber of the self reassuring itself of its existence in an infinitely autistic loop, does contemporary literature do so as well? If so, how? Does it operate differently than visual art? 

The short answer to this question is yes. I think much contemporary literature works in a way similar to Han’s description of the aesthetics of the smooth to close the reader off from genuine alterity and the possibility of new, better ways of thinking and living. My goal in this essay is to try and figure out how the concept of masturbatory fiction might be useful in thinking about the version of “smooth” aesthetics in a literary medium. I’m thinking of the problem in two parts. I’ll outline them here, then discuss them further individually:

  1. “Masturbatory fiction” (also “Fleshlight fiction”)
    1. This is fiction that is not written with any intention of challenging the reader. It is fiction that “goes down smooth.” My initial thought was that this category could consist of genre fiction read for pleasure (especially series of novels like Jim Butcher’s The Dresden Files), but I’ve since thought about it some more and see titles like these as less of a problem. Masturbatory fiction that presents a real problem is fiction considered “serious” or “literary” that is written with “pure aesthetics” and “beauty” in mind, but that nonetheless fails to break the reflective loop of the reader’s reflection in the text. The social and cultural cachet of texts like these makes their smooth aesthetics a particular problem.
  2. “Masturbatory” reading as opposed to “sexual” reading
    1. Here I mean a reading practice in which the reader uses a text to feel good in themselves, that is, to masturbate, even if the text does not lend itself to such usage. There is no actual engagement, negotiation, or communication with the text from the reader – the text is approached as a passive reflection of the reader herself, thus creating the loop of the reader reassuring herself of her existence by seeing herself in the text and vice versa. I’m beginning to think that this reading practice is actually the bigger issue than the masturbatory texts themselves.

One qualification before I describe these problems in further detail is that I don’t think I’m arguing that one should not engage with masturbatory fiction. In fact, I’m thinking that the only way out of the impasse of the smooth is precisely to force these texts, which are written to be used/consumed for pleasure, to “stick in one’s throat” through critique and a more “sexual” reading. The idea that one can remain completely free from these problems is, I think, its own kind of masturbatory position, reinforcing the illusion of the Romantic “real self” in a way analogous to the aesthetics of the smooth, but oriented toward a more “critical” subject that, at least in her head, “knows better” than that. Insisting on only reading “The Classics” because modern literature is largely masturbatory doesn’t keep one from practicing masturbatory reading without realizing it.

Masturbatory fiction

Literature that lends itself to masturbatory reading (think “popular” fiction) uses language in such a way that a reader is likely to already be primed for or expecting. It uses language smoothly to reinforce what the reader already “knows,” but doesn’t speak. The reader will recognize herself (through her assumptions) in the text and, thereby, take pleasure in herself as she takes pleasure in the text and is “satisfied” at its conclusion. There aren’t any bumps in the road, or sticky-outy bits, the plot flows without any sense questions about what the characters are doing or why. 

One way that masturbatory fiction reflects the reader in itself is through the use of genre tropes that the reader can recognize in her identity as a “sci-fi reader,” or a “western reader,” or a “fantasy reader.” For example, potboiler detective novels have identifiable generic tropes that might include characterizations, plotlines, and assumptions. Even exemplars of this genre that do something new with it retain these generic elements which can be used to re-affirm the reader as a “reader of detective of novels.” For example, Jim Butcher’s The Dresden Files features a protagonist, Harry Dresden, who is a wizard. He works with the police as a consultant, and there is the familiar detective novel trope of “good” and “bad” (or incompetent, anyway) police officers that by turns ask for the detective’s help when they run into bureaucratic inertia or try to get him to scram so the real law can deal with the problem. Harry is a gruff, rather socially traditionalist figure who “knows right from wrong” and lives without the trappings of modern convenience. As the series continues, it is also revealed that he is an extremely powerful wizard who needed substantial training to use his powers effectively and responsibly, contributing to the affirmation of the reader seeing herself in the text as not only morally upstanding in a world of confusion, but also unique and special for something she simply is. She, too, is powerful but misunderstood! Harry Dresden acts as a way for Butcher to live out a fantasy that, if taken seriously, would likely seem uncomfortably retrograde and oppressive to many of his readers. By identifying with Harry uncritically, readers internalize their own sense of “specialness” and moral rectitude along the limited and traditionalist lines set by Butcher’s characterization. As the only wizard in Chicago, Harry is part of a special group that knows the “real” truth of the world: that magic and a whole host of spectral beings exist and operate just outside the ken of the merely human. 

The Dresden Files is clearly masturbatory literature. It makes use of genre tropes from fantasy and detective novels, “subverting” them in some sense but with a result that has an identifiable parentage in the genres that inform it. The novels present a problem, elevate tension, and resolve the problem in ways that do not challenge the reader or their assumptions about the world. For a “fantasy reader,” these novels reflect the self that is taken to already be there “inside.” The pleasure the reader takes from this masturbatory novel, then, is satisfaction in herself, reaffirming herself in an autistic loop that fails to consider what construal of the world the text itself offers and whether she thinks this is a construal worth having. Harry’s old-school gentlemanliness is naively charming in the novel, but also represents a host of repressive assumptions that the text discourages from being questioned. There’s nothing dirty about the sex that he and his love interest eventually have. Even that is pure as the driven snow. 

I don’t want it to seem like I’m just picking on Butcher and Harry Dresden. I quite enjoyed the volumes of the Dresden Files that I read, and found them quite useful to think with. I’ll return to this question more thoroughly in the conclusion, because I think it’s an important corrective to the idea that one should only read “serious” fiction. That notion is mistaken, for reasons I will turn to now. 

Masturbatory fiction that is clearly identifiable as such presents less of a problem when it occupies a position that is separate and distinguishable from “literature,” especially “serious” literature. Reading Crime and Punishment, for example, is not likely to make one feel “good” in the same way that reading Crazy Rich Asians is. That genre fiction is being taken more seriously by the academy is indicative of the fact that, as Han argues with visual art, this “serious” literature is being (further) dragooned into an ideological project to reinforce the reader’s pleasure in the self, as opposed to positioning the reader to find satisfaction in engagement with the world around her (that is in fact the condition of her being). Basically, it’s a way to keep things from changing. Literature claimed to be “beautiful,” especially anything described as “timeless” (more on this concept in another post) encourages masturbatory reading, even of texts that might actually, on a more critical reading, leave the reader not “feeling good.” The “serious” veneer of so-called “literary” fiction means that its masturbatory characteristics are more difficult to point out and potentially work against. Knausgard’s My Struggle is not embarrassing to read (to some) in the same way that Twilight is, simply meaning that those who might tend to read more critically because of a theoretically higher level of education or cultural cachet are given pleasure in the same way that the “masses” are in their stories. They are also thereby stymied in whatever attempts they might make at engagement with the world. Since they benefit more from the world as it is, or at least don’t suffer as much from it, masturbatory reading of “serious” fiction acts as further insurance that these people continue to avoid engaging with and changing the world. That they are in positions of relative power, where they might actually be able to make changes more easily than those at the very bottom, at least individually, makes the dominance of masturbatory reading especially tragic. 

It’s not the case that even novels that lend themselves to masturbation do so completely or necessarily. These novels tend to encourage a masturbatory reading, but there is another variable at play: the reader herself. The reader, by forcing herself to become attentive to the words of the text as a construal of the world can change the reading of any masturbatory text from an act of self-love, to an act of making love, which requires a partner and results in the challenge and potential that a partner brings.

Masturbatory reading

What I mean by masturbatory reading is basically a failure to read critically, but it’s not just uncritical reading. Taking a text at face value is actually less masturbatory because it might result in shocks (if the text itself is not masturbatory). Masturbatory reading also consists of assuming that the “pleasure” of a text is in its formal beauty, determined by those in positions of social authority. One uses the text to satisfy oneself by oneself, rather than challenging the text and drawing satisfaction from the novel possibilities that this challenge might bring into the open, even if the text itself was not written to “go down smooth.”

Masturbatory reading is a kind of corollary to masturbatory fiction, but I think tends to occur more frequently when one reads “serious” or “literary” fiction because of the cultural cachet that these texts have. Since they’re “real literature,” the idea that one such as my own lowly self could challenge them is anathema, arguments over canons notwithstanding. It’s like the only choices are completely uncritical reading that makes one feel good in oneself, or a disenchantment from the text so complete that it removes any possibility of pleasure or satisfaction from the reading exercise, thus leading to another kind of masturbatory pleasure: the pleasure of the ascetic. 

Genre fiction, again, like the Dresden Files, is fairly well culturally coded as “for fun.” Literature departments have insisted in recent years, however, that things like YA novels, science fiction, and comics are all topics that fall under the purview of academic literary studies, i.e. that can be “serious” and written about in a dissertation. On the one hand, they aren’t wrong. If we take literature or literary studies to mean “the study of texts,” then it seems obvious that comics, for example, would count as literature. On the other hand, however, the seriousness with which things like comics can now be taken, and the cultural cachet that surrounds them as “literature” exacerbates the problem of the smooth by removing the “guilt” one historically felt when reading masturbatory fiction. These formerly “guilty pleasures” now become something one can talk about openly and in public without shame. Han is relevant here as well.

For Han, a work of art that retains the sublime character of beauty offends its viewer:

A push comes from the work of art. It pushes the observer down. The smooth has an altogether different intentional nature. It adapts to the observer, elicits a Like from him or her. All it wants is to please, and not knock over.

Han, Saving Beauty, 6.

The desire to please, to not make any waves or stick in anyone’s craw eliminates the possibility of negativity, of the self being challenged. The possibility of experience, gained against negativity, “withers” (7). This has implications for masturbatory reading because the desire to please and its elimination of negativity removes the transgressive and excessive qualities from sex, leading to a situation in which

Dirty eroticism gives way to clean pornography.

Han, Saving Beauty, 9.

Eroticism relies on secrets, on transgression and excess, on not simply or only desiring to please. The sexual metaphor applies well here, I think. Consider the difference between a satisfying masturbation session and really satisfying sex. Sex is dirty (both literally and metaphorically), excessive, even dangerous. The possibility of being physically or emotionally hurt, even quite severely, exists much more strongly in a sexual encounter than in masturbation. One emerges from a satisfying sexual encounter physically tired or sore, with love-bites or even bruises to show for it. Taking masturbatory pleasure in self-inflicted pain is not the same because one inflicts the pain on oneself – it isn’t the result of a challenge posed by an Other who represents difference and negativity in the sense of “not-me.” It is not possible for one to “go too far” except by accident during masturbation – you only need a safe word with a partner. A partner may go too far, but intentionally, unaware that their action crosses a boundary. Such a challenge shocks the other partner, forcing communication and negotiation in which both partners gain experience of the other’s sexual proclivities and limits. Both partners change as a result of a sexual encounter. 

One reason that masturbatory fiction is less of a problem than masturbatory reading is that masturbatory fiction is, or was, marked as kind of embarrassing. It shared with sex the “dirty” or “guilty” element of pleasure. Sure, read Twilight in bed before going to sleep, but maybe not in a trendy coffeeshop, especially if you’re the kind of person who pretends to like French art films better than rom-coms. Pulpy sci-fi or detective novels, even when they are not themselves masturbatory in form, are culturally coded as “guilty.” They are, or were, to be greedily consumed in private. Paradoxically, this allowed for pulp sci-fi to express wide-ranging challenges to prevailing social norms and ways of thinking and acting. Social norms that took such texts as “less serious” or masturbatory paradoxically allowed sci-fi to encourage or even demand more “sexual” readings. 

The problem now, however, is that even works of “serious” or “literary” merit tend to be written and read in a masturbatory way. The bestselling books one “has to read” are increasingly, it seems to me, intended to “go down smooth.” They aren’t all like this, but as the quality of life and the possibility of the future continues to decline for those in the class of people who read New York Times bestsellers, it should come as no surprise that “beauty” should be a primary criterion of literary (and cultural) merit. 

That’s all on this for now. I’m still not sure if I have still have these concepts muddled. I’ll keep thinking about it.


Works cited and mentioned:

Arendt, Hannah, The Human Condition (I have the 2nd edition)

Butcher, Jim, The Dresden Files series

Han, Byung-chul, Saving Beauty

Spufford, Francis, The Child that Books Built

Meditations on Re-Reading

Précis: Meditations on re-reading The Dark Tower series. Thoughts on the practice of re-reading, especially at regular intervals; considerations of the temporal experience of re-reading and what it can tell us about making a better world.


NOTE: SPOILERS AHEAD

My wife gave me a complete set of Stephen King’s The Dark Tower for our wedding anniversary, and I’ve decided to attempt a yearly re-reading of the series. I read the series for the first time about two years ago on the recommendation of my father-in-law and loved it. Now that we’re all quarantined and the sense of time is slipping and getting fuzzy (at least for me), I found myself gravitating back to The Dark Tower for reasons that I’ll make clear later. I’ve just finished volume two of the series, The Drawing of the Three. I have final papers to write and final grades to assign in the next week and a half or so, so I probably won’t get started on volume three, The Waste Lands, for a couple weeks.

NOTE, AGAIN: SERIES SPOILERS AHEAD

The Dark Tower series is cyclical. The first and last lines of the series, which King has described as one long novel in several volumes, are the same: “The man in black fled across the desert, and the gunslinger followed.” Roland’s quest for the tower is an eternal cycle, although one that might be eventually end. He and his ka-tet, those bound by destiny, travel though worlds that have “moved on,” with the trappings of technology and civilization degrading and degenerating into unusability. Through a series of trials, the ka-tet travels to the Dark Tower, the nexus that holds together all of the worlds to face the Red King who is intent on destroying the multiverse held together by great Beams that intersect at the Dark Tower. I’ll leave it to the reader to find out for themself whether Roland is successful.

The cyclical nature of The Dark Tower saga makes it an interesting point of departure for some meditations on re-reading. I will mostly focus here on re-reading novels, but will address re-reading non-fiction (especially philosophy) toward the end of this essay. I’ll be making quick and dirty use of some ideas from the work of French philosopher Gilbert Simondon (filtered through that of another French philosopher, Bernard Stiegler.) This isn’t a formal paper so I’m dispensing with footnotes, etc. Besides, I’m really just using on concept as a starting point.

In the first reading of a novel, everything is new and surprising. The reader is pulled along through the narrative both by its novelty and by the impulse of the plot. The plot basically implies or poses questions – what happens next? how does this end? Even in formulaically written “genre fiction” like detective novels, techno-thrillers, or supernatural romances, part of the pleasure of a well-written novel is the way it manipulates the reader’s expectations and provides novelty. For example, Jim Butcher’s Dresden Files plays with the hard-boiled genre of noir detective fiction by making its protagonist a wizard (really – and it’s not YA fiction like Harry Potter). Long-time readers of detective fiction will recognize some common tropes from other exemplars of the genre, like the protagonist’s hard-boiled but ultimately moral and heroic nature and the incompetence of most of the police, and the fun twist is that Harry Dresden, the protagonist, is an honest-to-goodness wizard. The novelty of the plot in this example, then, is not the form of the plot itself, which is an established genre, but the twists and idiosyncrasies Butcher incorporates into it.

For Simondon, and Husserl before him, perception is not a passive act. All objects of perception are understood through the mind of the perceiver, and this mind is not just a receptacle for perception. The mind actively reaches out, “protends” toward new perceptions based to some degree on previous perceptions it has retained. As an illustration, think of a person standing on a beach watching the waves break. A person who knows how to surf sees the waves differently than one who does not, or one who is more interested in fishing than surfing. It’s not that the surfer sees “more” than the non-surfer, just that the surfer perceives different aspects of the same object that she knows how to look for and considers important.

After the initial perception, which again is not “pure” – there is no “pure” or non-judgmental perception – elements of the perception which were “protended” for, toward which the perceiver’s attention stretched, organize how those perceptions are retained in memory. The non-surfer may go home happy and feeling calm and peaceful because the waves only lapped gently at the shore that day, while the surfer may go home frustrated for the same reason.

Yes, but what does this have to do with (re)reading? As I said above, the first reading of a novel is new in the sense that, even given previous experience of the genre of novel one reads – or the experience of reading novels at all, this particular novel is new to the reader. We don’t know what happens yet, so we read on (or don’t.) It is easy to forget that even this first reading is not “pure” in the sense of non-judgmental perception. To continue with the example of genre fiction, The Dresden Files is obviously and immediately a detective novel, albeit an idiosyncratic one, so noir-junkies will “protend” expectations into the text that more casual readers of the genre might not, even if they recognize the presence of generic tropes. Even more basic, however, are the protentions inculcated in readers by our social frameworks that help us make sense of novels at all. The novel is a relatively recent form of literary creation. For Homer, for example, novels would probably not have made much sense, even if he could have read one, because they differ dramatically from the forms of literary production common to Homer’s time and cultural background.

On a second or subsequent reading of a text, the protentions one brings to a novel might become more clear. For example, maybe you read a book and told a friend about it, who then told you that every time a character stands up to do something in the novel he is described as “stretching his legs.” You didn’t notice this, and so you re-read the book with this claim in mind. Sure enough, you find to your dismay that this character does indeed do a lot of leg stretching. This example is somewhat prosaic, but it points to two important aspects of re-reading I think are worth lingering with.

First, perception is not passive and never “pure.” It can be “primed” to look for certain things and mark them where it may not have otherwise. I’ll talk later about how this can be used for making the world better, but it’s worth stopping a moment to consider the negative version of this kind of “priming:” conspiracy theory.

Again, this is blog post so I won’t go deep into the psychology or subjectivity of conspiracy theorists, but will only pause to point out that conspiracy theory is protention gone awry. The conspiracy theorist sees the object of their obsession everywhere, and any piece of information can be made to fit their understanding structured by this mis-protention. “Of course they’d say that, they’re X kind of person, in the pay of the Deep State, etc.” The problem here is not that the conspiracy theorist doesn’t see things “objectively.” Again, no one ever does. Rather, the problem is that they have hyper-extended their protention so they can never be wrong. At no point can they be brought up short and be required to rethink their claims or incorporate new evidence into a revised and necessarily partial (both in the sense of “incomplete” and in the sense of “interested,” like “I’m partial to”) understanding. For all they like to claim to be thinking, this is in fact exactly the opposite of thinking.

Second, perception can be trained and altered in line with one’s goals. An aspiring novelist, for example, might approach a novel she has enjoyed in the past with the intention to look for, that is protend her perception into, the stylistic and formal qualities of the novel rather than simply its plot and dialogue. She isn’t seeing anything that wasn’t there on her first reading, only actively looking for data in the same text that mean something different to her in line with her new goals. This is obvious to anyone who majored in English because they liked to read. Reading Frankenstein for pleasure is very different than reading it for your final paper.

It’s worth pausing here again to point out an important point. We protend into new perceptions constantly, whether consciously or not. We cannot “suspend judgment” completely, and have to be trained to do so even to a modest degree. If we could all magically see things “as they really are,” there would be no need for lawyers or negotiators. One of the possibilities re-reading allows is the opportunity to carefully consider and examine the protentions we bring to the object of our attention, and whether we want to continue using those protentions. This requires us to think carefully about what we are looking for, and, even more importantly, about why.

“Why” is the most difficult question, but in a sense also the most natural. We don’t do things for no reason. Humans are capable of intention and making choices in the world, a world which is of our own design. Death and taxes may be certain, as the saying goes, but these are not really the same kind of thing. Death is the great unifier. Everyone dies, and has always done so, regardless of where, when or how they lived. Taxes, on the other hand, require a whole host of other things to exist in order to make sense at all: money, the state, a sense of “civic duty” or responsibility, accounting, and so on. All of these things are produced and reproduced by humans and, because they are produced by humans, could be reproduced in other ways or ended entirely. This might seem obvious to some, but for others the idea that death and taxes have the same kind of certainty is an article of faith. Like conspiracy theories, claiming that the way the human-created world is is somehow “natural” inhibits thought, rather than stimulating it. For an example, consider the time and energy spent by Southern writers and politicians in trying to convince people that slavery was “natural.” A practice that we perceive with disgust was not only accepted but claimed to be natural not even two centuries ago.

This example should prick us to reflection then. What do we think is “natural” that is in fact part of the human-constructed world that could operate differently? And how could we make it that way?

Re-reading, then, is a useful way to illustrate a capacity humans have that goes far beyond just looking for hints at how to be a good novelist in a book one enjoys. By attending to our protentions and considering what we bring to a text and why, we can gain experience in performing similar acts of attentive consideration to the broader human-constructed world we live in. This is especially important in a time when media are reduced to “content” made to be “consumed.” To re-read a book, especially to re-read it with a particular goal in mind for a particular purpose is a weird atavism now. Sure, re-read it if you like it, but what are you looking for? Why make the effort? Just enjoy it!

(Note for another time: one consideration we might attend to is why the work I’m describing here, of reading and thinking critically, is not considered “fun.” Or why “fun” things seem to be the only things many people consider worth doing.)

Re-reading is an essential practice, especially in a world dominated by the drive of consumption. Many novels, television shows, movies, video games, and other media aren’t worth re-visiting, but those that are, ought to be. The critical faculties developed through the practice of re-reading may be all that stands between the hope of human lives worth living, and the possibility of precarity, penury, and nastiness, of lives of pure and thoughtless consumption, of lives without even a bad “why,” where our protending is simply done for us.

I may seem to be overstating the power of re-reading. It’s true, I probably am. But we are (always) living in the Kali Yuga, the time before the end of the world, and it’s worth starting somewhere.