Design a site like this with WordPress.com
Get started

Blog

Report from the Workshop: 06/06/22

[This was intended to be the report published in 06/06/22, but then Things Happened and I didn’t get it done. I’m publishing it now because I also haven’t had time to write a Report this week. Things Have Continued to Happen. If this post stops abruptly it’s because I’ve plum run out of bandwidth.]

Currently reading:

Yates, Frances: Giordano Bruno and the Hermetic Tradition

  • Quite long and very detailed, but still interesting. Scholars have improved upon Yates’ work since her writing, but still a good introduction to the Early Modern milieu before science and magic as we now understand them had been separated. Some of the arguments are directed at other specialists, but the book is still quite readable for someone outside of that field.

Wulf, Andrea: The Brother Gardeners: Botany, Empire, and the Birth of an Obsession [NB: finished in a couple days – worthwhile reading.]

  • Started over the weekend. A fascinating glimpse at the mid-eighteenth century and the birth of modern botany. I read Wulf’s book on Alexander von Humboldt, The Invention of Nature, and I appreciate her style and the way she contextualizes her subjects. Probably most interesting to me so far has been reading about the lengths to which English noblemen went to acquire plants from North America since their native gardens were dead and colorless in ye olde English winter.

I’ve recently had avocations on the brain. I started (but gave up on – just too dense and long for me right now) Priest of Nature by Rob Iliffe, a biography of Isaac Newton a little bit ago. I’ve also been reading about the hermetic revival in the European Early Modern period, and now I’m turning my attention toward botany and plants in general. I wrote about the edifying potential of popular science books in an essay post last week, and I wanted to continue thinking about the importance of avocational work.

Modern professionalism is, well, modern. “Profession,” in the Medieval and Early Modern periods meant something a bit more specific than what we think of with the word today. A doctor, lawyer, or churchman had a “profession,” while any other kind of worker, even a skilled mason or merchant, had a trade. “Professionalizing,” then, meant a more extensive or “deep” education, as integration into a social hierarchy and institution. To study mathematics at Trinity in Newton’s time, for example, one typically also had to take holy orders in the Anglican church. Professions, then, also required a certain degree of (at least public) profession of orthodoxy. Newton got special dispensation not to take holy orders, but not because of his heterodox positions on the Trinity. Rather, Newton was making such important contributions to mathematics (still part of “natural philosophy” then), that pastoral and clerical duties would get in the way of his progress.

Wulf’s The Brother Gardeners gets into the question of avocations as well. The main characters of her account are Peter Collinson, an English cloth merchant, and John Bartram, a farmer in the North American colonies near Philadelphia. In the early eighteenth century gardeners in England practically devoured botanical specimens from the American colonies, and John Bartram provided many of their seeds, cuttings, and specimens with Collinson as an intermediary. Both men were Quakers, which meant that neither could have attended university, but Bartram in particular is interesting because despite his lack of Latin and his comparatively humble status as a colonial farmer, he developed a deep and extensive knowledge of botany and came to play an important role in the spread of botanical science and North American plants to the Old World.

One might think that botany as a sideline would be natural for a farmer, but it’s worth considering that farmers grow crops to support themselves and their families – while some knowledge of botany in a scientific sense would no doubt serve them well, a Philadelphia farmer who grows corn or potatoes might not care too much about the specific number of pistils and stamens that a flower from Jamaica has, and what this means for botanical classification, pace Linnaeus. These were the kinds of things that did fascinate Bartram, despite his botanical collecting initially making him no money and even resulting in injury.

I’ve spent most of my adult life pursuing an education in fields abstract and humanistic. First history when I realized geology required calculus, then an interesting mishmash of humanistic subjects under the heading of “Liberal Studies,” and now philosophy. I’ve had jobs during the entire time I’ve been in undergrad and graduate school, many outside of or adjacent to academia proper: Lifeguard and swim instructor, grocery bagger and cashier, adult ESL teacher, high school English teacher, adjunct faculty at a community college, substitute teacher at an elementary school, science camp counselor, teaching assistant, and now library assistant at UNM.

Since staring on a more focused academic path I’ve come to realize that I don’t feel at ease with the majority of my classmates. Most of them are at least ten years younger than me and fresh out of undergrad, sometimes with little more work experience than some summer lifeguarding or grocery bagging. Their lives and priorities are totally different from mine, and it’s been difficult bridging the gap. I also find myself balking at the pressure to professionalize. Over the last year I’ve come to realize that academia has historically been my avocation, the thing I do that isn’t the thing that actually supports me materially but that I nonetheless spend a lot of time doing. I’m using vocation and avocation here in a modern sense. I don’t feel “called,” the root of the word “vocation,” to academia. If there’s anything I fell “called” to do, it’s the avocational projects I work at, like writing professionally and gardening. These aren’t “hobbies,” at least not in the derogatory sense of that word, and it’s interesting to note that “hobbies” are now increasingly becoming integrated into the “hustling” one does on the internet. No place is safe.

John Bartram, who spent years traipsing about the American colonies to gather seeds, cones, and cuttings for his European clients, started doing so because of his fascination with botany, as a favor for Collinson, and to the connected to the wider world of botanists, even for personal gain.

The point? My original goal was a PhD in philosophy. I think that has changed. I may need to run my farm like Bartram, but I’m still planning some trips to collect new and interesting specimens.

Advertisement

Essay: Plants and Dilettantes (written while annoyed)

[The reader may remember this post, in which I went on a screed against writing like a jackass. This post takes a similar tone initially, but I hesitate to call it a “screed” because I start out being annoyed, but everything turns out ok in the end.]

One of the most frustrating things that I encounter in reading modern philosophy is statements of this type:

Philosophy since X has only thought of A, B, and C, and it has (illegitimately) only taken such-and-such form. Really, philosophy should be Y, etc. This is why everything is bad today, and no one wants to read philosophy.

Just about any contemporary philosophy/critical theory text (which also unhelpfully and maybe even patronizingly assumes the reader has read everything in the “Philosophy” section of the library since the Pre-Socratics, has an extensive knowledge of the punk-rock scene in New York, San Francisco, London, etc., or knows what “musique concrete” is.)

I’m being hyperbolic, of course. I’m also being uncharitable, but that is my privilege since this is a personal blog. Not all philosophers write like this (I have enough logic to know this, at least), and in many cases the people who do make such claims have good points. I do get bored and annoyed with the sometimes excessively “poetic” style of some contemporary philosophy, but an even bigger gripe is the sense that philosophers don’t read (or at least don’t write about) anything outside of their discipline.

However, I’ve just recently read a counter-example to (some of) these claims: The Life of Plans: A Metaphysics of Mixture by Emanuele Coccia, translated by Dylan J. Montanari. Coccia’s volume is fascinating for a few reasons, some more abstruse than others. I have a couple of intellectual habits that I’ve often found difficult to fit into philosophy: for one, I want to apply it. Yes, yes, metaphysics. What is it for, then? What does it do? Or convince people to do? How does it fit with the rest of the world? And, more specifically, the world of stuff? Another of my “bad” habits has to do with my dilettantism. I once used quotations from a detective novel set in North Korea to make a point in a paper on Derrida. The prof loved it, but he’s a Heidegger guy and willing to experiment. Of course, with Derrida you can get away with quite a lot, but I don’t often read much philosophy that makes use of sources too far from the 100s of the Dewey decimal system. I think that’s a shame.

Coccia does a few things that I find immensely refreshing. First of all, his analysis is grounded in actually knowing botany. He writes:

From the age of fourteen to the age of nineteen, I was a student in an agricultural high school in a small isolated town in the farmland of central Italy. I was there to learn a “real job”…Plants, with their needs and illnesses, were the privileged objects of all study that took place in this school. This daily and prolonged exposure to beings that were initially so far away from me left a permanent mark on my perspective on the world.

Coccia, The Life of Plants, xi.

Coccia’s studies obviously eventually diverged from a purely vegetarian (haha) diet, but this deep, specific education in a discipline involved with stuff, one that revolves around living, physical beings that, as Coccia makes clear, present some significant challenges to the way human beings often think of themselves and their world, nonetheless informs a book of philosophy that doesn’t just address itself to a lifeless ivory echo chamber.

Probably my favorite facet of Coccia’s writing is not in the body of the text, which is nonetheless quite interesting, but in the notes. The book doesn’t have a bibliography or works cited page, which annoys me, but it does include extensive endnotes. And they’re a gold mine for a dilettante like me.

In several notes Coccia offers readers suggestions for popular treatments of topics in botany, cosmology, and evolution, among other sciences. The notes are also replete with technical and specialized sources, of course, but the inclusion of less specialized materials demonstrates not only respect for the reader but also a refreshing sense that one can (and should) look to sources outside specialized writing in philosophy proper for material to incorporate into writings philosophical.

Coccia is clearly no dilettante, given his training in botany, but his inclusion of popular works in the sciences demonstrates, at least to my mind, an acknowledgment of the importance of the kind of edifying dilettantism one cultivates by reading works of popular science. I’ll explain.


Today, people go to universities to get degrees that will get them jobs. To be clear, this is not a bad thing in and of itself (nor is it new) – many lines of work require specialized and technical knowledge that one can much more easily gain in a formal setting than by going it alone. Universities have specialized equipment, libraries, and other resources that private individuals typically don’t have unless they have Jeff Bezos money. Western universities have their roots in the Catholic church (and, if Christopher I. Beckwith is right, ultimately in the vihara of the Buddhist world via Muslim madrasas). Clergy, lawyers, and doctors made up the entirety of university student bodies until fairly recently historically, and their courses of study were intended to prepare them for careers in these fields and in diplomacy, etc. However, the focus on utility in education tends to dissolve the more humanistic elements of education understood as a means of improving oneself. As universities become more and more like corporations, the sense that one is doing something more than jumping through a hoop on the way to a job fades into the background.

Even in historic situations where one went to university to, for example, become a priest, the actual knowledge acquisition was supplemented by a sense that one was becoming a kind of person. A newly-minted Anglican priest with bad personal habits (or heterodox positions on the Trinity, like Isaac Newton) would not be likely to go very far in the institution, regardless of their mastery of the material taught.

Like capitalism, which dissolved feudal bonds (a good thing), but then set up new problems, the modern corporate university has largely dissolved the sense of molding or shaping particular kinds of people, all the “educating the whole student” stuff you see in their fliers notwithstanding. Universities no longer act in loco parentis, which is good, and in most cases public universities don’t make weird requirements of their students for purposes of moral control. On the other hand, this means that universities are slowly becoming further and further integrated into the general webwork of hyper-industrial capitalism, creating students who may know how to do a certain job (when they even know that), but that are otherwise disinterested in the world or learning more about it. Learning, which capital understands purely in terms of “efficient” utility, becomes something one invests in, but under the aegis of all capitalist investment: ROI. Without a strong value proposition and good possibility of return on one’s investment, learning becomes, at best, a kind of “hobby.” Or at least something one does not pursue with the kind of intensity that an iron-worker with a fourth grade education in the 1930s would have consumed offerings from the Everyman’s Library or Penguin. Since from within the mind of capital there is no possible incentive aside from capital accumulation, whatever kind of person is produced by universities must, first and foremost, be more or less completely “mapped” and set up for integration into capital’s net. Of course, being heavily indebted with neither real estate or financial instruments to show for it contributes to disciplining those whose mapping doesn’t stick.

Coccia’s book, for all its merits, falls victim (a bit) to the blindness to work outside of philosophy that I’ve been describing. He offers a variety of introductory texts on topics in botany, but part of the book’s argument is that philosophy has largely ignored plants, to its own detriment. I’m not in a position to adjudicate this claim, although Coccia makes good arguments. But here’s the thing. There are people considering and thinking about plants and the world. They’ve been doing it for years, but they haven’t been doing it in philosophy departments.

Examples off the top of my head: the works of Loren Eiseley, Michael Pollan, Merlin Sheldrake, Robin Wall Kimmerer and others (without mentioning similar work in fiction, documentary films, etc). Kimmerer works directly in botany, Sheldrake is a scholar of fungus, Eiseley was an anthropologist, and Pollan has written several best-selling books on human interactions with plants and food.

Now, the cynic might object: under capitalism, the only books that get picked up and published by prominent presses are books that fundamentally do not challenge the social order. While these books may be interesting, they can’t actually offer any meaningful change because they are so popular. I have two points in response to this.

First, making this claim does capitalism’s job for it. Like all other forms of social organization, capitalism presents itself as natural. Financial “survival of the fittest” and unethical dealing suddenly become acceptable when, before, usury, simony, and other rules of the game under capitalism were not just crimes but sins, transgressions against moral law. The stakes were much higher than a fine from the SEC. Again, the only incentive capital can see is maximizing profits and accumulating more capital – if you have to behave unethically or immorally to do that, then you can just go to a tent revival or Pentecostalist service, have a blissed-out ecstatic experience that you take to mean assurance of your salvation, and then get right back to “the grind.” Hey, you gotta do what you gotta do, and you have to think that it is natural and normal that this be the case. But here’s the thing: capital is myopic in this way. You, the person living in a hyper-industrial capitalist society, do not have to be. Capital is hegemonic and creeps its way into every nook and cranny of the world, but it doesn’t go all the way down.

Maybe it is the case that Michael Pollan’s books simply serve to reinforce and reproduce capitalist forms of life. But how can you know that if you don’t read them? How can you know that buried in the garbage, are valuable bits that could be used, repurposed, remixed, or argued against? For all you know, Pollan may be keenly aware of the limitations placed on him by the vicissitudes of the book marketplace. Maybe there was a truly trenchant critique of mono cropping in one of his books that an editor ordered cut out. Besides, since we all live under capitalism, Pollan has to make money somehow. He could do it in ways far more compromising than writing books about fruit.

Second: If anyone hopes to find a way beyond capitalism and its depredations, they should celebrate the fact that anti-capitalist sentiment and critiques of capitalism – some of which do in fact get published by large presses – are becoming popular and, in the process, moving out of niche subcultures and into the suburbs. It is entirely possible that a book one could buy at a ridiculous markup in an airport bookstore with dramamine and some gum might articulate critiques of capitalism or offer alternatives or food for thought. But one might never know, because the title sounds like something one’s dyed-in-the-wool Hillary voter parents would like. Surely a book available in such a place couldn’t have anything to say to philosophy, Regina Philosophiae Gratia Deo.

I will admit that a book called something like The Subtle Art of Not Giving a Fuck (a real title), or The Flower Child Within: Psychodynamic Gardening against Attention Difficulties (fake, but plausible), does not appeal to me. I certainly wouldn’t pay for either title. But! I would consider checking out a copy from the library or borrowing a copy. (But definitely not going to certain websites in search of a pdf…) I would read it not to just gulp it down uncritically, but to actually engage with the world and what all the people that will also have to be on board with The Revolution are thinking.


And so, after rambling in the brambles, we return to Coccia and to the possibilities in popular science books. If there’s a point to all this, it’s that insofar as philosophy understands (or understood) itself as a universal discipline, a discipline for which no part of the world is completely foreign or inaccessible, one of the philosopher’s first jobs should be to learn as much as they can about that world, and actually try to do something with that knowledge. Even if that means being a dilettante. Some degree of specialization may not only be unavoidable but necessary in a world of incredible technical complexity. But it doesn’t mean one should pass up anything on the other shelves.

Report from the Workshop: 05/29/2022

This week’s reading:

  • Ada Palmer’s Too Like the Lightning (finished)
  • Philip K. Dick’s Time Out of Joint (finished)
  • Frances Yates’ Giordano Bruno and the Hermetic Tradition (in progress)
  • Philip K. Dick’s The Man Who Japed (in progress)

Reading up next:

  • Jennifer Price’s Flight Maps: Adventures with Nature in Modern America
  • ???? (Possibly something by Samuel R. Delaney)

For a while now I’ve been working on a post (I’m thinking of it as an “exhibit”) about my index card files and how I use them in my writing. I was talking with M about it last night, and she gave me some good ideas to make the system not only more useful, but also more interesting for others to look at and use. (She always gives me good ideas!)

I’ll save the bulk of my thoughts on index cards for the essay series after I’ve gotten it going. For now, suffice to say that I have a pretty elaborate (but really not all that complicated) note-taking and filing system. 2022 is also year six of keeping track of (almost) all my reading on index cards in a file box. It might seem odd to read this on a blog, but I find that I do the bulk of my better work with paper and pen/cil. These Reports are some of the only things I plan, compose, edit, and publish entirely digitally.

Without going into too much detail, I was explaining to M how the system works and my recent attempt at overhauling it. It started life as note cards for my first master’s thesis, but I’ve since worked on expanding it. Anyway, some of the cards were labeled with topics that didn’t really work with the system as an open and expandable project. They were too project-specific. So I used white-out tape to cover the old topics, then write new ones and refiled those cards. M suggested that I keep the “metadata” of the change on the back of the card, which I hadn’t thought of and now think is a great idea! (I’m also considering including more meta-data like date filed, etc. but that might make adding to the system too unwieldy.)

During our discussion I mentioned how the system has holes and ambiguities. I’ve had people tell me that Evernote, for example, does all the things I want my card catalogue to do, digitally, without the need to write stuff out by hand or take up space. I could have the entire catalogue I have a variety of reasons for preferring the analog method (which I’ll go into in the essays). For now, I just want to make some observations:

1. No filing system or note-taking system is perfect. Things will inevitably go missing, get misfiled, destroyed, or something else. That’s the world, baby. And failing to realize this makes actually putting together a filing system much, much easier. (I work in a university archive and can confirm this. Stuff gets misfiled once, not called for again potentially for decades, then no one can find the thing until someone happens upon it looking for something else. Either that or one part of a collection is filed in a different place from the rest of the collection, but the finding guide doesn’t say that. You know, that sort of stuff.) If you know this will happen regardless of how hard you work, it takes the pressure off to get everything perfectly correct every time. Good enough is, as is often the case, good enough.

2. The less work it takes to keep a note-taking system organized and running, a. the less likely it is to be useful, while b. the more bloated it will become.

To point 1: While I try to keep a central repository for quotations and another for ideas, sketches, and snippets, I know I don’t have a complete record. I can’t have a complete record. The napkin I wrote something down on got washed. I wrote the quotation down right, but forgot to write down the page number of the book. It was, obviously, a library book, and that book is now, equally obviously, not available. Guess I’ll need to file that in the “problem” section and find another quotation to support my argument. If I had it in my head that I needed to keep track of every single idea or every single quotation I marked, I’d not only never get them all organized, but never have the time to use them. While accuracy and thoroughness are necessary to make the system work, keeping in mind that I shouldn’t even try to make the system exhaustive or “complete” means that I actually end up using it. It also has the benefit of spurring continued work on the system, which brings me to point 2.

To point 2: I write out ideas, lines of description, and quotations on lines 3×5 inch notecards. I do this all by hand, typically using a ballpoint pen. (I mark passages in library books in pencil, obviously.) Then, I label them, alphabetize them, and file them. All manually. Typing might take less time, but the time (and effort) is part of the point. When I flip back through the book looking for my marks around useful passages, I have to keep in mind that I’m going to have to write out whatever I end up keeping by hand. Since I want to be able to use the notes later, they have to be neat, which requires care and time. I can only go so long before my hand gets tired and my handwriting starts suffering. Each card, then, already represents a decision: it costs time and effort (and a card and some ink) to make each card, and that means that I don’t copy down everything I mark. Just particularly useful or well-put lines that will fit on one side (sometimes a bit on the back) of one index card.

This doesn’t include the labeling, keeping track of topics on the backs of bibliography cards, filing, and refilling after using the cards. I’ve put physical effort into the system and it takes up physical space as a collection of discrete objects, meaning that using it feels much more “real” than reading highlighted text from a pdf. It also makes the cards easier to manipulate, and since the quotations are already the result of some level of discernment (whether they are worth the effort or not), I feel confident that I won’t be wading through repetition or stuff that doesn’t really matter. Plus rearranging index cards makes much more tactile sense to me. Plus, they never need to be charged.


That’s all for now. I’m working on getting into a regular posting schedule, at least over the semester. I’m anticipating the index card series to take me a fair bit to plan, draft, and prepare (including pictures!), but I’ll shoot for having part 1 up later this coming week.

Spandrel: Plants want to grow

[/spandrel]

…I discovered some volunteers in the compost pile! They appear to be tomatoes, surely sprouted from some Roma or cherry tomatoes that I fed the pile some time ago.

volunteers in the compost pile (sometime at the beginning of May, 2022)

Volunteers interest me because they demonstrate something important: plants want to grow. Given even somewhat right conditions, plants will sprout and get to creating their own environment, reaching up to the sun and down into the earth at the same time.

In a way, the position of plants resembles the way Buddhist traditions understand humanity. Unlike the Abrahamic traditions in which humans represent a particular and special part of God’s creation, in Buddhist thought humans don’t differ ontologically or teleologically from other beings. Human life remains valuable, but, at least in the Mahayana tradition, differs only in degree from other forms of life rather than in kind. Hence the prevalence of vegetarianism in Mahayana schools. Humans, (in)famously, make their own environments. But only after plants have made one for us first.

But though humans don’t differ in kind from other beings, human life nonetheless remains precious because we are ideally suited to awakening to what the Buddha taught: that life is characterized by suffering. Humans are conscious and, therefore, aware of our suffering as such, but we are also relatively limited in our abilities to remove the possibility of suffering from our lives. By contrast, gods and other celestial beings “higher” than humans enjoy long lives devoid of suffering, but because their lives are completely carefree, it never (or rarely) occurs to them to try and figure out the problem of birth, death, and rebirth. In traditional accounts, this means that after death gods and celestial beings, having done nothing to expiate their bad karma or gain merit and insight, tend to be reborn “down,” as animals or humans.

Many cultures use trees as symbols connecting the worlds above and below, and for good reason: terrestrial plants are the reason any terrestrial animals exist, and, therefore, any intelligent life (on this planet). The bumper sticker that reads “Trees are the Answer” isn’t incorrect, just a bit limited in scope. Plants convert solar energy into sugars and turn solid rock into soil, making life possible for others – in a way, we humans (and other heterotrophs) have a parasitic relationship with the autotrophs that built the possibility of our world from bare rock and CO2. [Cue Agent Smith in the scene where Morpheus is chained up in The Matrix.]

The same volunteers, the end of May 2022

But parasitism isn’t exactly right. Humans are responsible, along with other factors, for the extinction of a great many plant species, but we also have something approaching a commensal or mutualist relationship with a significant number of others. But regardless of our relationships with them, humans can learn something from the plants around us: plants make worlds that suit them without thinking about it. We humans, parasites par excellence, can’t not think (I think?). But if we look to plants, who patiently make worlds day in and day out, sometimes suffering, often dying, we might catch a glimpse of a positive way of being, a way of being that makes worlds constantly, because it wants to…

[\spandrel]

Report from the Workshop: 05/22/21

Report: plants, writing, and knowing what happened to you.

[Note: this post will consist of slightly more “confessional” material than I would usually publicize. Not baring my soul type stuff, but maybe “meditations” like Descartes (only without my converting to total rationalism).]

Earlier this week I received confirmation that when I injured my knee in March I not only completely tore my ACL, but also sprained my MCL and damaged two meniscuses. Cautionary tale not to get dancing-drunk at weddings, I guess. I appreciate now actually knowing the score rather than being caught between optimism and frustration, but knowing has also brought new problems.

M has several medical professionals in her family, and has passed my information on to them. One says that I might not need surgical repair, another seems to suggest that I definitely should have surgical repair. I await an appointment with an orthopedic surgeon to get more insight.

Here’s the confessional part: I hate doctors. Not the people themselves, or Medicine in general (this isn’t some anti-vaxx bullshit), but going to the office itself, sitting and waiting, getting weighed and measured and blood pressured, waiting again, getting poked, prodded, told to lose weight, asked questions I feel like I’m supposed to know the answer to but don’t, being expected to advocate for myself when all I want to do is get the fuck away from the linoleum and scrubs and standard-issue old magazines and bad landscapes on the wall. I’ve been working with a physical therapist for a few weeks now to get my knee stronger and I not only like her, but actively enjoy the sessions. And yet, the part of the building outside her office is The Doctor’s, and my heart is always nervously pounding when she comes out to greet me. Even my old therapist, whose practice was in an office building and looked distinctly unlike a medial clinic, made me nervous. (She was surprised to learn this since she wasn’t a psychiatrist. I said I couldn’t imagine anyone not being terrified.) About the only people I have a more severe allergy to than doctors are sports coaches, “motivational” people, and obnoxious businessman types.

Needless to say, the possibility of surgery, of entering the very Belly of the Beast (the hospital) does not have me feeling great. Knowing that needles will likely be involved makes things worse.

Why do I share this? For one thing, I have tried to see this as a way to do some desensitizing training. If I go to the doctor enough times without anything bad happening to me or anyone making me feel bad, maybe I’ll start feeling better about it. I don’t think anyone enjoys going to the doctor, but my aversion is so severe that, before covid, I hadn’t had a flu shot in nearly a decade. I recognize that, for some, this will seem horrible unethical, but it’s hard to express how much the thought of needles makes me afraid. My blood pressure skyrockets, I start shaking, I turn pale, and, in extremis, I start saying really, really nasty things.

I will probably be posting more about this topic as a way to try and deal with it. Maybe.


On a different, and more pleasant, note, M and I went to the botanical gardens this morning. I wasn’t able to walk all that much, but we enjoyed the flowers and the nice weather. I’ve mostly been planning my own garden since I can’t really do much physical work in the yard because of my knee. As we walked I got to thinking about taking cuttings and the ethics of taking cuttings.

I recently read an interesting book by Emanuele Caccione called The Life of Plants. I’m still processing my notes from that reading, and will eventually post a full-length review here, but I wanted to mention an interesting point that Caccione discusses: plants make their own environments.

Right now, our backyard is mostly sand and gravel. This is not uncommon in Albuquerque, where lawns are an expensive (and wasteful, I would argue) use of water. There are, of course, all kinds of plants that will grow here just fine without supplemental irrigation after establishing themselves, but figuring that out takes more effort than I think many people are used to. I’ve done a bit of planting and some putting about trying to shovel up gravel, but the bulk of the work will have to wait until my knee is better.

I got to thinking about taking cuttings because plants can be fickle. They don’t always do what their planters want them to do, regardless of what it says on the label. A good bet, then, is finding established plants around you and trying to propagate them individually, letting them make an environment that humans and other people can enjoy as well. Taking cuttings also makes me think of my reading and writing process. I read a book, mark good lines, and then “take cuttings” by writing the lines out on index cards and filing them. (I promised a fuller treatment of this practice in a previous post and, now, renew my promise to get it written. At some point.)


Aside from posts to this blog, I haven’t done much writing lately. With the semester now truly over and some more daylight hours available to me, I hope to get into a more consistent writing habit. I started working out a short story earlier this week that I think has legs. Of course, one benefit of surgery would be an excuse to sit and write all day as I recover. We’ll see.

Spandrel: Teaching God to Behave Itself

[/spandrel]

…Christian Gnosticism is interesting to me because it posits that the world was not created by a perfect being with perfect omniscience and omnibenevolence, but rather (in some versions of the story) by Yaldabaoth, the misbegotten creation of Sophia, one of the Emanations of God furthest from the perfect Pleroma where God him(it?)self dwells. Yaldabaoth, also called the Demiurge, is the first mis-creation in this world, and therefore fancies himself a God over all the parts of creation that come after him. [From a Buddhist perspective, this makes him the most deluded being in the universe given that he not only fails to realize that he too is marked by impermanence, but in fact claims to be the only being not subject to birth and death.]

Since, from the Gnostic perspective, the world contains evil because it was created by an imperfect being, the answer to “why does God let bad things happen to good people” is (partially!) resolved: God lets bad things happen to good people because 1. the being we’re calling God isn’t really God, and 2. that being is partially himself evil. Or, at least, deluded or ignorant and so not omnibenevolent since not omniscient.

This also means that in some instances Gnostic thought posited that spirit, since it ultimately still does come from God (the real one, not Yaldabaoth) is good, and the material world created by the Demiurge is therefore entirely bad. Here I part ways with the Gnostics.

Some positions in Gnosticism also posit that the Demiurge (Yaldabaoth) is actively evil or antagonistic. I think this is too simple. It’s a much more interesting situation if the Demiurge isn’t good or evil but maybe blind, ignorant, impaired or deluded somehow. Then, I think, we can do something about it by engaging with the world.

Nietzsche argued that in his attempt to “make room for faith,” Kant assassinated God without having the guts to declare himself the killer. (See also Dostoevsky’s Crime and Punishment – the “pale criminal,” indeed). When Nietzsche’s Zarathustra comes down to the Spotted Cow and wonders that the townsfolk fail to realize that God is dead, the ramifications of Kant’s wordy stiletto-thrust have yet to make themselves fully felt. No one seems to get it, no one seems to care that something momentous has happened. Instead, the last men “blink.”

What if God is not dead – if he was ever alive at all – but rather, blind, groping, maybe crippled or confused? What if Nietzsche mistook the death of an image of God – the late-Medieval perfect and omniscient God of a rationally ordered cosmos with a hierarchical chain of being built in to it – for the end of the possibility of God as a kind of structural point, like the “it” in “it is raining;” an asymptote, or a singularity that allows for the possibility of relations with it, but is not itself a real object?

The job, if God is not dead but ignorant or deluded, would be to teach him how to behave through our own actions and example. If humans have free will despite the cosmos having been mis-made, then we can freely choose to look back at God and say, “pay attention. I’ll show you how it’s done.”

[\spandrel]

Spandrel: Strategic Belief

[Note to the reader: a “spandrel,” in architecture, is the roughly triangular space between one side of an arch and the ceiling above it. In medieval churches, these would often be filled in as decorative elements. In biology, a “spandrel” is a phenotypic trait resulting from some other trait, rather than as a direct product of natural selection. The human chin, for example, is proposed as a biological spandrel since it apparently doesn’t do much. In both cases, the term seems to mean something like a “byproduct,” one unintended or incidental, maybe, but that might nonetheless find some use. I’m using the term for some incidental (or “occasional”) thoughts that don’t quite have the legs to become essays yet. Because I think of these posts as fragments, I’ve included elipses on either end, connectors to a context not yet defined.

Housekeeping note: for now I will probably file these posts under “essays,” but I may create a dedicated page in the menu if I find them proliferating.]


[/spandrel]

…once I had a professor tell me that he was a “strategic Freudian.” We were reading Words With Power: Being a Second Study of ‘The Bible and Literature’ by Northrop Frye, and he was responding to the question of whether Frye’s analysis of myth as the disavowed basis for social order was similar to Freud’s ideas on repression.

I don’t remember what he actually ended up saying after declaring his strategic Freudianism, but the idea has stuck with me for some time not only for its rhetorical possibilities, but also for its potential therapeutics. Rather than insisting on belief as an all-or-nothing proposition, strategic belief would seem to allow for greater flexibility and more fruitful generating of thought experiments or hypotheticals.

Of course, one could argue that “strategic” belief is not belief at all, but a kind of cynical relativism. I would respond, at least initially, that here the emphasis should be on the strategic element. Strategic belief tries to do something with the idea in question. Of course, that might still lead to unintended negative consequences, but it still strikes me as potentially quite useful.

On top of that, cracking the rigidity of one’s beliefs by intentionally “inhabiting” an idea one might not entirely agree with seems potentially useful to me as a way to put some hairline cracks into the calcified edifice belief becomes when it has no challenge or nuance. I’ve talked about the idea of “translucence” of one’s mind to oneself (see here and here). Strategic belief seems like it might prove helpful in opening those shades a bit more, letting more light in…

[\spandrel]


Essay: Some general principles, part II

[I’ve written this post to stand on its own, but readers curious about why the numbers start with “3” will want to take a look at the first post in this series. ]

Make things explicit

Make your commitments explicit – at least to the greatest extent possible. That’s where I start these considerations. Most of us, most of the time, run on a kind of autopilot, not really thinking about what we are doing or why, and letting habituation run things for us. This phenomenon should not come as a surprise to anyone, really, and I don’t intend to argue that one should work to remain fully aware of everything all the time. I would guess that just about everyone (at least everyone who lives in the modern world of jobs and commutes) has gotten in the car or on the bus and found themselves halfway to work before remembering that they meant to go to the drug store.

Injuries throw this phenomenon into sharp relief. My knee has improved substantially since I injured it in March, but I still have to be careful how I move. I was shelving books in the archives on Monday and needed to get up to the top shelf. I stepped onto the little rolling stool (you know, the vaguely cylindrical one all libraries have), and stepped up. In so doing I put just a bit more pressure on my injured knee than was wise, while also twisting a bit. Nothing popped, and the pain went away as soon as I straightened my leg out, but the experience told me that I still need to exercise caution. I can’t let my knee go on autopilot. Yet. Injuries, then, also show that the ability to do things without having to actively concentrate on them is also crucial to normal functioning.

Another reason for making one’s commitments explicit has to do with the therapeutic effects of writing these thoughts out. I don’t intend to suggest that writing about your problems will solve them (not least because I have no medical training), but at least in my own experience, getting these things out does often help one identify places where one might have some agency, an opening for something new or different. It also helps one find the knots and holes in their world, the places that still need elaboration – or even to exhumed.

On that note, I’ll continue with my general principles. Again, I take these as kind of “rules of thumb” that tend to structure my life, but that don’t go “all the way down.” Hopefully they make some kind of sense.


3. You don’t just see the world – the Earth worlds

I don’t think it constitutes going out on a limb to suggest that a real, mind-independent world exists outside of us. However, the suggestion that any individual could have complete, “objective” access to this world – to “see the world as it is” – strikes me as totally bananas. And also harmful.

For Heidegger, the “earth worlds” (where “worlds” is a verb). This means that the Earth, or parts of it, coheres into a particular “world” for a particular person based on that person’s existential projects. That is, how the world appears has to do with who a person is and what that person, therefore, does. Once you start actively trying to notice birds, you will see them everywhere, all the time. Try it. Once you learn how to surf, the ocean looks different than it did before. Now you see that the calm waves that appealed to you for swimming or floating aren’t any good for surfing. What appears and how it appears to you as the Earth worlds will differ for everyone. This is not to say that each person sees a different and mutually incompatible world – I think it’s pretty safe to say that the point here is rather that the valence or trajectory, or flavor of each person’s world differs from that of anyone else’s.

Timothy Leary (I think) called this a “reality tunnel,” making use of the emic/etic distinction from anthropology to make this point. For those unfamiliar, an “emic” perspective is “within” the object of study. A historian studying the social effects of Sufi lodges in Late-Ottoman Turkey who is also herself a Muslim, approaches the topic, at least partially, from an emic perspective. Another historian studying the same topic and period who is not a Muslim, on the other hand, would be approaching his research from an “etic” position, “outside” the object of study. Now, problems exist with this distinction, not least because any observer affects their object of study somehow, and often in ways unpredictable ahead of time. So, really, no one ever has a “pure” etic perspective because one must have some connection to the object/topic/person/etc. at hand in order to make any sense at all of it. But though “purely” etic perspectives remain beyond reach, one can, crucially, remember that one has a particular perspective.

Making one’s principles and beliefs explicit serves a useful purpose here, as well. Like most of these principles, I question (when not flatly denying) the possibility of transparency rather than translucence. I can know that I have a particular position, and bear this in mind when talking with other people – especially people I disagree with – but I can’t once and for all, thoroughly and systematically lay it all out. Why not?

For one, where is the “I” that would do this? Are these not “my” opinions and commitments? If “I” can see through myself, I have to see myself seeing myself, then myself seeing myself seeing myself, ad nauseam. Whatever “I” sees through me has to already be behind me, but by then I have to “I”s! So no. On pain of infinite regress, I cannot know my own mind transparently. But I can, by making my thoughts explicit, gain some translucence. How?

Let’s say I spend an hour or so writing out some thoughts. Then I get up from the table, make some tea, eat dinner, watch TV, whatever it is people do. During most of these activities I return to autopilot mode. [Just to clarify, this is not bad. It is necessary for life and not (completely) avoidable.] The next day, still in autopilot mode, I get up, go to work, do whatever it is I do for a job, come home, and find my notebook still sitting on the table. I idly flip through it while waiting for some water to boil and turn to the page where I made my thoughts explicit. Suddenly, I find myself face to face with myself but outside of myself. I see the words I wrote and find myself “snapped out” of the autopilot, if only temporarily. I have more thoughts on the imporatnce of what French philosopher Bernard Stiegler calls “tertiary retentions” or “epiphylogenetic retentions,” which lead to the next principle.

4. Cognitive structures don’t stop at the inside of your skull.

I consider it a significant limitation that people seem to think that their brains are where they do all their thinking. Even more limiting is the notion that one’s “self” is something external, transcendent, and unconditioned – a Cogito from…somewhere driving one’s body around like a car. Maybe the most limiting idea, actually, is the accompanying notion that thoughts are not “things” that motive action or that one has some degree of control over. Thoughts aren’t “real,” since they happen “inside.” My principles on this require careful elucidating, and I’m not even sure I’m getting what I want across. I’ve given preliminary and clumsy presentation of some of these ideas in my most recent Report from the Workshop, so hopefully readers won’t lack all familiarity. Consider principle #4 the most “work in progress” of the principles so far.

Consider: When you say what you think, you use words that are publicly available. Wittgenstein’s “beetle in a box” thought experiment has convinced me, at least, that we don’t have access to a “private” language – to use language, we must be integrated into a previously existing symbolic structure and adopt its use and conventions. When one speaks their native language, it feels “natural.” One might occasionally struggle to find the mot juste or to push a phrase off the tip of one’s tongue, but in general one’s native language doesn’t feel like speaking a language at all – it just feels like speaking. Contrast this with learning a foreign language (especially as an adult) – even speakers with a high degree of proficiency might still struggle sometimes and make mistakes. It takes a long time of dedicated practice and use for a language other than the language one grew up speaking to just feel like speaking.

Further, consider the ways that we use language. Speech comes first chronologically, although one listens long before one can speak with ease. After speech come reading and writing, complicated skills that sometimes present distinct challenges, but that, in most cases most of the time, any child can learn. The means by which we read and write – clay tablets, papyrus, bamboo slips, palm-leaf manuscripts, rag paper, digital screens – all exist outside of us (in the sense of not being part of our bodies). But today, at least in “developed” countries, it is nearly unthinkable for an adult to have never learned any reading and writing. “Functional” illiteracy, or not having read a book ever again after high school, we can understand, but not not knowing what a book it as all.

Reading, then, stands as a kind of language use that requires things “external” to us – objects that do not have vocal chords, mouths, and lungs. And these objects, once we adopt them, cannot subsequently be separated. No matter how hard I try, I cannot “un-learn” how to read the languages that I read. I might be able to train myself to focus on the letters I read rather than their significance as words, but even if I study type-faces for their aesthetics, I read the words the type spells out.

A “reading” mind is, then, different than a “non-reading” mind. And the differences don’t stop there. Consider reading a physical codex versus reading a digital version of the same text. While both count as “reading,” in the sense that one visually decodes arrangements of symbols, these readings nonetheless differ substantially in a variety of ways that are too detailed to go into here. Suffice it to say, that writing a to-do list on a piece of paper, or in an application on a smartphone, is a good example of the extent to which one’s mind does not stop at the inside of one’s skull, or even at the inside of one’s body generally.


I hope the reader will forgive my clumsy writing in this post (and the previous one in the series). Part of working to make one’s principles explicit involves making false starts and persisting at the edge of one’s conscious experience. I’ve insisted that one doesn’t have fully transparent access to the contents of one’s mind, but one can gain a certain degree of translucence. I’m making these posts public because it strikes me that writing so that others might read and understand what I’m talking about forces me to take an even further step outside my own head. I can’t rely on shorthands and assumptions, since I remain aware that others might not share them.

Report from the Workshop: 05/15/2022

Earlier this week I wrote this post about my birthday last month, describing how M got me some cool stamps and offering some ruminations on my mobile Wunderkammer full of personal trinkets and their memories.

Also for my birthday, I treated myself to a book I had anxiously awaited for some time: On Quality: An Inquiry Into Excellence, edited by Wendy K. Pirsig. This slim volume collects some important excerpts from Robert M. Pirsig’s two groundbreaking novels, Zen and the Art of Motorcycle Maintenance and Lila, as well as previously unpublished letters and transcripts of a few talks the famously reclusive Pirsig gave over the course of his life.

Robert Pirsig’s work has influenced me deeply. I’ve probably re-read Lila about five times, although I’ve only read Zen once I think. I’ll have a full review of On Quality up probably by the end of next week, but until then I wanted to talk a bit about tools.


In addition to excerpts and other occasional writings clarifying Prisig’s understanding of Quality, the book includes photographs of some of Pirsig’s tools taken by his nephew, David Lindberg. The photos are quite stark, in black and white, and depict wrenches, planers, sockets, and other implements that one might find in any well-appointed garage. The tools don’t appear to consist of anything really all that special – Pirsig seems to have preferred the Craftsman brand – but the photos demonstrate something that resonated quite strongly with me as I considered the treasures in my Wunderkammer: these tools were clearly used.

I have a long-time interest in tools and their use. I’ve been re-reading Samuel R. Delaney’s wonderful Nova this week, which also has some relevance here. Taking place in the third millenium of the Common Era, the characters of Nova are equipped with sockets at their wrists, ankles, the base of their spine and the nape of the neck that allow them to “plug in” to machinery ranging from P.A. systems to starships, controlling the machines with their own neural impulses.

Nova was written long before The Matrix saw the silver screen in 1999, but the idea of plug-in interfaces between humans and machines is probably about as old as machines that could be plugged in at all. [This would be an interesting research project. I’ll add it to the Compost Heap, about which more in an upcoming series on my index card catalogue, but will probably never get around to following it up.] What interests me about the plugs in Nova, versus the single “jack” at the back of the neck in The Matrix, is that the characters of Nova have plugs on their extremities as well – there’s something bodily, something somatic about their “plugging in” to the machines, and these allow one to “plug in” to the real world. While The Matrix deals with philosophical questions of the “brain in a vat” type, Nova addresses the idea that the brain by itself is not what makes humans capable of using tools. In a real sense, the brain does not stop at the inside of one’s skull. Nor does it stop at the tips of one’s fingers, or the palm of one’s hand. The brain is not separable from the body, and the body is not separable from the tools it uses.

To return to the photos of Prisig’s tools. Apart from photos of what appears to be some kind of garage-grade hair dryer, a drill bit, and a set of router attachments, Lindberg’s photos depict hand tools, and these tools have clearly been used. The shining steel wrenches are blemished and worn. Some are probably quite old – Pirsig lived into his seventies and died in 2017 – but their wear and tear is clearly the result of use as well as age. There is also a photo of Prisig’s 1966 Honda Super Hawk motorcycle, now housed in the National Museum of American History. One can easily imagine Pirsig using his tools to keep the machine in good repair.

From the photos in On Quality one gets the impression that Pirsig kept his tools organized and neat – but not too neat. The photo on the book jacket depicts a drawer of wrenches and sockets pulled out from a cabinet. They are laid neatly in the drawer, but not, it appears, in any particular order. While the viewer cannot know whether the objects in the photo were staged this way or not, I like to think that this depiction is an honest representation of how Pirsig arranged them. The photo gives, for me anyway, the sense that though these tools were not organized by size or type, Pirsig would have immediately known where they were when he needed them, probably without even having to think about it. Like the objects in my Wunderkammer, or the files and stacks of papers in my office, the tools and their arrangement represent the life of their user.

I have some wrenches and sockets, but they have yet to see anything like the use Pirsig clearly put his own tools to. Rather, I have pens, pencils, index cards, notebooks, and a laptop. How could these things be similar?

Nova depicts humans interacting “directly” with machines, making the machines parts of themselves through neural connections that obscure the separation between human and machine. The self, then, and I think Pirsig would probably find this thought amenable, does not exist “inside” one, separate from the world “outside,” but rather takes its existence in part through the tools one uses, the “external” aspects of the body that one takes up. Sometimes, when I’m writing something with a pencil, or typing on a laptop keyboard, I lose all sense that the computer, the pencil, the paper, is “outside” of me. I don’t have to think about what I’m doing, because part of what it means for me to do anything at all means taking up things outside of my body (which, as Emanuele Coccia points out in The Life of Plants, is both inside and outside, container and contained, of the atmosphere responsible for the possibility of my life). The computer is made of metal and silicon rather than flesh, the pencil is made of cedar instead of bone and sinew, but at some point I don’t remember this at all, and it becomes one thing.

When I write, it feels like I’m working on something, bringing something out, tuning something up. It’s like working on a motorcycle without knowing whether the thing in question actually is a motorcycle rather than an outboard boat engine, or even a clockwork windup toy. I arrange my tools, get the material in front of me, and get to work, staying out of my own way. We don’t need to “plug in” to machines and technological devices external to us – that tools are not foreign to us, that they are, in a sense, not even really “outside” of us, is probably the single most natural thing to human Being.

Essay: A Belated Birthday Address

On April 26th I turned 34, an age I like because 3 is a prime number and 4 is the square of a prime number. Both digits add up to 7, which is also prime (hell yes), as well as an auspicious number in several schools of esoteric and hermetic thought. Of course, next year is 35, in which both digits are primes (sick) but their sum is not (bummer). However! 8 is two cubed, and that’s still pretty good. Cubes strike me as even more esoteric.

And so, I’m firmly in my mid-thirties. That doesn’t really mean a whole lot to me except that now I have to care about the lumbar support on my desk chair in a way I didn’t before. Turning 34 has, however, gotten me thinking: how long before I can justify becoming a model train guy? I used to think I would wait until about 40 for that, but with the turning of another year plus having a nephew that will be ambulatory in short order, I’m wondering if I shouldn’t speed up the timeline some. I don’t wear white New Balance sneakers and I have little interest in either the Civil War or World War II, so I need something for some uncle cred. I already like birdwatching and, now, stamps, so maybe model trains are a logical next step?


For the last five years or so I’ve been participating in Postcrossing, an online platform that allows members to send and receive postcards from all over the world (maybe a great-uncle type thing to do). I love digging through my postcard collection to find the perfect one for each new address I get, and it’s always fun to get a card from someplace far away in the mail. But it isn’t just about the cards. I also love to see interesting stamps from all over and use nice stamps on my own cards. For my birthday this year, my lovely wife, M, got me some vintage US stamps. She also gave me some old canceled stamps from different countries around the world. One was a set with a dinosaur theme that included a stamp with a pterosaur on it from, wait for it, North Korea. Badass.

When I opened the little packet and realized what was inside I spent about a half hour ooohing and ahhhing over the postage. I would never have thought to buy the foreign stamps for myself, which made the gift all the more special. After arranging them all on the table I cracked up because of how silly I felt. Apparently I’m a character from a Charles Dickens novel: morally complicated, kind of stuffy, living in a hell-world of hideous economic inequality, and pumped about old stamps. I also wanted a pudding (flan) for a birthday treat, which made me feel like Oliver Twist or something. M was relieved when she saw my reaction to the stamps – she thought I would think they were silly, but they turned out to some of my new favorite things.

I haven’t arranged the stamps in an album yet, but it’s on my list of things to do. Again, as a Victorian child, I love an album. I have binders of postcards from my several years of Postcrossing and an album of ticket stubs, money, and other paper ephemera from my various trips. Almost all of my class notes from undergrad and grad school are squirreled away somewhere in plastic bins or folders, as are old journals. My photographic experiments with a Fujifilm Instamatic camera that M gave me a few years ago (she’s a talented present-giver!) now nearly fill three of their own albums as well. But for all that, I don’t think I could ever be a serious stamp collector, even though it would give me a reason to buy yet more albums. Not least because I don’t have the money to “invest” in stamps – or anything else for that matter, although I will say that anyone who wants to give me a gold Krugerrand would get Christmas cards from me for the rest of my/their life. (To be clear, not because Krugerrands are made of gold but because I think the name is funny, the guy on one the face side has the beard-but-no-mustache look of the seafaring people from The Wheel of Time, and the other side has a Springbok in the process of “pronking.” Plus, it’s legal tender in South Africa but doesn’t have a value stamped on it, which I like for some reason.)

While I’m kidding about the Krugerrand (mostly), I do often find that the things that most interest me aren’t the things that command the greatest price or airtime. I don’t mean the platitude that “the best things in life are free” or some variation on that theme. That old chestnut is simply not true – a postcard from a botanical garden, pistachio ice cream, a nickel flattened by a train, or a movie ticket stub that you find inside a used book aren’t free, even though they’re some of the best non-people things I can think of off the top of my head. Rather, I mean that collecting things for the sake of completing a set, or because of some externally defined standard of valuation, strikes me as odd. What if the other members of the set are ugly, or expensive? Or, the worst possible thing, not interesting? If I’m going to collect anything, it’s going to be because I like the particular items in question, because I find them interesting – I don’t really care if they’re valuable.


In my office at home, on a high shelf in the closet, I have a box of treasures. One of these is a McDonald’s toy shaped like a Big Mac that “transforms” into a dinosaur. I have had this toy since I was in the single digits in age, and it fascinates me. The box also contains cards, bits and bobs, and various other trinkets that I have accumulated over my life and that have either sentimental value or interest for me. None of these things is valuable in the monetary sense (I think), but I nonetheless like to think of this box (which started its life as one of those picnic boxes for a wine bottle and later carried, hilariously, a small hookah) as a kind of mobile Wunderkammer [literally “room of wonders”], a nomadic Cabinet of Curiosities that I’ve carried from one residence to the next over the years.

I don’t open the box to look at the things inside very often. It’s actually kind of an intense experience, bordering on the unpleasant. There is lots of Time in there, and Memory. In a way, I’m not sure I even keep it for myself. Maybe this is a bit Romantic of me, but I think I actually keep the things inside the box for posterity. I think that’s also why I’ve never been happy with ereaders and prefer physical codices, why I keep ticket stubs, why I take notes by hand, and why I’ve kept journals and notebooks – sometimes obsessively – for most of my life.

Last night I was working on my card catalogue (about which more in an upcoming series) and came across this quotation from Byung-chul Han’s The Burnout Society:

The imperative of expansion, transformation, and self-reinvention – of which depression is the flipside – presumes an array of products tied to identity. The more often one changes one’s identity, the more production is dynamized. Industrial disciplinary society relied on unchanging identity, whereas postindustrial achievement society requires a flexible person to heighten production.

The Burnout Society, 44.

I would substitute “hyperindustrial” for “postindustrial” here, but the point remains the same: personal identity has been well and truly hooked to market circuits of consumption, and the more identities change, the more there is to consume. To “Become a Stamp Collector,” I would have to buy: albums, stamps, other people’s collections, books or memberships to websites, flights to go to conventions, etc. A pretty penny for somebody, and more or less trackable and calculable. If Amazon sees me order a stamp album, you can bet your britches that my recommended items – now populated by books on Hegel, knee compression sleeves, and gardening stuff – will soon include reference books on stamp values, and maybe even some numismatic stuff (why not? I collect stamps, don’t I?)

I don’t think Han’s point is to argue that we should work to have a permanent, unchanging core Self that stays consistent across time. At least, I would hope not if he considers himself a faithful reader of Heidegger (which appears to be the case). Rather, the point is that hyperindustrial society incentivizes changing one’s “identity” via consumption, on a whim, to drive consumption, which then dynamizes production. With a weekend, a new Twitter account, and a few hundred dollars, I could become a sneakerhead, an “energetic healer,” or (God forbid) a crypto guy. The danger, then, is that I would mistake my consumption habits for truths about myself, rather than see my “self,” such as it is, as a process of projecting into a future that is not yet known and remains unforeclosed. The self, on Han’s account, is plastic. I would argue that this is not new to hyperindustrial or postindustrial society, but rather has always been the case. Capitalism has just established a parasitic relationship with this pre-existing aspect of human Being.

So what does this have to do with stamps, exactly? Or that weird treasure box I’ve described? If human Being is plastic enough that it can be harnessed and channeled via patterns of consumption as a vehicle for the flow of capital, this Being can also be modulated, molded, formed, and shaped in other ways, including through one’s immediate surroundings: one’s objects and the significance they have. While I’m sure that the sneaker guys who buy and sell and collect shoes take some sense of abiding satisfaction from that pursuit, it still strikes me as having capitulated to the insistence of hyperindustrial capitalism that its cellular components (us) exist the way it wants us to. It would strike me as a bit sad, although maybe this is unfair, for a sneakerhead’s grandchild to find grandpa’s shoe collection in a storage locker and spend time poring over it, reliving and widening their experience of their grandparent.

The things I keep, that I find valuable and wonderful, help to bolster and reproduce my own sense of self and the kind of Being I want to inhabit. But I also think of these things as affecting the Being of those who come after me. What will my grandchildren think, for example, of the reading log I’ve kept on index cards for years? For one thing, they’ll probably find it a pain in the ass that I insisted on keeping these records of paper, rather than in an Excel file. But, hopefully, they’ll also see the way my handwriting will change over time, growing spidery as I age until eventually someone else has to write the cards for me. Maybe they’ll think of the dusty shelf where I kept the box as it filled up, and then think of the shelves full of books that I insisted on long after physical books on paper were “practical.” What will they think of the odd assortment of bits and bobs in my carry-on Wunderkammer? The funny rocks, the stamps and bookmarks and ticket stubs and boxes of notebooks they’ll find when I’m gone? Will they be able to simply throw these away?


I’ve given the ideas above a fair amount of thought over the years. One of the handful of things I’ve published is a story called “Going Home” that touches on the sense of taking things with one, especially inconvenient physical objects. You can read it at The Ekphrastic Review here.

I didn’t set out to write something “serious” when I started drafting this essay. I guess things just turn out that way sometimes. One last point before I leave off philosophizing, though. If you weren’t convinced by my meditations above, think of it this way: M gave me some old stamps for my birthday, and from one stamp, for reasons singular to my own strange reality tunnel, I wrote an entire essay. How’s that for the importance of physical stuff?