Power overwhelming, or: Velma is lesbian now

Recently, it was announced that Velma – of Mystery Inc. Scooby Doo fame – is canonically lesbian. This came as a surprise to no one who has ever pondered the matter; the reaction could best be described as a subtle mélange between “well, obviously” and “finally”. While the canonical works have, up until now, been somewhat ambiguous on this point, the general opinion among those vaguely familiar with the series has for the longest time been that, yes, she is very much lesbian, no two ways about it. The general intellect has firmly assimilated this fact, even if it is not reflected in the source material.

This is an interesting state of things. Fictional characters only exist as far as they appear in the text – be it on a page, on screen or some other media. They have no objective existence outside of the text, they are pure representation; when the movie ends, that is it. What lingers is the memory of having experienced these representations, and a series of logical inferences that can be drawn from these same memories. Fictional characters in themselves are not this or that; they are fiction, made up, we can change them at will by writing them differently. And yet, these phantoms of representation can have inescapable features that impose themselves whenever someone mentions a character’s name. Velma is a fictional character, she is not real; she is also, unequivocally, lesbian. The fiction has a material solidity to it.

This interesting state of things has been the source of much confusion over the years. Not least in terms of dead authors, which began as a dry technical observation on the craft of literary criticism, and then took on a life of its own. We should not understand the death of the author to be a radical separation between work and writer; rather, we should understand it in terms of the author not always being the best conversationalist about the works in question. Authors, unlike fictional characters, are actual persons with an actual body and an actual capacity to produce words about the things they wrote. More importantly, authors can only be in one place at a time, unlike fictional characters who can be many places all at once. More importantly, fictional characters can be the center of many conversations at a time, while authors can, at best, juggle three or four, given a sufficiently stuffed dinner table. Physical authors are a conversational bottleneck, and fictional characters thrive in conversations. The more of these conversations there are, the better. It is only natural for conversations to outpace their inciting incident.

There have been a great many discussions about Velma, the fictional character. These discussions have mostly trended in the same direction, with more or less explicit sapphic overtones. As the years went by and these discussions faded into the ambient background noise of popular culture, the obviousness of Velma’s orientation became more and more entrenched. It is no longer a point of contention, a matter o debate, a question to raise; it is settled, part of the strange materiality of fiction. It could not be otherwise.

I contend that Scooby Doo has become archontic. A feature of becoming archontic is that the original source material – the movies and series about Scooby Doo, in this case – are placed in a context where they have equal status with other works. Or, indeed, with years’ worth of accumulated conversations. If you were to watch an old episode of the series, it would be filtered through a contemporary understanding of what it entails. The present imposes itself.

A slightly less sapphic example of archontic texts are modded video games, where players have become so used to playing the modded versions that the original is a possible mode among many. Any given play session could go with one of the mods, or with the unaltered game; it could go either way, both are valid options. But the unmodded version will forever be reinterpreted in light of there being different ways to go about playing; the conversation will have these altered states in mind, and proceed accordingly. There is no turning back, there is no primacy of the original. There are only further conversations.

The original creators of Scooby Doo may or may not have intended for Velma to be an unequivocal sapphic icon. Intent is immaterial, however. Years and years of people talking about Velma as a lesbian has cemented this version of fictional fact as actual truth, and so it became an unavoidable matter of course. To paraphrase the archons of Starcraft: the lesbian presence of Velma is a power overwhelming. Making it canonical is not only natural; it is the path of least resistance, given the materiality of fiction.

Advertisement
Power overwhelming, or: Velma is lesbian now

The fantasy theme of rationality, or the big mood of speedrunning

Speedrunning is an eminently rational practice. In terms of being a rational practice, it has the advantage of wearing its prime rational value on its sleeves. That value is speed, which supersedes any and all other priorities whenever a run is made. A run being whatever happens between pressing “new game” and the end credits rolling; whatever it is, faster is better.

At this point, some readers might object that completing a video game in the shortest amount of time possible is the opposite of rational, and that nothing good comes out of it. This is a valid criticism to make, but it misses the point, and slides on the definition of “rationality”. Being rational means having a goal and taking effective steps towards achieving that goal, measuring and evaluating one’s performance over time. The goal itself may or may not live up to the description of being a virtuous pursuit worthy of striving for, but once the goal is established, whether a course of action is rational or not is fully determined by whether it advances the progress meter towards that goal. Rationality is fully process-oriented, and thus, it can be rational to play video games at high speed.

What makes speedrunning such an eminently rational practice is that it lays bare the shedding of other values in the pursuit of the goal. During the course of a run, the speedrunner actively ignores or bypasses many aspects of a video game that are usually considered vital or important, such as text, story, narrative themes, feats of graphical majesty or works of musical mastery. The goal is to get to the end as fast as possible, and stopping to smell the roses (or behold the game as a work of art) is the opposite of going fast. As the runner finds ways to bypass ever more components of the game, we come to understand that this is how other rational processes come to shed components of a practice that might be viewed as important when considered by an outside onlooker. Rationality strips away everything that is not goal-oriented and distills the process to the barest viable minimum. In the context of speedrunning, we have the advantage of knowing explicitly what the goal is. In other social or economic practices, we are unfortunately not always so blessed with clear definitions.

Fantasy themes have the advantage and disadvantage of being one of the most useful analytical tools stuck with the absolute worst name imaginable. A fantasy theme is a critical (as in ‘rhetorical criticism’) attempt to recreate the spirit of what it was like to be there (as in “you had to be there”). Most fiction acts according to this principle – skilled authors know how to write in such a way that when the punchline or climax arrives, the reader is in the perfect frame of mind to appreciate it. The fact that it takes a while to arrive at said frame of mind is a feature, not a bug. When an author or a social situation manages to create a fantasy theme powerful enough that more than one person feels it, the results are pure magic.

Of course, reconstructing these themes after the fact is a difficult thing, and even more so to write a text that conveys what went through people’s heads in actual social settings. The payoff, however, is immense – decisions that in hindsight are inexplicable or seemingly irrational are suddenly contextualized thus that they make sense, or even attain a measure of inevitability. The past is made present, and thus we are reminded that we are all equidistant to eternity. The challenge lies not only in collecting enough material support for the claim that a certain theme permeated a certain moment, but also in avoiding the temptation to go full Herodotus and just invent events that did not happen but fit the narrative. It behooves a critic to remain faithful to the sources.

At the intersection of speedrunning and fantasy themes, we have this 2019 Summer Games Done Quick showcase of a Chrono Trigger run. The fact that it is six hours long is both natural and significant. Natural, in that Chrono Trigger is a very long game which takes a very long time to finish, making the six hours an impressive feat indeed. Significant, in that because of the duration we are able to see the fantasy theme emerge into the situation and how it affects those present. What begins as a rather straightforward showcase of a speedrunner’s rational bag of tricks to avoid getting bogged down in time-consuming minutiae (the hallmark of these kinds of games), gradually morphs into an emotionally intense scene where the only proper response is a prolonged triumphant yalp, which cannot help but infect the audience even now, years later. Not only is it impossible to step into the end without understanding the series of events that led up to it – it would be to miss the whole point. What amazes and amuses me is that we end up with a perfect blend of rationality and fantasy, of solidly goal-oriented measurable accomplishments and a relentlessly intangible case of “you had to be there”. Sometimes, you gotta go slow in order to go fast.

Should you decide to take the plunge, be aware that the video is in fact six hours long, and that you do not want to be in a hurry to do anything or go anywhere upon its completion. The ending is a mood a person has to sit with for a while, and this text is but a nod in appreciation that it exists as an object in the world at all. Most fantasy themes are irrevocably lost to history, but this one – this one we managed to catch. It is perhaps the most anomalous aspect of the whole event.

The fantasy theme of rationality, or the big mood of speedrunning

Nostalgia

The big draw of nostalgia is that the future is foreclosed. All the possible paths and trajectories life could have snaked from that one nostalgic point have collapsed into the one singular timeline we now find ourselves in. All worries have either borne out or turned into intense nothings. All upcoming exertions have been performed. All feats of bravery, cowardice, romance, ambiguity, clarity, sickness, health, love, hate – it is all in the past now, as surely written in the annals of time as in our memories. The letter has arrived, we have received it, and as we think back on that nostalgic moment, we can appreciate it as a pure immobilized snippet of time where nothing ever changes. For the purposes of nostalgia, time truly is a moving image of eternity.

The attraction of such eternal clarity only heightens in a present characterized by everything but a clear sense of what is happening or where we are heading. The world shifts this way and that, mashing contradictions together every which way, such that making sense of it all seems a fool’s errand at best. Some embrace the tactic of a forward retreat, following every wild goose chase to wherever they lead in the hopes of it being on firmer ground than here. Others fortify their current location, making it a bastion as impenetrable physically as it is socially, with little to no regard for how suitable it is for permanent habitation. This leads to an ever increasing juxtaposition of unstoppable forces and immovable objects, which only serves to increase the overall chaos. In all this, nostalgia tightens its grip, almost to a chokehold. The glorious past, yes. If there is one certainty in this world, that would be it.

The great philosophers, poets and historians all try to tell us that this sense of a solid unmoving past is an illusion, and that everyone was, is and will be equidistant from eternity. The ancestors faced much the same fears we do, albeit about different things; the register of human emotion has only varied so much across time. Our children will face much the same challenges we face now, albeit hopefully aided by the wisdom we pass on. Our peers – our companions in the present – know just about as much as we do about what is happening, albeit with different levels of proficiency when it comes to hiding their fears. Time then was as much as it is now, and it is comforting in a way to think that the Ancestors too looked back with a nostalgic sheen in their eyes. Our heroes doubted too, beset by the very worries that loom over us now. There is solace to be had in the fact that they at no point were able to sidestep the future. At the moment they performed their great deeds and became our revered heroes, they were scared out of their minds and hoping against hope that it would work out. The past had not become inevitable yet.

We need to be reminded of this every now and then, lest the closure of the past results in the foreclosing of the future. What is yet to come is still contingent, as is the present, as was the past. Nostalgia aims to lock the past in perfect perpetuity; it is only a short step until it locks the future into the days beyond this one which are still perfect. Reject this impulse. Embrace the perpetual now, and push contingency as far as it will go. Build your own ancient traditions. Do it now.

Nostalgia

The discipline of sociology

One of the least intuitive aspects of sociology is that it sees everything as real, with a very broad definition of everything. UFOs? Real. The existence of the divine creator of the world? Real. The metaphysics of sports? Real to a fault. Astrology? That’s a mother lode of reality. Everything, it’s all real. All of it.

This is such a counterintuitive claim that most people write the whole endeavor off as soon as they catch wind of it. “Really everything? All of it? That cannot possibly be right” they think to themselves, only to read on to discover that, yes, all of it, everything, the whole shebang. Anything and everything goes, nothing is out of bounds. Which, unfortunately, can cause folks of a literalist and positivist bent to just write the whole discipline off as an exercise in lunacy, a waste of time and, frankly, a bunch of gobbledygook unworthy of anyone’s attention.

The key to unlocking this counterintuitive claims is to realize that it is not operating on an ontological level. Sociology makes no claims about the factual existence of UFOs, gods or the predictive properties of stellar phenomena. Sociology does, however, assert the existence of social contexts structured around these entities, which can be studied systematically and scientifically. And, since sociology is the study of social structures and contexts, this is exactly what sociology goes on to study.

This places UFOs, divine beings and horoscopes in a strange ontological position. On the one hand, there is scant empirical evidence for the existence and/or efficacy of any of these things. On the other hand, they evidently operate as an organizing principle in the lives of the people who are interested and engaged with the communities surrounding them. While no UFO has been conclusively spotted, this poses no difficulty whatsoever for UFO spotters who travel far and wide in their quest to make contact. Nor does it negate the existence of UFO believers. For the purposes of understanding why these people do what they do, we have to posit that UFOs have real social effects, which makes UFOs – you guessed it – real.

Durkheim called these strange objects in the shadowlands of reality “social facts”. Regardless of the ontic facticity of flying saucers, a sociologist has to treat their status as an organizing principle as a thing that factually exists in the world. And, more crucially, the purpose of studying a community of UFO enthusiasts is not to finally nail a photo of those flying buggers; it’s to watch how the social fact plays out in actual physical reality, as an empirical process which (fortunately) can be observed, studied and documented. This is the empirical reality of sociology.

As you might imagine, this has methodological implications. If someone were to barge in to these contexts in the name of Science, proudly proclaiming that organizational principle of said context is pseudoscientific bullshit that has a less than zero chance of being real, then the response will (quite rationally) be to throw said person out the door and ask them to never show their face again. In methodological terms, this is a critical mission failure; the scientist has failed to secure access to the empirical material. When working with people, you have to get along with them. This means approaching the object of study with a measure of politeness, and to come prepared, having done your homework. They key to methodological success is to know intimately what it means that Mercury is in retrograde.

Given that sociologists study real social contexts, it follows that these very same contexts have no obligation whatsoever to let the agents of sociology roam free to make empirical observations to their hearts content. These are real people with things to do, time tables to keep, budgets to consider and kids to pick up. Making room for strangers suddenly appearing to ask nosy questions is, for all intents and purposes, well above and beyond. This means that gaining access to the relevant social milieus, a sociologist either have to be very patient and play the long game in building trusting relationships, or otherwise pull their weight in some significant way. To be sure, being able to say that you work at the university of [insert name here] opens a great many doors, but it also closes others, and at times it makes no impression whatsoever. There is no one correct way to gain access, but a great many ways to find oneself barred from it.

This raises the question of scientific objectivity. The scientific ideal is a dispassionate observer who observes and analyzes the data according to strictly defined criteria in a rigorous manner. Ideally, it should be utterly irrelevant who interprets the data; if the method is followed, the same results should follow every time. For obvious reasons, this does not work when studying real social settings – a sociologist who has spent years building up trust and mutual respect within a community is not interchangeable with a physicist whose main activity is to juggle variables. Indeed, this very same sociologist is not even interchangeable with another sociologist – the community under study simply does not know the guy. When the method of measure is a particular person, things get personal indeed.

On the basis of this, it might be tempting to write off the results of sociological investigations as unscientific. Indeed, if results can only be replicated by one singular person, then that would be quite a blow. However, this state of things is not a deviation from the scientific method, but an application of it. The fact that the methodology includes lengthy, and at times downright anthropological, aspects does not take away from their methodological necessity. It is simply an admission that humans are as humans do, and that if you want to play along you also have to play nice. Play being a function of time spent together. There is a sociological method; the fact that it sometimes takes an extended period of time and a counterintuitive amount of effort to go through the steps of said method does not make it less methodical. We could make a parallel to the investment costs of particle accelerators; the fact that you or I will never be able to build one for purposes of replication does not negate the value of those already in existence.

The alternative to accepting the social sciences would be to delegate human activity to an unknowable black box, whose inside mechanisms are utterly inscrutable and beyond the reach of scientific investigation. Which, to be sure, would make the lives of university administrators that much easier when the perennial round of budget cutbacks comes around. But it would also limit the project of extending human knowledge to a very narrow range of topics. We know, through the application of our powers of reason and intellect, that nothing changes when a goal is scored in a big match. The ball passed an arbitrary line, and that’s it. We also know that the massive cheering and displays of emotion that result from this very same goal is a very real thing indeed, and that it can be studied as an object in the world. There is a method to it, and at some points of your investigation you might have to drink a celebratory – or commiseratory – pint at the bar, but thus are the rules of entry.

The discipline of sociology

Dyson: Analogia

Sometimes when I read something, it connects in my mind to something else I’ve read. In this case, the connexion is made to Zielinski’s Deep Time of the Media. Zielinski explored various technological and epistemic dead ends, and pondered where we would be today had any of those possible lines of development been pursued. The goal of such an archaeological excavation is not merely historical curiosity – although that is always a valid reason to go about things – but also a reframing of the present. Seeing how we are all equidistant from eternity, the various abandoned dead end media he explored were once (or could have been) state of the art technological cutting edge. Pondering where things could have gone puts the arbitrariness of where they actually went into perspective; the alluded to Deep Time consists of not taking historical contingency for granted just because we happen to live in the here and now.

Dyson has embarked on a similar project, albeit with ever so slightly less grand epistemic ambitions. Analogia tracks the development of digital technology from Leibniz to the present, and gives us the perspectives of those who happened to be on the other side of it. Not its opponents, per se, but of its victims. When the telegraph made its way across the continental United States, it did not bring good news for the Native Americans. These self-same tribes were not passive recipients of history, however, and responded to these news by quite literally hacking the telegraph wires. Chopping off a wire gave a tactical advantage in that it severed enemy lines of communication, but the act was not one in opposition to telegraphy specifically; they were firmly in the realm of Clausewitzian depriving your opponent of the capability of resisting, rather than of Luddite convictions.

This places the emerging communications technology outside the realm of inevitability. It is a rare thing indeed to place those who happened to be on the other side of technology as strategic actors using the full force of human rationality to oppose said technology. Usually, the story sets up the initial conditions, the requirements for the technology to work, the impediments in place, and what the clever men of applied science did to overcome these impediments. At best, locals are delegated to the status of “impediment”, and then swiftly removed from the equation as the march of Science and Technology continues ever apace. Inevitability had to be brutally enforced, and by pointing towards those who were subjected to the enforcement the inevitability is brought into question.

A reader of the book might not be immediately struck by this line of thought. The text moves from context to context, giving biographical and at times personal insight into the lives of a number of individuals. From Leibniz himself, and his petition to the Russian Tzar to build a very early version of a difference engine; to a trans-Siberian expedition from the very same Russian court to explore the east coast (before subsequently arriving at the west cost), and the various logistical challenges inherent in such an undertaking; to the supreme canoe-making of those who live in the arctic peninsula of Alaska; to the social circumstances surrounding the creation of the atom bomb. In all these individual, peculiar and at all times fascinating recountings, it is easy to lose track of the overall analogy; the meeting of analog and digital gets lost among the proverbial trees, as it were.

This is, perhaps, more of a feature than a bug. The whole endeavor is to not reduce these varied and skillful opponents of the coming digital order to an analogy. Part of looking at the world through the lens of deep time is that every moment is contingent, and every future moment following from any given moment was as uncertain as it is to us. This necessitates a certain amount of critical distance from history – the fact that we know how it turned out does not imply it was the only way things could have gone. This mode of thinking is inherently allergic to analogies; it ever so slightly defeats the point to posit that since something happened once, it is bound to happen again in the same manner. Thus, an extensive setting of the stage is required, to fully appreciate the magnitude of the collapse of eternity into one singular timeline.

What Analogia ultimately does is set the stage for our present moment, in all its contingent glory. There is no telling where things go next, and those who do try to tell you usually do so in the context of a sales pitch. And there are many, many sales pitches abound about the advent of artificial intelligence, the new digital frontier. It is an inevitable development, a preordained extension of destiny, the next step in human evolution, the next big thing, the done deal. All it requires is that we embrace it in our hearts and unilaterally allow it to happen. A digital manifest destiny, if you will.

As we have seen, inevitability has never been the case when it comes to new technological developments. There is little to suggest it would be the case this time around, and it will undoubtedly be interesting to see who among us will happen to be on the other side of this evitable inevitability. It will also be interesting to see where the new avenues of relentlessly analog wire cutting will take place. The future is not here yet; we still have time to be contingent. If history teaches us anything, it is thus.

Dyson: Analogia

Rhetoric and the truth claims of documentaries

Rhetoric is the art of organizing words in such a way that they have an effect. At times, this effect can be to achieve a certain outcome, i.e. to persuade someone of a course of action. At other times, it can be to make someone come around to a point of view. Most of the time, it is merely to achieve clarity where there previously was none. Clarity is an underappreciated effect to achieve, and more often than not simply being able to boil down a message to its most basic and easily understood form can rescue a situation from much untold strife. Getting everyone on the same page, even if only so they can disagree more constructively, is a superpower in disguise.

The organizing of words is not contained to writing or a speech, however. It does not even begin there. It begins in one’s head, as a method of making sense of things. It is a function of paying attention to how things fit together, and how they can be made to relate to one another. Everything is related to everything else, and every connexion is a potential line of argument. Most relations are tenuous or non-intuitive, and can be safely put aside in search of better ones. Eventually, after having looked at enough possible connexions and discarding enough bad ones, what remains is a few potential ways of organizing the words such that they can be understood. What follows from this process is a lot of bad ways to put something, and a few good ones. The key is to select something from the good category.

This organization of thought and narrowing down of possible alternatives to a few good ones is often pitted against a natural, intuitive way of going about figuring out what to say. Rhetoric is the art of lying, it is often said, and a person’s true thoughts & feelings are expressed spontaneously, without prompt or preamble. Indeed, it is maintained, the more one thinks about something the less truthful it becomes. At length, this line of argument ends up saying that the process of choosing one’s words cannot but end up in dishonesty; whatever truth was to be found in spontaneity is lost once the moment has passed, leaving only the malingering specter of doubt. It is a line of thinking as silly as it is wide-spread.

The thing of it is, it is seldom clear what the most effective, truthful or proper way of saying something is. If someone asks for direction to a place, is it best to describe the route by means of street names, landmarks or numbers of turns left/right? All three options lead to the destination, and it is unclear whether one is more truthful than the other. The criterion of spontaneous honesty does not give us any guidance as to which to choose, and would only serve to induce anxiety in those who are not able to make a choice right there on the spot. If we allow ourselves to ponder whether the asker is familiar with the local street names, can remember an abstract list of left/right turns, or would be best served by the most concrete visual input the city has to offer – then we are in a better position to pick the most appropriate option.

From this, we can gather that rhetoric is the art of figuring out what is useful to say. What flows from the heart is not always the most informative of thoughts, and saying it reflexively might at times lead to confusion. By organizing one’s thoughts in relation to the overall situation, it becomes possible to be a more constructive participant in the conversation. It streamlines the process, as it were.

This brings us to documentaries, who – much like spontaneous honesty – are assumed to be bearers of unmediated truth. If something is presented in the form of a documentary, then it is generally assumed to be the true story. This follows both from convention, and from the fact that many generations of film making has established the genre as an effective vehicle for conveying information. Form and function go together, leading to a very persuasive mode of presentation.

However, just as in the above example of asking directions, there are always multiple ways of arriving at a certain truth. Documentaries do not spring fully formed out of nothing; they are structured around an organizing principle which puts things in relation to one another. The organizing principle was chosen at some point during production, and then executed in the form of the finished product. Whatever the choice, there were always other ways of presenting the information that were not chosen. And, conversely, if one of those other options were picked, then we would end up not picking the first option. Neither of them are false, but at the same time are neither of them the whole truth. They are equally and both at once, choices. (Here, Booth would make an amused aside about the need for an infinite number of documentaries, to cover all bases.)

This places us in a tricky spot when it comes to evaluating the truth claims of documentaries. Or, rather, it makes the truth somewhat orthogonal to the choice of organizing principle. Any given choice is bound to have advantages over another, with corresponding disadvantages. The difference is one of emphasis rather than of veracity, which is an important difference, but – and this is the point of this text – we can not arrive at a constructive criticism the significance of this difference if we talk about it in terms of true and/or false. Truth is not the issue; the mode of presentation is.

The purpose of rhetoric is to organize words so as to arrive at clarity. The same goes, mutatis mutandis, for documentaries. As a critical reader and/or viewer, your task is to evaluate the organizing principle to see where it places its emphasis. Should you find that there is an alternate organizing principle, then you are richer for knowing it, and can proceed to compare & contrast until interesting nuggets of insight emerge. These nuggs will, no doubt, be useful in your upcoming attempts to find something useful to say. –

Rhetoric and the truth claims of documentaries

Byung-Chul Han: the burnout society

The translation of Byung-Chul Han’s small book the Burnout Society is unfortunate. For one, it immediately places the work in the discourse of the burnout, which connotes all sorts of self-help positive thinking bear the burden alone nonsense that are ever so perpendicular to what the book is about. For another, it is wrong. The original title, Müdigkeitsgesellschaft, is more accurately translated as “the tiredness society” or “society of the tired”. The actual medical condition of being burned out is an aspect of this tiredness, but it is not the main focus of the book. Which, unfortunately, means that the point of the closing reflection on what it means to be tired together tends to get lost on English readers. It comes as a surprise, rather than as a fitting conclusion.

As these words are written, the importance of quarantining oneself against the Covid-19 virus is entering into public consciousness. This is an interesting point in time, since many latent patterns of thought are directed at and applied to the upcoming quarantine situation. The common sense interpretation of how to deal with the new situation emerging, and thus we get an unusually clear picture of the common sense interpretation of how to deal with the old situation. For a brief moment in time, we can see the change in action, and contrast what’s new with what’s old.

One common reaction to the quarantine is to say “gosh, I’m gonna get so much done! This sure is going to be a very productive time!”. A quarantine is seen as a temporary reprieve from restraints that prevent the full forces of creative output to be unleashed into the world, and thus as a potential time of unparalleled getting shit done. Unread books, hobby projects, writing ideas, gardening feats, culinary experiments – whatever it is, now is the time for getting it done. The pent up creative energy will flow with wild abandon. ushering in a new era of unprecedented personal productivity.

Byung-Chul Han contends that the last few decades have seen a shift from what he calls negative production to positive production. The former is a process of standardization and error elimination, whose goal at all times is to remove flaws in order to maximize efficiency. These flaws can either be technical, in the sense that the productive machine is not optimally configured, or social, in the sense that abnormal elements of society have to be removed or repressed so that the eternal productivity can continue without interruptions. Deviant forms of life, queer sexualities or non-conformist ideologies are examples of such abnormalities. The goal of negative production is to make each part of the production process standardized and interchangeable, including the human components. A worker is a worker, and workers do as they are told.

Positive production, in contrast, relies not on standardized units of production doing what they are told. Instead, these very same units have internalized the imperative to be productive to such an extent that they tell themselves what to do. The specifics vary from person to person, but the imperative to Produce remains a constant. At work, this expresses itself as an ever increased effort to attain maximum productivity, to be the utmost exemplar of whatever work is performed in all aspects. At home, it expresses itself as a nagging sense that one should do something productive. Academics have a name for this nagging sensation: “I should be writing”. The same goes for any other productive endeavor: I should be reading, painting, remixing, organizing, meditating. Whatever the activity, the same nagging sensation arises that it should indeed be done.

The upcoming Covid-19 quarantine, a period of time within which a person is specifically obligated to stay at home, is a good way to see the two mentalities of production in action. For those working under the paradigm of negative production, this would be a period to unwind – to be themselves, to sleep in, to not give a darn about the Man. Those laboring under the new paradigm, however, have to get themselves ready to (as paradoxical as it might seem) get to work. There imperative is still there, and it is even stronger for there not being anything else to do. The quarantine is a great opportunity and an even greater obligation.

Based on this, we should be able to predict that a substantial number of people will end up more tired at the end of the quarantine than at the beginning of it. The amount of actual work accomplished during the quarantine is beside the point; the tiredness is not a result of sustained effort, but of constantly feeling that the Work should be performed. Positive production allows no time for rest, only for more of itself, more production. Instead of the quarantine being two weeks of rest, relaxation and recovery, it will be two weeks of constant anxiety over not sufficiently productive during our allotted time. At the end of it, tiredness and exhaustion will be the words of the day.

The point of this book is not to argue that we should go back to negative forms of production. This is a book of philosophy, after all, which means the point is to get us to ask ourselves if this really is how we want to spend our lives (in and out of quarantine). Like all good works of philosophy, it does not lead to a clear-cut easy to implement answer, and leaves readers with more questions than they previously had. I say we acknowledge our tiredness, and then proceed to be as unproductive as we need to be.

Byung-Chul Han: the burnout society

Frye: Anatomy of criticism

Northrop Frye’s Anatomy of Criticism was published in 1957. This is not only a bibliographically necessary nugget of information for when you want to compile a list of works cited, but also an important touchstone when reading the book. A great many things have happened since 1957, and it is interesting to waltz through the realms of criticism as it were back then with the knowledge of how it turned out later. Armed with the knowledge of the future, returning to this writ is akin to a tour of what might have been. The present back then was contingent in a way that our present is not.

To take an example: at length, Frye gestures to the emerging trend of replacing criticism proper with the act of producing ranked lists, and the inherent methodological problems of such an approach. For one, merely ranking things is not a critical act, it is merely the application of a more or less explicitly defined set of criteria on a limited set of objects. Going through the motions of such a procedure does not increase our understanding of the works in question, or even why they were included in the ranking process to begin with. With the modern phenomenon of listicles firmly in mind, we can look back on these musings and nod an extended agreement. Not to mention the trend of modern mission statements to replace grammatical structures in favor of disjointed yet prominently displayed keywords, where even the pretense of an overarching organizational principle has been abstracted out of the picture.

To take another example: while delineating the different roles of critics and authors, Frye makes a joking aside that Dante, who proclaimed that a certain poem was the best he had ever written, was in so doing an indifferent critic of Dante, and that others had gone on to write better critiques of said poem. Little did Frye know that a mere decade later, the whole death of the author hubbub would flare up in earnest when Barthes kicked the hornets nest. And then kept it going for quite a spell.

A funny third example is how Frye points out how there were no standard introductory books on criticism. He then goes on to speculate what the first page of a potential such book would say, pondering that perhaps it might modestly begin with the question “what is literature?”. Then, he ventures that the second page might expand on this question, in terms of verse and rhythm, and that subsequent chapters then deal with the complexities of genre and other nebulous yet necessary literary terms. Terms which, although recognizable in action and principle, seems strangely resistant to being theoretically explicated.

It is funny in the sense that there are now several such books, who do not necessarily agree with each other on the finer points of what criticism actually is. It is also funny in the additional sense of a reader being able to go to their local university library, scour the shelves of every book that looks vaguely introductory to the enterprise of literary criticism, and empirically investigate how Frye’s prediction turned out. So I went and did just that. Here follows, in the order of which they stacked up next to me after I had done the aforementioned scouring, the results.

First out is Persson (2007), titled Varför läsa litteratur? (Why read literature?). An introductory title if there ever was one. The book begins by mentioning that merely asking this question is seen as blasphemous in certain context, almost taboo. Persson then continues to outline that, in practice, this very question has had a variety of responses throughout the ages, relating to the building of such things as character, nationhood and a (well-read) democratic citizenry. He then gestures towards the contemporary trend within organizations to demand a justification (a stronger word than an explanation) for everything that happens within it. Thus, being prepared with answers with slightly more rhetorical and conceptual bite than “it’s a traditional value held throughout literally all of recorded human history (more often than not constituting said history)” is a modern virtue.

Next up is Barry (2009), with Beginning Theory. It opens up with the observation that the “moment of theory” has passed, and that we now find ourselves in the “hour of theory” – the enthusiastic fervor with which theory was introduced has been replaced with the slightly less enthusiastic aftermath in which we can look back upon what has gone before, and calmly set to work organizing and cataloging the aftermath. Theory, literary theory among them, has become a day-to-day business, and thus it needs standardized books like this one so everyone in said business are, as far as such things are possible, on the same page.

Observant readers will note that “criticism” seems to have been replaced with “theory”. Just theory in general, with “literary“ added on as a reminder that books are somehow involved. Culler (1997) picks up this theme on the first page of Literary Theory, where he differentiates between capital-T Theory in general and literary theory in particular, and then goes on to discuss how the two have been so thoroughly intertwined over the last decades that keeping them separate is a fool’s errand. Non-literary theory (defined broadly) has impacted on how literature has been written, which has then affected how criticism of said literature has taken form, which in turn has influenced literary theory, and to fully understand it all a modern readers has to know a little about every step of this series of events to keep up. Basically, a critic also needs to be a theorist, in order to understand the books they claim to critique.

Franzén (2015), in Grundbok i litteraturvetenskap, (Introduction to literary science), take a slightly more analytic approach, and defines theory in the scientific sense of being a comprehensive set of ideas relating to something; the ‘something’, in this case, is literature in its many forms. Franzén notes that there has been a move from writing about literature in a normative sense – i.e. how it should be – to writing in about it in a descriptive sense – how it actually does what it does. The book then proceeds to outline a number of themes in this straightforward manner.

Eagleton (1996) opens up Literary Theory with the striking formulation “[i]f there is such a thing as literary theory, then it would seem obvious that there is something called literature which it is the theory of”. After this opening salvo, Eagleton takes a closer look at what the category of “literature” includes (e.g. the Illiad) and what it, more importantly, does not include (comic books), and how this selective applicability affects the theory which claims to be about those things included. What is literature indeed.

Peck and Coyle (2002) introduces Literary Terms and Criticism with the assertion that “literary criticism is primarily concerned with discussing individual works of literature”. The authors then immediately clarifies that aspects slightly less particular to an individual work, such as its genre or its historical context, also play into the process of criticism. The tension between books always being singular, unique and one of a kind, and also very possible to group together with other similar monads, is as of yet one of the unresolved questions of theory, literary or otherwise.

Next up is Norton’s monumental tome the Norton Anthology of Theory and Criticism (2010), which features 2758 large pages of small print, covering just about every aspect of theory and/or criticism there is. It starts off by proclaiming that there are those who claim to be anti-theory, who hold the position that all this circumlocution is a mere distraction from the real work of getting it done. Slyly, the anthology then points out that this in itself is a theoretical position, whose assumptions can be critically examined and thus better understood. Not said, but heavily implied, is that the following thousands of pages might be of some use in this critical endeavor.

Finally – it was a big stack, dear reader – Bennett and Royle (2009) begin their An Introduction to Literature, Criticism and Theory by posing the rhetorical question: when will we have begun? From this provocation, the authors then set out to problematize the beginning of a text. Do early drafts count, or shall we limit ourselves to the finished publication? What about marginal notes, commentary, public reception or influential works of criticism? When, indeed, can we with confidence proclaim that we have read and understood enough to finally get on with doing either literature, criticism or theory?

It is tempting to say that Frye is still correct in his assertion that there is no standard introductory work on criticism. The prevalence of many introductory works, plural, only serves to underline this point, albeit probably not in the spirit with which Frye made it. But I reckon it would be more fruitful to say that there is indeed a standard of introductory works, and that what unites them is an unwillingness to once and for all proclaim what literature (and the criticism of it) actually is. Literature is at once both the baseline of human expression (in its many forms), and the gradual expansion of the possibilities of human expression. We all agree that there is such a thing as literature, and then immediately start to argue about the finer points beyond this first principle. Establishing a firm definition of what literature is invites future authors to blur the line by new and creative literary feats, and criticism must always – lagging behind as it is – try to keep up with whatever tools it can get its hands on, theoretical or otherwise. Which is indeed a hopeful thought to take into an uncertain future. It certainly makes the present ever so slightly more contingent.

Works cited

Barry, P. (2009). Beginning Theory: an Introduction to Literary and Cultural Theory. Manchester: Manchester University Press.

Bennet, A., & Royle, N. (2009). An Introduction to Literature, Criticism and Theory. Harlow: Pearson Longman.

Culler, J. (1997). Literary Theory: a Very Short Introduction. Oxford: Oxford University Press.

Eagleton, T. (1996). Literary Theory: an Introduction. Cambridge: Blackwell.

Franzén, C. (2015). Grundbok i litteraturvetenskap: historia, praktik och teori. Lund: Studentlitteratur.

Frye, N. (1957). Anatomy of Criticism: Four Essays. Princeton: Princeton University Press.

Leitch, V. (ed). (2010). The Norton Anthology of Theory and Criticism. New York: W. W. Norton & Co.

Peck, J., & Coyle, M. (2002). Literary Terms and Criticism. Basingstroke: Palgrave.

Persson, M. (2007). Varför läsa litteratur?: om litteraturundervisningen efter den kulturella vändningen. Lund: Studentlitteratur.

Frye: Anatomy of criticism

The I Ching

The I Ching – the book of changes – is a strange thing. It is, all at once, a divinatory practice, a meditative technique, a highly significant cultural document and a vocabulary. All crammed into a very small package, most of which – for western readers – will consist of contextual information, clarifications and useful forewording. The actual text is a mixture of commentary, general life advice and technical documentation, all intertwined. Those looking for a straightforward read will be highly disappointed.

In technical terms, the I Ching is a six bit binary system with 64 different states. As with a computer binary, each bit can either be 0 or 1, yin or yang. Depending on which six bits are given by the divinatory process, the resulting sign can give very differing interpretations of the situation you find yourself in. The sequence 000111 gives you the sign Stagnation, a very clear indication that the situation is hopeless and that nothing good can come out of persisting; the general advice is to leave as quickly as humanly possible. This can be contrasted with the at first glance seemingly opposite 111010, the sign for Calm Anticipation, which advises that a great or dangerous moment is imminent, yet that the time to act is not quite here; the general advice is to wait energetically. Two very different moods to find oneself in, yet very compactly conveyed through the use of merely six lines.

This efficiency is ever so slightly opaque to those who do not know the signs. It is also a remarkable achievement. It manages to place wildly disparate life experiences into the same framework, and thus allows for comparisons between different situations and the appropriate courses of action for each such situation. When under the sign of Stagnation, the only possible way forward is to just drop everything and get out, since nothing can be salvaged. When under the sign of Calm Anticipation, however, the opposite is true – the winning move is to firmly keep your eyes on what’s ahead and sticking to the plan. The wisdom imparted by comparing these two signs is that these are two possible life situations to find oneself in, and that being able to tell which applies to the current moment is crucial to getting ahead.

As you might imagine, there are a great number of possible comparisons to make with 64 available signs. To make things even more interesting, each sign is subdivided into six subvariations depending on which line gets emphasized in the divinatory process. Take 111010 as an example. Emphasis on the first line indicates that the danger is far away still, and that the best way to prepare is to live in such a way that the appropriate virtues are firmly in place when it finally does arrive. This can be contrasted with fourth line, which indicates that the danger is already clear and present, and that the proper move is to not make things worse in a blind panic, but to calmly hold fast. Both indicate that things will get better once the approaching danger can be overcome, but that overcoming this danger is a function of the actions taken in the calm moments of preparation.

Math enthusiasts will quickly figure out that the sum of these subvariations is 384, a respectable number of possible life situations. When I earlier called the I Ching a vocabulary, this is very much what I meant; being able to systematically distinguish between such a large number of possible situations (and the prudent courses of action for each) is a whole dedicated skill in itself. Being able to talk with confidence about the subtle differences between the different signs and their subvariations is yet another skill, one which may very easily be (as the character of Chidi in the TV series The Good Place so eloquently exemplifies) mistaken for wisdom. It is the allure of what is signified through 101001, Effortless Grace, which ever so slyly emphasizes the former over the latter.

The great number of variations points towards one of the inherent paradoxes of the I Ching system. On the one hand, the sheer volume indicated that just about everything ought to be covered in there somewhere. On the other hand, any student of creative writing will surely be able to think up more than six variations for each sign, once they have gotten the general gist of what it is about. Indeed, anyone with sufficient life experience will be able to recall that one time when the sign itself was applicable, but none of the variants really fit. The world is greater than the attempt to systematically categorize it.

This paradox is not a bug, however. It is a feature. Once someone has gotten so used to the signs and variations that they are able to identify the blind spots of the system, they have mastered a vocabulary of situations, remedies and moods so vast as to be able to conceptualize just about anything they stumble upon. If a peculiar situation does not fit into the system, then that too is useful information, and indicates that there is something there that warrants thinking more intently about.

Thinking intently is one of the things the I Ching encourages its practitioners to do. Going through the motions of a divinatory session takes everything from 30 to 90 minutes, during which it is advisable to keep out all distractions. Not only because it is easy to lose count whilst going through said motions, but also because the sheer act of sitting still with the problem firmly in mind is itself a kind of thinking. As Jung almost phrased it, the hands are busy whilst the mind is giving space to consciously and unconsciously process the situation. Once the answer is given and a sign appears, the practitioner is more than ready to see how it applies to the present circumstance, in extensive detail.

The I Ching is a peculiar text, a discursive anomaly. It is, I dare say, a small book of big moods.

The I Ching

Dark Souls (2011)

Dark Souls is in many ways the prototypical video game. When you first boot it up, there is a grand cinematic explaining the scope and breadth of the narrative universe – there is a god of lightning, a lord of death, a fire witch, a dragon, an epic battle! It’s all very dramatic and cinematic, and then

New Game

The player character is in a dungeon for some reason, and an unknown NPC throws down a key so as to make a timely escape possible. What follows is a period of getting used to the controls, possibly dying once or twice (the big boulder is a contender for this outcome), and an indirect lesson that sometimes you are not ready to fight the big demons just quite yet. The broken sword you begin with might be thematically proper, but something more pointy is required for actual combat. Thus players are introduced to the concepts of switching to appropriate gear and running past enemies, as need be.

When looking at gameplay after this point, what is striking is that so much of it conforms to the image of video games kids have. The player character is a dude (or dudette) with a sword, who fights generic enemies (whose individuality can be safely ignored) and bosses (whose uniqueness make their backstories as interesting as their fighting techniques). All this in a setting steeped with backstory, lore and hidden secrets, who can be uncovered by players enthusiastic and determined enough to give it a go.

In other words, it is very much like when we were young and played early NES games. The graphics were pixelated to perfection, and the physical cartridges the games came on barely fit enough information to convey any narrative information outside the mechanics. Each and every pixelated enemy had a name, a backstory and a place in the universe. And, more importantly, an entry in the manual that came in the box – lovingly crafted to ensure the differentiation of one colored set of pixels from an identical albeit differently colored set of pixels. The Goombas and Bullet Bills had canonical names, and all the implied narrative infrastructure that comes from having a name.

In those archaic pre-internet days, this narrative infrastructure turned into local myths and legends. Part of it came from simply informing everyone involved about the facts – given time and enough double-checking of the manual, soon enough the Bowsers and the Lakitus were known entities. An even bigger part came from the telling and retelling of ideas of how the implied, never shown but carefully named, kingdoms or future settings had to be organized. The world of Super Mario had a princess and a whole series of monarchs being turned into various creatures, establishing that the mushroom kingdom was indeed a magical kingdom. The world of Mega Man implied a whole host of futuristic machines subverted to the twisted ways of Dr Wily, and so a setting could be imagined around that. And so on and so forth.

Given the lack of available textual information (the manual was only so large, and the cartridge could only contain so many bits), there was plenty of room for imagination and extrapolation. Indeed, even speculation. Many a friend group had informal theories of what may or may not have transpired – I dare not call them fan theories, lest the gamers grow restless – some of which are still remembered fondly to this day. These theories served as a springboard and expression for young imaginative minds, and as an informal social glue in a time when such things were rare indeed. If you ever get the chance, do probe someone about their childhood imaginings of these virtual worlds. There is more there than might meet the eye.

When I say that Dark Souls is a prototypical video game, I mean that it harkens back to this earlier era of mythological expansion and exegesis. An enemy is not just an enemy – they have names and backstories. The bosses are not just slightly tougher enemies – they have intricate relationships with each other and the world they find themselves in. The world is not just something put in place by virtue of the necessity of having to render something on the screen to make the gameplay look appealing – everything is significant, every detail conveys important information, every aspect contributes to the overall story. There is more backstory to be uncovered, and more importantly, more stories to be told. Dark Souls is very good at bringing out the forensic storytellers inside its players.

With the advent of the internet, the social space of this storytelling has shifted from the geographically available friend group to a more global setting. The Dark Souls portion of Youtube has viewerships in the millions, with cooperating and competing exegesists comparing notes. The drive to tell, retell and refine the stories found implied in the games – always implied, just at one remove – is still there, burning like a great bonfire. Or, more accurately, like many small bonfires scattered across the lands.

There are those who speak of Dark Souls only in terms of difficulty, as some great obstacle to be overcome by those worthy enough. While I do acknowledge that this, too, is part of the myth building that eventually leads to storytelling, and that there are parallels to the whole Nintendo Hard thing, I must say that such simplistic takes miss the point. If difficulty is your only point of reference for talking about the game, then I am sad to inform you that you have officially failed at Dark Souls.

Take heart, however, for there is always an opportunity to play again. The age of fire is still with us, for a brief time longer. A new game awaits, and new stories. Tell them well.

Dark Souls (2011)