The presidency of Donald Trump

Paul Virilio sees accidents as something that is created at the same time as the inventions they happen to. With the invention of the train followed the concurrent invention of the train accident – the one cannot exist without the other. The only way to completely eliminate the risk of accidents is to stop using the inventions that give rise to them; as long as the trains keep rolling, the accident looms as an eternal possibility just one routine mishap away.

Of course, accidents for Virilio are not only spectacular local events that happen once in a while (albeit with oh so many photo opportunities). They are also slow, gradual events that take place over large periods of time – and, in the case of railways, over large distances. As the number of trains and railways expanded historically, so did the number of local accidents. But it also brought with it more subtle systemic accidents, which even to this day are so subtle as to be unnoticeable lest someone points them out.

Railway systems need stations in order to work. Passengers need to be able to get onboard the trains at some point, and they also need someplace to disembark at journey’s end. This fact is trivial in and of itself, but it needs to be mentioned in order to make the distinction that there are places with train stations, and places without. The social, economic and geographic implications of this distinction, are the slow accidents of the railway system.

The fact that a certain place has a train station is not an insignificant fact. It means that this place is connected to other places with train stations, and that these places thus are linked closer than they would be otherwise. Being a node on the railway network brings with it all the advantages of being connected to the other nodes – people, goods and other things of importance can traverse the distance between here and there with relative ease. Which means that there will be more of these things moving about, by sheer virtue of access – especially in economic matters, where the profits of setting things in motion will perpetuate matters for as long as it can.

Conversely – accidentally – places that are off the railway grid suffer, as social and economic activity gather around the network nodes. Especially in more rural areas, where whole regions are depopulated as the process of urbanization keeps moving forward. To be sure, this was not intended by the inventors of the train (or the various machines, large and small, that make up the railway network), but nevertheless this is the accident that is produced by the invention working as promised. Even when there are no train crashes, the accidents of the railway system take place on a routine basis.

Accidents, then, are the unintended consequences of things that work just as they are intended to do. Accidents do not happen despite efforts to prevent them, but as a side effect of business as usual. They are, to use a common expression, the cost of doing business.

This metaphorical use of the notion of accidents can of course be extended to things other than trains. Virilio, for his part, applies the metaphor to most parts of society as we know it (such as war, cinema and aesthetics). But for the purposes of this particular post, we are going to apply it to one particular discursive anomaly:

The President of the United States, Donald Trump.

It is tempting to view it as an anomaly proper. Something that according to all known rules and predictive methods should not happen, yet which happened anyway. Something so out of left field that it leaves scientists baffled and pundits grasping at straws in order to fill the airtime they are paid to fill (if ever so vacuously). It is tempting, but such an approach would not lead us forward. Especially not if we, after having had months to digest the news, still manage to return to bewilderment time and again. The paradigm of the anomaly simply will not cut it.

If we view it as an accident, however, a different picture emerges. Even more so if we view it as an aggregate of accidents, where all the many moving parts are doing more or less what they are supposed to be doing, but the net result is the state of things as we know them. We did not end up with status quo despite the best efforts of all involved to avoid it, but because of an overall institutional configuration that made such an accident a very distinct possibility. Trump was not a result of everything spectacularly backfiring all at once, but rather an unintended result of everything doing exactly what it was meant to do – the accident inherent in the normal operations of business as usual.

The presidency of Donald Trump

Dead Can Dance

Dead Can Dance lyrics have an underestimated use case. Whenever someone asks you a mundane question about your everyday life – out of curiosity or out of some ambition to commit small talk – it is very worthwhile to replace whatever answer you might have with a line from a Dead Can Dance song. Whatever the topic of idle inquiry your interlocutor might have attempted, the conversation will from then on be brutally reframed into a very different topic indeed. For instance, the question “how was your day?” is favorably answered with the lines:

For time has imprisoned us
In the order of our years
In the discipline of our ways
And in the passing of momentary stillness

These words express, semantically speaking, the same meaning as the word “nothing much”. Nothing in particular happened, same old same old, things set in motion long ago are still predictably in motion. However, the act of circumscribing the status quo with more words than is strictly speaking necessary alters the social situation you are in. No longer are you in a ritualistic and predictable situation of talking small – you have elevated the situation to a full blown rhetorical situation.

A rhetorical situation is a situation where the outcome is not defined beforehand. Most situations in life are for all intents and purposes predetermined. There are rules and rituals that can be followed and performed in order to move things along. When ordering food at a restaurant, paying for groceries at the store or performing routine errands of everyday nature, you do not really have to put very much of yourself into it. The question of whether or not you want fries with that is at once both empty words and sacred ritual – the fact that it has been asked means that everything is proceeding along predicable lines. The situation is resolved, no further discourse is required. Everyone knows what to do.

You do not have to think of something to say.

Of course, throwing a random Dead Can Dance stanza into a situation makes everything uncertain. It defies all expectations, and thus interpretation must be brought to bear. Who is this fool who speaks with too many words and too little straightforward clarity? How to respond to them? What even is going on?

As a rhetorical strategy, this has the distinct disadvantage of a high likelihood of backfiring. Being thrown into uncertainty is not a pleasant experience, and might provoke anger from whomever is at the receiving end. Caution and discretion is to be advised – there is a time and a place for all things, and fortunate are those who know where these might be.

But as a discursive anomaly, it tells us something about the limitations of interpersonal communication. One can only stray too far from the expected before things start to break down into uncertainty. There are rules, genres and traditions which must be respected, above and beyond any purely utilitarian aspects of a situation. The range of useful things to say is far narrower than that of things possible to say, and the response to straying outside of discursive usefulness often comes in the form of punishment.

Like Prometheus we are bound
Chained to this rock
Of a brave new world
Our god forsaken lot

A common these among those I have spoken to about Dead Can Dance is an appreciation of being reminded of an older world. A world that used to be around, but whose remnants are hard to come by. The frequent allusions to various names – Prometheus, for one – harkens back to a time where you were expected to simply know these things. You were supposed to know of the gods, the mysteries and the possibilities of eventually encountering them. It was a different time, of prophets, alchemists, seers, secret societies, inspired poets and broken souls. Ancient names, ancient sins, ancient memories – it is rare to be reminded of such things.

Things didn’t always use to be the way they are. Things could be different.

To be sure, most of the things alluded to are myths, and most likely weren’t around in the past thus mythologized. But that is beside the point. Being reminded of Xavier’s sins is not a matter of recalling the facts of something that happened in historical time, but a reminder of the possibility of acting in a world with a sense of purpose far beyond the ambition of scoring an extra percentage point on the quarterly report. The madman thought he could cure humanity, and was struck down for his sins. Oh, to be moved by such ambitions!

In a world where language is ever more seen as a purely utilitarian tool – few are those who suffer poets – there is ever a need for reminders that the gap between useful and possible things to say is larger than it ought to be. Not every utterance needs to be useful. And, ironically, sometimes the most useful statement is also the least utilitarian. Stating something that ever so indirectly reminds those present that the present is not all that can be – now there is a useful statement indeed. Even if it is about the sagacious Solomon.

The question of whether or not you want fries with that may be the prevailing sacred chant of the day, but there used to be others. Better ones. We can remember them, if encouraged. Better yet – with sufficient audacity, we can write new ones.

Now there is an ambition to be moved by.

Dead Can Dance

Lipsitz: Popular culture

As a great many great persons have said, reading the news will not inform you about what goes on in the world. Indeed, you become more informed by the world by not reading the news. Not because the news do not inform about recent happenings and goings on, but because the news in and of themselves do not contain the keys needed to decode the significance encoded within the news. It is a paradoxical but true thing: by turning away from the news and engaging with the world, the news suddenly become that more significant.

Careful readers will note the use of the word “read”, rather than “watch”. Reading the news and watching the news are two radically different things, especially these days. But they both contain the same element of internal incomprehensibility – no matter how much you read or watch, the tools for comprehending what is going on must be brought in from somewhere else. To be an efficient consumer of news, you must first consume books and other pieces of media. If content is king, context is emperor.

Careful watchers will note that this is something of a fractal pattern. Popular culture (of which the news are ever more becoming a part) in and of itself does not contain the keys and tools needed to decipher the significance in and of popular culture. Taken on its own terms, popular culture is an autonomous, isolated sphere of knowledge whose capacity to engage and encourage emotional responses is just that – emotional responses. Unless outside knowledge is brought to bear, a splatter movie is just a splatter movie.

Lipsitz describes this dialectic between popular culture and external knowledge an endemic background noise of modernity. On the one hand, most people only ever encounter important topics of history through popular culture. On the other hand, popular culture is only ever meaningful (outside the thrill of special effects) through the application of external knowledge. The relevant question is not whether the one or the other is better or more important – the relevant question is what to make of this dialectic.

The fact that popular culture is consumed means we have to look at the one doing the consuming – the consumer. The context of consumer culture and the capitalist systems that make consumers possible, is also the context we find ourselves in. In a truly fractal fashion, we find ourselves yet again looking at the same dynamic. Albeit with the additional question of just who the “we” is in this context.

A consumer approaches popular culture the same way a cinema goer approaches a movie. Alone, in the darkness, in possession of a ticket that allows entrance to a particular viewing, perhaps also with the added place-bought popcorn and soda. The conditions for entry are determined by economic circumstances, and barring subversive acts of access, only paying customers are allowed in.

Conversely, popular culture is also produced with the paying audience in mind. Made to appeal to the lone consumer subject looking from the anonymous darkness of the cinema salon, popular culture talks to individuals, one at a time. The fact that these individuals are part of target demographics does not automatically lead they also talk to each other or share an understanding of what they consume. Shared appreciation is not also a shared frame of reference. Individual consumption is always just that – individual.

The challenge and possibility inherent in this state of things is the formulation of a common “we”. A subject position able to impose the needed external knowledge to bring life and meaning to popular culture – and, indeed, to the news of the day. Whether it be in the form of fandoms (a very distinct approach to popular culture), social movements (black lives do indeed matter) or academic disciplines, there is space aplenty for creating and organizing the bodies of knowledge required to make sense of things.

Lipsitz warns us of the temptation to treat popular culture as a distraction. The title – this ain’t no sideshow – indicates what is at stake. While it is tempting to scoff at popular culture and its commercial, flawed shallowness (especially those aspects that are watched rather than read), it is also the only pool of shared symbols that can reliably be drawn upon. When speaking to those close to you, you can draw upon some pre-existing shared body of knowledge in your discussions and deliberations. When speaking to those you do not know, some common ground must be sought out. This common ground, despite being produced for a mass market of individual consumers, is more often than not popular culture.

Again, the relevant question is to what to make of the dialectic between actually existing popular culture and any given body of external knowledge. Making sure that you as an individual are acting on the basis of the best possible knowledge is a good place to start. Making efforts to share this knowledge with others in an interesting and pedagogic way is a good way to continue. But the efforts of an individual can only accomplish so much; there comes a time for organizing into something greater. Individuals are fragile things, who can disappear faster than they ought to, but points of view have staying power across generations.

Watching the news will not further your understanding of what happens in the world. Engorging yourself in popular culture will only take you so far in improving your understanding of the world and your place in it. Hitting the books will be far more efficient in this regard. Returning to the news or the newest movie after having done your homework will reward you by reflecting the insights you already possess. You can only ever reap what your mind is ready to sow, and a well-prepared mind will find itself in perpetual abundance.

The thing to do, then, might be to talk to those who like you watch the same things you do. Impart to them some frame of reference which will enable them to make sense of what you both see, and then discuss your findings afterwards. Transcend individual consumption of ever so commercial culture, and make truth of these radical words of resistance:

You are not alone.

Lipsitz: Popular culture

Hašek: the good soldier Švejk

Any piece of writing about The Good Soldier Švejk is bound to discuss one particular question: is Švejk a sly fox who is able to manipulate social circumstances to his benefit, or is he an utter idiot whose lack of grasp on basic reality causes the powers that be to dysfunction in spectacular ways?

The beauty of Hašeks writing is that it keeps both interpretations open, and never collapses into either one of them. Despite the multitude of situations Švejk stumbles into during his travels, it is never clear whether one interpretation or the other provides a better fit for our protagonist. Or, indeed, if either interpretation can be said to apply to these very situations – neither Švejk nor the institutions of First World War Austro-Hungary are unambiguous on this point.

The fastest way to summarize the storyline of The Good Soldier Švejk is to say that the aforementioned Švejk stumbles from one situation to another, and at every stumble along the way he causes the routine ordinariness of the war effort to collapse upon its own logic. While the war in and of itself was every manner of madness and chaos, the institutions set up to manage its logistics followed certain rules and routines, and the appearance of a certain good soldier threw these institutions out of joint. His very presence caused the wartime routines to misfire, and the only possible response the institutions could muster was to send him along to the next situation.

Every modern institution works by its own routines, logics and expectations. This is not unique to First World War Austro-Hungary, although war puts a certain pressure upon the institutions involved. It follows from the ever more specialized areas of knowledge and expertise that are entrusted to run our institutions. Hospitals have certain ways of doing things, as have prisons, universities, municipal administrations and so on. These ways are, for the most part, contingent upon particular traditions within particular discourses, which means the only way to know them is to be a part of the institutions in question. They do not follow from necessity or reason, but through gradual accumulation of knowledge by repeated actions over time, and the only way to understand them is to have seen them in action. Learning by doing, as it were.

The fact that every modern institution has its own rules, rituals and informal ways to go about things, means that no one particular individual can be reasonably expected to know them all. Just as no one is expected to be a doctor, plumber, constitutional scholar, gourmet chef and rocket engineer all at once. By virtue of the sheer accumulated knowledge in each field, no one can know the internal logics of every field. It takes years of education and dedication to master even a single field, let alone a multitude of them. Knowledge is fragmented, just like the modern world.

However, it is possible to know just enough about the autonomous practices of each particular institution to know which kinds of signals will cause which kind of response. While the underlying logic might be obscure or unknown, the knowledge that certain acts engender certain responses is enough for those who want to communicate with the institutions in question. A trivial example of this can be as simple as knowing that turning in a certain form will cause a certain thing to happen. A more engaged understanding is that the workaday routines of any given institution can be disturbed by engaging in certain kinds of behaviors, which prompt extraordinary responses.

Most modern institutions force their logic upon their subjects. If you go to a hospital and are not literally on fire, you are most likely asked to sit down and fill out a form. Nothing will be done until you have filled out that form, and depending on how you fill it out, you will be treated differently – in both senses of the word. Veterans of the field – i.e. fellow or past patients – will tell you to fill in x rather than y, and the particular logic of the institution is ever so indirectly foisted upon you. There is a shared expectation that you act in certain ways while you attend modern institutions, and these expectations are constructed jointly by the subjects in the know and common subjects like yourself.

Švejk, our hero, manages to throw a wrench into the internal machinations of every institution he visits by not acting in accordance with these expectations. Early on in his adventures, he is thrown into a garrison prison, due to some minor mishap. A field priest holds a fiery sermon about repentance and forgiveness, and Švejk does the least expected thing of all. Rather than taking the sermon as the empty verbiage everyone else (the priest included) understands it to be, he arises and cries with utter conviction at the sentiment expressed. Such honesty had never before been seen within the prison walls, and no one is quite sure what to do about it. Švejk is sent to the priest’s office, and ends up in his employ – and thus escaping.

No indication is ever given as to whether the sincere idiot Švejk was actually genuinely moved by the insincere sermon, or if the sly fox Švejk knew that this was the only thing he could possibly do to escape his confinement. Both interpretations are valid, and during your reading of the books, I would suggest keeping both possibilities in mind as the events unfold. Not least due to the inherent possibility that it does not matter if Švejk is the one or the other, and the possibility that modern institutions respond the same in either case.

I would also suggest that you read the Good Soldier Švejk as a manual in modern resistance and sabotage tactics. You are not limited to simply doing what the authorities tell you to do – you always have the option to comply with the appropriate levels of idiot sincerity or sly insincerity needed to undermine whatever overall ambition the powers that be might have. You, too, can grind modern institutions to a halt by throwing a wrench into their internal workings.

Bring a friend.

Hašek: the good soldier Švejk

The death of god

An immediate association to the phrase “the death of god” is of the decline of religiosity in recent history. Young people visiting churches these days do it mostly for aesthetic reasons (especially if there is an adjoining graveyard), or – most commonly – as a side effect of weddings and similar happenings. The notion of visiting the house of God and communing with the spirit (as it were) is present only in its absence. It used to happen, but it used to happen in the sense that history happened: to other people.

Nietzsche took a literal approach to the phrase in his parable of the madman. In it, the titular madman enters a marketplace in search of those who, like him, saw and felt the death of god. He asks, he shouts, he makes all sorts of noise in order to ensure that none present missed what had happened. Those present, as you might imagine, were none too impressed by this sudden outburst of religiosity in their daily shopping and socializing. They had other things on their minds, other things to do, and if god had died that was neither here nor there. And even if what the madman said was true, it was not their job to sort out what it meant.

The death of god is not a phrase to read literally, or even as a theological statement. Rather, it is to be read in the expressions of those who heard it spoken and were indifferent to it. Gods do not die, but being forgotten is the next best thing to it. The fact that Nietzsche chose to locate the parable in a marketplace is telling, as the market (literal or figurative) is where the things that matter take place. The fact that the Market has undertones of religious veneration only empathizes this point – the old order has made room for something else, something not overly concerned with the metaphysical bothers of old madmen.

The death of god is a sort of shorthand for a more comprehensive change in the ways humans and societies go about things. On a large scale, the notions of grand accomplishments and venerable institutions that have stood the test of time as a monument to human will, have been replaced by quarterly reports and cynical calculations of who will be the next corrupt leader of our defunct systems. On a small scale, individuals engage in activities they only peripherally understand and do not have any emotional investment in whatsoever. Being amazed by the grandness of the world is replaced by fulfilling the daily quota of productivity. Gone are glory and virtue – what we have left is an imperative to increase sales performance by another percentage point.

Nowhere is this as visible as when the humanities are asked to “prove” their worth according to some arbitrary measure of profitability or utility. The belief that having people around who are able to read, write, paint, appreciate works of art or in other ways convey the experience of being human – has somehow been replaced by something more banal. What is the ability to navigate humanity compared to the ability to navigate the flows of market demand in order to supply just the right content?

You probably remember this from your school days. How you became enthused about something, only to be told that you had to do something else. And, more importantly, that this something else was more important than the frivolities that had you so enthused. Not just once, or twice, but throughout the whole duration, until you became so used to it that it became second nature. Work now, play later; seldom the twain shall meet.

Of course, there are reasons for this way to go about education: it aims to prepare kids for the world they live in, and habits developed early are more firmly rooted than those adopted later. There are whole systems of thinking about this, and they all each strive to prove that they are more efficient than their competitors at creating market-ready pupils. The preparation for life as a human being is not only shaped by the necessity to perform in some sort of market setting – it is designed for and by these very settings, with explicit goals and timetables.

The notion that you have to get a degree to get a job is still prevalent, and thus there is a certain ambient pressure to get a degree. Education is an investment, and you have to discipline yourself to make it worth the money. Not because of an innate curiosity or propensity to certain kinds of knowledge, but to make the cost/benefit analysis turn out in the positive. Finding out that something is not for you is a costly discovery, and it is not unheard of to complete an utterly uninteresting degree by sheer force of previous investments. When it is too expensive to back down, the only option is to keep at it.

Weber described this situation as the iron cage of modernity. Your actions are not based on what you want (or even need), but on the impersonal workings of bureaucratic systems and invisible market forces. Your life trajectory is not defined by traditional human qualities, such as love, family, friendships etc, but by ever more abstract relations to bureaucratic institutions. Education is one such institution, and the ways in which a relation to it can go wrong are numerous and well documented. Indeed, averting being documented a certain way (ie getting a bad grade) can become a more important focus of your lived experience than human beings in your vicinity – be they family, friends or loved ones. These bureaucratic institutions have it within their power to radically redefine our lives with a single form letter.

Of course, there are reasons to invest time and resources to keep our relationships to these abstract systems positive. Making it through an education with a degree opens many doors; giving a correct account of your taxes saves a lot of trouble; knowing the intricacies of bureaucratic systems lets you tap into their resources and opportunities. The iron cage rewards those who allow themselves to be locked into it; even more so those who learn to play by its arbitrary rules.

But the sneaky corpse of god always lurks in the background. The sneaking suspicion that the meaning of life is not to get that degree, exploit that loophole or get that grant. Or the gut feeling the telos of your being is not to maximize the profit margin for a company that does not care if you live or die. Or that there might be something more to being alive than being a mere cog in a societal machine. Or that digging the pit of babel might not be the best of ideas.

There used to be monuments dedicated to the awe of being alive. Most of them are still around, albeit not recognized as such. They are silent discursive anomalies from the beforetimes, and it behooves us to find them and revive them. Not because it will increase our marketability, but because life is – in the old religious sense of the word – awesome.

The death of god

Pecorari: Teaching to avoid plagiarism

Students read things. Students also write things. At the best of times, these activities are combined in the form of essays or theses, where the students draw upon a large number of sources (that they have read) and forge the ideas put forth by these sources into a coherent whole (that they have written). By doing so, the students demonstrate that they can navigate the world of the written word. More so, they demonstrate that they can use the vast riches of accumulated human knowledge in order to structure a well-reasoned account of something they have a non-trivial grasp of. By mobilizing the available knowledge in a given field, they become – metaphorical and literal – masters of a subject.

Of course, at the worst of times, they just copy a text that looks fancy and hand it in. Sometimes without even reading the text in question, thus performing neither the reading nor the writing aspects of student life.

There is a word for this worst case scenario, and that word is “plagiarism”.

Whether you are in academia or not, you are likely to have a vague notion that plagiarism is bad. Some instances of plagiarism need little explanation as to why they are wrong, such as the clear-cut example of turning in something that someone else wrote. Taking someone else’s idea and passing it off as your own – even if it is rewritten in your own words – is also understood to be on the wrong side of the fence. It is generally understood that you ought to do the work and give credit where it is due.

Given this, it might be surprising to find that Diane Pecorari’s book Teaching to avoid plagiarism (2013) has more than a hundred pages and touches on a wide range of topics, from literacy, to university management, to the sociology of knowledge. If the issue of plagiarism is as clear-cut as I made it out to be above, why would there be a need for any kind of lengthy discourse on the topic?

The reason for this goes back to the very first paragraph of this text, and the complex relation between reading and writing. Students are expected to read and write, and to write in such a way that it is clear to anyone who reads where their ideas and influences come from. This is a very specific way of writing, with a very specific set of rules and traditions, and it is not always clear from the students’ perspective how these rules and traditions work. The role of the university (and its teachers) is to introduce students to these rules and traditions. Hence, teaching to avoid plagiarism.

The requirement to produce a text on a deadline creates all manner of incentives for students to engage in a weaker sense of plagiarism – patchwriting. That is to say, to read the words in a given text, rephrase these words into slightly different words, and call it a day (with or without referencing a source). The students have fulfilled the goals the particular assignment set out for them – to produce a text – but not the more overall goal of engaging critically with the required reading and the ideas contained within. While patchwriting is not as severe as stealing a text outright, it does point to the heart of the matter: critical literacy and the ability to perform academic writing.

The goal of academic writing is to be as precise, clear and transparent as possible. A good academic text draws upon available sources and describes what these sources have to say accurately and succinctly. The purpose of this is to give a fair account of what others have to say on the topic, and to give readers the chance to navigate themselves to the ideas they need to understand in order to follow the line of reasoning from start to finish. (Whilst, hopefully, also avoiding the mistakes previous thinkers made while thinking on the same topic.) When readers are finished reading an academic text, they will have all the tools they need to understand what they just read. And where to look for more reading on the topic.

In order to produce such a text, students must have read the texts they refer to. Moreover, they must have understood them sufficiently to give an account of what was said, and in which context, and from which perspective. In short, they must do a non-insignificant amount of deep reading, and be able to show that they have done it. The means of showing that the work has been done is proper sourcing. That is to say, stating clearly and transparently which ideas come from which texts, and how these ideas are used in the context of the student’s own writing.

A common misconception is that you are not allowed to contribute anything of your own. Paradoxically, the point of building your own text through the texts of others is to give you the means to contribute something that is truly your own. When you have done the work of reading what others had to say, you can then compare and contrast your own thoughts – and contribute your own perspective to the discussion. When you have shown that you have thought through what others have to say, you have also shown that you have thought through what you have to say. You know who is who and what is what, and thus you have earned a seat at the table.

(The inverse of this is, of course, people who say things without doing any research whatsoever, and find their cases undermined by a simple Google search. While they might be ever so passionate about a given subject, their contribution is less than constructive and insightful.)

As you might imagine, this is a lot of work. It requires the application of many skills and competencies that are not obvious at first glance. Understanding the mechanical details of how to source ideas properly is hard in and of itself – especially since there are several ways of going about it (MLA, APA, etc). While these details can be learnt by spending enough time in the presence of these writing conventions, knowing them is not sufficient. Effort is required to attain a deeper understanding of how to use texts as (re)sources and tools for specific purposes. Even more so when it comes to understanding that different texts are appropriate for different tasks, and that there is a whole worldview associated with knowing what to invoke where. It begins with knowing that Wikipedia is of relative value as a source, and escalates from there.

When seen this way, proper source use is less about getting the formalities right and more about relating yourself as an independent subject towards the world (of texts and in general). Teaching to avoid plagiarism thus translates into easing students into such ways of thinking, and into assisting them in finding their own ways of being in relation. This is not an easy task, and Pecorari (2013) makes a point of repeating that it is the responsibility of the whole university to help students along in this regard. Not just once or twice, but through a sustained effort throughout the entire course of education.

This is slightly more involved than simply finding those students who turn in papers they neither wrote nor read. To be sure, there will always be a few who simply steal a text wholesale and turn it in as their own accomplishment, and those need to be dealt with somehow. But on a more general level, the purpose of higher education is to produce human being capable of critical literacy, and this end requires more sophisticated means than mere punishment.

It never hurts to have a book on the subject.

Pecorari: Teaching to avoid plagiarism

Stephenson: Anathem

When does knowledge become dangerous?

It is a question every loving parent faces as their kids grow ever more capable of interacting with the world. Not without a palpable sense of worry. On the one hand, it is their duty to nurture and encourage such personal growth that the child can stand on their own two feet and participate as a full member of this world. On the other hand, it is also their duty so watch over these same children so that they do not injure themselves in the process of growing and learning.

Some things are inherently more dangerous than others. The expression to play with fire comes to mind. While it is useful to be able to properly set thing aflame in certain situations, the path to knowing what constitutes these proper situations is less than self-evident, and knowing how without knowing when is a recipe for potential disaster. It is a question of judgment, for all parties involved, and many manufacturers play it safe by using phrases proclaiming their items not suitable for children.

In Anathem, Neal Stephenson asks this question through a series of iterations. Institutions of learning are set up across the world, and subsequently isolated from it, giving their members free reign to experiment and advance their pursuit of knowledge. From this isolation springs any number of useful innovations, which are then distributed to the outside world through some unspecified means. And, every once in a while, these institutions manage to invent something so effective and destructive that the outside world feels it necessary to crack down on them and ban any further developments in that direction.

Except, of course, for some useful tidbits that requires specialized training in order to maintain. Fire, while dangerous, is still useful. Every iteration retains some tidbit of useful knowledge, restricting it to a set of practical real world useful applications. While never going so far as to close down the institutions all together, each iteration impose ever stricter limitations on what knowledge can be pursued without outside intervention.

These limitations follow a certain direction, from the physical through the digital to information in and of itself. Blowing things up in a physical sense is, as you might imagine, a useful capability for political entities to have, but its usefulness becomes a clear and present danger when possessed by everyone. Blowing things up in a similar, digital way is also useful for these same entities, and equally dangerous when the hoi polloi get their hands on it. The comparison holds for the information of life itself; given the fears of natural mutation of already existing viruses, giving everyone a basement gene lab is a fiery proposition indeed.

Stephenson pushes the question further in the same direction by, ever so indirectly, asking when simply thinking about thinking poses a similar threat. When does it become an unequivocal threat to everyone involved to have disciplined and dedicated metathinkers roaming around with their uncontained and mostly unknown cognitive superpowers?

And what useful tidbits would be left behind after an intervention to contain these roaming brainiacs?

This question is, naturally, not new, as any student of philosophy will be wont to tell you. The enthusiasts will quote any number of passages praising the usefulness of philosophy in all manner of things, while their more jaded peers will simply let you glance at the required reading lists. Justifying the raw potential power of philosophy is very much a part of philosophy as a discipline, and depending on who you ask, it is either a footnote or the raison d‘etre of the whole enterprise.

At first glance, the notion of explosive philosophy might cause a giggle. Then, upon further reflection, it might cause a frown, followed by an unarticulated suspicion that there might be more to it than blowing up cars in a cinematic fashion with your mind. As the Philosopher said, you are what you do, and as later philosophers have commented on this, you do what you think, and what you think is, if you really think about it, some sort of philosophic reasoning. And if you can change what you do by simply thinking about thinking – well, that is explosive stuff, if you keep at it.

Anathem, being a work of fiction, gives Stephenson the discursive excuse he needs to keep pushing this idea in any number of directions. Should you attempt to put it into action, however, you would most likely find yourself straining at the limits of your social surroundings. You are, after all, not an isolated self-sufficient enclave free to pursue whatever strikes your fancy, but an integrated part of a social context, with obligations, duties and communicative hurdles containing you to a limited range of possible options. Thinking is all well and good, but saying the wrong thing at the wrong time can scar you for life.

Which, to be sure, brings us right back to where we started. When does knowledge become dangerous? How does a loving parent go about teaching their young ones the useful things they need to know, without inadvertently causing harm further down the line? Are there best practices for how to play with fire, physical as well as metaphysical? Who gets to decide these things?

Ask not for whom the bell tolls, lest you become anathem yourself.

Stephenson: Anathem