Trapped in Negation
I am well aware that academics have been complaining about managerialism and lamenting the fate of the humanities from time immemorial. But I can’t recall a time when the discipline of literary studies, in particular, has seemed as besieged and vulnerable as its does at present.
A couple of years ago, I was working as a sessional teacher at a university when management handed down a new decree: from now on all employees must undergo a ‘working with children’ check. This struck me as odd. While I don’t discount the possibility that an institution of higher learning might have to accommodate the occasional underage prodigy, university students tend to be high-school graduates, which means that even the freshest fresher will usually be at least seventeen. True, this does not technically qualify someone as an adult, but a seventeen-year-old is not exactly a child. More to the point, I had been hired at short notice to teach a course for second- and third-year students, none of whom could be mistaken for children.
The university management, in other words, wanted me to jump through a clearly inapplicable bureaucratic hoop. They wanted me to prove I was a respectable person with no criminal past that might make it risky for them to place children under my care, so I could walk into a classroom full of adults the following week and discuss an Angela Carter essay about the notorious French libertine and pornographer the Marquis de Sade, having spent the previous few weeks with those same adults considering Primo Levi’s account of conditions in Auschwitz and Hannah Arendt’s concept of the ‘banality of evil’. And the expectation appeared to be that I would rush out in the middle of a busy semester to fulfil this suddenly mandatory requirement, even though I was, like a good proportion of the teaching staff at universities these days, not a permanent employee, but a casual with no leave entitlements or job security, who was being paid an hourly rate that did not include preparation or consultation time, and whose services would be dispensed with the moment the final essay was marked at the semester’s end.
Christ on a bike, I thought. What will they want next? A heavy vehicle licence? A responsible service of alcohol certificate? An Estonian tourist visa? So I did what a reasonable person would do when faced with such a reasonable demand: I placed it in the far queue and busied myself with more pressing matters. I would deal with that particular edict, I told myself, if and when someone decided to follow it up — guessing (correctly, as it turned out) that no one was likely to be doing so any time soon.
That same semester, I was asked, again at short notice, to teach another course at a different university. Such is the nature of casual employment in the tertiary sector. This university did not require employees to submit to irrelevant police checks. In order to establish that I was a virtuous person, I was merely required to complete an online training module about ethical behaviour. This consisted of a series of hypothetical scenarios, which I was to grade on a scale from one to five. Some of these scenarios were straightforward (sleeping with students: totally not ethical — I got that one right); others occupied more of a ‘grey area’, as the online module helpfully explained when I assigned a three to a particularly dicey ethical conundrum that should have been scored a two.
Anyway, after I failed, I was given the option to complete the module again. I confess that I had not been taking the exercise all that seriously up to that point, as you were allowed multiple attempts to get it right. My initial run through had merely been to gain a sense of what I was being asked to do. And it was impossible to fail, really, if you put your mind to it, because the university supplied a glossy brochure that explained the finer points of each ethically challenging situation and set out clear behavioural expectations. All you had to do was scan the PDF and find the answer they wanted. The possibility that this might allow an unethical person to slip through their vetting process had apparently not occurred to them.
But I decided not to persist, again guessing correctly that the labyrinthine administrative superstructure of the modern corporatised university was such that there was unlikely to be anyone rushing to reprimand me for neglecting this bureaucratic obligation. What bugged me about it was not just the pointless box-ticking and the minor irritation of being told how to behave, even though I had relevant qualifications and experience, and despite the fact that I was (I liked to think) an adequately socialised human being — no, what really irked me was the apparent correlation between the version of ‘ethical’ behaviour this slick training module was seeking to impose and the imperative not to act in any way that might besmirch the good name of the university. It was all about protecting the brand. The assumption seemed to be that behaving ethically and maintaining high academic standards were important, not because we have personal and professional obligations toward other people, or because the very purpose of the university was to nurture the life of the mind and encourage scholarly endeavours as noble ends in themselves, but because these things were excellent for public relations and likely to generate high levels of customer satisfaction.
Now, I realise this is all pretty trivial. Like the demand that someone who does not work with children undergo a ‘working with children’ check, the online module was a mere formality, little more than a symbolic gesture, the kind of tick-and-flick exercise that bureaucracies like to impose every once in a while to remind everyone just who is working for whom. It probably warranted no greater show of resistance than the theatrical eye-roll performed by the young academic who told me about this particular requirement.
Unfortunately, however, I have a background in literary studies. As a result, I tend to treat questions of language and symbolism as if they are important. It is literally the only thing I am qualified to do. I have been trained to regard even small symbolic gestures as significant precisely because they encapsulate larger structures of meaning. This made it hard to overlook the arrogation of virtue. The university was, in essence, rigging up its probity by demanding I prove my own. It was seeking to indemnify itself. Yet it seemed to me that the convenient conflation of virtue and self-interest in this instance, when set against the actual conduct of the university, could be readily interpreted as evidence of systemic hypocrisy.
Here was an educational institution making a show of its commitment to the highest standards of professional behaviour and the welfare of its students, at the same time as it was funnelling as many students as possible through outrageously overcrowded and understaffed courses, and outsourcing much of the substance of the education they were there to receive to sessional teachers, like me, who were not being provided with the kinds of working conditions that might allow them to do their jobs to the best of their ability, or being remunerated at levels that even began to approach adequacy. Here, in short, was an institution demanding a ritual show of obeisance without burdening itself with any reciprocal sense of obligation, an institution that clearly had no qualms about treating casual employees and students alike as exploitable and disposable.
Did I mention that the ‘ethics’ of this particular university do not preclude hiring someone for two hours a week, then expecting that person to take sole responsibility for a cohort of almost eighty students?
I raise all of this for a number of reasons. The first is that, if we are going to talk about scholarship, it is necessary to get the admin out of the way. The second is to declare that I cannot claim to be a scholar in any professional sense, as no university has ever troubled itself to employ me in an ongoing capacity. Most of my work over the past twenty years has taken the contingent form of public criticism, which I have pursued in the faint hope that it might be possible to carve out a little space to read and think and write about literature beyond the walls of the academy. Anything I have to say on the subject of scholarship thus comes from a marginal and, as it were, undomesticated perspective.
And from this vantage point things would appear to be less than optimal. It seems to me that the starting point for any discussion of the meaning and value of scholarship — and I am assuming we are talking predominantly about scholarship within the discipline of literary studies, though I take this to be a reflection of the situation with regard to the humanities more broadly — must be how scholars themselves are treated by the institutions that notionally exist to foster such scholarship. This would suggest that we live in a society that is devaluing scholarship in a quite literal sense. In May 2019, The Age reported that around two-thirds of the academic staff at Victorian universities had no secure or ongoing employment. The ratio at the state’s two wealthiest and most prestigious universities, Melbourne and Monash — universities that in 2018 boasted a combined revenue of more than $3.5 billion — was over seventy percent.
Those figures, which reflect national and international trends toward the casualisation of academic labour, are frankly disgusting. The official line that sessional teachers are paid for all the hours they work is plainly a lie. Systemic exploitation on such a scale cannot be regarded as a minor or incidental feature of the contemporary university. As one frustrated casual academic was moved to observe recently, we have to confront the fact that ripping off sessional teachers while grinding them into the dirt is now part of the business model. The academic quislings whose response to this development is to ponder its potential ‘benefits’, while blandly predicting that rates of casualisation will only increase, or who greet the prospect of a generation of emerging scholars being forced into an unconscionable state of precarity with the inane observation that ‘we will all need to be custodians of our own Brand Me’, should be ashamed of themselves. I do, however, salute whoever wrote the headline for the first of the articles I am alluding to, which perfectly captures the Helleresque absurdity of this academic version of Stockholm Syndrome: ‘Casual academics aren’t going anywhere, so what can universities do to ensure learning isn’t affected?’
It is not just the proposition that educational standards can somehow be separated from the disgraceful material conditions under which casual academics are expected to work that makes you want to claw your own eyes out. What is truly dispiriting is the sense of resignation, the assumption that the obvious answer to the question is unthinkable. Apparently, all those overworked and underpaid casuals should stop hitting themselves. The passive acceptance of the idea that (to quote Margaret Thatcher) ‘there is no alternative’ represents a clear victory for the prevailing corporate ideology, which recognises no principle beyond its expansionist logic of ever-increasing turnover and maximised revenue. The unambiguous implication of such thinking is that academics exist to support the system, rather than the other way round. So much for the idea of a self-governing community of scholars.
Again, I stress that I am viewing this from an external perspective, and I am well aware that academics have been complaining about managerialism and lamenting the fate of the humanities from time immemorial. But I can’t recall a time when the discipline of literary studies, in particular, has seemed as besieged and vulnerable as its does at present. Literature departments are shadows of their former selves. The attrition rate among young literary scholars in the United States has been described as an ‘extinction event’ and has even spawned its own sub-genre of essay known as ‘quit-lit’, in which former academics recount their tales of being burnt out and walking away in disgust.
The fact that for several decades the study of literature has been the focus of fierce ideological dispute has also taken its toll. We have reached the point where prominent right-wing voices in Britain and the United States are openly canvassing the possibility of defunding or abolishing the humanities altogether. Relentless culture warring has hollowed out the notion that studying literature is an intellectually valid occupation in itself. To the extent that it retains a degree of cultural significance, it has become a symbolic chattel to be squabbled over. Public discussions about its potential value as art and its inherent complexities have been subsumed by demarcation disputes and simple-minded political arguments, in which the substance of specific works of literature is essentially moot. There could hardly be a more brazen insult to the innumerable scholars who have devoted themselves to the study of literature than the spectacle of a well-funded coterie of ideological cranks, self-appointed defenders of ‘Western Civilisation’, who plainly couldn’t give a toss about anything Milton or Flaubert might have had to say, walking into a Vice-Chancellor’s office with a cheque for $50 million and walking out with their very own custom-designed vanity degree.
The traditional defence of the humanities has its intellectual roots in late eighteenth-century romanticism, from which emerged the notion of ‘aesthetic education’: the idea that exposure to great art and literature will have a civilising or refining effect, that it will open a path to the holistic process of personal growth and self-realisation encapsulated by the German concept Bildung. This idea has, for a host of reasons, come to be viewed with justified scepticism. The late George Steiner once pointed out — conclusively, one would think — that there were officers at Auschwitz who liked to relax after a hard day’s genocide with a volume of Goethe’s poetry and a Beethoven record playing on the gramophone (and fair play to whoever noted that one need not reach for such an extreme example, since the proposition that literature has a civilising effect is disproved by a quick glance at any university English department).
But in the stubborn notion, to which I retain a sometimes uneasy allegiance, that serious literature has things to teach us, that it represents something more substantial and important than a mere idle diversion, resides a basic ambiguity that defines the humanities — namely, that it has a foundational commitment to an ideal of intellectual inquiry that is autotelic. The value of its scholarship cannot ultimately be quantified, precisely because it recognises that man does not live by bread alone. The humanities are premised on the idea that to be ‘human’ means having a natural interest in the true, the beautiful and the good.
That such sentiments, with their suspicious whiff of the ineffable, have come to seem platitudinous is an indication of how naturalised they once were, and how fundamental they once were to the very conception of the university. Even a confirmed utilitarian like John Stuart Mill could acknowledge, as something of a commonplace, the intrinsic value of an open-ended, anti-instrumentalist view of education. The university, he observed, ‘is not a place of professional education. Universities are not intended to teach the knowledge required to fit men for some special mode of gaining their livelihood. Their object is not to make skilful lawyers, or physicians, or engineers, but capable and cultivated human beings.’
The problem we now face — and by ‘we’ I mean those of us who still believe that literature constitutes a substantive body of cultural knowledge worth preserving and studying — is not simply the traditional scepticism about the value of an education in the humanities, but a pervasive set of cultural assumptions that have made it difficult to speak of literature as a substantive body of knowledge.
In his recent book Nervous States, the British sociologist William Davies argues that the ‘post-truth’ society we now inhabit, with its overt scorn for the very notion of expertise, is the logical outcome of the ideas of Friedrich Hayek, who is universally recognised as the intellectual godfather of the neoliberalism that has been restructuring our lives for the better part of four decades. Hayek wrote in opposition to any and all forms of socialistic thought, basing his philosophy on the contrary notion of a radically atomised version of homo economicus. In doing so, he not only repudiated communism and even the mildest forms of democratic socialism, but essential components of traditional liberalism as well. One can take as an indication of just how comprehensively Hayek has eclipsed his nearest ideological rivals the fact that Davies’ wide-ranging intellectual history, which reaches back to the seventeenth century in its attempt to identify the philosophical foundations of our current predicament, charts the rise of so-called ‘neoliberalism’ without feeling the need to refer to the likes of Locke, Mill and Rawls even in passing.
The assumed obsolescence of these canonical liberal philosophers underscores the essential point, since it is precisely what is ‘liberal’ about them that neoliberalism vanquishes. Mill’s famous defence of free speech proposed that the truth can only emerge from a process of unfettered debate. In the present climate, that might seem optimistic, but Mill at least assumes that there is such a thing as truth and that it might be discoverable. At the heart of Rawls’ thought is the concept of justice: the difficult problem of how to balance the liberal commitment to the rights of the individual against the need for social cohesion and fairness. Mill and Rawls, in other words, predicate their individualism on a negotiated relationship with a social realm and with metaphysical concepts that are assumed to be held in common — which is to say, there is a humanistic core to their thinking, an assumption that while we might never agree what constitutes truth and justice, we can at least agree that truth and justice are the subjects of dispute, and that they are ends worth pursuing.
This is, more or less, the basic paradigm of scholarship. As Davies points out, the starchy formalities of scholarly discourse arose in order to avoid personalising the business of intellectual disputation: you can attack someone’s ideas as vigorously as you like, but you must respect your colleagues. This convention, imperfectly practised though it often is, recognises that scholarship entails a process of negotiation between individual expertise and an accepted body of knowledge; it is a way of negotiating the inevitable friction between one’s own experiences and convictions and those of other people, from which emerges the possibility of an expanded and enriched understanding. It assumes that genuine knowledge is grounded in evidence and reason, that to be credited as such it must be able to withstand scrutiny, demonstrate that it is based on something more substantial than a personal opinion. Thus individual scholars, however singular their views, both draw upon and contribute to a collective endeavour.
The defining characteristic of neoliberalism — and, Davies suggests, its most diabolical innovation — is that it does not even try to negotiate these kinds of tensions. It simply dismisses them. It is a philosophy that has more in common with Sade than Rawls. It recognises no principle beyond individual desires, accepts the validity of no collective measure of those desires beyond the hard data of the marketplace. This means, in effect, that it makes no distinction between a baseless conviction and an expert opinion. As Davies observes, its attitude to truth can be summed up in a remark attributed to Napoleon: ‘It is not what is true that counts, but what people think is true.’ It is a view of the world that is not merely philistine; it has been conceived in such a way that it is incapable of registering the value of any kind of meaningful abstract concept that speaks to our inner being or to our collective existence — not only truth and justice, but beauty, morality, society, and of course the proposition that an education in the humanities might have a value that has no valid metric, that its greatest value may well be that it refuses to submit to narrow instrumentalist thinking.
On a purely institutional level, this inevitably ends up validating the tyranny of the majority that Mill cautioned against, bending the scholarly function of the university to meet the demands of the marketplace. The calculus is simple. Popular courses thrive; unpopular courses die. Research that attracts external funding is important; research that does not is unimportant. Utility is all. I am grateful to the federal education minister Dan Tehan for stepping forward, at the very moment I was sitting down to write this paper, to demonstrate the point. In the course of announcing that the funding goalpoasts for universities were to be moved once again, he declared that literary scholarship was otiose. Soon it will disappear once and for all; it will wither and die, and good riddance to all that useless cultural knowledge. I am paraphrasing, of course. What he actually said was that our universities must be ‘producing job-ready graduates with the right skills for the modern economy’, which amounts to much the same thing.
To this end, Tehan announced that funding was to be linked to a new set of metrics that would not only quantify internal practices, but take account of graduate employment outcomes. Now, I realise that the world is facing far greater problems than arcane funding arrangements for beleaguered humanities departments, and we are all numbed to this kind of thing, but I am quietly astonished that this announcement didn’t seem to generate much of a ripple. If humanities departments are, on top of everything else, going to be held responsible for what happens to students after they graduate, then we may as well admit it’s all over — though it must be conceded that the system will be approaching something close to perfection when it becomes possible to accumulate an enormous student debt completing a university degree in literary studies, taught almost exclusively by academics without secure jobs, only to become an exploited casual who is paid for a fraction of the hours you work, at which point the cash-strapped department that lacked the resources to employ you properly in the first place will face additional funding cuts because your quantifiable ‘graduate outcome’ indicates that you didn’t earn enough money.
The simple point I am trying to make is that if you set out to justify literary scholarship in utilitarian terms, you have lost the argument before you start. To my mind, scholarship is a word with overwhelmingly positive connotations. I regard it as distinct from the related concepts of research and theorisation, which are the necessary activities one undertakes in order to arrive at some kind of genuine understanding. Scholarship represents the knowledge itself. It is the ideal. It evokes the positive principles of intellectual rigour and disinterestedness, in the true meaning of that word. It speaks of a love of knowledge for its own sake, a desire to understand things in all their complexity. It rejects impulsive or emotive interpretations, preferring the kind of grounded and measured view that can only be arrived at through a consideration of all the available facts. It is not divorced from political concerns, but it is fundamentally non-ideological and anti-instrumentalist in its orientation. Its allegiance is always to that which is true, that which is verifiable. The true scholar respects the evidence, respects precedent and expertise, and argues in good faith, even when taking a dissenting line. Scholarship is thus communal; it opens up the shared realms of culture and knowledge, demands a consideration of abstract values and concerns larger than our own.
By this definition, scholarship is implacably opposed, in every important respect, to the political and institutional culture we now face. This is precisely why the ideal is worth upholding, why it urgently needs to be defended, but also why defending it can seem like such a thankless task. I am not in a position to offer any concrete solutions to this dilemma, but I will make a general observation. Over the past four decades or so, much of the theoretical work within the humanities has flattered itself with the idea that it is intrinsically radical. Literary scholars have been busy dismantling canons, subverting dominant paradigms, practising the hermeneutics of suspicion, insisting on the radical indeterminacy of language, using words like ‘essentialist’ and ‘normative’ as insults, and so forth. I am far from the first person to note that there is a certain historical irony in the fact that humanities scholars have been busy reading their Foucault while the universities have been morphing into panopticons, or that the kinds of sociological approaches and abstruse philosophising that have characterised these theoretical developments could be interpreted as convenient means of adapting to institutional demands. And my intention here is not to dismiss or even denigrate the intellectual substance of these phenomena, which are far too complex and diffuse to admit any easy judgement. I would merely observe that one of the consequences of all this theorising for literary studies has been an evident loss of faith in itself, an inability and perhaps even a squeamishness about articulating a positive argument in its own defence.
There has been something of a move away from this tendency of late. As Rita Felski has observed, the habit of treating literary works as symptomatic, rather than substantive in themselves, leaves unanswered the question of why one might choose to read them in the first place. This sentiment has been echoed by Toril Moi, who argues in her latest book Revolution of the Ordinary that the anti-essentialist thinking that has come to predominate within the humanities remains ‘trapped in negation’ — which is to say, it defines itself in the act of rejection, leaving itself unable to articulate any valid principle of its own. Moi’s specific ambition in Revolution of the Ordinary is to propose a version of Wittgenstein’s ordinary language philosophy as an alternative to what she sees as the dead end of post-Saussurean thought. But in the process she makes a straightforward point. ‘Much of the humanities deals in particulars,’ she writes. ‘The critic’s love for the particular case — the specific poem, novel, or film, the specific artist, painting, composition — fuels her work.’ This is a truth that literary scholars are apt to downplay, if not conceal. If the challenge before us is to find a way to argue for the value of literary studies that resonates beyond a shrinking community of beleaguered scholars, then I am inclined to agree with Moi that the way forward lies with dropping certain pretenses to theoretical sophistication, speaking of the substance of literature in meaningful ways that emphasise its specificity and its relevance, developing a plain-spoken poetics that is grounded in appreciation and attentiveness. I don’t necessarily think this will fix anything, but it couldn’t hurt.
This is an edited version of a paper delivered at Provocations #2: Scholarship is the New Conservative at the J.M. Coetzee Centre for Creative Practice in 6 September 2019.