One Year of Freelancing, part 1: The Mistakes I Have Made

champagne
To my first year of freelancing… a huge success!

I have been freelancing and writing now for about a year. In that time my skill set has made me wealthy and famous, my name has become a brand, I have no idea what the word failure means… oh, wait, I’m confusing reality with my application to go on The Apprentice. No, I’ve been freelancing for a year, my income has not yet matched the rent I’m paying (I try not to sweat the small stuff such as affording food and paying my bills), all my savings are gone, and I’m currently begging and borrowing (but as yet only contemplating stealing) in order to survive. So it’s been going splendidly.

Even if I didn’t actually want to freelance, what choice do I have (I ask myself rhetorically)? I have applied for several jobs over the past eighteen months, ranging from those where my experience and qualifications were completely aligned with the position, to the most basic jobs that I could have done with my eyes closed when I was 18 (and my eyes would have been closed, so unimaginably tedious were the jobs I was applying for). But I haven’t got close to landing any of them. There’s simply not much place in today’s job market for a man in his 40s, armed only with a PhD, a varied and successful background in academia, a capacity for independent thought and a sense of humour. Perhaps the problem is that I simply don’t get the soul-destroying values of most ‘work’ (which increasingly includes the work of academia and its brave new corporate mentality), the obsession with making profit out of producing meaningless shit, the ‘performance targets’, the valuing of numbers over people and, well, values, the human resources personnel (i.e. the corporate drones whose main purpose is to make employees redundant), the idea that productivity demands putting on a suit and tie and turning up to a dreary office at nine in the morning when, quite honestly, what would truly be productive (i.e. of benefit to the world) would be for me to lie in bed all morning thinking about Descartes and then sharing my thoughts in a blog article… I could rant on, but I won’t.

Fortunately no ‘sane’ employer wants me near a desk at nine in the morning, so I do get to sleep late and then pen thousands of words—the equivalent, in fact, of a couple of books over the past year, all of which have earned me the princely sum of £0.00 (exclusive of my expenses). So this is the life for me, and despite all evidence I retain an almost lunatic belief that it will work. Peer extremely hard and, as I will insist on telling the psychiatrists when they cart me away, grounds for optimism can be detected. One reason for optimism is in fact the mistakes I have made—or rather that I can recognize most of them. So here are the mistakes that I may or may not be repeating as I venture boldly into year two of freelancing/crippling poverty.

  1. Not having a good plan

Although I had long entertained the idea of going freelance, my actual step into it was the result of a neat government trick. There I was signing on as unemployed, for the government one more irritating confirmation of how badly they preside over society and the economy—so they gave me a little incentive to go self-employed, thereby taking me out of the unemployment statistics and saving on the welfare bill. It was dressed up as a scheme in which ‘employment experts’ would help participants to draw up a coherent business plan and get this plan up and running. But I think ‘scheme’ is too grand a word for it. ‘Crap’ would be better, and also a suitable description of my business plan which the ‘experts’ cheerfully waved through.

But I should be fair: I take full responsibility for my poor planning. I had some ideas—a mixture of tutoring, editing and writing—but with no clear sense how I was going to make them work. In part that’s the way of things: we learn through doing. But it helps if there is at least the semblance of a plan to give structure to the doing. As someone who had spent most of his life in a refuge from the ‘real world’ (i.e. in academia), I had little insight or idea of how to formulate a coherent strategy that may actually turn my vague ideas into a viable living. But worse was that the ideas were indeed vague, a sort of ‘I’ll do a bit of this, and a bit of that’ without really knowing much about what the doing and the bits and the ‘this’ and the ‘that’ were. In so far as I had a document headed ‘Business plan’, it should be filed under ‘creative writing’; since it was hard to identify in it either a business or a plan, it was more an invitation to drift aimlessly than a blueprint for success.

  1. Lack of confidence and belief

A bad plan is arguably better than having no plan at all, but not having confidence or belief in it or my ability to carry it out simply compounded its badness. Initially all was well: despite having had the recent experience of being made redundant from a job that I was good at followed by several months of trips to Walthamstow (an experience grim enough in itself) to sign on at the Job Centre (a whole new level of grimness which made me ponder on the happier times of my job cleaning toilets—yes, I have done that as well as give lectures on topics such as the philosophy of Spinoza), I began my life as a freelancer full of hope and joy. I held onto this unfamiliar state of mind for a surprisingly long time (it was a good two or three hours) before reverting to my more typical mentality of doubts and pessimism. And, frankly, one cannot afford doubts and pessimism as a freelancer (actually I could, because I had some savings, but you know what I mean).

Freelancing requires almost single-minded commitment, confidence and belief. I don’t think it is necessary to be quite as deluded in this respect as an Apprentice candidate, but it probably wouldn’t do any harm. Being convinced not only that the path will end in success, but also that you are utterly brilliant at doing what you do—yes, you end up looking like a dick, but who cares about that when the money is rolling in? (Well, I tend to actually, which is part of the problem.)

What happened to me was this. I started questioning whether I was really any good at all the things I wanted to do. I worried about the lack of money coming in. I wasted time and energy on agonizing whether to swap self-employment for regular employment, and more time and energy applying for jobs I didn’t much want (and yet more time and energy on feeling miserable about how my soul would be imperilled if I actually got one of these jobs). I started taking on far too much private tutoring, at the expense of those things such as writing that I really wanted to do and which have more long term potential, simply because tutoring brought some money in (but, as I hope to write in a future article, private tutoring is a dispiriting sector, abysmally paid and involving often demoralizing work and conditions). As my already bad plan started to turn into an even more directionless mess, my belief in my potential as a freelancer further plummeted. Desperately I would come up with ideas, only to panic and abandon them in a fog of uncertainty and despair. Feel free to follow the recipe in this paragraph: if failure is what you want to cook up, then I can guarantee that it works.

Above all, confidence and belief are about holding one’s nerve (for example, not letting panic compromise long-term aims by focusing on short-term work that ultimately yields little of enduring benefit) and being prepared to blunder, to make mistakes and to make a fool of oneself. The latter is especially important: among the wisest advice I have received from a close friend and artist is the necessity of taking risks, of diving in, of not letting the many mistakes and failures along the way distract from the path. All that involves almost total belief both in what one is doing and in one’s ability to do it—and total belief means not questioning whether any of this is deluded.

  1. Lack of marketing strategy and efforts

As an academic I preferred thinking about things such as whether early modern witchcraft can be explained sociologically or anthropologically rather than things such as marketing. But that mentality is disastrous for the freelancer. As I’ve learnt over the past year, marketing is, if not quite everything, then not far off it. You can offer amazing services, have loads of great content, be creating an amazing product, but unless it is marketed properly then little will come of it. Similarly, great marketing will make all sorts of crap successful (and explains why most of the stuff we own is really rather shit). All of that is, of course, obvious—or you’d have thought so.

In so far as I had any marketing strategy it was this. Set up Facebook, Twitter and LinkedIn accounts. Sign into them every now and then. Make contact on social media with people I know. Set up a website. Offer services and produce content that are of outstanding quality. Give away most of this for free or at ridiculously reduced rates in order to expand my potential customer base. (Yep, I’m using the lingo…) Tweet and post updates about it occasionally. Watch as the world quickly catches on to my brilliance.

It was the marketing strategy of someone who was both clueless and overly concerned about the possibility of annoying people.

What I’ve learnt—the insight I’ve gained from the mistake I have made—is that marketing needs to be done continuously, and that it needs to be fairly aggressive (and risk annoying people). Every day I should have been setting aside some time to promoting myself, which would have meant spending more time on social media than I generally care to. I should not have been shy of things that I normally consider ridiculous: amassing numerous followers and contacts; ‘liking’ tweets and updates about cute kittens; telling the world that I have just made myself a coffee as a prelude to writing a new article; sharing, liking and retweeting over and over again; repeatedly updating and tweeting about my own writing until the only way to shut me up is for sufficient numbers of people to read/download/pay for it. For someone whose preferred mid-morning activity is to lie in bed with a cigarette while reflecting on the beauties of anarchism or the merits of existentialism, spending that time instead in the less edifying milieu of social media does not come naturally. But it is essential.


So those are three mistakes. I have given this article the hopeful designation of ‘part one’. That means it may be followed by a ‘part two’—in which the intrepid and now penniless freelancer explains why, all these mistakes in year one notwithstanding, year two will be a story of success and glory. And that depends on whether, by the time I write the second part, I have got a plan, one in which I believe and have confidence, and one for which I have a marketing strategy.

It all comes down to one of the questions that historians occasionally ask: do we learn from the mistakes of the past, or are we doomed to repeat them?

Advertisements

In defence of school absenteeism

Prior to 2013, schools had discretion to allow term-time absences for pupils. Regulations introduced in 2013 put a stop to this, allowing absences only in exceptional circumstances (such as to attend a funeral), and stipulating fines for families who did not comply. This system of penalties is now likely to be thrown into confusion by the decision of a court to overturn a £120 fine imposed on a father for taking his daughter on a week’s holiday during school term. The appeal against the penalty hinged on the question of what constitutes ‘regular’ attendance. The law simply states that a child must attend school ‘regularly’ without offering a precise definition of the term. In this case, the father pointed out that his daughter’s attendance had been 94% the previous year and was 100% at the point of going on holiday. Clearly—and the court felt so too—such numbers might be thought to fall within the definition of ‘regularly’; equally clearly, ‘regularly’ is a sufficiently vague term to make most fines for non-attendance unreasonable.

But the issues go deeper than legal argument about the definition of a word. In particular, it is worth considering who is responsible for a child’s education. Many people suppose that the state assumes responsibility. Legally, however, this is not the case. Education is compulsory, but the primary responsibility for ensuring that a child receives an education falls to parents. Section 7 of the 1996 Education Act states that it is the ‘duty of parents to secure education of children of compulsory school age’, that this duty involves arranging education suitable to the ‘age, ability and aptitude’ and ‘to any special educational needs’ of the child, and that this may be achieved ‘either by regular attendance at school or otherwise’.

The legal situation can be summarized in this way: parents have the responsibility for ensuring the education of their child, and they can decide to fulfil this, but do not have to do so, by sending their child to school. The phrase ‘or otherwise’ has generally been understood to cover home education—and, indeed, it is the basis for the excellent home education resources and campaigning group, Education Otherwise (whose slogan ‘Education is compulsory, school is optional’ summarizes the legal situation more succinctly than I have done).

The law here seems eminently reasonable: parents may well choose to delegate their child’s education to a school, but to compel them to do so would be an unwarrantable intrusion by the state on the freedom of parents to decide on the best interests of their child. Most parents are better placed than the impersonal state to understand the needs of their child, and there may be excellent reasons why they deem formal schooling unsuitable to those needs. Not all parents believe that corralling children into a classroom to be drilled for hours in preparation for an endless round of tests and exams comes close to representing an ‘education’.*

Both legally and morally, therefore, there are question marks over the power of the state to fine parents for pupil absences. As long as the parents are ensuring that education is being provided, then it would seem that there is compliance with the law. Even if we accept that delegating education to the state involves entering some form of agreement to ensure school attendance, it is not clear that occasional absences are in breach of this agreement. As the recent appeal case confirms, occasional absences do not breach the requirement for ‘regular’ attendance.

Of course, it may be objected that such absences do in fact breach the duty of parents to provide an education for their child. Many suppose that unless a child is in school, that child is not receiving an education—and they will particularly complain about a child being on holiday instead of in school. But such objections stem from an excessively narrow idea of what education is or should be about. Education is not solely about passing exams nor should it be exclusively concerned with preparation for the workplace. These may constitute important parts of it, but above all education is about preparing children for life. A child will learn far more, and far more important things, from a week on holiday (about, for example, a different culture, a different place, different activities, family relations, and so on) than from a week in the classroom. The notion that a few days of lessons are a more important life experience than a holiday is evidence of the skewed values of society today. (Indeed, a child will almost certainly learn more about life from occasional truancy than from the lessons that he or she has skipped.)

Consider the case of the couple who have been taken to court because they took their two young children to India to visit their extremely sick grandfather. There are compelling personal reasons why such an absence might reasonably have been granted, none of which seem to have resisted the bludgeoning approach of the state. But beyond that, it could be argued that the educational value of the trip far surpassed anything that might have been learnt in a week or two of school. Not only would it have entailed the experience of a different culture, but it would have involved (unfortunately, of course) an engagement with such important parts of life as illness, ageing and mortality. Only the most blinkered educationalist would maintain that a few literacy and numeracy classes have more educational value than that.

CalvinAndHobbes_12

The Department of Education claims that even one week of missed classes each year will damage a child’s chances at GCSE with a lasting effect on life chances. But this is simple scaremongering; reassuringly for those children who are occasionally ill (i.e. all of them), there is nothing daunting about catching up with a week or so of lessons here and there. I have no doubt that there are often serious problems arising from absenteeism, but trying to solve those problems by clamping down on all absences really does introduce the proverbial sledgehammer and nut into the debate.

Michael Gove, the former Education Secretary, proposed docking child benefit from parents who allow unauthorized absences; Michael Wilshaw, the chief inspector of Ofsted, would like to fine parents whose children don’t do their homework and has urged schools to ‘adopt a hard line and not allow the youngsters out’. In the punitive and narrow vision of Gove and Wilshaw, education amounts only to teachers in classrooms preparing pupils for exams, and the duty of parents ceases to be truly educational but instead one that may be likened to the task of prison officers transporting their charges to the institution and locking them down. I don’t doubt that classroom education has many benefits, nor do I reject the value of exams, but to equate education exclusively to school lessons and testing is not only life-denyingly dreary but also hugely mistaken.


*I declare an interest here: as a former home-educating parent myself, I understand the reservations that many parents feel about school. I intend to write more on this in a future article.

Who dey? The Bengals!

Most sport-loving British men (and it does tend to be men) of a certain age have a favourite American football team. Back in the 1980s Channel 4 began covering the NFL with a highlights programme every Sunday and a live broadcast of the Super Bowl. Come Monday, padded out in our school blazers and ties and armed with a tennis ball, we would try to recreate what we had seen. Like most of the boys, I fancied myself as a quarterback, even though I could never see a pass or read the situation. But in my mind there was something spectacular and exhilarating about commanding a long drive down the field (or the concrete playground in our case).

bengals_tiger_stripes
It was the tiger stripes that won my heart…

The most popular teams in Britain in the 80s were the Dolphins (because of Dan Marino), the 49ers (because of Joe Montana), the Bears (because of ‘the Fridge’) and the Raiders (because of… actually I have no idea). The contrarian that I am, I refused to let popularity be my guide, instead opting for the team with what I considered to be the coolest name and strip: the Cincinnati Bengals. I soon discovered that they were widely regarded as a pretty terrible team (even if they had appeared in the 1982 Super Bowl, losing it to the 49ers, but that was before Channel 4 began its coverage). It mattered little to me: the Bengals might have lost most games they played, but they did so while adorned in tiger stripes.

And then in 1988 they briefly became good (thanks in large part to their fine quarterback, ‘Boomer’ Esiason) and reached the Super Bowl to face, once again, the San Francisco 49ers and Joe Montana. I still remember much about the game. A close fought, low-scoring affair (it was 3-3 at half-time), a third-quarter touchdown gave the Bengals the lead going into the final quarter. Thanks to a field goal, they still retained the lead with just over three minutes left, and with the 49ers pinned back by their own goal line. But then, over eleven plays, Montana drove his offense 92 yards up the field to deliver the winning touchdown with 39 seconds left to play. It was an extraordinary drive—but for a Bengals fan, it was about as painfully tortuous as sport gets.

That was the closest the Bengals have ever come to winning the Super Bowl. I left school and began university, and, lacking a television, I rapidly lost interest in American football. This spared me some fallow times for the Bengals: for about twenty years they were probably the worst franchise in the NFL, never remotely coming close to success, and usually failing in ways that invited ridicule rather than sympathy. And then a few years ago my interest in the sport was rekindled, and I discovered, happily, that the Bengals had surprisingly become quite good. Okay, they haven’t won a playoff game since George Bush senior was president, but the fact that they have consistently reached the playoffs over the past few years is something—these days they actually win more games than they lose. Indeed, as I write they have made a perfect start (4-0) to the current season, so optimism reigns once again.

My better self has long tried to dampen my enthusiasm for American football. There is, after all, much to dislike about it. The sport is intensely violent and aggressive and has real problems with long-term injuries; the culture surrounding it is dispiritingly meat-headed, money-oriented, and garishly showy (the Super Bowl is in many ways a hideous spectacle); it has a history of racism (for a long time black players were not allowed in the NFL) and homophobia; the NFL, by all accounts, is an appalling organization. On the other hand, the draft system which gives weaker teams first pick, the scheduling which ensures stronger teams have tougher fixtures, and the salary cap are all features which ought to give the English Premier League pause for thought.

Above all, it is a great game. Although it can never rival for me the flow, beauty and thrill of football (i.e. soccer), American football is an extraordinarily complex, intricate sport, capable of sustaining tension throughout. It can appeal on so many levels: it is fast, powerful, tough and highly physical; but there is also a beauty to its tactical sophistication, its clever plays, its use of bluff, deception, variety and skill in working out the best defensive or offensive strategies, all of which are attractive on a deeply intellectual level. I find it vastly superior to rugby. Whereas I struggle to find any intellectual or aesthetic appeal in rugby (most rugby matches look to me like an unthinking, primordial mess), American football mixes artistry and ideas in with the big hits.

Even so, I am not enthusiastic about the prospect of a London-based NFL team. I prefer a safe distance from the culture of the NFL. Anyway, although I know next to nothing about Cincinnati and will probably never visit the city in my life, I’m happy to keep on following the Bengals—and, given that they seem to be good whenever I show an interest, perhaps the Bengals are happy about that too.

Tory neoliberalism

There is a very good chance that by the time the Tories are eventually voted out of office (2020 at the earliest, but more likely 2025 or even later) most existing public institutions will have been largely or fully dismantled, either through cuts or through privatization. The welfare state is rapidly being eroded; privatization of the NHS is likely to be extended; education policy will, by design, result in more schools taken out of local authority control and into the hands of private companies; central funding of higher education and the arts, already decimated by cuts, will almost certainly have been slashed further; the BBC is facing a relentless attack on its role as a publicly funded broadcaster; despite denials from the government, plans have been drawn up to privatize Channel 4. Apart from parliament itself, the royal family and the armed services, we may find that within a decade almost nothing will be left of publicly funded institutions.

The budget deficit is the government’s convenient but dishonest rationale behind this programme. The reality, of course, is that the cuts are driven by an ideological commitment to neoliberal economic principles. It is interesting to consider why the government will not admit this, instead choosing to depict itself as having to make unfortunate but necessary decisions to (as the myth goes) clean up the mess created by the previous Labour government. There are understandable presentational reasons for this: in general, most people are suspicious of ideology but fond of practicality (which helps explain why the Chancellor of the Exchequer preferred to be seen wearing a hard hat rather than reading economic theory for his election campaign photo opportunities); and it is a lot easier to communicate via social media and tabloid newspapers an idea as simple as ‘sorting out a mess’ than complex economic theories.

But perhaps the primary reason for playing down the ideological nature of the policies is that the government knows full well that at best the policies are highly risky, and at worst they simply do not work (or at least do not work in the way many of their most enthusiastic proponents would hope). It is worth acknowledging, even for those of us on the Left who oppose the government’s policies, that neoliberalism makes a serious claim to be the best and most rationale way of organizing society. There is no lack of weight or sophistication to neoliberalism as a set of ideas, and to reduce it simply to an ideology of greed or hatred of the poor is an intellectual dead end. Certainly neoliberals maintain that profit, self-interest, competition and, if you like, greed are key motivating factors behind human action, and they positively encourage inequality not only as the most effective way to ensure that merit is justly rewarded but also as a dynamic that stimulates productivity and economic growth. Whatever one thinks of the neoliberal analysis, there is no doubt that it presents a body of psychological, philosophical, political, social and economic theory that demands serious engagement.

Like many ideologies, however, neoliberalism works better in theory than in practice. At the heart of neoliberal thought is a belief in laissez-faire capitalism, free markets, deregulation and the excision of the “dead hand” of the state. It is supposed that markets, unlike governments or the state, know best; therefore, markets that are fully free (i.e. that are subject to no government intervention through red tape, regulation, or state ownership) will, it is argued, work efficiently and in the interests of everyone. The economic laws of supply and demand, so long as they are left to operate freely, will benefit society as a whole.

As a theory neoliberalism is seductively persuasive—as long as one accepts that there are economic laws as true as scientific laws. But there is little hard evidence that neoliberalism works well in practice. As its name indicates, neoliberalism is a modern reworking of classical economic liberalism. Associated particularly with the politics and ideas of the nineteenth century, above all in Britain, classical liberalism has impressive intellectual credentials stemming from the thought of, among others, Adam Smith and John Stuart Mill. The history of nineteenth-century Britain is instructive for some of the strengths and weaknesses of economic liberalism. On the one hand, liberal policies (e.g. lack of regulation) helped the phenomenal economic growth set off by industrialization; on the other hand, they also generated some truly dreadful social conditions. In theory, by maximizing profit and increasing wealth, everyone will benefit (or, rather, all those who deserve to—the lazy and the workshy, for example, will deservedly not benefit); but this is not how it turned out in practice. For example, although profits are increased from, for example, the employment of cheap child labour, what resulted was not only suffering children but also depressed adult wages and increased adult unemployment. Such was the moral outrage at the social effects of economic liberalism, that the only remedy was for the state increasingly to intervene in the social and economic sphere, by regulating working conditions and alleviating poverty and its results.

The neoliberal Tory vision of society has clear echoes of liberal nineteenth-century society. David Cameron’s idea of the ‘Big Society’ (which still occasionally sees the light of day in his pronouncements) is in essence a modern take on the nineteenth century: promote laissez-faire capitalism, rein in government and state intervention in social and economic regulation, and encourage philanthropy, charity and human kindness to fill in the gaps. It’s an optimistic vision, but one for which the evidence is hardly encouraging. There were certainly philanthropists in the nineteenth century, but philanthropy and charity on their own had a minimal impact on the alleviation of social problems. In the US, the most neoliberal of all western countries, there is a long tradition of philanthropy, but American society is nevertheless beset by levels of poverty and inequality—and increasing ones at that—that dwarf those of western Europe. When, during the election campaign, the Tories unveiled a policy that would, through statutory paid leave, compel employers to enable workers to undertake voluntary and charitable work, it was perhaps an unspoken acknowledgement that neoliberalism may not work.

Austerity and the ‘shock doctrine’ so beloved of the IMF (and US) have similarly provided little real evidence that they work. As has been well documented (by, for example, Joseph Stiglitz and Naomi Klein), despite repeatedly failing to achieve the desired results, the IMF has for decades nevertheless relentlessly pushed on with its blinkered neoliberal policies, convinced that the rarefied theories of the Chicago School economists trumps the actual evidence before its eyes. In so far as problems were acknowledged, these were usually attributed to the failure of countries to move to a fully free market quickly enough, or to the continuing intervention of governments in economic policy. Even in cases where the IMF policies had some success, it was only a small part of society who benefited—not a surprise to critics of the IMF and neoliberalism, but not the anticipated outcome expected by the IMF and neoliberals themselves.

Neoliberal ideologists (among whom much of the current government can be included) tend to be dogmatic in their approach. And there is nothing wrong with that, provided the evidence supports the dogma. But the evidence invariably points the other way. In a recent essay discussing the life and work of the social scientist Richard Titmuss (a strong believer in the importance of public institutions), Stefan Collini discusses Titmuss’s final book, The Gift Relationship: From Human Blood to Social Policy (1970), in which Titmuss considered the provision of human blood to meet medical and scientific needs.* In the US, a neoliberal approach was adopted as the best response: market principles should apply, according to which the demand for blood will be met by supply, and a price mechanism will develop which will satisfy all concerned. In short, leave it to the market. Consequently a commercial trading system developed: for donors there were financial incentives to give blood, and there were profit incentives to encourage companies to trade in and supply blood. In the UK, however, the publicly-funded National Blood Service (part of the NHS) was established, which relied on voluntary donations of blood. A neoliberal would assert that the US model should prove to be the most effective. But as Collini describes, Titmuss’s work demonstrated ‘the greater efficiency of the [UK system] as measured by all the relevant criteria: purity of blood, availability and reliability of supply, cost and administration.’ (As further subsequent confirmation of this, one may note that it was blood sourced largely from the US and infected with Hepatitis C and HIV that was at the centre of the ‘Tainted blood scandal’.)

It is not simply its invariable failure to work in practice and its often calamitous social effects that makes neoliberalism such a disastrous ideology. Just as corrosive is its moral bankruptcy. One of the more interesting moments of the election campaign was when Cameron was confronted by an audience member on Question Time about the fact that he spends all his time talking about the economy but never about morality. The Prime Minister gave a good answer in the context of the debate: he argued that getting people into work and improving their economic security and prospects were above all moral issues, since it was only by doing this that poverty could be reduced. Indeed, his answer probably sums up neoliberal moral theory: assuming laissez-faire capitalism works, then the society that emerges will be morally good. But, as I’ve suggested, laissez-faire capitalism does not work. The dogmatism of neoliberals makes them blind to the bankruptcy of their morality.

The effects of this dogmatism and moral bankruptcy are hugely damaging, for they result in an impoverished moral culture. Even if we accept that public institutions such as the NHS, or the welfare state, or the BBC, could be run more efficiently according to market principles (and according to narrow economic criteria they almost certainly could), this would nevertheless miss an important point about their existence. Throughout Titmuss’s work there was, as Collini explains, ‘a governing preoccupation…: the need for societies to give effective institutional expression to non-economic values in the face of the tiresomely corrosive power of the profit motive.’ It is this ‘institutional expression to non-economic values’ that matters so much—and it is this which will be lost as the Tories dismantle the welfare state and drastically cut back on public funding. The NHS may well be a bloated, inefficient organization, but its importance lies not only in the healthcare it provides but also in the values it represents: the belief that society should look after all its members, the idea that we are collectively responsible for looking after one another, the valuing of care over profit. When a museum or gallery allows free access, this is an important statement about the value of making heritage, art and education available to all. Even if charges did not lead to declining museum attendance (as the evidence indicates they do), by implementing those entrance charges we would nevertheless be replacing a non-economic value with an economic value. Profit and efficiency may be increased by such charging, but at the cost of abandoning the shared moral value that education and heritage should be universally accessible.

When the Tories talk about values, usually specifically ‘British’ values, conspicuously absent are such things as compassion, caring, and sharing. This is hardly surprising: neoliberals tend to be sceptical about these types of virtue, not because they don’t admire them but because they do not fit well with their narrow economic theories. For neoliberals, first and foremost should be a focus on free market principles; in so far as compassion, caring and sharing have a role, these will emerge naturally from the wonderful society that laissez-faire capitalism will create. But it may well be wondered what happens when a government consistently emphasizes economic virtues over moral virtues. If, for example, the message is consistently sent out that economic inefficiency is a more pressing concern than the damage caused by poverty, then society is likely to internalize this message—to value wealth and profit, and to devalue welfare and caring. A cynic might of course argue that such internalization of the profit motive at the expense of welfare is precisely the intention of the Tory neoliberals.


London Review of Books, vol. 37, no. 19, 8 October 2015, pp. 29-33.