Below is an email that I sent to the Guardian regarding their obituary of Eva Figes. It's very long and perhaps of no interest to the vast majority of people to whom the name Eva Figes is unknown. I happen to love Eva Figes' books and I'm not too keen on the Israeli occupation of Palestine, so it upset me to see the death of one of my favourite authors used as an excuse to engage in a little re-writing of history in the standard zionist style. In spite of this I've hopefully retained a certain level of objectivity in my criticism (that's the whole "academic standards" thing I mention in a moment of snobbishness). The article is available here - http://www.guardian.co.uk/books/2012/sep/07/eva-figes
To the editor of the Guardian,
I'm not usually one to complain of newspaper articles - it seems somewhat redundant considering no-one's forcing me to read your paper - but I felt compelled to share my thoughts on your obituary of Eva Figes, written by Eva Tucker. I'm currently writing a PhD thesis on Figes and the group of writers surrounding her so it makes me a touch defensive when I see inaccuracies printed. It is my understanding from the article that the author knew the late novellist in person far better than I, yet I don't think that should render my input invalid. As this obit may be the only contact that most people have with Eva Figes now that she's passed it's a shame to leave them with a bad impression.
Anyway, here's a couple of inaccuracies that I noted:
The biographical information in paragraph four (which I assume stems from her work Little Eden, or a review of that work, or perhaps a personal conversation) talks about her being called a "Jerry". There are a couple of other minor niggles here that can be excused for purposes of concision but I think this one is particularly notable considering what comes later. The bullying that Figes writes about happening when she moved to England was actually due to her being Jewish, and she notes in Journey to Nowhere that she escaped the "collective guilt" of her German nationality by referring to her Judaism. Perhaps she was called "Jerry" on occasion, but it would not be the most obvious thing to take from her memoirs, as it makes her experience of anti-semitism appear somewhat conspicuous by its absence.
The reading of Journey to Nowhere is so biased that I can only assume the "acme of fury" described was an act of projection on Mr. Tucker's part. The considerable research that went into that book is dismissed in favour of ad hominem attacks on Ms. Figes and "Edith" - the starting point, not end point, of the book's argument. I am aware that journalism cannot be judged by the same set of standards as academic criticism, but any balanced reading of the text would consider the quotation about Israel's right to exist as an abberation in terms of the overall tone that is introduced at a climactic moment for effect, rather than symptomatic of the entire work.
In terms of hard factual innaccuracies, the phrase 'she called German Jews who went to Palestine in the 1930s "Hitler Zionists"' is just wrong, and cannot be excused as subjective interpretation. "Hitler Zionists", "yekkes" and "soap" are listed by Figes as terms used by Palestinian Jews to refer to the German Jews forced to Palestine by Hitler as they were uncomfortable with the fact they did not share a language and the more cosmopolitan Germans were not part of the settlers movement out of choice. Eva Figes is writing against these terms that she had encountered as part of her research - using them as an, admittedly crude, counter-argument to Israel's current ideological interpretation of the holocaust and the political power which that lends them internationally.
Apologies for the lengthy email. I don't expect anything to be rectified as I realise that most of my points could be considered "my point of view". It would be incredibly appreciated, however, if you did something at least about the glaring error regarding "Hitler Zionists". You are essentially calling a recently deceased woman an anti-semite by attributing to her words which she was herself condemning. Ironically, it is this kind of irresponsible conflation of criticism of Israel with anti-semitism that Eva Figes was warning about.
As a final note I think I should mention my favourite of Figes' novels, 1969's Konek Landing - an experimental work that very subtly investigates the post-war European landscape from the perspective of a stateless Jewish boy. It is a work of incredible beauty and compassion that has never properly been identified as part of the holocaust writing canon, although Figes has mentioned in numerous places that she considered it her most important novel. You are of course free to publish whatever obituary you wish, but I would suggest that an inclusion of a novellist's most important novel would not only be of worth from the standpoint of the general reader but may also help to counterbalance any nasty insinuations made by a journalist who does not seem familiar enough with her work to offer criticisms in a responsible manner.
Thanks for taking the time to read this if you have reached this point. I wouldn't blame you if you gave up after the bulletpoints! I really appreciate the fact that the Guardian published an obit as I believe Eva Figes to be unfairly neglected as far as modern British writers go and anything that may inspire someone to pick up one of her books is great by me. I am also sure that Eva Tucker believed wholeheartedly that her apology for Journey to Nowhere was the right thing to do when defending Eva Figes as a writer. I only wished to have it noted that I respectfully disagree.
(Quick comment prior to this blog post - I wrote this during the heat of the riots and the vicious backlash. I did not place it upon my blog at that time due to a genuine fear that the publication of a dissenting voice may lead to state persecution, if not arrest. These fears remain, yet my cowardice doesn't. I apologise for such gross hypocrisy and also for the hyperbole here, but these are times wherein such things are rife - this is, however, no excuse.)
I am deeply shocked and morally sickened by this country following yesterday’s riots. Yet it is not the criminals that illicit this response in me but you, the British people. A more bloodthirsty gnashing of hysterical teeth I could not imagine from this nation that so often praises its own liberalism. Previously sane people are now calling for martial law, suspension of communications, rubber bullets and water-cannons, curfews, an end to human rights, a restoration of the death penalty, bringing back the cane, an outright genocide aimed at wiping out the entire working class. Yes, some of these are meant more seriously than others, but not one of these things should ever be meant any way but ironically by anyone with an ounce of sense and humanity. We should look on these things as the backward, totalitarian methods that they are. We should consider them on a par with thumbscrews – which, judging by the internet and the radio, some of you are probably in favour of, you Nazi bastards.
Wow, that’s a lot of hate in that paragraph. Well let me tell you that it’s a tense time for the person seeking freedom right now. Here’s a handy set of reasons why it’s not a good idea to demand the state conduct violence upon your behalf:
The moment the state operates it sets a precedent. Today they may be rioters that are shot with rubber-bullets and hosed down with water-cannons (both weapons that are potentially lethal and often maim very seriously) but tomorrow they will be protesters. They will be trying to stop a war, or protect equal rights, or perhaps even complain about police shooting someone dead, and they will be treated with as much violence as you would wish upon these rioters.
Once a liberty goes, it goes forever. It will not come back for many, many years and most likely it will take numerous deaths to get it back. Our freedoms are considerable (not perfect, however) in this country, yet each has been won through generations of suffering and struggle. What may seem a justifiable action now in emergency circumstances will erode a right that it will be incredibly difficult to win back. The basic rights to communication, fair trials, free association, free speech, free movement – these will go the moment a curfew or some other draconian measure is enforced. If you wish these stripped from other people, you do not deserve them yourself.
If the police are not accountable to law then they are no better than the criminals they arrest. We, as a society, employ police so that they can exert the amount of force our laws have seen fit to grant them in order to maintain those same laws. A member of the police does not act as a free individual but as an organ of the state. To respond to extreme violence with extreme violence illegitimates power – it is a declaration of a fascist state, the act of fighting against which is in fact the only moral choice. According to many comments I’ve seen, a lot of people would welcome this state of lawless oppression. Just for this instance, you say? Oh, my wonderful imaginary reader, once you grant the police power, they will use it to its fullest extent – it is inherent in their nature.
But what good are these liberties? They don’t defend us from crime! No. They don’t. Yet they determine what crime is, and how we deal with it. In doing so, they determine what is not a crime – and what is not a crime is what makes up society. Like the prime minister of Norway, we must now ask for more democracy and more freedom, not less. We must refuse to move an inch when it comes to freedom – and we must demand more of it. Otherwise, the next person to be “mown down” or “strung up” will be you.
You don’t have to be a Marxist to see that poverty caused these riots. Why would anyone do this unless they felt as if they had no place within society? How could they? It is inconceivable. That doesn’t make them right – it makes them very wrong in fact – but it does prove what decreasing our freedom leads to. These people are crushed beneath the weight of poverty and can only see a way of getting free through operating outside society, as criminals. They have as little respect for their own lives as they do for others’. How could someone consider themself so worthless? Only if all potential is lost, when they are a slave to conditions set entirely against them.
This is why we must demand more freedoms – freedom from poverty and disease, the rights to work, housing, food, healthcare, education, direct democracy and democracy within the workplace. Only by fighting for these freedoms can we bring an end to oppression, both physical and mental, and finally live in peace together. For until we have overcome jealousy and anger then we shall never conquer hate. Until we are all free to live in society then there shall always be those outside it, attacking it, and making life worse for all of us.
 Both things I’ve seen seriously posted online by serious people I used to respect.
So it comes around to that time of year again. Time of our own traditions, those inherited, and those mashed together over countless generations down shifting cycles of the human unconscious. Yes, the mass of Christ – the weight of his sufferings and the density of his flesh. Nom-noms.
In my nightly wanderings around the works of Frazer I’ve come across many an interesting scene that will perhaps add weight to last year’s Jungian ramblings. May the myth of Persephone serve as an example. Ovid’s version, if failing memory serves me, depicts the abduction of a daughter from a mother and the ensuing custody battle. Frazer expands on this by drawing on the “eastern” version of the myth in which two lovers are separated. Being a season-based myth, the emphasis is thus moved from the nurturing vision of mother-earth to the impregnating actions of the seed-sower. What interested me, however, was that both versions utilised the pomegranate as a more-or-less arbitrary symbol: it both grew in the underworld and represented through seeds the many hungry days of winter. The version taught to me both at school and, more memorably, by my grandmother instead stated that it was Persephone’s eating of the pomegranate that doomed her to life in the underworld. In the land of Milton then, it would be fair to say that Eve’s original sin has jumped on an otherwise Hellenic bandwagon. Or sleigh, as they’d have had back then…
Forgive me, my comely and insatiable imaginary reader, for the roundabout way I’m taking – I only wish to demonstrate how imprecise and over-presumptive my method is here. You see, Christmas is a time of baffling complexity in terms of its symbolism (if can even be called that). The Christian nativity is second only to the book “Revelations”, or maybe “Ezekiel”, as far as Biblical weirdness goes. The three (important kabbalic/pythagorean number) magi standing beside shepherds has some Christian merit I guess; “tending of the flock” being a central doctrine, as well as its sense of egalitarianism. That hardly provides an overt for them though.
Once I return to Frazer and read of the Thracian celebrations of Dionysus – almost identical to the nativity in terms of setting, time of year, and dramatis personae – a consistent theme seems to emerge. The shepherds tend their flocks that they may be harvested. One magus brings myrrh; an embalming resin. The stable is depicted traditionally as filled with animals, many of which serve only as food, with the babe placed in the manger, or feeding trough. In essence, all the signs point to the inevitable death and consumption of the newborn. The Theban Dionysian ritual sees an old man become a cow that it might be torn apart to reveal an infant. The Christian ritual of eating body and drinking blood is perhaps tamer, but only as it happens so regularly and, for Protestants at least, is reduced to symbol.
Both myrrh and frankincense, as resin, are also the blood of trees. Yet the tree is traditionally an element of springtime symbolism eg/ the maypole. Here the Thracian myth goes whole-hog and employs a giant wicker phallus, but lets not get dragged into Freud. I think the key issue here is one aptly demonstrated by the Tate’s new tree; the tree is always an evergreen. Where spring is happily to be spotted in the greening of an oak, so is winter to be bravely weathered by the pine. Although usually depicted as an ash - the life-tree, Yggdrasil of Norse mythology, would be well-suited to this role. Standing as the eternal symbol of constant being, Yggdrasil stands for a gleaming permanence amongst the anthropophagic flux.
I’m getting pushed for time so may as well try and eke out a few more observances about the remaining elements. Now, I’ve said I wouldn’t get into Freud but it’s in Totem and Taboo that some of the implications of Frazer are drawn out on the subject of parenthood. Why is he “Father” Christmas? It could be argued that the term is historically a term of respect (perhaps like “grandpa” would be now, though it’d be hard to not make it sound sarcastic). If we’re talking historically, however, St. Nicholas has pretty much nothing to do with this odd character – albeit his namesake. Rather than Freud then, lets turn to Levi-Strauss and, less so, Frazer again. The notion of fatherhood is often ignored and sometimes not even understood by the tribespeople they studied. Women simply became pregnant and families extended through the maternal line. The father is negligible – a belief shared by the Jews and, to drag us back, perhaps then an influence on the creation of the Virgin Mary?
It’s here that I must stop myself. I fear a bottle of wine and increasing liberties regarding synchronic notions has left me pondering whether Father Christmas is actually Jesus’ dad. Perhaps a better path would be to follow the line of presents and dinner (offerings/sacrifices), or maybe some “Santa Claus” Germanic track? Maybe something to do with Coca-Cola or the fact that I’ve never really liked Christmas that much anyway…
Anyway, anyway… this has been a long one for you, my intrepid imaginary reader. You must be tired. Do come join me in bed. Or, failing that, consider that there is a strange sublimity in all man’s works, even at their most ridiculous. Surely that’s a better reason for the season? Well screw it then, I’ll return to Bakhtin…
It strikes me that people don’t grasp post-modernism. The term itself is overly laden with values, much of which conflict and most, without sympathetic reception, appear superficial, pompous and banal. Yet at the core of the truthless meta there is a distinct set of quasi-truths that are central to the future of social understanding. I don’t necessarily argue for them, people have been doing that since the 70s and have gotten nowhere - I simply illustrate them, highlight, foreground, and unambiguously narrativise them.
Post-modernism is perhaps best summarised by its own hyperbolic suicide note, Fukuyama’s “End of History”. Once we realise that the process of compiling history is purely speculatory self-justification, then surely it must “end”? But no, I’d suggest that the more mature attitude would be one that continues regardless, a stiff-upper lipped historical detachment that is finally prepared to approach foreign narratives; be they class-based, geographical, genderly or any of a million others. The post-modern attitude is sceptical through its fragmentation (for every Reason there are a thousand more passing as Coincidence) and believability is reduced to an aesthetic quality.
To return to my favourite novelistic example, Earthly Powers, the process of 20th century history is monitored through a set of fictional characters. These characters remain reasonably marginal, leaving room for the introduction of real historical characters such as James Joyce, until the narrative reaches a point of relative contemporary nearness, lending a “modern” relatability, when a fictional character is then made Pope. In this capacity, the novel seems to span both the traditional historical novel and post-modern “historical metafiction” genres; characters fit seamlessly into the gaps of history before overwhelming that “objective” history with their personal narratives.
The conflict between the presentation of history and meta-history can be seen in comparing a couple of war films; a genre almost a priori bullshit laden. Saving Private Ryan, aiming at realism, will stand for history – Inglorious Basterds, patchwork pastiche, will stand for meta-history. In Saving Private Ryan, the rest of the Allies are famously ignored in favour of a “heroic” storyline. The heroes follow a singular line of identity-thought that posits the Self as Good, the Other as Bad, Nation as Self, Foreign as Other. Compare this to the identity relations of Inglorious Basterds: the Americans are Jewish, they play the role of the Indian (traditional enemy of Aryan American Cowboys), and fundamentally, their narratives rely upon those of Others within the film’s progression. The ending of both films is the death of Hitler – but only Tarantino’s meta-history is free to make this visible. Perhaps the key to understanding post-modernism then, lies in a mix of irony and obviousness. An awareness of narrative-as-narrative does not undermine its content - in the right hands it can even improve it. The last South Park episodes, 200, and more obviously 201, operate only on the meta-level. When Buddha is told to “stop doing coke in front of kids”, the joke exists only in the context; no Buddhist would complain of this joke, nor would any of the million other real people that South Park has offended during its time on air, yet the imaginary threat of Muslim “fundamentalism” is enough to get the show banned.
The conclusion of episode 201 is intentionally edited out. The traditionalist mode of exposition that South Park does so well with its “today I learned a valuable lesson…” speeches are reduced to their announcement, the rest removed under the guise of “censorship”. The incomplete content remains a completed narrative; the joke being its very incompleteness. It’s in this capacity that post-modernism completes itself. The other underlying joke of the episode being that South Park had already shown Mohammed in an earlier episode shows how societal reception is as essential as content. The artwork exists only in the process of its reading, or watching, as an Art Event. Our Aion-Chronos constant present is the variable scope of the post-modern; post-modernism is its utilisation in the fullest sense through Events that function, rather than simply happen. Event-functions expand beyond their borders and reterritorialise other narratives, something traditional forms can only express within their own happenings whilst remaining fixed, territorialised themselves.
In the spirit of the disconnected ramble that was the previous rant, I write now on a certain synchronicity of mind. An acausal connectivity principle that flirts with Hume’s associationism before swiftly dropping back into the philosophical equivalent of Bosch’s hell. Yes, thought is an odd thing that one rarely contemplates in a spirit of problem solving, more likely our pronouncements on thought are closer to platitudes than real beliefs. Without doing a Woody Allen and “looking inside the soul of the guy sat next to me”, I can only base my thoughts on contemplation in personal experience, or more rightly experiences; my personal idea associations bearing no resemblance to linearity in either their content, my consciousness of them, or the time of their occurrence.
Contemplation itself can hardly be called a singular event – cut to a cartoon image of chin-rubbing, brow-furrowing, straining to turn on a floating lightbulb. No, much of our thought is unconscious; this is why it’s best to take breaks during periods of complex mental work. The action of “sleeping on it” is often when the great connection is made and the bridge into comprehension and understanding is quickly burned behind you, making you feel as if the problem was simple all along.
In laying our solutions over problems, in the manner of early Deleuze, we constitute the problems themselves via their solutions. Causality is projected backwards and “thought patterns” are formed. An expert on trees wouldn’t have become an expert on roots and then worked his way up to leaves. As more knowledge is added, it lays itself over previous knowledge, obscuring our memory of the chance accumulations that are inherent in learning and replacing them with a causal, linear narrative.
Yet these “patterns”, as selections made through judgements of meaning-value, must surely occur throughout all thought then, not simply education? In Vonnegut’s masterpiece Slaughterhouse 5, a scene in which three Germans stare, slack-jawed, out across the annihilated city of Dresden gives Billy Pilgrim, the main character, the uncanny sensation of there being something missing. Many years later, on the day of his marriage, he hears a trio of barbershop singers whose song is retroactively placed in the mouths of the three Germans, completing the image. The acausal connectivity of thought is here traced via meaning-value again, yet arbitrarily, between decades. The fragments of the past that unite with the future are the essence of these patterns (be they conscious or not), and their meaning-value is embedded in the sensation of recurrence.
Marcus Aurelius describes the existence of all times in one moment, Nietzsche the finite universe that must “eternally return” in its vastness, Deleuze’s “Aion” places all pasts and futures in the present – all these theorists effectively do the same thing; posit meaning as the circular time of recurrence. I suffer from a similar mental tick, although rather than philosophise, I tend to distrust it. You see, when I remember past events, regardless of their importance or when they occurred, I keep experiencing the sense of having done them before I did them. Something I experienced for the first time (something I could never have previously experienced before) is remembered as being a repetition. The feeling is that of déjà vu – except it isn’t so at the time, only afterwards in memory.
As I said, I refuse to be Woody Allen, so here I will intrepidly utilise my subjective mental obscurities to suggest to you, my lovely invisible reader, that the sensation of recurrence is the Deleuzean double present (Aion-Chronos), hotwired by chance. Our thought becomes like our face when we see it from an odd angle in a photograph; the mirror-face that we carry as the image of ourselves is presented from a new angle, a photo-face that occurs simultaneously with the mirror-face without changing or destroying it, despite its difference (its difference being that it is exactly the same). The seamless process of problem-solution “patterns” is confused, split and doubled, revealing products of the mind’s fractal workings in a self-contained recurrence. Recurrence is not repetition, but a meaning-connection/selection essentially acausal.
This is the reason that thought, in itself, doesn’t make “thinkers”. “e=mc2” could have been daydreamed by a shoeshine boy in 1840s London, “to be or not to be” may be a throwaway comment made in 700 years time when all fiction has been long eradicated, none of it matters. Thought itself doesn’t matter, neither its content nor its action; it only functions as true meaning on a cultural level, on a level of fame, hegemony, threats and bargaining. In such a way does Bernard K. Smith (“that’s Smith with an “I”) lament his greatest ideas being stolen, taken into the past and made by others, in the excellent youtube series/channel Church of Blow.
Continuation, then, is more of a habit in life; a tendency rather than its true being. The narratives that we construct, verifiable or not, are only what I previously described as islands of “Laputan meaning” for the reason that such is the way the human mind works. Synchronicity of mind, its endless collisions of thought, this is the nature of possible worlds now that Godfrey has popped his clogs
Perhaps it would be fitting to end with a note on the Lamed Wufniks. These are the 36 great individuals whose influence on the human race is paramount at this time. They tend to have little power, they never know of each other, and we will never hear of them, not until the far future, if ever at all. In all likelihood, they don’t know that they are one themselves. I doubt that I’m one of them, but perhaps you are, my beautiful imaginary follower? If so, could you put in a good word for me to the future? Much obliged.
March is the cruellest month, breeding Essay deadlines, soggy weather and now some sort of not-quite cold that means I’m not ill enough to stop working, just to wish that I was. Still, I’m in credit for one blog so this one can happily constitute a quick half.
A while ago I realised that my short story “The Coming of the One Great Vision” is oddly of-its-time. The reason for this is in the opening metaphor in which I compare a “badly tuned television” to a street view on a rainy day. My initial intention was to refer to the “white noise” effect that analogue sets give out when reception is poor (the rain would distort the image and make it fuzzy) but since that time the entire country has “gone digital”. In this new context, the same metaphor now refers to the “jumps” that sometimes occur when something goes wrong with the digital input (the rain is now so distorting that what appears through it seems to jump between places in sticking, flashing images). It is still an operable metaphor, in fact it’s probably a little less clichéd than before, yet this initial revelation was nevertheless disturbing. I felt like a modernist poet writing about the wonders of aviation, only to find that it’s become passé in the time it took to publish.
There is a similar event pondered in Barnes’ Flaubert’s Parrot where a simile describes a colour as that of a certain jam. The main character goes to find this regional jam to discover what exact colour Flaubert was referencing. When he finds that it isn’t quite what he imagines it to be, he then tries to find out whether the jam has changed colours since the time when Flaubert would have eaten it. The recipe, he is told, hasn’t changed; but is that a true guarantee of its verisimilitude?
With culture being in the fluxuous state it’s in, this seems to happen all the time. Nietzsche references the “unconscious” a good decade or so before Freud and the psychoanalysts gave it it’s common definition; Nietzsche’s use is then nothing to do with “repressions” or “desires” but simply a not-but-almost-conscious that the word “unconscious” would have summarised quite well before Freud. The same happens with Descartes’ philosophical “substance”, used long before Spinoza or Leibniz try to pin it down.
I also, somewhat strangely, found a case of it in a “modernised” translation of Locke’s collected writings on Politics. Where all the “thithers” had been replaced by “theres” and the “s” vs. “f”dilemmas resolved (the standard “modernising” translation from the C18th to C21st) , there remained the “divers” spelling of “diverse”. I can only assume that the translator had decided Locke was talking not of the “many and diverse beliefs of man”, but of the “jumping into water head first beliefs of man”. Perhaps he was, I certainly haven’t any evidence to the contrary…
Essentially this process of historical reinterpretation is a natural one; projecting ourselves backwards in time imaginatively is perhaps the true essence of history, even if it does lead us astray every once in a while. It’s all a matter of common sense. Much safer to assume that Homer’s “wine dark seas” describe the opaque qualities of both wine and seas rather than, as some have suggested, that ancient Greeks couldn’t see as many colours as we can so they thought the sea was red.
The same process goes on when Germaine Greer claims Mary Wollstonecraft as a fellow feminist, when Stalinism claimed Marx as its founder, or when “truthers” see CIA files on Bin Laden as evidence of US government cooperation in 9-11. All of them look backwards after a period of immense change and forget that things were different before that change occurred.
But then history is just another form of storytelling after all, and in lining up Germaine Greer next to mass murderers and conspiracy theorists I only aim to show that there are benefits to this process as well. This morning there was a joke on TV that the “women’s section” of Leeds UFC was a kettle stood next to some pots of tea and coffee – I had to have this joke explained as without the in-built historical chauvinism necessary to unite the images, it simply appeared as fractal nonsense to me. Now that’s what humanism should call progress; progress that shouldn’t have been necessary in the first place, but still progress all the same.
An apple fell on Newton’s head, now we have a theory of gravity. Edison flew a kite in a storm, now we have the light bulb. Darwin went to the Galapagos Islands and now we understand evolution. These sentences say a lot about how we view scientific progress. You’ll notice that in none of these sentences does the first clause claim to cause the second clause (through use of “therefore”, for example) yet immediately the two things are linked. The fact that the Newton-apple story was a myth made famous through Voltaire, electricity was well known long before people like Edison utilised it, and that Darwin’s books came out many years after his Galapagos trip don’t really matter; what’s important is how we automatically search for individualistic narrative to explain scientific discoveries.
But then who would want to be a scientist if they realised that the myth of the great inventor was just that, a myth? The recent film about Darwin wouldn’t have been half as interesting if it had shown him cutting up worms for twenty years with all the methodical precision of true experiment. No, the film makes him into some renegade scientist with 24 hours to save the world and, most importantly, use his super-evolutionary-theorising-powers to combat racism. It’s a lovely twist, especially for anyone that’s read The Descent of Man.
Problem is - people need reasons for things. Newton was recognised as a genius and so was his good friend Locke, but the real hero of eighteenth century development was the Royal Institute and the growing ideology of liberalism that permitted it. Yet even that reading plays along with the humanistic idea of “progress” that meets almost universal approval these days. Indeed, the idea that “more science = more good therefore history = progression” enjoys an unquestioned status nowadays similar to the existence of God in the middle ages. Ironic really, since ‘humanism’ has somehow become synonymous with atheism…
Although, as I’ve written about before (maybe last week even) people need to believe in something – especially if we take “belief” in Hume’s sense as meaning the concept of “meaning”. Humanistic progression is a handy narrative for injecting meaning into all things material. Reading this blog right now you’re probably thinking that the very blog form itself is a totally new medium (public diary) dependent upon new technologies (internet) and is thus a result of progression. But it’s not a product of progression; it’s a product of development. ‘Progression’ suggests a forward movement towards meaning and, dialectically, a phenomenology of increasing “goodness”, in the most theologically loaded way possible. ‘Development’ is self-contained and therefore references each separate ‘development’ as an individual completion of a self-imposed task. You develop a new type of sandal, you succeed and sandal design has become ‘more developed’ through diversification, sandals have not become more progressed for this would insinuate a perfect sandal.
The narratives that are drawn from materiality (the sort that call themselves ‘science’ and by doing so give it a bad name) extend even to the most subjective of enigmas – the mind. Of course, it’s easy to dismiss the weekly Telegraph article that tells all the middle-class parents how their children are superior to the poor kids ‘genetically’ and how greed is good according to ‘evolution’, but it goes even further than that. The IQ test, for example, is the equivalent of judging someone’s overall athletic ability by measuring every aspect of how they play table tennis, and then inferring that if they aren’t very good at table tennis then they should go work at McDonalds like the rest of the wrist-thickies. I’ve never taken an IQ test and never will… the project defeats itself. The greatest irony being that Einstein, someone who all high-IQ kids have to worship as a product of the individual worship I mentioned previously, was yes, indeed, rather poor at academia early in life.
Basically, at heart, the entire premise is designed to return privilege into the social hierarchy in an age where the traditional aristocratic bollocks has gone from tragedy to farce. Browsing the television the other night I found a programme about a “boy genius” with a billion-point IQ who had read War and Peace at the age of ten. The genius deigned to read us some of a novel he himself was working on… obviously I was on tenterhooks to find out what this superior being had to offer to a mere mortal like me. So he read it, and guess what, yes, it was shit. The kid couldn’t write for toffee and the way he spoke showed he was obviously thick too. But then he was ten and that’s how ten-year-olds are supposed to be. He had clearly only read War and Peace because it sounded like something that a ‘smart’ person would do and his parents had been told he was ‘smart’. Interestingly though, it did almost prove a theory that William Godwin advanced in his revolutionary Eighteenth-Century tract An Enquiry Concerning Political Justice when he was talking of aristocratic privilege – he said, if I see in a child the ability to become an epic poet, do I teach him how to write or do I award him all the prizes he may one day attain and then await the masterpiece? Indeed, if a child is told that they are a genius will they be likely to continue as they did before they knew or, more likely, will they become a mix of pompous arrogance and overwhelmed terror at their new found responsibility.
In conclusion then, life is meaningless and if you are ever under any impression at all that you are special then not only are you certainly not, but you’re also a dick. Of course, if it turned out I actually had a high IQ then I’d be trapped inside my own theorising and this would have been a hopeless attempt at educating the brainless cattle that make up mankind. Within weeks I’d no doubt be part of a Plato-esque meritocracy in which the high IQers write endless awful poetry whilst the low-IQers slave away in the acid mines until they grow plump enough to be processed into soylent green… Good thing I’m thick then.
I’ve recently been re-reading Thomson’s The Seasons, a book-length poem from the Eighteenth Century that does pretty much what it says on the tin. You begin in “Spring” and work your way through to “Winter” via a rambling, digressive style that replicates a pleasant meander across the countryside. As a huge fan of Wordsworth’s The Prelude, I was surprised to find that this poem was far more lyrical, even perhaps more philosophical in it’s adaption of nature and science to art.
Far stranger though, was the fact that this poem existed long before the first “Romantic” poets came and, from Lyrical Ballads onwards, claimed to “invent” the mode of the Egotistical Sublime. There’s even a story of Wordsworth and Coleridge on a pub crawl, picking up a copy of The Seasons that was almost completely worn-out from constant fireside reading and saying to each other, “Now there’s real fame!”
The first thing that this really demonstrated to me was the arbitrariness of literary “periods” or “schools of thought”. The cliché of viewing the Romantics as a rebellion of passion against the rationalised Enlightenment ceases to operate past anything but the surface level and does little justice to either side. It also warps our view of history; a view formed by modern assumptions that demands things be put into neat boxes so we can make some sense out of the chaos. Although of course all this labelling is inescapable… in the words of another Romantic, William Blake, “Without contraries there is no progression”.
The real issue I wanted to write about isn’t so much the labelling of periods but what that implies. As I think Voltaire pointed out more eloquently than I, when one classifies something, one inherently opposes it to everything else… “If it isn’t X, then it must be Y”. For this reason, The Seasons can’t be a “Romantic” poem, because it’s too early chronologically, but it isn’t as rigidly “Augustan” as an Alexander Pope poem, so it becomes erroneous. As the last published copy was from 1975, and it cost me £25 to buy one online, it’s evident how being left out of the periodised canon of literature has relegated this brilliant poem to something only now read by “specialists”.
I’m not recommending here that everybody who reads this should rush out and buy a copy of The Seasons, I’m just highlighting one of the sad facts about literary canons in general. In its day The Seasons had literally hundreds of editions published, carrying on well into the Victorian era... The Prelude struggled to sell 500 copies on first publication, but which is now available in Penguin Classics? Its not just Thomson’s poem either, there’s hundred of these works, hugely popular in their era and near-forgotten today, and mostly it’s down to the very academics studying and preserving literature that this happens!
I’m afraid this may have sounded like a rant but I assure you it’s not. If anything it’s rather funny. I don’t presume to condemn the entirety of literature studies off the back of a few books. It is all rather interesting though…
…oh, and speaking of Thomson, who do you think inspired Vivaldi’s Four Seasons?