Authors: Tony Judt
Tags: #Biography & Autobiography, #Personal Memoirs
On my occasional return trips to Cambridge, I am struck by the air of doubt and decline. Oxbridge has certainly not resisted the demagogic vogue: what began as ironic self-mockery in the 1970s (“Here at King’s we have five hundred years of rules and traditions but we don’t take them very seriously, Ha! Ha!”) has become genuine confusion. The earnest self-interrogatory concern with egalitarianism that we encountered in 1966 appears to have descended into an unhealthy obsession with maintaining appearances as the sort of place that would never engage in elitist selection criteria or socially distinctive practices of any kind.
I’m not sure that there is anything to be done about this. King’s, like much else in contemporary Britain, has become a heritage site. It celebrates an inheritance of dissidence, unconvention, and unconcern for hierarchy: look at us—aren’t we
different
. But you cannot celebrate your qualities of uniqueness unless you have a well-grounded appreciation of what it was that gave them distinction and value. Institutions need substantive traditions and I fear that King’s—like Oxbridge at large—has lost touch with its own.
I suspect that all this began precisely in those transitional years of the mid-1960s. We, of course, understood nothing of that. We got both the traditions
and
the transgressions; the continuities
and
the change. But what we bequeathed to our successors was something far less substantial than what we ourselves had inherited (a general truth about the baby-boom generation). Liberalism and tolerance, indifference to external opinion, a prideful sense of distinction accompanying progressive political allegiances: these are manageable contradictions, but only in an institution unafraid to assert its particular form of elitism.
Universities
are
elitist: they are about selecting the most able cohort of a generation and educating them to their ability—breaking open the elite and making it consistently anew. Equality of opportunity and equality of outcome are not the same thing. A society divided by wealth and inheritance cannot redress this injustice by camouflaging it in educational institutions—by denying distinctions of ability or by restricting selective opportunity—while favoring a steadily widening income gap in the name of the free market. This is mere cant and hypocrisy.
In my generation we thought of ourselves as both radical and members of an elite. If this sounds incoherent, it is the incoherence of a certain liberal descent that we intuitively imbibed over the course of our college years. It is the incoherence of the patrician Keynes establishing the Royal Ballet and the Arts Council for the greater good of everyone, but ensuring that they were run by the cognoscenti. It is the incoherence of meritocracy: giving everyone a chance and then privileging the talented. It was the incoherence of my King’s and I was fortunate to have experienced it.
The recently introduced nonselective secondary schools that were soon to become universal and were intended by the Labour government of the time to replace all selective state education.
A reform school for criminal adolescents.
See Noel Annan,
Our Age: English Intellectuals Between the World Wars—A Group Portrait
(Random House, 1990), an uncommonly self-confident account of a generation not yet stricken by self-questioning.
Anthony Grafton, “Britain: The Disgrace of the Universities,”
The New York Review
, April 8, 2010.
XVII
Words
I
was raised on words. They tumbled off the kitchen table onto the floor where I sat: grandfather, uncles, and refugees flung Russian, Polish, Yiddish, French, and what passed for English at one another in a competitive cascade of assertion and interrogation. Sententious flotsam from the Edwardian-era Socialist Party of Great Britain hung around our kitchen promoting the True Cause. I spent long, happy hours listening to Central European autodidacts arguing deep into the night:
Marxismus
,
Zionismus
,
Socialismus
. Talking, it seemed to me, was the point of adult existence. I have never lost that sense.
In my turn—and to find my place—I too talked. For party pieces I would remember words, perform them, translate them. “Ooh, he’ll be a lawyer,” they’d say. “He’ll charm the birds off the trees”: something I attempted fruitlessly in parks for a while before applying the admonition in its Cockney usage to no greater effect during my adolescent years. By then I had graduated from the intensity of polyglot exchanges to the cooler elegance of BBC English.
The 1950s—when I attended elementary school—were a rule-bound age in the teaching and use of the English language. We were instructed in the unacceptability of even the most minor syntactical transgression. “Good” English was at its peak. Thanks to BBC radio and cinema newsreels, there were nationally accepted norms for proper speech; the authority of class and region determined not just how you said things but the kind of things it was appropriate to say. “Accents” abounded (my own included), but were ranked according to respectability: typically a function of social standing and geographical distance from London.
I was seduced by the sheen of English prose at its evanescent apogee. This was the age of mass literacy whose decline Richard Hoggart anticipated in his elegiac essay
The Uses of Literacy
(1957). A literature of protest and revolt was rising through the culture. From
Lucky Jim
through
Look Back in Anger
, and on to the “kitchen sink” dramas of the end of the decade, the class-bound frontiers of suffocating respectability and “proper” speech were under attack. But the barbarians themselves, in their assaults on the heritage, resorted to the perfected cadences of received English: it never occurred to me, reading them, that in order to rebel one must dispense with good form.
By the time I reached college, words were my “thing.” As one teacher equivocally observed, I had the talents of a “silver-tongued orator”—combining (as I fondly assured myself ) the inherited confidence of the milieu with the critical edge of the outsider. Oxbridge tutorials reward the verbally felicitous student: the neo-Socratic style (“why did you write this?” “what did you mean by it?”) invites the solitary recipient to explain himself at length, while implicitly disadvantaging the shy, reflective undergraduate who would prefer to retreat to the back of a seminar. My self-serving faith in articulacy was reinforced: not merely evidence of intelligence but intelligence itself.
Did it occur to me that the silence of the teacher in this pedagogical setting was crucial? Certainly silence was something at which I was never adept, whether as student or teacher. Some of my most impressive colleagues over the years have been withdrawn to the point of inarticulacy in debates and even conversation, thinking with deliberation before committing themselves. I have envied them this self-restraint.
A
rticulacy is typically regarded as an aggressive talent. But for me its functions were substantively defensive: rhetorical flexibility allows for a certain feigned closeness—conveying proximity while maintaining distance. That is what actors do—but the world is not really a stage and there is something artificial in the exercise: one sees it in the current US president. I too have marshaled language to fend off intimacy—which perhaps explains a romantic penchant for Protestants and Native Americans, reticent cultures both.
In matters of language, of course, outsiders are frequently deceived: I recall a senior American partner at the consulting firm McKinsey once explaining to me that in the early days of their recruitment in England he found it nearly impossible to choose young associates—everyone seemed so articulate, the analyses tripping off their pens. How could you tell who was smart and who was merely polished?
Words may deceive—mischievous and untrustworthy. I remember being spellbound by the fantasy history of the Soviet Union woven in his Trevelyan Lectures at Cambridge by the elderly Trotskyist Isaac Deutscher (published in 1967 under the title
The Unfinished Revolution: Russia 1917-1967
). The form so elegantly transcended the content that we accepted the latter on trust: detoxification took a while. Sheer rhetorical facility, whatever its appeal, need not denote originality and depth of content.
All the same,
inarticulacy
surely suggests a shortcoming of thought. This idea will sound odd to a generation praised for what they are trying to say rather than the thing said. Articulacy itself became an object of suspicion in the 1970s: the retreat from “form” favored uncritical approbation of mere “self-expression,” above all in the classroom. But it is one thing to encourage students to express their opinions freely and to take care not to crush these under the weight of prematurely imposed authority. It is quite another for teachers to retreat from formal criticism in the hope that the freedom thereby accorded will favor independent thought: “Don’t worry how you say it, it’s the ideas that count.”
Forty years on from the 1960s, there are not many instructors left with the self-confidence (or the training) to pounce on infelicitous expression and explain clearly just why it inhibits intelligent reflection. The revolution of my generation played an important role in this unraveling: the priority accorded the autonomous individual in every sphere of life should not be underestimated—“doing your own thing” took protean form.
Today “natural” expression—in language as in art—is preferred to artifice. We unreflectively suppose that truth no less than beauty is conveyed more effectively thereby. Alexander Pope knew better.
1
For many centuries in the Western tradition, how well you expressed a position corresponded closely to the credibility of your argument. Rhetorical styles might vary from the spartan to the baroque, but style itself was never a matter of indifference. And “style” was not just a well-turned sentence: poor expression belied poor thought. Confused words suggested confused ideas at best, dissimulation at worst.
The “professionalization” of academic writing—and the self-conscious grasping of humanists for the security of “theory” and “methodology”—favors obscurantism. This has encouraged the rise of a counterfeit currency of glib “popular” articulacy: in the discipline of history this is exemplified by the ascent of the “television don,” whose appeal lies precisely in his claim to attract a mass audience in an age when fellow scholars have lost interest in communication. But whereas an earlier generation of popular scholarship distilled authorial authority into plain text, today’s “accessible” writers protrude uncomfortably into the audience’s consciousness. It is the performer, rather than the subject, to whom the audience’s attention is drawn.
C
ultural insecurity begets its linguistic doppelgänger. The same is true of technical advance. In a world of Facebook, MySpace, and Twitter (not to mention texting), pithy allusion substitutes for exposition. Where once the Internet seemed an opportunity for unrestricted communication, the increasingly commercial bias of the medium—“I am what I buy”—brings impoverishment of its own. My children observe of their own generation that the communicative shorthand of their hardware has begun to seep into communication itself: “people talk like texts.”
This ought to worry us. When words lose their integrity so do the ideas they express. If we privilege personal expression over formal convention, then we are privatizing language no less than we have privatized so much else. “When
I
use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” “The question is,” said Alice, “whether you
can
make words mean so many different things.” Alice was right: the outcome is anarchy.
In “Politics and the English Language,” Orwell castigated contemporaries for using language to mystify rather than inform. His critique was directed at bad faith: people wrote poorly because they were trying to say something unclear or else deliberately prevaricating. Our problem, it seems to me, is different. Shoddy prose today bespeaks intellectual insecurity: we speak and write badly because we don’t feel confident in what we think and are reluctant to assert it unambiguously (“It’s only my opinion . . . ”). Rather than suffering from the onset of “newspeak,” we risk the rise of “nospeak.”
I am more conscious of these considerations now than at any time in the past. In the grip of a neurological disorder, I am fast losing control of words even as my relationship with the world has been reduced to them. They still form with impeccable discipline and unreduced range in the silence of my thoughts—the view from inside is as rich as ever—but I can no longer convey them with ease. Vowel sounds and sibilant consonants slide out of my mouth, shapeless and inchoate even to my close collaborator. The vocal muscle, for sixty years my reliable alter ego, is failing. Communication, performance, assertion: these are now my
weakest
assets. Translating being into thought, thought into words, and words into communication will soon be beyond me and I shall be confined to the rhetorical landscape of my interior reflections.
Though I am now more sympathetic to those constrained to silence I remain contemptuous of garbled language. No longer free to exercise it myself, I appreciate more than ever how vital communication is to the republic: not just the means by which we live together but part of what living together means. The wealth of words in which I was raised were a public space in their own right—and properly preserved public spaces are what we so lack today. If words fall into disrepair, what will substitute? They are all we have.