Manifestations of Modernism
by Peter Mirus, March 1, 2005
With the recent scandals in the Church, groups have arisen to try
to address the need for Church reform. However, some proponents of
“reform” are using the sex scandals to champion Modernist causes,
demanding that the Church change both its divinely-instituted structure
and its morality to adapt to the demands of the present generation.
What Is Modernism?
Many of the proponents of “reform” in the Church are following the
Modernist blueprint. Essentially, a Modernist is one who believes that
the Church should adapt itself to the changing mindset of each
generation; that the evolution of society requires an equally evolving
system of faith and morality. But to argue that truth must remodel
itself according to the demands of society is a fallacy.
To the Modernist, revealed truth is at best a distant reality
which cannot be fully grasped by the human consciousness. As such,
Modernism is the complete antithesis of Catholicism’s primary
doctrines. If, as Modernist “wisdom” suggests, truth takes its form
only as the collective consciousness of the current era, then truth is
relative. Moreover, since truth can be grasped only in this community
form, the papacy is irrelevant. If there can be no such thing as a
known absolute truth by a single individual, and a collection of
individuals can only approximate it by common consensus, then the
concept of Magisterium, including papal infallibility, is not even a
matter of discussion.
To the modern world, and to the Christian / Catholic Modernist, a
relationship with God is something vague at best and something
completely unattainable at worst. And since moral truth is represented
best by a sort of cultural consensus, the personal pursuit of a
relationship with God is sheer hubris, and nothing more.
Modernists and Social Activism
The defining note of Modernist morality is generally social
activism in accordance with the prevalent fashionable opinions of the
current culture. This is because it is only by participation in the
community zeitgeist that the Modernist can reach a higher level of
Unfortunately, and contradictorily, in many groups committed to
social activism the participation quickly changes to dictation. By
virtue of his alliance with the prevailing cultural opinions, the
Modernist becomes convinced of certain moral certainties. Those who
don’t accept the prevailing views need to be informed of these quasi
truths for their own benefit. Moral certitude becomes moral
superiority, and Modernists -- who demand relative moral standards from
the Vatican – do not hesitate to impose definite moral (or amoral, or
immoral) standards on others.
Many mainstream Christian social activist organizations espouse a
vague worldwide common system of human values. I say “vague” because
there is little legitimate empirical evidence to support many of their
social agendas, and even less that makes philosophical sense. The means
are often specific (but misguided); the goals sound glowing but are
generally utopian and unrealizable in actual practice. Given the roots
of Modernism in the larger culture, it is no surprise that such
organizations mirror the agendas of their larger secular counterparts.
The Example of Suffering
Perhaps the mainspring of the Modernist error is a fundamental
misunderstanding of human suffering. Whether or not suffering has any
transcendent value is of paramount importance to many aspects of our
The ultimate goal of all Modernist social agendas is to eliminate
suffering. After all, it is something that we all have (and some more
than others) that we all don’t want. Therefore, any moral stricture
that stands in the way of this goal must be changed. If a couple is
suffering from not being able to have unrestricted sex, the moral law
regarding the purposes of the sexual act must be changed. If a man is
physically attracted to another man, the law prohibiting sodomy must be
changed. Again, it is not surprising that Modernists so often focus on
socio-sexual issues, given the surrounding culture’s obsession with
sex. In any case, to one of the Modernist persuasion, the sufferings
caused by the clergy sex scandals provide opportunities neither for
holiness via suffering nor for a return to Truth (Traditional faith and
morals). Instead, we hear promotion of doctrinal changes that have
their own selfish motivations. For example, we should eliminate
clerical celibacy and the restriction of priesthood to males, accept
and encourage homosexuality, and so on.
Essentially, the proposed way to end sex abuse by the clergy is to
provide the clergy with sexual freedom – the same kind of complete
sexual freedom that the Modernist laity demands for itself. The
formally unrecognized implication is that the Church is turning its
clergy into pederasts; the fact that pederasty reflects a lack of
adherence to Church teachings in not considered.
Suffering “My Way”
It is not true that Modernists want to eliminate all suffering.
Most Modernists will agree with the axiom “anything worth having is
worth suffering for.” They will suffer, and suffer gladly, and praise
suffering for the sake of any number of things: wealth, power, sex,
“freedom / empowerment”, career, sport, etc.
Hence, the theology of suffering with Christ for the
transformation of self is replaced by a willingness to suffer only to
more effectively indulge one’s passions in the long run. The Modernist
ultimate goal for self is to reach a point where one can maintain one’s
own selfishness without having to suffer for it. Hence the modern
obsession with frantic wealth acquisition (during which you suffer) so
you can build a high state of earthly existence, retire early and
maintain that elevated state until your euthanasia.
The ultimate social goal is to teach others how to attain freedom,
wealth, power, and sex in roughly equal measure to your own – and hence
end suffering in the world. In the process of doing this, all people
will be welcomed into the world social community and adopt a common
cultural morality that will lead to world prosperity and peace.
The point is that to the Modernist there is no suffering that is
worth undergoing without getting some physical reward in the present.
In the realm of Modernist goods, that which is physical is most easily
believed, that which is moral is what makes everybody “feel good”, and
that which is not physical has no moral value and therefore no
Irrelevant Faith, Irrelevant Church
It is fairly apparent that to call oneself “Catholic” but to
adhere to Modernist principles is an untenable position. The Modernist
rejects several essential principles of the Catholic Faith – most
importantly the dogmatic principle, that God’s revelation is both
exceedingly specific and safeguarded by earthly successors to Christ
who, by virtue of the Holy Spirit, are able to “bind” and “loose” with
admirable precision. Thus, whenever popes teach something contrary to
the Modernist creed, they are dismissed as outdated and, therefore,
As with the relevance of the Church, so too with the relevance of
God Himself. God can hardly be relevant without a clearly recognizable
plan for humanity. Unless enduring, consistent truths are derived from
Revelation, there is no relevance. Yet the Modernist believes in a God
so contrary to the attribute of immutability that the Modernist faith
borders on atheism.
In fact, perhaps this best sums up the case: Whether they realize
it or not, Modernist Catholics are atheists hedging their bets.
Under the Ban: Modernism, Then and Now
by Russell Shaw
On July 3, 1907, in a decree bearing the lachrymose Latin title
Lamentabili, the Vatican's Holy Office, predecessor of today's
Congregation for the Doctrine of the Faith, condemned 65 propositions
that it had found contrary to Catholic orthodoxy. Pope Pius X followed
up two months later, on September 8, with an encyclical named Pascendi
Dominici Gregis (Feeding the Lord's Flock), in which he linked the
condemned propositions to a heresy called Modernism and went on to
identify its philosophical and theological roots. In conclusion, the
encyclical specified stern disciplinary measures for stamping out the
Considering these events from the pope's point of view, he could
hardly have done less. For according to St. Pius X, who was to be
canonized in 1954, Modernism was the very "synthesis of all heresies."
Its condemnation and eradication were essential to protecting the Faith.
Still, a century later three questions about Modernism do need
answering: Did it actually exist? What was it all about? What
difference does it make?
Whether there really was something corresponding to what Pius X
called Modernism isn't so easy to say. After all, it was the pope
himself who gave Modernism its name and provided theoretical coherence
to what up till then had been a gaggle of ideas identified with a
loosely linked group of Catholic intellectuals in France, Italy, and
England. Modernism's leading figure, the Scripture scholar Alfred
Loisy, was not entirely wide of the mark when he complained that Pius X
not only had condemned Modernism but "invent[ed] the system" he
Even so, it would be foolish to dismiss Modernism as a figment of
the papal imagination. Pius X's account, as Loisy conceded, was drawn
from actual sources. These he identified as "[Maurice] Blondel and
[Lucien] Laberthonniere's philosophy of immanence . . . intimate
religious experience and moral dogmatism, into which had penetrated a
certain Kantian element . . . [George] Tyrrell's mystical theology,
which exhibited a certain Protestant individualism and illuminism," and
especially evolutionism, as it was reflected in Loisy's own
"evolutionary history of the Hebrew religion and Christianity, of
Catholic dogma, cult and constitution."
Regarding the threat that all this posed, it is necessary to begin
with some historical background.
Descartes, Kant, and Darwin
Modernism's remote origins are usually traced to Descartes and
Kant and their theories of knowledge, especially knowledge of a
religious sort. In holding that we cannot know things directly but only
as they are presented to us by our minds, Descartes and Kant inserted
radical subjectivism into the heart of the epistemological question,
including the question of what can be known about God and spiritual
realities. Shortly after Modernism's condemnation, Rev. Arthur
Vermeersch, S.J., an important Roman theologian of the day, remarked
that while in Kant "dogmas and the whole positive framework of religion
are necessary only for the childhood of humanity," Modernists went
further still and took faith to be "a matter of sentiment, a flinging
of oneself towards the Unknowable."
Beginning in the late 18th century and continuing through the
century that followed, the epistemological revolution launched by
Descartes and Kant merged with developments in archaeology, history,
and biblical study to produce a radical shift in the thinking of some
theologians. Among other things, the historical value of the Gospels
and the first five books of the Old Testament (the Pentateuch) came
under attack while in his influential book The Christian Faith,
Friedrich Schleiermacher (1768-1834) sought a foothold for belief in
the notion that the essence of religion is piety, and piety is feeling.
This is to say that to be genuinely religious it is sufficient to feel
dependent on God.
In 1859, Charles Darwin's The Origin of Species erupted on an
already shaky religious scene, spewing the ideological equivalent of
lava and ashes far and wide. T. H. Huxley, an aggressive publicist for
Darwinism, was soon exulting that one of the best things about it was
its "complete and irreconcilable antagonism to that vigorous and
consistent enemy of the highest intellectual, moral, and social life of
mankind — the Catholic Church." But strange to say, it was
Protestantism, not Catholicism, that suffered first and worst.
Attempts to salvage something from the collapse of faith reflected
in literary works like Tennyson's In Memoriam and Arnold's Dover Beach
ranged from biblical fundamentalism to liberal Protestantism. "The
Jesus of History and the Christ of Faith" became a favorite slogan of
the liberal Protestants. The catchy expression is shorthand for a
supposedly unbridgeable split between the human Jesus — a historical
figure said to have been concealed from sight by the early Christian
community's practice of shrouding Him in pious fictions — and a divine
being upon whom believers project their subjective religious impulses.
(In his new book, Jesus of Nazareth, Pope Benedict XVI calls this
history-faith dichotomy "tragic" for Christian belief. The Gospels
present "the real, 'historical' Jesus in the strict sense of the word,"
Although comparatively slow to infiltrate Catholic circles, the
new thinking began to appear there in the latter years of the 19th
century. Men like the historian Louis Duchesne, the novelist Antonio
Fogazzaro, the philosopher Blondel, and the pietistic German-British
intellectual gadfly Baron Friedrich von Hugel led the way. Not all of
them were full-fledged Modernists, and some, like Blondel, later were
genuinely horrified at being thought at odds with the Church.
Nevertheless, they exchanged ideas, encouraged one another in their
work, and formed a network that over time came to have an influence
beyond its numbers in seminary and clerical circles. Several Italian
pastoral letters warning against Modernism by name appeared in 1905 and
Loisy, Tyrrell, and Friends
The two most visible members of the group were the Frenchman Loisy
and the Irish Jesuit Tyrrell.
Alfred Loisy was born in Ambrieres, France, in 1857; studied at
the diocesan seminary of Châlons-sur-Marne; and was ordained in
1879. A gifted linguist, he taught at the Institut Catholique in Paris
until his criticism of traditional views led to his removal as
professor of Sacred Scripture; thereafter he taught at the École
Pratiques des Hautes Études. His best-known work,
L'Évangile et l'Église (The Gospel and the Church),
published in 1902, was a response to the historian Adolf von Harnack,
whose liberal Protestant understanding of Scripture reduced
Christianity to a handful of basic principles. But although he wrote to
refute von Harnack, Loisy simultaneously denied that Christ meant to
establish a Church or teach a body of lasting religious truth, and
argued instead for "the incessant evolution of doctrine."
L'Évangile and his other works were placed on the Index of
Forbidden Books. Loisy refused to submit to Pascendi and was
excommunicated in 1908. He taught at the College de France from 1909 to
1930, and continued writing in defense of Modernism until his death in
Born in Dublin in 1861, George Tyrrell grew up an Evangelical
Christian, studied at Trinity College, and entered the Catholic Church
in 1879. He was accepted into the Jesuits the following year, ordained
in 1891, and in 1896 transferred to the Jesuits' Farm Street church in
London. There he launched his career as a writer and was introduced by
his great friend von Hugel to the work of men like Loisy and Blondel.
Tyrrell's increasingly radical critique of orthodox theology led to his
expulsion from the Jesuits in 1906. His rejection of Pascendi was
public and violently worded. Excommunicated in 1907, he received
absolution before his death in 1909, but was refused Catholic burial on
the grounds of not having retracted his heretical views. In the
posthumously published Christianity at the Cross-Roads, Tyrrell
expressed his hope that Christianity would not be the last stage of
humanity's religious quest but an intermediate stage on the way to a
new universal religion.
In fairness to Loisy, Tyrrell, and the rest, it is important to
bear in mind that they had an unobjectionable, even commendable, aim in
mind at first. In a 1908 book, Simple Reflections (i.e., on the Holy
Office decree and the papal encyclical), Loisy described himself and
the rest of "the avowed modernists" as "a fairly definite group of
thinking men united in the common desire to adapt Catholicism to the
intellectual, moral and social needs of today." Had there been no more
to it than that, Modernism might have been more deserving of praise
than condemnation. Unfortunately, there was quite a bit more.
In his carefully documented and nuanced study of Modernism,
Critics on Trial (Catholic University of America Press, 1994), Msgr.
Marvin R. O'Connell calls an anonymous document titled Il Programma di
Modernisti "perhaps the most succinct and coherent statement of the
Modernist position ever attempted." Published "virtually before the ink
on Pascendi was dry" in a forceful translation by Tyrrell, it almost
certainly was written by an Italian philosopher and editor named
Ernesto Buonaiuti. Boldly flaunting its immanentism, it contains such
declarations as: "Religious knowledge . . . is our actual experience of
the divine which works in ourselves and in the whole world," and,
"Religion is . . . the spontaneous result of irrepressible needs of
man's spirit, which find satisfaction in the inward and emotional
experience of the presence of God within us."
Of Pascendi and Lamentabili, this manifesto says: "The Church and
Society can never meet on the basis of those ideas which prevailed at
the Council of Trent . . . The Church cannot, and ought not to, pretend
that the Summa of Aquinas answers to the exigencies of religious
thought in the twentieth century." Only evolution provides a basis for
believing in "the permanence of something divine in the life of the
The Condemnations of 1907
Drawing on the works of Modernists, the 65 condemned propositions
in Lamentabili include: The Church's Magisterium "cannot determine the
genuine sense" of Scripture even by dogmatic definitions (no. 4); the
Gospels contain "only a slight and uncertain trace" of Christ's
teaching (no. 15); divine revelation is only "the consciousness
acquired by man of his relation to God" (no. 20); Christ's resurrection
is "not properly a fact of the historical order, but a fact of the
purely supernatural order, neither demonstrated nor demonstrable" (no.
36); the sacraments originate not in Christ but in "the apostles and
their successors" engaged in interpreting "some idea and intention of
Christ" (no. 40); the only purpose of the sacraments is to be reminders
of "the ever beneficent presence of the Creator" (no. 41); "it was
foreign to the mind of Christ to establish a Church" (no. 52); "truth
is no more immutable than man himself" (no. 58); Catholicism "cannot be
reconciled with true science, unless it be transformed into a kind of
non-dogmatic Christianity, that is, into a broad and liberal
Protestantism" (no. 65).
Then Lamentabili states the pope's judgment: "condemned and
Then-Rev. Joseph Ratzinger, in an essay published in 1966,
criticized some of the formulations used by the Holy Office — for
example, that found in proposition 22, which states that so-called
revealed dogmas aren't "truths fallen from heaven" but interpretations
of religious facts worked out by human effort. "It is certainly very
difficult to determine the meaning and binding force of such a
condemnation," the youngish German theologian, today known to the world
as Benedict XVI, remarked. Indeed, he went on, some of these individual
propositions, "in themselves, can have an altogether acceptable
meaning." The real significance of Lamentabili, he then declared, can
be found in "its meaning as a whole, insofar as it condemns a radically
evolutionistic and historicist tendency." This remains a useful
interpretive principle for reading the decree today.
Appearing two months later, Pius X's Pascendi Dominici Gregis is
remarkable in several ways. An extremely long document — nearly twice
the length of Pope Leo XIII's 1891 social encyclical Rerum Novarum — it
is bluntly, even harshly worded, as if its author had taken personal
offense at the ideas it condemns. But it is important to grasp that
Pascendi really is about ideas, not personalities. The Modernists
viewed Pius as an ignorant bumpkin who as pope found himself in over
his head ("a peasant of simple seminary training," von Hugel remarked
condescendingly), but his encyclical is a sophisticated analysis of
A few highlights: Saying he aims to give a systematic account of
Modernism — something the Modernists themselves deliberately failed to
do — the pope situates its philosophical basis in "agnosticism." By
this, he makes it clear, he means the idea that the fundamental source
of religion is "vital immanence." In other words: "It is . . . to be
sought within man himself . . . in a need for the divine." Here, too,
in "religious consciousness," is where Modernists locate revelation. It
follows from such principles, Pius says, that dogma "not only can but
ought to be evolved and changed," and that "all religions are true" to
the extent they reflect the human psyche.
Moving to theology, Pius pinpoints key Modernist ideas. Christ did
not establish the Church and institute sacraments directly, but only in
the sense that His example inspired later Christians to do so. The
Church is "the fruit of the collective conscience" of the Christian
community. The idea that the Church's authority comes from God is
"obsolete." In order to avoid "internecine war," the Church must adopt
"democratic forms" of government. In sum: "In a religion which is
living nothing is without change, and so there must be change."
From here [the Modernists] make a step to what
is essentially the chief point in their doctrines, namely, evolution.
Dogma, then, Church, worship, the books that we revere as sacred, even
faith itself . . . must be bound by the laws of evolution . . .
The Modernists' reading of the gospels flows
naturally from their agnosticism . . .
Any divine intervention in human affairs must
be relegated to faith, as belonging to it alone. Thus, if anything
occurs consisting of a double element, divine and human, such as are
Christ, the Church, the sacraments . . . there will have to be a
division and separation, so that what was human may be assigned to
history, and what divine to faith. Thus, the distinction common among
the Modernists between the Christ of history and the Christ of faith;
the Church of history and the Church of faith; the sacraments of
history and the sacraments of faith.
Pascendi Dominici Gregis concludes by spelling out practical steps
to take: removing Modernists from seminary faculties, new censorship
norms, and the creation in every diocese of a clergy council
responsible for ferreting out nascent signs of Modernism. Three years
later the pope published the text of an oath against Modernism that
priests were required to take; it remained on the books until Pope Paul
VI did away with it in 1967.
Reaction to Pascendi and Lamentabili was mixed. Bishops generally
accepted Pius X's judgment and prepared to carry it out. Publicly at
least, the strongest words of defiance were probably those of Tyrrell
in the Times of London: "Neither the engineered enthusiasm of la bonne
presse, nor the extorted acquiescence and unanimity of a helplessly
subjugated episcopate, nor the passive submission of uncomprehending
sheeplike lay multitudes will deceive [the Modernists] into thinking
that this Encyclical comes from, or speaks to, the living heart of the
Church — the intelligent, religious-minded, truth-loving minority." The
writer André Bremond, an ex-Jesuit like Tyrrell, called this
outburst an example of his friend's "Irish frenzy."
Then something shameful happened — something that even
conservatives today regard with abhorrence. Instigated by a Vatican
official, Msgr. Umberto Benigni, a campaign began to neutralize anyone
suspected of Modernist tendencies. It was carried out by a secretive
group created by Monsignor Benigni called the Sodality of St. Pius V. A
network of spies and informers went into action in dioceses searching
out supposed Modernists and Modernist sympathizers — bishops, pastors,
professors, editors — and reporting them to the authorities.
Reputations were blackened, careers damaged, innocent people hurt. The
intellectual pogrom continued until Pope Benedict XV finally put an end
to it. But the bitter taste lingered in many mouths.
So much for Modernism and its condemnation. The question remains:
A century later, what difference does it all make?
The Modernist Legacy
In attempting to answer that, we need to begin by recognizing that
some ideas associated with Modernism are part of today's orthodox
Catholic consensus. Certainly this is true in the area of Scripture
study, where much that looked like avant-garde thinking around the turn
of the last century has come to be taken for granted. Today, for
instance, hardly any knowledgeable person would argue that Moses was
the immediate human author of the Pentateuch. Nor do Catholics have any
difficulty accepting the idea that the Bible embraces a variety of
literary forms and that the four Gospels reflect the early Christian
community's theological understanding of Christ as well as the events
of His life. In Jesus of Nazareth, Benedict XVI calls the
critical-historical method an "indispensable" tool for understanding
Scripture, though certainly not the only one.
Today, too, it is clear that although the condemnations of 1907
and the events that followed drove Modernism out of sight, Modernism
and its offshoots survived. For one thing, the Modernists and their
sympathizers kept on writing. Moreover, as Rev. — now Cardinal — Avery
Dulles, S.J., pointed out in 1971 in The Survival of Dogma: "Driven
underground but not solved by the condemnation of Modernism, the
problem of dogmatic change surfaced again in the nouvelle theologie of
the 1940s." (As the Second Vatican Council demonstrated, Newman's
principle of doctrinal "development" offers an approach to this
question whose usefulness has not been exhausted even now.)
Eventually, too, changing times and circumstances allowed
Modernism and its cousins to emerge from hiding. Modernism persisted,
says historian Philip Trower, "because the causes which had originally
brought it into existence persisted: the increasingly secularized
culture in which the bulk of Western Catholics now lived, and the
complexity of many of the questions raised by 'modern thought.'"
The case of Rev. Pierre Teilhard de Chardin, S.J., is instructive.
Forbidden to publish by his superiors, under pressure from the Holy
See, the Jesuit paleontologist continued writing, with his works
circulating in mimeographed form for years. When it was finally
published after his death, The Phenomenon of Man (French edition 1955,
English edition 1959) and other books enjoyed cult success among
readers who relished their quasi-poetic, quasi-mystical,
quasi-scientific synthesis of evolutionism and Christianity. In the
1960s and 1970s, Teilhardian evolutionism occupied an important place
in the stew of ideas and sentiments that shaped the troubled reception
of Vatican Council II.
And now? Major elements of today's progressive Catholicism bear
more than a small family resemblance to things condemned by Pius X in
1907. Consider Pascendi on the Modernists' program of Church reform:
They . . . demand that history be written and
taught according to their method and modern prescriptions. Dogmas and
the evolution of the same . . . must be brought into harmony with
science and history. As regards catechesis, they demand that only those
dogmas be noted . . . which have been reformed . . . As for worship,
they say that external devotions must be reduced in number and that
steps must be taken to prevent their increase . . . They cry out that
the government of the Church must be reformed in every respect . . .
Both within and without it is to be brought in harmony with the modern
conscience . . . which tends entirely towards democracy . . . The Roman
congregations they likewise wish to be modified in the performance of
their holy duties, but especially . . . the Holy Office . . . Finally,
there are some who . . . desire the removal of holy celibacy itself
from the priesthood.
Leaving aside the pros and cons of such proposals, it's a fact
that they have been items on the progressive Catholic agenda for years.
But the Modernists of a century ago aren't the only ones with
counterparts now. The integralists who launched a witch hunt after the
publication of Pascendi and Lamentabili have successors, too. These
include people who believe — or at least strongly suspect — that the
Church hasn't had a real pope since 1958, when Pius XII died
(sedevacantists, they're called), who hold that Vatican II wasn't a
valid council of the Church, and who at this very moment are very
likely somewhere on the Internet trying to make the case that Benedict
XVI is a Modernist. Just like Loisy and Tyrrell, Benigni also has
In summing up Modernism and its legacy, however, it's only fair to
give the last word to Loisy: "The Catholicism of the pope being neither
reformable nor acceptable," the excommunicated ex-priest and Modernist
luminary wrote in 1931, "another Catholicism will have to come into
being, a humane Catholicism, in no way conditioned by the pontifical
institution or the traditional forms of Roman Catholicism." A century
after Modernism was condemned, Loisy's successors are still working
hard to bring that about.
Russell Shaw, a Crisis contributing editor, is a writer and
journalist based in Washington, D.C. His most recent book, written with
Rev. C. John McCloskey III, is Good News, Bad News: Evangelization,
Conversion, and the Crisis of Faith (Ignatius Press).
Harry Potter: Death stalks the halls of Hogwarts
Written by Joe Woodard
Friday, 13 April 2007
The tragedy to be unveiled in the last Harry Potter is a mirror
for our age.
ImageHarry Potter and the Deathly Hallows is scheduled for release
July 21. And barring possible plot surprises, heroic Harry is doomed to
die in this seventh and last book of J.K. Rowling’s hugely popular teen
sorcerer series. He will follow wise and self-sacrificing Hogwarts
School of Witchcraft and Wizardry headmaster Albus Dumbledore and a
half-dozen fellow students into some vague though presumably
comfortable afterlife, apparently as a disembodied spirit.
Given that the Potter books now rank second only to the Bible in
their popularity, what are we to make of Harry’s pending death?
Boasting solid five-star Amazon ratings and over 300 million
sales, Potter is a clear symptom of Western civilisation's slow slide
back into naturalistic mythic paganism. Despite our electronic heart
monitors and computerised intravenous drips, modern technological
optimism is finally colliding with the unavoidable reality of death. In
a banal mockery of Nietzsche’s "Eternal Recurrence," Western
civilisation is reverting to an epoch of tragedy, a worldview that
virtually defined the Ancient Greeks and Romans -- and which they then
rejected some 1,500 years ago, voting with their feet in favour of the
The Potter books encapsulate three cultural temptations that have
undercut the once Christian West ever since the philosophers of the
17th century Enlightenment launched their insurgency against
Christendom. In historical order, those trends are: first, the
reduction of human reason to mere practical technique or
"problem-solving"; second, the rejection of rational metaphysics or
theology in favour of self-conscious myth-making (now glorified as
post-modernism); and now, last and most clearly with Harry’s death, the
slowly-dawning realisation that human mortality still punctures all of
our idiosyncratic "realities" and renders human technology (even
genetic engineering and sorcery) mere distraction and vanity.
Harry’s education at Hogwarts rivals modern medical schools in its
philistine pragmatism. Whether studying spells and potions, dark arts
or magical beasts, the sorcery students learn only how to "do" things,
like flying on brooms, de-gnoming gardens or creating gluttonous
feasts. Magic is just another craft. What they should "be", what sort
of character they should cultivate, never becomes a topic of
instruction or conversation. Harry is encouraged only to be true to
himself. And one of the four school "houses," Slytherin, is explicitly
dedicated to the nasty kids, presumably because that’s just the way
they are, and they have a right to an education sharpening their nasty
It’s unclear whether Rowling is deliberately parodying modern
"self-affirming" schooling here. But the pedigree of her stunted
understanding of education and human reason includes the likes of
Enlightenment philosophers Spinoza, Descartes, Bacon and Locke. In
their quarrel with Ancient metaphysics and Christian theology, early
modern philosophers sought to harness reason to the "relief of the
estate of man" and the creation of a "heaven on earth" through
technology. So they rejected any sort of metaphysical speculation and
therein the contemplative intellect as essentially useless, asserting
(in Thomas Hobbes’s words) , "We know only what we make."
Whatever the differences among the Enlightenment savants, they
agreed that reason is not a mirror of an independent reality, mundane
and divine, to which human beings must conform themselves. Rather, they
redefined reason as a human construct, obedient to human purposes. Yet
any definition of those purposes, beyond the endless increase in human
powers, has remained up for grabs.
The result of this philosophic lobotomy we see today in a medical
profession fully committed to expanding its techniques, but oblivious
to any distinction between its legitimate and illegitimate purposes. We
see it in accountants and engineers who work themselves to death,
because doing is all they know, because no one has taught them that
happiness is found in contemplation and worship. And we see it in the
Hogwarts (and Springfield Elementary) school faculties, dedicated to
empowering students, but deliberately recusing themselves from training
characters in righteousness and nobility.
The modern technological ambition to reconstruct both material and
human nature has naturally culminated in the post-modern presumption
that we can all construct our own personal, virtual realities. In
contrast, the claims of Christendom stood or fell on issues of
historical fact, like whether that tomb was really empty. But these
days, we’ll deliberately commit to any likely story that will
temporarily make us feel good.
In this context, author Rowling is symptomatically post-modern,
not in the obvious fact that she is creating a new myth (as did
Tolkien), but in her blithe assumption that whatever reality lurks
behind the mythic is basically benign. For all the murder and
soul-sucking in the Potter books, Rowling pokes hardly at all into
questions of what lies beyond the veil. Spirits haunting Hogwarts, like
Nearly Headless Nick and the Fat Friar, provide reassurance of some
sort of commodious afterlife, despite the cutthroat will to power in
this life, so it really doesn’t matter who’s won when the whistle blows.
Modernity’s Achilles’ heel
And yet… and yet, death remains a problem -- a serpent Rowling has
not avoided but rather tried to domesticate. And the viper cannot long
imitate the garter snake. The culture of ancient Greece and Rome, the
world of Homer, Sophocles and Virgil (and most of the world besides),
was virtually defined by their awareness that human beings would always
strive for a nobility rendered ephemeral and pointless by their
mortality, and the more noble the human, the more tragic the death.
Life itself is the undeserved misfortune suffered by noble characters
-- the classic definition of tragedy.
For this tragic epoch, the Good News of the Christian Gospel (as
pundit Chesterton said) was original sin, the revelation that life
wasn’t pointless cruelty, that the universe wasn’t stacked against man,
but rather that man was simply his own worst enemy. Conjoined with the
promise of the "resurrection of the flesh" and eternal life, this meant
that life was basically the undeserved good fortune enjoyed by ignoble
characters -- the very definition of comedy. So Christendom was
expressed in the farces of Dante, Chaucer and Cervantes. And the joys
of contemplation were opened to the meanest intellects in the church’s
endless parade of feastdays.
After a thousand years of Christendom, however, the insurgents of
the Enlightenment found the idleness of worship and the reign of
clerics an affront to human pride. They believed that unleashing all
the potential of human technology alone would render mankind healthy,
wealthy and wise. Some thought, with Hobbes, that life made commodious
and safe would become reconciled to quiet death in old age. Others,
with Descartes, believed that the development of medical technology
would bring practical physical immortality. Either way, man the worker
would emerge as the happy master of his own house.
It hasn’t turned out that way, of course. First, the modern
obsession with conquering human suffering has made Western man
pathologically soft and sensitive, discombobulated by daily irritants
our grandfathers would have simply ignored. Second -- confirming the
Christian hypothesis of original sin -- the expansion of man’s power
over nature has meant (as others observed) the expansion of some men’s
power over other men. Given today’s malignant public administration,
economic interdependency and mass media, almost no one now pretends to
be the master of his own house.
And third, technology itself has developed a credibility bubble;
its promises of happiness have outstripped its delivery, and with every
further development of medicine, death looms larger as the final
frontier -- unknowable, implacable and unavoidable. So the last man’s
ideal life has become perfect fitness until 75 or 85, then a little
poison for a comfortable death. And to this he dedicates life-coaching,
organic cooking and treadmilling.
Colliding with the inevitable
This is where Harry’s death comes in, as yet another symptom (like
Columbine High) of where we’re heading. It took 400 years for the
Enlightenment buzzards to roost. For four centuries, Western pragmatism
has coasted on its reserves of Christian optimism. But the tipping
point was reached when the sexual revolution threw off the last of
Christian "oppression", and then raised a next generation of
Kids today have far fewer self-serving illusions than their baby
boomer parents. Death has always been the staple of adolescent
literature; but today the hero dies. So they can again understand
Achilles’s complaint, "Do you not see what a man I am? How huge? How
splendid?… Yet even I also have my death and strong destiny; there
shall be a dawn or afternoon or noontime when some man in the fight
will take the life from me also."
So there is a silver lining to the pagan cloud, descending over
the land. Modernism was a kind of naïve vanity, predicated on an
immature bracketing of the big questions of life -- like the
businessman who resolves to spend time with his family once his bundle
is made. But kids now are realising that, even if you’re a
technological wizard, you still die in the end. Culturally they feel
the heart flutter, the shooting pain down the left arm, the memento
mori. The now-manifest spiritual vacuity of the pragmatic epoch means
they’re now open to something, almost anything.
Joe Woodard is former editor of the Canadian conservative magazine
Western Standard, now teaching in Calgary.
Unprotected: students exposed to disease
and heartache By Jennifer
Wednesday, 10 January 2007
A campus psychiatrist is driven to write about the way students'
bodies and souls are sacrificed to the sexual ideology reigning in
ImageIt is a continuing mystery how advanced Western societies
can, with a straight face, declare that trans fats should be banned (as
in New York City) but at the same time, ignore the health risks
associated with non-monogamous sexual activity. Finally, someone with
authority dares to speak out. Her name is Dr Miriam Grossman but she
called herself Dr Anonymous when she wrote Unprotected: A Campus
Psychiatrist Reveals How Political Correctness in her Profession
Endangers Every Student. As a psychiatrist at the University of
California, Los Angeles (UCLA), she has treated thousands of college
students over the past ten years. If you have a loved one in college,
you owe it to them to read Unprotected to find out what is really going
Adults think they are teaching the young to be non-judgmental, but
this translates into the young having no basis for making judgements
about what is good for them. Although there is plenty of evidence that
sex without commitment is emotionally and physically harmful, this
evidence is carefully concealed from the young. So even while they are
told to make their own decisions, the adults around them systematically
understate the harms of non-marital sex. The author is especially
effective because she dramatizes general points with the stories of
particular individual students who typify a problem.
She tells of Brian, a gay student who came to her because he
wanted medication to help him stop smoking. During the course of the
session it transpired that he and his boyfriend often pick up other
men. “It’s hard to be monogamous,” he explained. Neither Brian nor his
boyfriend use condoms for protection. Neither has ever been tested for
The author reviews her responsibilities toward patients suspected
of having tuberculosis. The law expects the doctor to test students
at-risk of TB. If the skin test is positive, she is required to give
him a chest X-ray. If the combination of skin test and chest X-ray
point to TB, the doctor is required to report him to the Department of
Health within a day. Yet for students at-risk for HIV, she can only
recommend testing and discourage unsafe activities. A man from Mars
would conclude that we are more concerned about the health of TB
patients than of HIV patients.
A student named Heather is referred for unexplained depression.
After discarding numerous possible explanations, including academic
pressure, poor health, death of a pet, the doctor asks Heather whether
she has had any changes in her relationships. Heather thinks it over,
“Well, I can think of one thing: since Thanksgiving, I’ve had a ‘friend
with benefits.’ And actually I’m kind of confused about that... I want
to spend more time with him, and do stuff like go shopping or see a
movie. That would make it a friendship for me. But he says no, because
if we do those things, then in his opinion we’d have a relationship–
and that’s more than he wants. And I’m confused because it seems like I
don’t get the ‘friend’ part, but he still gets the ‘benefits'.”
The author recounts the evidence that sexually active teenage
girls are about three times more likely to be depressed and to have
attempted suicide than girls who were not sexually active. She also
recounts the evidence that women’s physiology creates this
vulnerability. Women secrete a hormone called oxytocin during sexual
activity, and while nursing a baby. Oxytocin promotes bonding, trust
and relaxation. Mother Nature evidently is trying to get us to connect
with our babies, and with our sex partners, who after all, might become
the father to our children.
Oxytocin recently made an appearance in American politics. George
Bush’s appointment to the Office of Population Affairs actually
believes in abstinence. The Life-Style Left discovered that Dr Eric
Keroack had once given a lecture in which he informed people about the
bonding power of oxytocin. They went apoplectic, rather than confront
the evidence on its own terms.
This refusal to face inconvenient facts cries out for explanation.
One of the author's patients asked her, “Why, Doctor, do they tell you
how to protect your body from herpes and pregnancy, but they don’t tell
you what it does to your heart?”
I have my own theory about this, which is completely complementary
with the author's experience. Far from being sexually neutral, tolerant
and non-judgmental, the Life-Style Left subscribes to a covert
ideology. I call it Condomism. Its chief tenets are that sex is a
private recreational activity with no moral or social significance.
Unlimited sexual activity is an entitlement. There are no harms
associated with sex that cannot be controlled by condoms or other forms
And if anyone complains about anything that can’t be controlled by
condoms, well, those complaints are not worth taking seriously. Getting
attached to inappropriate sex partners? Never happens. Women’s
depression associated with uncommitted sex? Must be bad data. Post
Traumatic Stress Disorder associated with abortion? A mere blip in the
data, even though the author's back-of-the-envelope calculations show
that if a mere 1 per cent of post-abortive women develop PTSD symptoms,
that amounts to 420,000 traumatized women. That’s a lot of women to
Unprotected is a bold and important book. Buy it. Read it. Pass it
around. You may just save someone you love a lot of heartache.
Jennifer Roback Morse, PhD, is a Senior Research Fellow in
Economics at the Acton Institute, and the author of Smart Sex: Finding
Life-long Love in a Hook-up World.
Anthropologist René Girard Foresees a Christian Renaissance
Virtually Deceased," Says René Girard
ROME, DEC. 17, 2006 (Zenit.org).- French anthropologist
Girard, one of the most influential intellectuals of contemporary
culture, thinks that a Christian Renaissance lies ahead.
In a book published recently in Italian, "Verità o fede
Dialogo su cristianesimo e relativismo" (Truth or Weak Faith: Dialogue
on Christianity and Relativism), the anthropologist states that "we
will live in a world that will seem and be as Christian as today it
Girard, recently elected to be one of the 40 "immortals" of the
Academy, said: "I believe we are on the eve of a revolution in our
culture that will go beyond any expectation, and that the world is
heading toward a change in respect of which the Renaissance will seem
The text published
by Transeuropa, is the result of 10 years of meetings between the
French thinker and Italian professor Gianni Vattimo, theorist of
so-called weak thought, on topics such as faith, secularism, Christian
roots, the role of the Gospel message in the history of humanity,
relativism, the problem of violence, and the challenge of reason.
The book presents specifically to the general public the
of three unpublished conferences in which the two authors challenge
each other on the most radical points of their thought.
In the book, the French professor states that "religion conquers
philosophy and surpasses it. Philosophies in fact are almost dead.
Ideologies are virtually deceased; political theories are almost
altogether spent. Confidence in the fact that science can replace
religion has already been surmounted. There is in the world a new need
In regard to moral relativism, defended by Vattimo, René
writes: "I cannot be a relativist" because "I think the relativism of
our time is the product of the failure of modern anthropology, of the
attempt to resolve problems linked to the diversity of human cultures.
"Anthropology has failed because it has not succeeded in
different human cultures as a unitary phenomenon, and that is why we
are bogged down in relativism.
"In my opinion, Christianity proposes a solution to these problems
precisely because it demonstrates that the obstacles, the limits that
individuals put on one another serve to avoid a certain type of
The French academic continues: "If it was really understood that
is the universal victim who came precisely to surmount these conflicts,
the problem would be solved."
According to the anthropologist, "Christianity is a revelation of
but also "a revelation of truth" because "in Christianity, truth and
love coincide and are one and the same."
The "concept of love," which in Christianity is "the
the unjustly accused victim, is truth itself; it is the anthropological
truth and the Christian truth," explains Girard.
In the face of Vattimo's appeals to justify abortion and
well as homosexual relations, the French professor stresses that "there
is a realm of human conduct that Vattimo has not mentioned: morality."
Girard goes on to explain that "understood in the Ten Commandments is a
notion of morality," in which the notion of charity is implicit.
Girard then answers Vattimo, who suggests a "hedonist
"If we let ourselves go, abandoning all scruples, the possibility
exists that each one will end up doing what he wants," writes Girard.
The French anthropologist criticizes the "politically correct
which considers "the Judeo-Christian tradition as the only impure
tradition, whereas all the others are exempt from any possible
Girard reminds the defenders of the politically correct that "the
Christian religion cannot even be mentioned in certain environments, or
one can speak of it only to keep it under control, to confine it,
making one believe that it is the first and only factor responsible for
the horror the present world is going through."
As regards moral nihilism, which seem to permeate modern society,
Girard concludes that "instead of approaching any form of nihilism,
stating that no truth exists as certain philosophers do," we must
"return to anthropology, to psychology and study human relations better
than we have done up to now."
Tolerance in the Rainbow Nation
By Martyn Drakard
Thursday, 07 December 2006
South Africa has become the fifth country in the world to legalise
ImageOn the last day of November, South Africa, the economic powerhouse
of Africa, became the fifth country in the world to legalise gay
marriage. It was a strange move for a country which, in many ways, is
intensely conservative socially.
The path to same-sex marriage began in 1994 with the raising of the
rainbow flag. The collapse of apartheid was the climax that millions of
South Africans had hardly dared dream of. But it brought about not just
a political revolution, but a social and moral one as well. The new
South Africa needed a new identity and a new style. The teaching of the
Dutch Reformed Church, with its strict insistence on predestination and
its equally strict moral code, had been discredited. Anything
"conservative" meant returning to the years of nightmare.
The 1994 constitution represented therefore an extreme reaction against
years of oppression. It banned discrimination not only on grounds of
race, but also religion, sex, ethnicity, and sexual orientation -- the
first in the world to do so. It did not take long for the issue of gay
marriage to emerge.
In July 2002, the High Court ruled that denying same sex couples the
right to marry was unconstitutional. In November 2004, the Supreme
Court of Appeal declared that marriage laws must include partners of
the same gender. Finally, the Constitutional Court ruled in December
2005 that the exclusion of same-sex marriage "represents a harsh if
oblique statement by the law that same-sex couples are outsiders and
that their need for affirmation and protection of their intimate
relations as human beings is somehow less than that of heterosexual
couples." It set down that the law had to be changed within a year.
A first draft of the new law proposed a civil union, just short of
marriage. This satisfied no one. Supporters of traditional marriage
values held massive demonstrations, while gay rights groups criticised
it for creating a kind of sexual apartheid. A second draft followed the
Constitutional Court's instructions more closely. This was passed in
the National Assembly on November 14, with 230 voting for and 41
against. Deputy President Phumzile Mlambo-Ngcuka signed it into law on
November 30. South Africa has joined the exclusive club of nations
where same-sex marriage is legal, along with the Netherlands, Belgium,
Spain, and Canada.
The rest of Africa was full of horror, not admiration, at the Rainbow
Nation's progressive stance. In Nigeria, Africa's most populous
country, the parliament is even considering measures to send anyone
connected with a same-sex marriage to jail for five years, in addition
to banning gay clubs, publicity, shows, organisation and so on. But
South Africa is quite different from the rest of Black Africa. Its
culture and ethos are oriented towards the Western and Europe. Although
the blacks form the majority of the population, because of the years of
apartheid they have been sidelined economically, politically and
culturally. As a result, many Africans have interpreted the new law as
an extension of Western influence. A leading Zimbabwean historian,
Phathisa Nyathi, questioned whether it represented progress for Africa.
"The mere fact that South Africa is more gay than
other African countries, shows just how much white influence is in that
country. South Africa is certainly leading the way in Africa. But
leading where to? It is clear whose values are being promoted here. In
the First World, there is nothing wrong with being gay. It is just part
of what they call freedoms."
Other African countries are not likely to follow South Africa's lead.
African Anglicans, for instance, are so opposed to the ordination of an
American homosexual as a bishop that they may secede from the Anglican
communion. Robert Mugabe, Zimbabwe's president, caused a furore in the
West when he described gays as "worse than pigs". In most of Africa he
was applauded. Didymus Mutasa, Zimbabwe's Minister of State Security,
took time out of a meeting of defence ministers to lecture his South
African counterpart on the issue
"Recently you passed legislation to allow men to
marry other men and women other women... I find that very difficult...
because, how can I be attracted to another man sexually? How will women
view me afterwards? Our president does not like it and when he spoke
against it, he was not speaking for himself alone but for all of us."
Amongst Muslims, attitudes are not much different. An Islamist leader
in Mogadishu, Sheikh Sharif Ahmed described the new law as "a foreign
action imposed on Africa". A Tanzanian taxi-driver in Dar es Salaam,
said it was so immoral that it meant the world was coming to an end,
because man was going against God’s teaching.
The Bantu and Nilotic languages of East Africa do not have a word for
"homosexual". In Kiswahili (which is Bantu and Arabic) the word "shoga"
is the closest. But this means a catamite, and is used as a serious
insult and not in its strict sense. Until recently most Africans had
not heard or seen the word "homosexual". Although homosexuality is
being described as a taboo, taboos have a purpose, many Africans feel.
They were introduced to preserve the cohesion of a community and
prevent its moral and social disintegration; they were not meant to
prevent people from using their freedom responsibly, but rather to help
them do so.
Nonetheless, the ripples of South Africa's decision will be felt
throughout the continent. Many Christians argue that homosexuality is
un-African, not unchristian – and herein lies a danger. As African
traditional values disappear with urbanisation, leaving many culturally
adrift, moral criteria and perceptions may change.
Recently urbanised Africans, especially the younger generation, out of
touch with their roots, cultural and religious, have begun to adopt
Western ways, sometimes unconsciously. As more and more migrate to the
cities and eventually join the middle class, they become cut off from
their ancestral traditions. This could mean in future a two-tiered
society: a majority who are poor, traditional and still fighting the
elements; and a small but growing minority which comes face to face
with individualism, family break-ups and the many social challenges
associated with wealth and the erosion of the "old ways". In this South
Africa is blazing a trail, for better or for worse. The question
remains: will the rest of Africa follow, or will the African practical
and religious sense help to turn the tide?
Martyn Drakard is MercatorNet's African Contributing Editor. He writes
Don't apologise for reading great
books By Ross Farrelly
Children find the classics relevant and interesting if they are
Parents who wish
their children to learn something of the classics of English literature
at high school must be tearing their hair out in the state of Western
Australia. It would be a travesty if school students were directed to
study the reality television show Big Brother during the precious few
hours they have in the English classroom. However the state's
curriculum council has gone one better, suggesting that students study
the ads screened during this program.
This tragic situation is a result of the post-modern view that any
text is as good as any other, that there is no absolute truth and that
no books have anything meaningful to say about the human condition.
Homer Simpson is thought to be as good as Homer’s Odyssey and students
end up wasting precious time watching the ads screened during Big
Brother when they could be studying George Orwell’s 1984, thereby
discovering the origin of the term.
In "Australia’s wackiest postmodernists" James Franklin says that
is difficult to formulate a workable alternative to post-modernism in
academia. This may well be the case but in K-12 education it is not so
One example of an alternative to studying television ads is the
Junior Great Books program. Earlier this year I visited Chicago, the
home of the Great Books Foundation to attend a training course in this
excellent program. Since then I have been running the program at my
school and helping other teachers to trial the course in their
The Junior Great Books program comprises a series of high quality
texts from a variety of cultures. Each story address a fundamental
problem of human existence in a manner which is age-appropriate and
attractive to the young reader. The students study these stories in
great depth. They listen to the teacher read the story aloud, make
directed notes on the story and write compositions based on the
linguistic features employed by the author.
The culmination of the study is a shared inquiry discussion, a
disciplined group discussion which examines a single question raised by
the text. Students are free to formulate responses but are encouraged
to cite evidence from the text to back up their arguments. This
approach assumes that the author has written something meaningful and
worthy of sustained study and that the text under discussion holds some
authority -- a view what is the anthesis of the post-modern celebration
of the death of the author.
The shared inquiry discussion is conducted according to four basic
guidelines set down by the Great Books Foundation:
Only those who have read the
selection may take part in discussions.
Discussions are restricted to the
selection that everyone has read.
Support for opinions should be
found within the selection.
Teachers may only ask questions --
they may not answer them.
The advantage of this method of collaborative learning, especially
when the teacher rigorously follows the fourth rule of shared inquiry,
is that the students get to see a living example of sustained
intellectual curiosity. The importance of this cannot be
overemphasised. Children learn much by example. Teachers can tell their
students that books are interesting and important but nothing is more
powerful than seeing them actively engaged in careful examination of a
piece of literature, striving to find meaning in it and actively
pursuing an aspect of the story which they find personally meaningful.
This type of teaching, a form of Socratic dialogue suitably
modified to meet the needs of primary students, has the added benefit
of connecting students with this aspect of their heritage, the
rational, open minded pursuit of truth which was introduced to the West
in Classical Athens and which endures to this day.
The Junior Great Books Program comprises literature of the highest
quality. The stories are selected with several criteria in mind.
Obviously the stories must be well written. If a traditional story such
as a Grimm’s fairytale or one of Aesop’s Fables is selected, the
retelling is very carefully chosen. The story must be profound enough
to sustain at least four readings; it must be age-appropriate; and it
must deal with an issue which is relevant, interesting and meaningful.
Furthermore, the stores chosen must be somewhat ambiguous. Morality
tales do not lend themselves to lively debate while stories which can
be interpreted in a number of ways encourage students to draw divergent
conclusions and to justify their conclusions with reasoned arguments.
For example, at the beginning of a series aimed at eight and
nine-year-old children, students study The Happy Lion, by Louise Fatio,
which deals with the question of what makes a true friend, The Tale of
Squirrel Nutkin, by Beatrix Potter, which deals with one’s approach to
authority figures, How the Camel Got His Hump, by Rudyard Kipling,
which deals with one’s duty towards society and Kanga and Baby Roo Come
to the Forest and Piglet Has a Bath, by A. A. Milne, which raises the
issue of dealing with strangers. Each of these is accessible to primary
students and opens up discussions on important and profound issues
which children meet as they grow up. Studying such texts prepares them
for more difficult questions and helps make them more thoughtful, more
considerate, more humane people.
Some critics assert that literature of the type found in the
Junior Great Books program is irrelevant and boring. My experience is
otherwise. Because the texts embody fundamental questions which lie at
the heart of the human condition, students find them incredibly
relevant. It does take a certain amount of teaching skill and
enthusiasm to involve all students in the discussion -- but it is well
worth the effort.
Finally, the study of good literatures refines children’s tastes.
They develop a taste for leisure reading which makes them think,
consider, reflect and reason. Big Brother and the accomanying
advertisements lose their appeal. And this is no bad thing.
Ross Farrelly is a Sydney based educator and writer.
Enter Modernism | Philip
From Truth and Turmoil: The Historical Roots of the Modern Crisis in
the Catholic Church
The Bible, the Word of God in human speech, is not like a manual of
instructions - though it has often been treated like that. While most
of it is straightforward enough, there are also many passages whose
meaning is far from immediately self-evident. This is why Bible study
has a history going back to Old Testament times.
The obscurities are basically of three kinds.
The first are due to mistakes by copyists. In the transmission of the
manuscripts down the ages, the attention of the copyists sometimes
wandered, or they added comments in the margin which later became
incorporated in the text. As a result, the surviving manuscripts
contain numbers of variant readings. The kind of scholarship that tries
to determine which of these different readings comes nearest to the
original is called textual criticism. It is largely a matter of
comparing manuscripts to determine which seems most reliable. 
It is not difficult, I think, to see why God, in his providence,
allowed the texts to become corrupted in this way. Had he prevented it,
had he ensured that the thousands of copyists working over two to three
millennia had never made a mistake, the Bible would so obviously be a
work of divine origin that faith would no longer be a free act. The
variant readings are never sufficient to make the main substance of the
biblical books uncertain. They only affect particular sentences or
Obscurities of the second kind flow from the human limitations and
character traits of the inspired human authors. While ensuring that
they wrote what he wanted, God did so through the medium of their
particular personalities and styles of writing and the kinds of
literary composition characteristic of their age. Since they were
writing a long time ago, they, not surprisingly, used modes of
expression or referred to events and things some-times beyond the
comprehension of later readers.
Difficulties arising from this second class of causes are resolved, in
so far as they can be, by the study of ancient languages, history,
archaeology, and literary forms or genres (not to be confused with
"form criticism"). Are some words to be taken literally or
metaphorically? Is a certain book or passage intended to be history in
the strict sense, or an allegory or parable, or is it some combination
of the two? The search is for what the human author intended to say and
how. This is called "the literal sense".
These first two forms of Bible study simply prepare the ground for what
in the Church's eyes has always been the most important branch; the
study of the religious significance or theological meaning of the texts.
Obscurities in this field are due to the mysterious nature of the
subject matter, or, according to St. Augustine, are deliberately put
there by the divine author himself. "The Sacred Books inspired by God
were purposely interspersed by him with difficulties both to stimulate
us to study and examine them with close attention, and also to give us
a salutary experience of the limitations of our minds and thus exercise
us in proper humility".  God does not disclose the full meaning of
what he is saying to mere cleverness or sharp wits.
Most of the problems connected with these three branches of Bible study
were familiar to the scholars of the ancient world, with the school of
Antioch concentrating on the literal meaning and those of Alexandria on
possible symbolic or "spiritual" meanings. The critical approach was
not unknown either Origen and St. Jerome, for instance, on the basis of
internal evidence, doubted whether the Epistle to the Hebrews was
really by St. Paul.  But whatever the problems, down to 200 years
ago the end in view was always the same: to strengthen belief, deepen
understanding and increase love of God.
Since around 1800, on the other hand, "advanced" biblical scholarship
has followed a markedly different course with the precisely opposite
results. The critical method has been given pride of place over every
other approach; attention has focused on technical rather than
spiritual questions (when and in what circumstances were the books
written), with a high percentage of those trying to answer the
questions losing most of their beliefs in the process. This is a plain
historical fact which receives surprisingly little attention. Does it
mean that the Bible cannot stand up to close examination? No. We have
to distinguish between the method and the spirit in which it is used,
or between the critical method and the critical movement.
That the critical method, once formulated, would be applied to the
Bible was more or less bound to happen, but it was clearly a much more
sensitive business than applying it to other historical documents,
seeing that implicit in its use was the assumption that the origin of
at least some of the books would turn out not to be what had hitherto
The method also carries with it a number of temptations. Experts like
to exercise their skills. But if a text is the work of a single author,
without additions or interpolations and written when it was thought to
have been, there is nothing for the critic to do. The method, of its
nature, therefore carries within it a kind of bias against single
authorship. There will be a tendency to see any ancient text as
necessarily a patchwork of literary fragments put together by groups of
editors at some considerable time after the events described which is
different from recognizing, as has always been done, that the biblical
authors, like other writers about past events, when not writing about
events they had themselves taken part in, depended on external sources.
We can see the tendency at work in 19th-century Homeric studies, where
it came to be more or less taken for granted that any work before the
fifth or sixth century A.D. must be of composite authorship. Homer's
very existence was doubted, and the authorship of the Iliad and Odyssey
assigned to a mob of Greek poets spanning several centuries. Since then
Homeric studies have changed course. A real Homer is credited with the
bulk of the epics.  But there has been no such change of course in
advanced biblical scholarship.
Another temptation will be to try to ape the exact sciences by
assigning a certainty to conclusions, which, because of the nature of
the subject matter, can only be conjectural.  Nevertheless, as we
have already said, there is nothing objectionable about the method
itself. The Church has approved it, and its use by biblical scholars
with faith and a sense of proportion has thrown light on numbers of
incidental scriptural obscurities.
The critical movement is another matter. Although forerunners like the
17th-century French Oratorian priest Richard Simon and the 18th-century
French physician Jean Astruc were Catholics, we can take as the
movement's starting point the publication of The Wo!ffenbuttel
Fragments (1774-1778) by the German Lutheran dramatist and writer
Lessing. The "fragments" were actually extracts from an unpublished
manuscript by the rationalist scholar Reimarus, which Lessing pretended
he had found in the royal Hanoverian library at Wolffenbuttel. A few
years later, Gottfried Eichorn, the Lutheran professor of oriental
languages at Jena (and subsequently Gottingen) published his
Introductions to the Old and New Testaments (1780-1783 and 1804-1812),
and from then on the movement was dominated by scholars whose
conclusions about the time and the way the biblical books were written
were influenced as much by philosophical assumptions and cultural
prejudices as by concrete evidence.
Their principal assumption was that supernatural phenomena like
miracles and prophecy are impossible, and therefore a large part of the
Bible must be folklore. They also tended to see people in the past as
necessarily inferior, uninterested in objective truth and incapable of
transmitting facts accurately, while regarding priests as by nature
deceitful and only interested in the maintenance of their collective
authority. Evidence that the art of writing was practised by the
Hebrews at least by the time of the Exodus, and of the capacity of
non-literate peoples to orally transmit religious traditions faithfully
over long periods of time was either downplayed or ignored.  These
assumptions had in most cases already been made before they set to work.
The Pentateuch and Gospels were the main objects of attention. The
crucial question about the composition of the Pentateuch is not "When
were the books written or put together in the form we now have them?"
but "Was the information they contain, whether recorded by Moses or
others, transmitted accurately down the centuries?"
The crucial question about the composition of the Gospels is "Were
they, or were they not, written by eye-witnesses, or by men with more
or less direct access to eye-witnesses?"
To both questions the critics' conclusions tended towards a negative
If Moses existed, it was maintained, little could be known about him
except that he was neither the Pentateuch's author nor Israel's
lawgiver. The Pentateuch was put together after the Exile out of four
collections of documents and oral traditions, the earliest written four
or five hundred years after Moses' death, with the books of the Law
coming last. Deuteronomy had been composed at the time of King Josiah's
religious reform (640-609). The clergy responsible pretended they had
found the book in a part of the temple undergoing reconstruction.
Before that the Jews had no fixed laws. They lived by a shifting mass
of customary rules and regulations. Most of Leviticus, also the work of
priests, was written during and after the Exile. But in order to
convince the Jewish people that these two codes of laws were not the
innovations they must have appeared to be, the post-exilic clergy
combined them with two sets of oral and written traditions ("Yahwistic"
and "Elohistic") about the supposed early history of the world and the
Jewish people, now found in Genesis, Exodus, Numbers and Joshua.
Most of these ideas are associated with Julius Wellhausen (1844-1918).
But long before he was born, Eichorn had been suggesting that
Leviticus, for which he invented the name "priestly code", had a
different origin from the other four books of the Pentateuch, while
between 1802 and 1805, J. S. Vater had introduced the "fragment theory"
of the suspended Scottish Catholic priest, Alexander Geddes. According
to Geddes, the Pentateuch had been put together at the time of the
Exile from 39 separate sources. In 1833, E. Reuss was teaching that no
traces of the law can be found in the early prophetical and historical
writings, consequently the law could not have existed in the early
period of Jewish history. In a book published at Gotha in 1850, Eduard
Riehm attributed Deuteronomy to the reign of King Manasses.
It was less easy to dismiss the New Testament miracles as myths and the
Gospels as patchworks of folklore. Between the death of Christ and the
writing of the Gospels there were no long centuries during which myths
could form and orally transmitted information become garbled. The best
the critics could do was date the Gospels as long after the death of
the last eye-witnesses as possible. This in a sense is what a great
part of New Testament scholarship outside the Catholic Church has ever
since been about.
For Reimarus the New Testament miracles were due to conscious
deception. In the case of the Resurrection, the apostles simply stole
the body, then lied about it. (Reimarus also seems to have been the
first modern scholar to present Christ as a political agitator.) Less
crude were the theories of critics like Semler (d. 1791) and Paulus (d.
1803). They attributed the miracles to natural causes misunderstood by
the witnesses. The apostles thought they saw Christ walking on the
water when he was actually walking on the lake-shore. But if this was
the way Christianity began (lies or poor eyesight), how do we explain
its phenomenal expansion and later triumph? Efforts to answer this
question took a more sophisticated philosophical form.
The leader of this new school of thought, Ferdinand Christian Baur,
founder of the Tübingen school, side-stepped the question as to
what prompted the apostles to invent the myths, or give them the form
they did. He concentrated on the way the myths developed. The rise of
Christianity was explained in terms of Hegel's theory that progress
takes place through the clash of contradictory ideas.
According to Baur, a conservative Jewish party under St. Peter and St.
James (thesis) came into conflict with the Gentile-oriented party under
St. Paul (antithesis). The eventual result was a compromise (synthesis)
from which sprang the Catholic Church. St. Matthew's and St. Mark's
gospels represent the conservative view, St. Luke's gospel and St.
Paul's epistles that of the innovators, and the "Johannine writings"
(not from the pen of St. John) the standpoint of the party of
compromise. Baur attributed the bulk of the New Testament to the late
second century. He was also one of the first critics to regard the
Gospels as primarily a record of the early Christians' collective
thinking rather than a record of events and facts. However, he was at
least honest enough to admit that if the Gospels were written by
eye-witnesses or the friends of eye-witnesses, his theories fell to the
But how, asked Bruno Bauer, another critic of the period, can a
collective consciousness produce a connected narrative? A good
question. However Bauer (with an "e") was even more radical than Baur
(without an "e"). For Bruno Bauer, Christianity originated with the
author of St. Mark's Gospel, an Italian living in the Emperor Hadrian's
time, who never intended his book to be anything but a work of fiction.
Nethertheless the idea got about that the hero was a real person, a
sect of admirers formed, and the other New Testament books followed.
Bauer eventually lost his teaching post.
Such, roughly, were the beginnings of the biblical critical movement.
The Bible, it would seem, is like an atomic reactor. Anyone working on
it without the protective coating of prayer and reverence rapidly has
his faith burned to cinders.
This is not the place to consider to how many of the theories we have
been describing contemporary scholarship still attaches weight. Here we
are only concerned with the immediate results.
At first sight it may not seem to matter much when or by whom the
biblical books were written, provided they are still believed to be
inspired God, in the sense intended by him. It is true, however, that
most men and women will, rightly or wrongly, assume that the greater
the span of time between the occurrence of an event and its being
recorded in writing, the less likely the record is to be true.  It
was therefore not long before the readers of Reimarus, Eichorn and
their successors were believing the Bible to be largely a work of
fiction too, the critics' immense erudition being the principal factor
enabling them to carry the day. Their readership included growing
numbers of Lutheran pastors, who were simultaneously being exposed to
Kant's idea that God's existence could no longer be proved from his
Seeing that, as Lutherans, they believed neither in an infallible
Church nor a tradition complementary to Scripture, there seemed no
longer to be any reliable basis for belief. Religion appeared to be at
its last gasp, and for many it was in fact so. Most of the fathers of
modern German atheism, like Feuerbach, the forerunner of Karl Marx,
began life as Lutheran theological students.
However, men can rightly want to go on believing in God even when they
are unable to answer the formal objections to belief, and so it often
was in this case. The situation was saved for the poor victims of
Reimarus' scepticism, Eichorn's doubts, and Kant's agnosticism - or
they thought it had been - by the Lutheran theologian Friedrich
* * *
Schleiermacher (1768-1834), a leading figure in the German romantic
movement, had likewise had his belief in the reliability of Scripture
and the value of natural theology undermined by Eichorn and Kant, but
he thought he had discovered a way out of his impasse.
His message was roughly this: "Take heart. All is not lost. Religion
does not need outside evidence to justify its existence. Religion is
not knowledge, whether in the form of creeds, doctrines or the content
of sacred books. It does not need philosophical reflection either. The
essence of religion is piety, and piety is feeling. If you have a
feeling of dependence on God you have all that is necessary to make you
a member of the worldwide 'communion of saints' or company of the truly
religious. The separate beliefs and practices of the various religions
scattered through time and space are simply different ways, all more or
less valid, of cultivating and expressing this fundamental instinct or
attitude, which by itself is sufficient". 
Such was the tenor of the book which first made Schleiermacher famous:
On Religion - Addresses to its Cultured Despisers (1799).
Equating religion and feeling had of course long been a feature of
certain kinds of Protestantism, not least with the Moravian brethren,
one of whose schools Schleiermacher had attended as a boy. But no
professor of theology had hitherto denied the Bible and creeds any
objective value, or made feeling - even if it was a feeling of absolute
dependence on God - the sole substance of Christianity.
In 1811, Schleiermacher, who had been teaching at Halle, was offered
the chair of theology at the recently founded university of Berlin, a
post he held until 1830, and in 1821 and 1822 he published in two parts
the other book on which his fame chiefly rests, The Christian Faith.
In The Christian Faith, in spite of its title, Schleiermacher does not
retreat from his previous position. Christianity remains only one of
many expressions of the feeling of dependence or "God-consciousness".
But he tries to show why it is the best expression so far: Christ was
the man in whom God-consciousness reached the highest intensity. Christ
was not God. He did not found a Church. But the followers who naturally
gathered round so remarkable a man received the impress of his
personality, his special way of feeling dependence on God, and later,
by forming themselves into a permanent community were able to transmit
his special way of feeling or personhood down the ages. We do not know
how many, if any, of the words attributed to Christ by the Gospels
actually come from him. But each Christian receives the impress of
Christ's way of feeling, by living and experiencing the sense of
absolute dependence within the Christian community.
What differentiates the Christian religious consciousness from other
forms of religious consciousness, and makes it superior to them, is the
sense of having been redeemed from sin by Christ. This does not mean
that Christ paid the debt for mankind's sins by his death on Calvary.
Such a notion borders on magic. Redemption means that by receiving the
impress of Christ's personhood, the Christian is better able to
overcome sin (or whatever is an obstacle to the feeling of absolute
dependence) and reach the highest level of God-consciousness of which
he is capable.
One is inclined to agree with Karl Barth a century later that a
characteristic note of Schleiermacher is an astonishing self-assurance.
Schleiermacher is the real founding father of modernism. With
Schleiermacher, everything essential to modernism has arrived. Radical
biblical scholarship destroys belief. There follows a desperate attempt
to construct a gimcrack religious shelter out of the ruins with the
help of some form of modern philosophical subjectivism. This in turn
leads to the positing of the two fundamental modernist theses. First,
since there is no reliable external source of religious knowledge, it
can only be found in personal experience (early modernists inclined to
stress individual experience, today's modernists communal experience).
Secondly, doctrines - those at least which are found "difficult", or,
as would be said today, "lacking in credibility" - should not be
regarded as statements of fact, but symbolic expressions of personal
experience. Supernatural happenings, like the parting of waters at the
Red Sea or the Resurrection, take place in people's minds or
imaginations, never in the real world.
Personal experience is therefore the judge before which every objective
statement of belief, whether in the Bible, the creeds, or any other
source, will have to justify itself. If a teaching finds an echo in
personal experience it can be accepted, if not, it should be left on
one side or rejected. That is why, in The Christian Faith,
Schleiermacher relegates the Trinity to an appendix: "What is not
directly given in Christian consciousness", as a contemporary admirer
of Schleiermacher puts it, "is of no primary concern to faith". We can
have a feeling of sinfulness (concupiscence), or of having had our sins
forgiven (redemption). These ideas are therefore "meaningful", but we
no more feel that there are three persons in the One God, than that
there are four, five or six.
Schleiermacher stands at the turning point in the history of
Protestantism where the fierce certainties of Luther, Calvin and the
other reformation patriarchs start to crumble, and doctrine or any
clear statement of belief comes to be seen as something repulsive,
something that, instead of giving light to the mind, weighs on it like
a sack of cement which the mind wants to throw off.
As the 19th century proceeds, this turning away from doctrine will
become first a flight, then a stampede, and finally a Gadarene rush,
until in the mid-20th century it hits the rocks at the bottom of the
cliff in the patronising agnosticism of Bultmann and the barely
disguised unbelief of Tillich. Catholics swept into the stampede
usually express their dislike of religious certainty with the lament
"Oh, no! Not another infallible doctrine".
The one interesting feature of Schleiermacher's theology, from the
Catholic standpoint, is his shift of attention away from the Bible to
the "Christian community". What Schleiermacher meant by that term is
not what Catholics mean. Nevertheless he reintroduced into
Protestantism as a whole an awareness of the Church as a factor in
Christianity of at least equal importance with the Bible. The Bible
might be untrustworthy. But the Christian community with its personal
experiences was an indisputable past and present fact.
 The term "higher criticism" was reserved for analysing texts,
whether biblical or profane, in order to elucidate their authorship,
date and meaning. The higher critics regarded textual criticism as a
lower branch of scholarship.
 Quoted by Pius XII, Divino Afante Spiritu, 47.
 For Origen's doubts, see Eusebius, Hist. Eccles., 6.25, 11-13. For
St. Jerome's: Eph. 129. C.S.E.L. 55169.
 See Geschichte der Griechischen Literatur, Franke Verlag, Bern,
1963 (English translation 1966) by Albin Lesky, professor of Greek,
University of Vienna.
 Some examples will help to illustrate the difficulty of assessing
the significance of stylistic differences. (a) Dr Johnson's two
accounts of his journey to the Western Isles - one in letters written
on the spot, the other in book form published after his return - are so
unalike in style that, in Macaulay's opinion, if we did not know
otherwise, we should find it hard to credit that they were written by
the same man. (b) The 17th- century mystic St. Margaret Mary Alacoque
was ordered by her superiors to write her memoirs. The result was found
too unpolished for the intended readership, so they were rewritten in a
style suited to the Grand Siècle. Should we infer from this that
St. Margaret Mary had nothing to do with them? (c) There are versions
of Chaucer in contemporary English. If these alone were to survive,
what conclusions would be drawn about their authorship? The style of a
text can belong to a period later than that of the author, with the
content remaining essentially his product.
 See Ricciotti, History of Israel, Vol. 1, Milwaukee 1955, who cites
a succession of cases where texts of enormous length have been handed
down orally, with apparently little if any alteration, for centuries.
See also William Dalrymple, City of Djinns, HarperCollins, 1996.
According to this author, in India today there are still "bards" who
can recite from memory the whole of the Mahabharata, an epic longer
than the Bible.
 In taking this line, the critics were making, even by their own
standards, an illegitimate inference; namely that the books of the New
Testament were necessarily formed in the same way as those of the Old
Testament as though literary composition and culture had remained
unchanged between the period of Sennacherib or Cyrus and the age of the
early Caesars. In fact, after two centuries of debate, there seems to
be no compelling reason not to accept the already ancient tradition
enshrined in the History of Eusebius of Caesarea (264-340), that the
Gospels were written by the four Evangelists at roughly the time and in
the way always believed. Justin Martyr (100-165) calls them the
"Memoirs of the Apostles". Vatican II affirms both their "apostolic
origin" and "historicity". (Dei Verbum 18 & 19). How could St. John
have recalled lengthy speeches like Our Lord's at the Last Supper? We
have only to recall similar feats of memory on the part of Macaulay and
Mozart to realise it is entirely possible even without special divine
 It is now common, in Catholic Bible study groups and popular
commentaries, to hear the Exodus miracles described as merely literary
devices used by the author to convey the idea of God's power. See, for
example, A Catholic Guide to the Bible, Oscar Lukefahr C.M., Liguori
Publications, Liguori, MO.
 Livingston, Modern Christian Thought, Macmillan, New York, 1971,
Philip Trower is a British writer and journalist who covered five
episcopal synods in Rome from 1980 to 1990. Born in 1923, Trower was
educated in English private schools and attended Eton from 1936-40. He
earned a B.A. in modern history from Oxford University 1941-2. He
worked in literary journalism for the Times Literary Supplement and
Spectator; in 1951 he published the novel Tillotson. He is also the
author of the novel, A Danger to the State (Ignatius Press), about the
19th century suppression of the Jesuits.
By Seamus Grimes
Thursday, 06 July 2006
Experience around the world shows that building societies upon
post-modern tolerance instead of old-fashioned respect just doesn't
To be tolerant of others is admirable, but in our “post-virtue” era
tolerance has become an end in itself. Ever since the ideals of our
Judeao-Christian heritage were first enunciated we have repeatedly
fallen short in how we have treated the “other”. We have compounded
this failure more recently by re-interpreting those ideals in ways
which make it difficult for many to fully appreciate our shared
Many are the deficiencies and inconsistencies of the West. By the 19th
century, rationalisations of Western racial superiority were common.
Peoples from radically different cultures like Australian aborigines
were sometimes viewed as sub-human. In some instances, this even
stemmed from a distorted interpretation of Christian teaching,
resulting in institutionalised forms of racism such as the “White
Australia” policy, South Africa’s apartheid regime and the horrors of
Nazi Germany. In fact, most societies had their marginalised “other”
communities which were treated as second-class citizens, despite
high-minded constitutional aspirations “to cherish our children
equally”. More recently the challenge of not diminishing our humanity
by not diminishing the inalienable rights of others has arisen
particularly in relation to immigration.
By 2006 much progress had been made on the basis of “multiculturalism”,
an approach which aspires to allow for some flowering of cultural
differences within countries with large immigrant populations.
Multiculturalism took over from the “assimilationism” of an earlier
period, which sought cultural uniformity between immigrants and their
In 2006, however, multiculturalism is in crisis. This is partly because
it is a policy constructed on weak post-modern, relativistic pillars
which emphasise tolerance as an end in itself. In practice this means
standing for everyone’s right to be different in fairly petty ways,
without specifying how they should be the same in very important ways.
Girls’ right to wear head scarves is affirmed, and girls’ right to
marry whom they choose is ignored.
In a recent speech to the Australian Parliament, Tony Blair spoke of
humanity’s common ownership of universal values such as justice and
fairness and noted the growing threat of Islamist extremism. But we
also face the more insidious undermining of long-cherished values by
intellectuals who have developed a strange loathing of their own
heritage. Under the umbrella of “multiculturalism” also shelter
supporters of formerly objectionable lifestyles, notably gays and
lesbians. The dogma of multiculturalism has shown itself to be
incapable of distinguishing between social norms and social customs to
the exasperation of many voters. It seems absurd to equate the right to
serve chicken tikka masala with the right to teach school children
about contraceptives and gay lifestyles.
So, despite apparent progress, multiculturalism is unlikely to provide
a solid basis for constructing a more just society in 2036. The strain
is already showing in growing racial and cultural tensions in the
larger European cities. My prediction is that government-endorsed
multiculturalism will be extinct in 30 years’ time. What will replace
it as the political cement for increasingly diverse Western societies
is harder to pick. Growing hostility towards Muslims does not auger
well. At the worst, it could be a resurgence of 19th century racial
discrimination and disenfranchisement.
But we can hope for better than this, thanks to growing levels of
interconnectivity through media like the internet, growing literacy and
so on. The world in 2036 is likely to be more urbanised and more
globally integrated, with significant clusters of highly skilled people
from many different countries making significant contributions to the
older and more established core regions of the world. This greater
mixing of peoples will also foster a greater homogenisation of society.
While there are many reasons to encourage the preservation of
humanity's varied cultures, it is vital also to create a society which
emphasises our common humanity. The multicultural model, while being
well-intentioned, has not been particularly successful in creating a
coherent approach towards solidifying the common good. Much can be
gained from building social policy on the foundations of the best of
our Judaeo-Christian background which emphasises the fundamental values
of fairness, justice and respect.
Seamus Grimes is a professor of geography at the National University of
Ireland in Galway.
Putting God back in the
By Peter Sellick - posted Tuesday, 13 June 2006
It is a great pity that the move in philosophy called postmodernism has
attracted negative sentiments in the general public.
There is a fear this movement removes all that we know is solid -
morality, values, and the scientific world view - to replace it with a
sea of relativities and uncertainties.
It is thought postmodernism is just more radical scepticism emptying
all that we thought we knew. Certainly the more outrageous expressions
- particularly those originating in France - give off this odor.
conjuring up a vision of young philosophers eager to make their names
rampaging through our universities and overturning all the old
authorities and certitudes.
An example of how postmodernism has been misrepresented is the silly
questions set in English literature exams asking students to criticise
texts from a particular position such as Marxist, feminist,
postcolonial, or queer.
The origin of these questions lies in the program of deconstruction
with its critical examination of texts for their underlying and often
unrecognised biases and agendas.
While this process can be recommended for identifying false
consciousness, it is important to understand how our position in life -
as man, woman, indigenous, privileged, and so on - informs the kind of
world we construct. Its deconstruction lays bare our soul and allows
other points of view room to breath.
However, when this technique becomes the be-all and end-all for
studying literature, then students standing back from the text with
judgment in their heads are likely to miss the text altogether. The
result can be moral megalomania, with the student self-righteously
assuming the position of moral arbiter over our great texts of
When suspicion rules there is no way texts may engage us. The greatest
advantage of studying the great literary texts is that they are
“other”. Student-centred learning, which insists on texts from the
students’ own world, will not be confronted by difference and will
remain in its own comfort zone.
Surely the importance of the great texts is their accurate
representation of the human - something Big Brother as an alternative
text does not do.
It is increasingly being recognised there was never room for theology
in the modern project of setting up clear and distinct ideas based on
firm foundations. This philosophical absolutism found its ground
primarily in the natural sciences where theory could be tested and
affirmed, or found wanting.
The success of this method ensured that all other forms of knowledge
were relegated to the margins as subjective, with little or no
foundation in the world. Theology suffered most from this move because
the God that it investigated could be imagined only in terms of being
under the auspices of natural science.
As a being among other beings God was vulnerable to negative evidence.
And by seeking to lay the foundations of proof for God’s existence, we
also lay the foundations of proofs for his nonexistence.
The whole argument about God’s existence or nonexistence is easily
exposed as a kind of idolatry for both atheist and theist. This kind of
god is a simple projection of our hopes, needs, desires and fears. As
such, it acts as a mirror reflecting these concerns to us but closing
the path to an interaction with the divine as “other”.
So, far from being a revision of Pyrrhonian scepticism in proclaiming
the emptying of all known things, postmodernism - as its name implies -
seeks to correct the errors of modernism.
By insisting on firm foundations for knowledge, modernism artificially
limited what we thought we could know. Time and again in On Line
Opinion’s comments section I am confronted by those who tell me I can’t
That was the whole point in modernism, but postmodernism - even though
it denies absolute foundations of knowledge - allows us to know enough
to get along together.
We are reminded how recent the modern project is when we read St Paul,
who tells us;
For now we see in a mirror, dimly, but then we will see face to face.
Now I know only in part; then I will know fully, even as I have been
fully known. (1 Cor 13:12 NRSV)
But we have this treasure in clay jars, so that it may be made clear
that this extraordinary power belongs to God and does not come from us.
(2 Cor 4:7 NRSV)
Rather than being about absolute foundations, scripture is conditioned
by the transient nature of being and the movement towards a new
reality. It’s the mentality of the nomad rather than of the settled
people - of a people straining to see an emerging future rather than
settling down to old verities.
The demise of the modern project of certitude allows theology to move
outside the categories of being and absolute certainty to reclaim
biblical speech about God.
What does it mean that God, in the first creation narrative, brings the
world into being by his Word? “And God said, “Let there be …”, and it
was so. What does it mean when, at the end of a prophecy, the prophet
says, “Thus says the Lord?”
Who is this God who is identified with the sound of pure silence? But
most puzzling of all, what do we make of the creative Word of God,
which spoke the universe into being, becoming flesh.
Our problem is our view of biblical texts through the eyes of modernity
blinds us to the texts’ original meaning - to the extent that we miss
the extremely puzzling and paradoxical nature of biblical speech about
God. The many examples of this could be teased out only in a larger
When our speech about God begins in the Bible instead of philosophical
presuppositions, the God we arrive at is quite different - so different
that John can tell us:
So we have known and believe the love that God has for us. God is love,
and those who abide in love abide in God, and God abides in them. (1
John 4:16 NRSV)
This language about God has no counterpart in the modern project, and
only when we escape from its limitations can such a text make any
sense. Only when we escape from the God of the 17th century
scientist-theologians - the God that Newton believed in - can we hear
what the Bible is saying to us.
God is not a being among beings. By centering his theology on the Word
of God rather than the being of God, Karl Barth moved from the modern
paradigm to the postmodern - before it was a recognised movement.
God thus ceases to be the subject of philosophical or scientific
speculation, but is experienced as a spoken word. This word has
content, not in the words of scripture, but in the reality to which
scripture points - the humanism that emerged from Israel’s struggle
with truth and in the man, Jesus.
This is the central reality of Christian worship, where the Word is
faithfully preached and the sacraments celebrated. God is with His
people. When this is affirmed, all speculation about God’s existence as
being evaporates, making the scientist-theologians with their
cosmological proofs redundant.
The Church’s long decline during the past few hundred years began when
God was taken out of the Church and became an object of scientific
speculation. It was entirely predictable this would produce the
dominant heresy of the modern age, Unitarianism, since the modern
paradigm could not cope with God as Father, Son and Holy Spirit.
When Newton could not affirm the doctrine of the Trinity (ironically
residing, himself, at Trinity College, Cambridge), the king gave him
dispensation to retain his post. In retrospect, this was a major
concession to theological error and, given Newton’s fame, helped
fragment the faith of the Church.
Understanding God in terms other than being requires considerable
education against our natural tendency to think in terms of person.
We must speak about God in terms of person because He cannot be reduced
to knowledge or force or process. Rather, he is in relation with us.
But we must also remember He is not a person among persons. He does not
exist in matter - especially not in a contradictory supernatural matter
- but in the spirit of freedom and truth.
Only when we escape from the false orientations and restrictions
modernity has imposed on us, can God be given back to the
Church.Stanley Hauerwas has entitled an essay “In a world without
foundations: all we have is the church.”
In modernity speech about God was a poor amalgam of the secular and the
biblical, with the secular eventually crowding out the biblical. In
postmodernity talk about God will be impossible without talk about the
people of God.
God will no longer be a foundation for morality or existence to shore
up our shaky lives but will be experienced as in the days gone by - as
his creative, judging, forgiving, loving Word in the midst of the
congregation. To hear this Word is to be saved from the powers of death
around us and set free from the idolatry so natural to our hearts.
A theology centered around the Word of God rather than the being of God
requires subtle changes in our understanding of the key occupation of
Christian prayer. Prayer is certainly a conversation of sorts.
Paul told us we should “pray without ceasing”. Surely he did not mean
we constantly press our concerns upon God, but that we unceasingly
listen for His Word.
A medieval painting of Mary shows her being impregnated by the Word
through her ear. This is an image of prayer. Prayer consists in us
listening to the Word. It is not something we do occasionally but is a
medium in which “we live and move and have our being”.
The seed crystal around which all Christian prayer grows is to be found
in the opening of the liturgy: “The Lord be with you.” This desire,
particularly expressed in the Gospel of John as God dwelling with His
people, is the primary desire of all Christians and the focus of all
This hope is fulfilled in our listening to the Word, and also in
intercessory prayer as hope for the “other”. Thus prayer is a different
kind of conversation with an “other”, and different from private
thought, which is insulated within the self. It is the opening of the
self to the depths of the Word.
The modern paradigm never had room for biblical theology, and the
Church’s task now is to throw it off to reclaim faithful speech about
This will take great educative effort directed to both the churched and
unchurched such is the tenacious hold of modernity on our minds.
Article edited by Allan Sharp.
If you'd like to be a volunteer editor too, click here.
This article was helped considerably by several chapters of Overcoming
Onto-theology by Merold Westphal.
FOCUS ON POSTMODERNISM
Your pocket guide to
By Martin Fitzgerald
Wednesday, 31 May 2006
Before all the hooha about postmodernism, there was something called
modernism. What was all that about?
It all began with one simple idea which multiplied dangerouslyHow often
have you been reading something quite pleasant and suddenly the author
drops the fateful word – postmodernism? The argument begins to get
fuzzy. You muddle through and hope that the rest of the article becomes
clear. You also have a sneaking suspicion that a dictionary will be
useless. And, anyhow, how are you supposed to know what postmodernism
is if you’re not even sure what modernism is?
So let’s start with modernism. This is the philosophical term for
philosophical offshoots of the Enlightenment, the complex of ideas that
has shaped the modern world from the 18th century until the mid-20th
century. Its characteristic features were -- and still are -- suspicion
of authority and tradition as sources of knowledge and the conviction
that human reason is the engine of progress. This implied that
religious faith was a bad guide to understanding the world and that the
unimpeded march of science and technology was a very good thing. The
Enlightenment was optimistic: knowledge through reason alone would
produce an ideal world which goes forever forward.
These key ideas sprang from a tectonic shift in philosophy begun by
Rene Descartes (1596-1650). Instead of asking “How can I explain the
world and understand it?”, he asked, “How can I be certain about
things?” This led him to the assertion that cogito ergo sum, "I
think therefore I am", was the starting point of philosophy. This
launched a long tradition of which postmodernism is the most recent
After Descartes, philosophy followed two strands. The first strand was
rationalism, the notion that sense data is suspect and that the truth
can only be reached through reason. Ultimately this led to idealism, an
attempt to explain the universe in terms of an absolute principle. More
about this later.
The second strand was empiricism, the view that only sense knowledge is
worthy of being trusted. Ideas, that is, concepts or reasoning which
are unverifiable by the senses, distort the truth. Empiricism fostered
progress in science and technology. And in the human sciences of
sociology, psychology and history it led to positivism, which demanded
that the human sciences should be based on the same methods as
experimental science. This required the exclusion of certain notions
from what had commonly been regarded as true knowledge. David Hume, for
example, denied causality. Just because a green billiard ball strikes a
blue billiard ball does not mean that the green causes the blue to
move. It might have been an accident. So, although the word empiricism
has a “scientific” ring, it really leads to a radical scepticism about
the common sense world.
In the 20th century, the methodology of positivism was applied to
language, what words mean and how we structure them. This has a long
history, but it characteristically resulted in an analysis of texts on
their own, without reference to the author and without reference to any
truth that the author might be aiming at. An important offshoot of this
approach was structuralism. It regarded language as just the most
sophisticated way of using signs and symbols. The Columbus of
structuralism was Claude Levi Strauss, an old leftie from Paris and an
anthropologist of primitive societies. He taught that all discourse was
simply power plays. Those who wrote used writing to subject and oppress
those who did not write, either because they were illiterate or because
they had no access to the apparatus of publishing. Understanding
discourse involves analysing its assumptions, its prejudices and its
access to the public forum. Without any reference to the truth or to
the author’s intention, all discourse had to be “deconstructed”.
The recently-deceased Jacques Derrida is the most famous practitioner
of deconstruction. This analyses sign systems, from billboards for Coke
to Hamlet and the Mona Lisa, not in terms of the truths conveyed, but
in terms of what they reveal about the power structures generating
them. The corollary is that if you want to change the power structures,
you must change the discourse, particularly the public discourse. This
has some plausibility in literature, but in anthropology, sociology or
psychology its effects have been quite pernicious. Deconstruction
transforms every form of intellectual endeavour into a kind of fiction;
they all are ways of constructing reality with words. All discourses
are equally valid because they are not about reality anyway. Everything
fed into the deconstructive shredder emerges as power. There are no
rights, no justice, no truth, no value, no worth. This is the
postmodern endpoint of empiricism.
The 1998 film The Truman Show, directed by Peter Weir, captures the
issues raised by the empirical strand of postmodernism. Jim Carrey
plays Truman Burbank, insurance salesman living an idyllic life in a
small American town who discovers his entire life is actually a reality
TV show. Its message is that in a world in which all discourses are the
same, the edges between entertainment fiction and reality are blurred.
People live vicariously through their favourite TV characters but to
some extent we are all TV characters because we too are manipulated by
Let’s return to the other strand of the Enlightenment, rationalism. It
aspires to an understanding of the world which is free from the
vagaries of sense perception. Thought leads to truth. In an odd sort of
way, it recognises that that there is a spiritual reality, even if only
in mathematical abstraction. The giant of rationalism is Friedrich
Wilhelm Hegel (1770-1831). He believed that there is a world spirit
which is the result of the accumulation of rational knowledge by
humanity, especially by intellectuals. In the course of history this
world spirit is refined through a dialectic, the clash of progressive
and liberating ideas with the conservative ideas that have preceded
them. It will culminate in an omega point in which everything is
absorbed into the absolute. There will be no differences or
distinctions; all will be part of the Absolute.
The State, he taught, embodied the Absolute because it somehow
crystallised the best things of society in itself. While Hegel is far
more sophisticated than I give him credit for here, the logical outcome
of his philosophy was Communism and Fascism. In these regimes, the
state was supreme and the individual counted for almost nothing. Rivers
of blood ran from these “isms”, and after the fall of the Berlin Wall
in 1989, Hegelian rationalism seemed to be bankrupt. The world had
experimented with Absolute Ideas which explained all of history -- and
they hadn’t worked. Postmodernism is often seen as a sceptical response
to the eclipse of these grandiose schemes.
Despite the temporary success of idealism, it had formidable
intellectual foes. The first was Søren Kierkegaard (1813-1855).
Reacting against the lack of attention paid to the individual by the
idealist juggernaut, he focused on how the individual must assert his
existence by constructing himself through his decisions. Freedom and
choice become supreme in his philosophy. As a devout Danish Lutheran,
his most important choices were those associated with religious faith
and religious commitment.
An atheistic response to this all-consuming idealism came from the
French philosopher Jean Paul Sartre (1905-1980). He borrowed
Nietzsche’s analysis of the death of God and the loss of meaning in the
world. Life was an absurdity, a tragi-comedy with no meaning other than
the one they chose. The Czech writer Franz Kafka captured this
existentialist dilemma in his bizarre tales, The Trial and
Metamorphosis and Sartre in his novel La nausée (Nausea). On a
more popular level Alan Alda portrayed it in the absurdities of M*A*S*H.
The German philosopher Nietzsche (1844-1900) actually took this view of
the world, known as existentialism, a step further than Sartre. In a
world without God, is there any meaning? No, he says. Nietzsche claims
that all humans act only for power. Some humans are more gifted than
others and are not bound by the ordinary rules which bind everyone
else. He calls them supermen, in the Nazi sense, not in the comic book
sense. He imposes meaning on his life and the life of others, thus
proving his freedom and establishing his identity. This strand of
postmodernism is captured in the nightmarish 1982 film Bladerunner,
Harrison Ford’s first major film. It highlights the creation of
identity in the “replicants”, the struggle for power and the
pervasiveness and power of propaganda.
Postmodernism is a set of ideas to be studied at university. But it is
also an attitude to life. People not only think postmodern thoughts,
they also live postmodern lives. They live without ideals, or ideas;
their morality is homemade relativism; their commitments are fleeting;
they distrust authority and “canonical” texts; they are sceptical about
assertions of truth and falsehood. Films are a useful way of capturing
this. The Truman Show and Bladerunner are two thought-provoking
examples, but there is one which sums them all up, The Matrix. See that
and you’ll understand more or less what postmoderism is all about.
Martin Fitzgerald is Head of Philosophy at Redfield College in Sydney.
By Philip Elias
Thursday, 01 June 2006
A Sydney seminar on the impact of postmodernism upon education provides
some thought-provoking reflections on a philosophy which pervades the
teaching of liberal arts.
ImageAustralia is not a safe place for postmodernists at the present
moment. Over the past months, academics, journalists, and even Prime
Minister John Howard have publicly attacked their influence on school
curricula around the country.
At a recent seminar at Warrane College, a residential college at the
University of New South Wales, in Sydney, academics from three
universities took the chance to lambaste the postmodern position. In
this issue of MercatorNet, we are featuring some of the contributions
that they made.
In "Your pocket guide to PoMo's history" Martin Fitzgerald provides
some historical background. His paper traces two philosophical strands
that have shaped postmodernism. One is the deconstructionism of
thinkers like Jacques Derrida and Michel Foucault. This developed from
the structuralist and logical positivist movements of the 20th century,
which in turn sprang from the empiricist tradition. The other is the
atheistic existentialism of Jean Paul Sartre and Friedrich Nietzsche.
The convergence of these ideas has led to the “theory” -- or more
correctly the attitude -- that characterises postmodernism.
In "What does this mean for education?" James Franklin argues that
postmodern-inspired curricula foster “unteachable suspicion” in
students. Great works of literature, historical documents, and even
scientific research are “texts” to be analysed rather than appreciated.
Power becomes the essence of human communication; truth, goodness and
human nature are merely fronts for the giant power play that we call
Postmodernism manifests itself in specific ways in the various teaching
disciplines. Barry Spurr’s paper, “What is the difference between King
Lear and Ginger Meggs?” deals specifically with the disastrous effects
of a postmodern approach to English. At its worst, Spurr argues,
postmodernism in the study of literature is “a synonym for intellectual
chaos and ignorance”.
Defenders of postmodernism will argue that any summary statements are
misrepresentative and create a “straw man” of their ideas. But it is a
bit rich for postmodernists, with their emphasis on the fluidity of
meaning, to ask for a watertight definition of their philosophy.
Another contributor, Alan Barcan*, pointed out that postmodernism is
“sometimes used as an umbrella term for the vast range of ideological,
curricular and pedagogical changes since the cultural revolution of
1967-74.” It encompasses the ideologies of feminism, environmentalism,
neo-Marxism, and so on. Despite their differences, these outlooks share
the philosophical approaches outlined above and unanimously encourage
“unteachable suspicion” towards all knowledge (except, of course, the
knowledge they provide).
So what is the big deal? After all, postmodern ideas are now
passé at most tertiary institutions in Australia. But as James
Franklin points out, there is a strong and more lasting trickle-down
effect to schools. For most people, the years at primary and secondary
are the most formative, if not the only, part of their education.
There are perhaps four areas of major concern for these students:
1. Fundamental facts and concepts are being skipped over
in a child's education. In the rush to initiate students in the cult of
cultural theory, grammar, great works of literature, and the broad
brushstrokes of history are neglected.
2. The gaps in the wall left by a postmodern approach to
education are often filled with substandard works and decontextualised
fragments of the classical curriculum. Australian Idol is considered as
worthwhile a “text” as Othello. Shakespeare’s works themselves are
clapped in feminist or postcolonial shackles. National histories are
first and foremost to be considered as shameful tales of oppression and
violence. Science is to be considered primarily as social construction.
3. The attitude of “unteachable suspicion” is presented to
students dogmatically, as a moral imperative in learning. While paying
lip service to an “anything goes” approach to knowledge and truth “the
promoters of the brave new world of so-called postmodernism are
authoritative and prescriptive to a fault,.” says Barry Spurr. In fact,
the only thing students are not encouraged to be suspicious of is the
analyses offered by their teachers. This often comes later, and
naturally, when the student completes university. The by-product is
usually a crudely pragmatic approach to education, or a sense of
4. A vicious cycle of artistic mediocrity is established.
Victims of a postmodern education grow up and become teachers. They are
unable to draw upon the essential texts and ideas in the liberal arts.
They cannot provide students with the grounding from which true
innovation can take place.
I suspect postmodernists want effortless wisdom. There is a sense in
which it is thought to be clever to identify the ways in which the
comic strip Ginger Meggs and King Lear are similar. But this is a
no-brainer. They are similar because they are both expressions of human
experience and imagination. The similarities are most properly examined
at university, in cultural studies or anthropology, not in high school
The really clever thing is to be able to say why Ginger Meggs and King
Lear are not similar. What makes one work great and another mediocre?
To answer this adequately years of study and experience are required.
Competent teachers are indispensable. Find them; poach them; train them
-- and I’d wager that “unteachable suspicion” will melt into a sense of
wonder. And this is the starting point for all knowledge.
Phillip Elias is studying medicine at the University of New South Wales
in Sydney. He was the organiser of the seminar on postmodernism and
Australian education at Warrane College, University of New South Wales,
Sydney, in April
* For reasons of space, the paper presented by Alan Barcan (Honorary
Associate, School of Education, University of Newcastle),
“Postmodernism and the Fractured Curriculum”, has not been published on
MercatorNet. His analysis of the ideologies competing for control of
education is presented in “Ideology and the Curriculum” in Naomi Smith
(ed.) Education and the Ideal (New Frontier, 2004).
By James Franklin
Thursday, 01 June 2006
It's easy to laugh off the fashion for PoMo, but harder to find a
August Highland:Postmodernism is not so much a theory as an attitude.
It is an attitude of suspicion – suspicion about claims of truth. So if
postmodernists are asked “Aren’t the claims of science just true, and
some things objectively right and wrong?” the reaction is not so much
“No, because…” but “They’re always doubtful, or relative to our
paradigms, or just true for dominant groups in our society; and anyway,
in whose interest is it to think science is true?”
Postmodernism in not only an attitude of suspicion, but one of
unteachable suspicion. If one tries to give good arguments for some
truth claim, the postmodernist will be ready to “deconstruct” the
concept of good argument, as itself a historically-conditioned paradigm
of patriarchal Enlightenment rationality.
Finally, the postmodernist congratulates her/himself morally on having
unteachable suspicion. Being “transgressive” of established standards
is taken to be good in itself and to position the transgressor as a
fighter against “oppression”, prior to giving any reasons why
established standards are wrong. In asking how to respond to
postmodernism, it is especially important to understand that its
motivation does not lie in argument but in the more primitive moral
responses, resentment and indignation.
PoMo at work
Barbara Kruger: “Untitled (Your Body is a Battleground)”, 1989To
illustrate, let us take a few examples from my webpage of Australia’s
Wackiest Academic Websites. Worst results only are shown. We will need
to consider later how widespread and dangerous such examples are and
hence how seriously the problem should be taken and what should be done
The University of Western Sydney used to be a leader in the field but a
couple of years ago their central marketers cleaned up their website
and it is now harder to find what is going on. Through the miracle of
web archiving, however, one can browse such past gems as the project of
Dr Arnd Hofmeister on “Queer embodiment”:
Based on a project with/of the Japanese Artist Erika
Matsunamie about masculinities and femininities with German male
“Cross-Dressers” this research project seeks to investigate the
phantasmatic dimension of embodiment. Embodiment is understood as a
highly overdetermined and contradictive inscription of practices in the
body with continuously shifting investments. Using in-depth interviews
and free association over significant self-portrait photographs modes
of articulations over embodied experiences are analyzed to get insights
in the heterogeneous processes of gendering and sexing the body.
(Let me make it clear that I have nothing against German transvestites.
It is just the way they are being used as an excuse for bullshit that
is a problem.)
Still very much with us is the oeuvre of Dr Alison Moore, who joined
the University of Queensland’s Centre for the History of European
Discourses as a postdoctoral fellow in 2005.
Her ongoing project is about the history of
excretory taboos in Europe of the mid-to-late nineteenth-century and
their relationship to visions of progress, bourgeois class conformity
and colonial identification. In this vein she had published `Kakao and
kaka: Chocolate and the Excretory Imagination in Nineteenth-Century
Europe’, in Carden-Coyne and Forth (eds), Cultures of the Abdomen:
Diet, Digestion and Fat in the Modern World, New York: Palgrave, 2004,
51-69. She is now working on a book manuscript entitled, The Anal
Imagination: Psychoanalysis, Capitalism and Excretion.
The two “Body modifications” conferences that have been held at
Macquarie University included some choice items. The first conference,
in 2003, had a keynote paper `A spectacular specimen: hermaphroditic
strategies for survival’:
Del LaGrace Volcano (formerly known as Della Grace)
is a gender variant visual artist and intersex activist who is the
author/photographer of three books, Lovebites (Gay Men’s Press, 1991),
The Drag King Book (Serpent’s Tail, 1999) and Sublime Mutations
(Konkursbuchverlag, 2000). sHE has been documenting and creating heroic
re/presentations from the queer communities sHE belongs to for over 25
years. Film credits include: Pansexual Public Porn (1997), A Prodigal
Son? (1998), Journey Intersex (2000) and most recently The Passionate
Another paper in the same conference was `What an arse can do: affect,
time and intercorporeal transfomation’: “Transformations in anal
capacity, in what an arse can do, are sought-after … let’s stop there…
Good taxpayers’ money, it must be remembered, is going into this
research. Last year’s ARC Discovery grants, the major large grants
competed for across all areas, included one to the University of
Technology, Sydney for a project on “Local noise: Indigenising hip-hop
Although “the body” is a favourite topic, owing to its multiple
transgressive possibilities, it must be said that most postmodernist
writing is much less colourful than the examples just quoted. More
typical is this paragraph, the first one on the website of the
University of Wollongong’s Hegemony Research Group, which introduces to
the interested public what the Group is doing:
The originality of Gramsci’s conceptualisation of
hegemony has long been recognized, and is evidenced by the extremely
wide-ranging intellectual applications of, and the amazing corpus of
published writings organised around, the Gramscian conceptualisation.
In cultural writing, historical interpretation and studies of states,
nations and global power it has proved remarkably versatile. Gramscian
understandings of hegemony have shaped – overtly or implicity – such
crucial but diverse studies as Edward Said’s analysis of Orientalism;
Louis Althusser’s theory of ideology; Michel Foucault’s concept of the
episteme; the writings of social historians such as … etc etc
The idea that a dense thicket of unexplained references to continental
theorists is the way to introduce an idea is absolutely typical of the
A 1998 press release from the University of Adelaide shows where this
is heading as regards respect for scientific truth. It concerns a
course on “Indigenous Australian Perspectives in Science and
Technology”. There is nothing wrong with studying aboriginal
perspectives on the natural world, but the claims made for it include
At Wilto Yerlo we believe it’s important that
indigenous students realise Western science is only one way of
understanding the natural world. Of equal value is their own indigenous
way of knowing the world.
That is not correct. Western science is a way of knowing the natural
world, but it is the only way of knowing it that is likely to make an
impact on the severe health problems of remote indigenous communities,
because it has found the unique right way to study causes and effects.
Defining the problem
August Highland:How serious is the problem? Have humanities departments
been taken over by this sort of rubbish?
Not exactly. The kind of people just quoted regard themselves as an
embattled minority, and not without reason. There are plenty of
humanities academics doing serious work, probably a majority in the
older universities. Still, the trickle-down effects of the
postmodernist industry are quite serious in a number of areas.
Humanities academics of a more respectable persuasion have to spend
time fighting for positions and grants against an enemy that never
gives up; it is a tiring business. Equally exhausting is trying to
persuade students to take serious subjects that will force them to
think and to learn something instead of grabbing easy marks from trendy
courses that give out high marks for the illiterate pooling of
politically-correct prejudices. Since universities allocate teaching
monies on the basis of enrolments, lecturers in logic or classics are
always at a disadvantage in an Arts faculty that offers Critical
Feminist Research Methodologies.
Other effects are felt in school syllabuses, as other contributors to
this symposium have described. Schoolteachers themselves generally
retain a fund of common sense, but curriculum designers and
educationists are not kept down to earth by the discipline of dealing
with school students and parents. I do have some positive news to
report on this front: I recently marked Sydney Grammar School’s
Headmaster’s Exhibition, an essay competition for the school’s top
students. I am pleased to say that the standard of argument in the
essays was uniformly excellent and there was not a trace of
postmodernism in any of them. Undoubtedly, a student with the good
fortune to have well-educated parents or to go to a top school will be
able to avoid infection by the postmodernist virus. The youth more
exposed to corruption is one who moves from a not-so-good school to a
second-rate humanities faculty and takes his teachers’ attitudes
seriously for want of access to anything better. An intelligent student
in that position suffers a grave injustice. It is especially because of
the trashing of the talent of such students that I maintain my anger
The last serious consequence of postmodernism, one that extends well
beyond the small clique of card-carrying jargon-laden theorists, is the
moralising of public debate on questions that should be factual, such
as the “History Wars” and debates on economic rationalism. The
relentless assault of postmodernism on truth and its replacement of
rational debate with resentful “deconstruction” has, so to speak, given
permission for public intellectuals to lead with denunciations and
rancour prior to getting their facts straight. The “History Wars” began
when Keith Windschuttle wrote a book, The Fabrication of Aboriginal
History, claiming on the basis of his archival research that the
Tasmanian aboriginals were not massacred but mostly died of diseases.
It is astounding how few of the replies to him bothered to examine his
factual claims and the evidence he provided. Almost all of the
ferocious attacks on him consisted of denunciations of his alleged
racism, abuse about his supposed lack of imagination, comparisons with
the Holocaust denier David Irving, and snide remarks about his not
having a PhD. Though only one of his major opponents descended to any
explicit postmodernist claims about the relativism of truth, the
standard of the debate was extremely low, in a way that I believe would
not have been tolerated forty years ago before the advent of
postmodernism. Something of the same shallow moralism infects the
debate on economic rationalism. According to its supporters, a free
market is the best method of delivering prosperity to both rich and
poor. That may or may not be so, but the way to debate it is to look at
economic evidence. It is not to the point to try to short-circuit that
difficult economic debate by abusing economic rationalists for
“reducing humans to mere consumers” or for approving of “obscene”
inequalities of income. Arguments on matters of fact need to be sorted
out before moral judgments are made, not, as postmodernism would have
it, the reverse.
Facing the problem
Barbara Kruger “Untitled (Your Manias Become Science)”, 1981If it is
agreed that postmodernism is a problem, what should be done about it?
There are four possible plans:
Plan A: Do nothing and hope it goes away
Plan B: Take political action in an effort to have
postmodernists sacked and deprived of grants
Plan C: Refute postmodernism with arguments
Plan D: Provide a more exciting, positive alternative
Defeatist though it sounds, there is something to be said for plan A:
sit and wait for it to go away. We all have other things to do, and
given that postmodernism is not exactly forging ahead, we might well
decide to take a relaxed approach and not grant it the oxygen of
publicity. And after all, flared jeans and big hair did not disappear
because anyone refuted them – they just came to their use-by date and
no one bothered with them any more. Still, to take the same approach
with postmodernism would neglect the claims of the young whose minds
will be corrupted by falling in with postmodernists. And since academia
still in most cases provides jobs for life (especially for those
unemployable elsewhere), timescales in academic fashions are very long
– a present PhD student could still be teaching in forty years’ time.
There have been some interesting recent attempts along the line of Plan
B: political action. Brendan Nelson, until recently the Australian
minister in charge of higher education, refused to fund about half a
dozen of the worst grants recommended by the Australian Research
Council’s grant evaluation process, and appointed the conservative
editor of Quadrant, Paddy McGuinness, to the panel that evaluated the
grants. That is fiddling with the margins, and there seems no prospect
of anything more forceful. Since academic freedom is a principle of
some value, that may be reasonable. There is an inevitable and largely
unresolvable conflict between the principles of academic freedom and
quality control. I do not call for anyone to be sacked. (Though I do
call for certain persons to resign in shame.)
Refutation, Plan C, would be a good plan in an ideal world where there
was a level playing field in the conflict of ideas, where theories
fought man to man on the basis of fair arguments. That is not our
world. You might as well expect damsels in distress to be rescued by
knights in armour. Because postmodernism is not accepted by its
followers on the basis of argument, deploying arguments against it is
like boxing with shadows. It is just met with a smokescreen of
“deconstructions” of the appeal to argument as itself implicated in the
modernist rationalist problematic, and so on.
Still, perhaps there is an uncommitted audience out there somewhere, so
I have two excellent thinkers to recommend who identified and exposed
what arguments there are at the bottom of postmodernism. The first is
Raymond Tallis, whose brilliantly-titled book Not Saussure shows how
the ideas of such later stars as Derrida repeat the fundamental mistake
in the philosophy of language made by Saussure a hundred years ago.
Saussure believed that the structured nature of language meant, for
example, that the meanings of “black” and “white” were defined merely
by their opposition to each other, rather than being tied to our
perception of those colours; the disconnection of language from reality
that his theory implies has been relied on my all postmodernists since
to emphasise the “constructed” (hence political, hence probably wrong,
hence open to remaking at our pleasure) nature of whatever we say.
The second thinker to expose the confusions at the heart of
postmodernism was the Sydney philosopher David Stove, who in 1985 ran a
“Competition to find the worst argument in the world”. The argument had
to be both very bad and very widespread. He awarded the prize to
himself with the following argument:
We can know things only
• as they are related to us
• under our forms of perception and understanding
• insofar as they fall under our conceptual schemes,
we cannot know things as they are in themselves
Stated as baldly as that, the argument is probably not recognisable.
Here is an example that most will recognise. Speaking of the typical
products of a modern high school, he writes:
Their intellectual temper is (as everyone remarks)
the reverse of dogmatic, in fact pleasingly modest. They are quick to
acknowledge that their own opinion, on any matter whatsoever, is only
their opinion; and they will candidly tell you, too, the reason why it
is only their opinion. This reason is, that it is their opinion.
That is a version of the “worst argument” because is says, in effect,
“my opinion is just my opinion – created by my genes, education etc –
so it cannot be an opinion that there is any reason to believe”. The
version that lies at the heart of postmodernism is similar, but more
The cultural-relativist, for example, inveighs bitterly against our
science-based, white-male cultural perspective. She says that it is not
only injurious but cognitively limiting. Injurious it may be; or again
it may not. But why does she believe that it is cognitively limiting?
Why, for no other reason in the world, except this one: that it is
ours. Everyone really understands, too, that this is the only reason.
But since this reason is also generally accepted as a sufficient one,
no other is felt to be needed.
I hope it is clear why the “worst argument” is so bad. As another
Sydney philosopher, Alan Olding, pointed out, it is of the same form as
“We have eyes, therefore we can’t see.”
It is hard to believe that a real live postmodernist will concentrate
long enough to take on serious arguments like those of Tallis and
Stove. The postmodernist mindset does not bother to reply to
objections. So here is my recommendation on what to say if you find
yourself arguing with one at a party. You will find that whatever you
say is met with an attempted “deconstruction” as just another symptom
of your indoctrination by the capitalist rationalist oppressors. So ask
this: "What would count as evidence against your position?"
If something is suggested, you have something to work on. Most likely,
it will become clear that nothing would count as evidence against that
position. But a position that nothing would count as evidence against
is vacuous. (If your position is “snow is white”, it is clear what
counts against it, such as seeing black snow; if your position is “snow
is white or snow is not white”, nothing counts against it because you
haven’t said anything with content.)
I have a recommendation also on what to say to the friends of the
postmodernist at the party who are shocked by your lack of tolerance
and urge you to read all of Derrida and Foucault before you rudely
dismiss their important contributions to thought. Ask them: “What is
one good idea that postmodernists have come up with?” Ten to one they
will be unable to state one idea postmodernists have come up with,
good, bad or indifferent.
A better alternative
August HighlandIn the longer term, the answer to postmodernism,
especially to its ethical appeal, must rely on Plan D: presenting a
better alternative. If the youth are being corrupted by postmodernism
through its appeal to their indignation and to their sense that there
must be more to life than the pursuit of material gain, then they can
only be rescued by presenting a more credible alternative moral vision.
So what vision? Unfortunately, there are a number of fundamentalisms
available – Islamic, Sydney Anglican, Hillsong, Environmentalist and so
on – which play well in the market. (I use “fundamentalism” here
somewhat loosely, for any position that hands down a complete scripture
and simply urges “have faith, take it or leave it”.) Fundamentalist
leaders are always encouraged by the number of fourteen-year-olds
joining up. What do you expect? It is fortunate that an Australian
teenager who signs up is not as badly off as one in the Gaza strip who
will soon find himself strapping on a bomb, but blind commitment is no
way to find the meaning of life. The Catholic tradition does not lend
itself so well to fundamentalism, since it has always approved of
philosophy, but a kind of Catholic fundamentalism is certainly possible
– for centuries, many Catholics said “If the Pope says Galileo is
wrong, then Galileo is wrong as far as I’m concerned.” That kind of
“loyalty” is not helpful.
Another alternative vision might be called “imitatory” – it is based on
presenting models that will inspire the young in the right course: the
life of Jesus in the religious realm, literary works with solid values
such as Jane Austen’s novels and Harry Potter, stories of real heroes
such as medical researchers and peace negotiators. That is a good plan
as far as it goes. It is very appropriate for the earlier years at
school. The fight against postmodernism, however, is really on the
level of theory. There needs to be a positive theoretical vision that
will support one’s initial positive reaction to heroes, instead of,
like postmodernist suspicion, undermining it.
I have a plan. It is based on presenting the absolute basics of ethics
in a way that shows their objectivity, but free from any religious
commitment. I have come to that view from a perspective of Catholic
natural law ethics, but there are other ways of seeing it – my closest
collaborator in this area is Jean Curthoys, author of an excellent book
attacking postmodernist feminist theory, Feminist Amnesia. She has a
Marxist background and sees what we are doing as a continuation of the
“liberation theory of the Sixties”.
The idea is that ethics is not fundamentally about what actions ought
to be done, or about rights, or virtues, or divine commands. Ethics
does indeed have something to say about those matters, but they are not
basic. Where ethics should start is well explained in a page of Rai
Gaita’s Good and Evil: An Absolute Conception. He asks us to consider a
tutorial in which one of its members had suffered serious torture and
that was known to all the others in the group. If the tutor then asked
the group to consider whether our sense of good and evil might be an
illusion, “everyone would be outraged if their tutor was not serious
and struck by unbelieving horror if he was”. Scepticism about the
objectivity of good and evil, Gaita says, is not only false but a moral
offence against those who have suffered real evil.
Ethics should start, then, with a direct sense of what is good and what
is evil. To what things can good and evil happen? The death of a human
is a tragedy but the explosion of a lifeless galaxy is just a firework.
Why the difference? There is something about humans, an irreducible
worth or equal moral value, that means that what happens to them
matters a great deal. That equal worth of persons, brought home
directly to us when someone we care about suffers loss or when we
ourselves suffer an injustice, is what ethics is fundamentally about.
Other aspects of ethics follow from that. Why is murder wrong? Because
it destroys a human life, something of immense intrinsic value. (And
why is it arguable that capital punishment might nevertheless be
possible in some extreme circumstances, although it takes a human life?
– because there is a possibility that it might deter someone from
taking many valuable lives.) Other rules of right and wrong should
follow from the worth of persons similarly (together with necessary
information about the psychological makeup of humans, which gives
insight into what is really good for them). Rights? They follow in the
same way as rules: the right to life is just the prohibition on murder,
but seen from the point of view of the potential victim; it too follows
directly from the intrinsic moral worth of the person under threat.
Virtues? The virtue of restraint or temperance, for example, is a
disposition to act so as not to harm oneself and others, so it too is
directly explicable in terms of the harm done (by drugs, for example)
to humans. Divine commands? They must be in accordance with what is
inherently right. In the Christian vision, God does support the value
of all humans. “Look at the birds of the air”, says Jesus. “They
neither sow nor reap nor gather into barns, yet your heavenly Father
feeds them. Are you not of more value than they?” Any god or purported
god who issues commands contrary to human worth, such as edicts to make
war on unbelievers, must be resisted in the interests of humanity.
Much more is needed to explain how that moral vision works itself out
in practice. It does not follow from the fact that the principles of
ethics are simple that it is easy to decide on ethical questions. On
the contrary, the fundamental equal worth of persons itself creates
conflicts when there is tension between what different people need.
Some of the issues are discussed further in my new book, Catholic
Values and Australian Realities. But I hope enough has been said to
indicate where to find an alternative, and more optimistic, vision of
human life than the simplistic travesties foisted on the long-suffering
youth of the world these past forty years by postmodernism.
James Franklin is an Associate Professor of mathematics at the
University of New South Wales in Sydney.
What is the difference between
King Lear and Ginger Meggs?
By Barry Spurr
Thursday, 01 June 2006
The real victims of Postmodernism are students who have never been
introduced to the classics of English literature.
In the best postmodern way, I should let you know at the outset that I
am not going to talk about either King Lear or Ginger Meggs. I have
juxtaposed Shakespeare’s tragic monarch and the hero of the
once-popular cartoon strip – as indeed I have juxtaposed Andrew
Marvell, the late Metaphysical poet and Mickey Mouse – in various
public ruminations about the problems associated with the reading,
teaching and appreciation of literature in English in the contemporary
classroom, specifically, in the current New South Wales Higher School
Certificate English syllabus (but, of course, not only there).
Such juxtapositions are meant to highlight the jettisoning of value in
education, in general, reflected earlier this week, for example, when
the Australian Catholic University saw fit to confer honorary doctoral
degrees on the Wiggles. The thinking (if it might be so called) behind
such events as this reveals a degraded idea of the university – if I
may use Cardinal’s Newman’s term of high conception in reference to
such a debased context. It goes well beyond a modern re-consideration
of (and, at times, a healthy re-valuation of received ideas about the
university and educational ideals in general) to expose, in
postmodernism, an utter disconnection from and ignorance of what those
ideals might be. The Sydney Morning Herald, to its credit, derided the
“doctoring” of the Wiggles for the stunt it was. But that it was
possible at all is indeed stunting, diminishing, demoralising – and
this, in a Catholic University during Holy Week.
Yet “postmodernism” is a clumsy and unilluminating term, for various
reasons. The first has to do with “modernism” itself. In art –
literary, musical and visual – Modernism, with a capital “M”, was a
movement, largely taking place between the two world wars and, in
literature at least, having its annus mirabilis as early as 1922, with
the publication of arguably the greatest poem and novel of the
twentieth century, T.S. Eliot’s The Waste Land and James Joyce’s
Ulysses. The next generation of writers -- someone, for example, like
W.H. Auden, whose artistry was maturing through the 1930s -- were,
strictly-speaking, post-Modernist: drawing upon what Modernists like
Eliot had achieved, but subverting aspects of that achievement, making
their own distinctive contributions. A generation later, in the 1950s,
Philip Larkin and other members of the so-called “Movement” school of
poetry, were also (and more obviously) reacting against Modernism, so
were post-post-Modernists, in the sense of their relationship with the
original (and, by now, distant) Modernist movement. So, it is both hard
to pin down the Modernism which postmodernism is related to, and also
to date its inception, and, most challengingly, to find what common set
of beliefs and attitudes it is supposed to embody and to which domains
it can be restricted. I think we may have some idea about what
postmodernism in architecture might entail. But is there a
postmodernist approach to mathematics, for example? And what might that
A scene from King LearWhat does seem to be agreed is that the essence
of postmodernism, in relation to the reading, teaching and appreciation
of written texts, is that, first, there is no limit to be set on what
might qualify as a “text” (a bus ticket will do) and no absolute value
to be placed on any particular quality of a text: with regard to its
aesthetic value, or its significance with reference to its
meaningfulness or meaninglessness – let alone any qualities of a moral
or spiritual kind, its celebration of eternal verities (which are a
chimera in any case). Therefore, there are no “canonical” texts, for
example, in the study of literatures in English – no necessary,
required reading for graduates with an English Literature degree: a
qualification it is perfectly possible to obtain, today, without having
read a word of Milton, Pope, Wordsworth, Yeats or T.S. Eliot – the
greatest poets of the seventeenth, eighteenth, nineteenth and twentieth
centuries. Here, postmodernism is a synonym for intellectual chaos and
What is needed is a term for this approach to art and literature that
is not parasitical upon a previous classification like “Modernism”, but
which simply presents itself as itself, as much as this elusive
quantity can be identified. I would suggest “Anarchism” had it not been
used before, in a variety of contexts. In Greek, of course, “anarchy”
means “without authority”, the absence of an “archon”, a chief
magistrate in ancient Athens. But this is inadequate, too, because the
promoters of the brave new world of so-called postmodernism are
authoritative and prescriptive to a fault. Their “Thou shalt nots” are
at least as strident as those of the defenders of canonical texts.
Having achieved their kudos from berating and destroying the
Establishment, in the silly sixties, they are now the Establishment
themselves and will brook no contradiction – are, in fact (and I have
been around long enough to have experienced this) far less liberal than
the hierarchies they demolished, while, of course, endlessly
proclaiming their tolerance of diversity, “difference” (so much better
if you can say it in French, giving it the patina of theoretical
respectability) and all the other claptrap of a pseudo-intellectual
system which has no centre other than the individual’s conviction about
his or her ownership of The Truth. Yeats saw it clearly in “The Second
Coming” in 1919: “the centre cannot hold; / Mere anarchy is loosed upon
the world, / The blood-dimmed tide is loosed, and everywhere / The
ceremony of innocence is drowned”.
The reasons for this sustained assault upon the study of English
literature are many and varied and have some particular Australian
components which makes the resistance to them even more difficult here
than it might be in, for example, Britain or North America.
Essentially, postmodernism is a political phenomenon, deriving from the
culture of resentment and victimhood which is one of the least edifying
outcomes of the increasingly democratic and demotic twentieth century
in Western societies. It is a peculiarly self-defeating,
self-destructive and paradoxical phenomenon because, in its opposition
to what – in a shorthand term – we might call “high art”, the very
people who it is depriving of access to the classics (through
demonising them, their creators and their purveyors, as a conspiracy of
oppressive elitism or proposing the Marxist dismissal of them as
conspicuous waste) are left with a mess of pottage of works which are
condescendingly and patronisingly deemed to be the only suitable and
“relevant” study for the demos.
That deprived constituency, recognising its deprivation, will, in time,
turn upon its self-righteous persecutors and inhibitors. In a mild way,
we are encountering this already, at the university, where students
come to us after the miseries of the NSW HSC [New South Wales Higher
School Certificate] English syllabus (not to mention what has gone
before it, over those twelve years of school so-called education) and
say that, now, they want “to read the classics”. Once, in the
disreputable dead days beyond recall pupils were exposed to a good diet
of this material even by the time of the Intermediate Certificate, let
alone the Leaving Certificate (taken in the equivalent of today’s Year
Nowadays, you have one of the febrile supporters of the New South Wales
Board of Studies arguing that we could not possibly expect the senior
school students of western Sydney to read Milton. That great mind has
nothing to do with their lives; they could not relate to Paradise Lost;
therefore, it must not be read. Large-scale works of English provenance
are revolting expressions of the dated grandiose imperialist patriarchy
of Britannia and old Christendom irrelevant to our enlightened and
advanced age. And, at this point, insular Australianism usually kicks
in, with the theme of repudiating anything and everything that might be
Eurocentric to affirm our liberation from our disreputable European
past. So, prescribe some contemporary Australian trash and then claim
to be affirming the young and their class struggle against the
oppressive and supposedly monolithic past, still defended in some
reactionary quarters by dinosaurs, as I have been called (and revel in
Andrew MarvellOne of the great defenders of the current syllabus goes
about telling students that, instead of studying Wordsworth, they
should be concentrating on his sister, Dorothy, who, oppressed by his
phallocentric, patriarchal, masculinist presence was thwarted in her
own poetic ambition which, had it been allowed to flourish, instead of
being silenced by her brother and his work, would have written works of
genius comparable to those Wordsworth himself composed. And then we
wonder that students are skeptical about the whole process of reading
and appreciation served up to them in this context of resentment and
What are the qualities that distinguish a great work of literary art?
All but one of these are offensive to what we might generally gather
under the umbrella term of a postmodernist approach to the reading,
teaching and appreciation of literary texts. The exception is the close
attention to structures of language (or “discourse”, as they like to
call it), animated by and expressive of that complexity and subtlety
which we expect to find in great literary texts. This, in postmodernist
textual study, at its best, is salutary. Unfortunately, it has two
major drawbacks. First, the weaker brethren find it the least congenial
process of reading, so are inclined to resort to other boiled-down
aspects of postmodernist theorising, such as the non-idea of reading
and evaluating a text purely in terms of what it says to you, the
reader, and how it speaks to your life (and your “journey”, to use one
of the terms beloved of syllabus composers), without regard to the
contexts biographical, intellectual, historical and social which
produced it and to which any intelligent reading of a text must submit
in order for a cogent comprehension and assessment of it even to be
And secondly, it misses the essential point of literary study, by
focusing attention on structure rather than meaning which, in
combination with the reader-centred evaluation, counteracts the power
of a great text to lift us out of our own inevitably limited selfhood
and contemporary situation to focus on a larger interpretation of life
and human existence which may utterly contradict everything that we, to
date, have believed or accepted as valuable, but which encourages our
attention because of the combination of intellectual substance and
aesthetic accomplishment which are the hallmarks of great artistic
expression in literature and which, in time, may come to sustain us in
life itself. This, for the postmodernist, is a bourgeois fantasy. The
self is the only self-sustaining entity, alone and palely loitering in
the wasted land of postmodernist subjectivism, in the final
death-throes of Romanticism which is contemporary culture.
Part of the problem with the present-day teaching of literature – apart
from the initially disabling conviction that it would be better if
“literature”, as a concept, didn’t exist -- is that we know too much.
Burdened by the daunting mass of knowledge about the past, for example
– now available at your fingertips on the Internet – readers and
teachers and syllabus-composers are not unsurprisingly drawn to the
watered-down versions of postmodernist theory (which, the philosophers
tell me, are so watered down as to be a contradiction of its genuine
theoretical bases in the thought of such as Derrida and Foucault).
These can, by a theoretical sleight of hand, dispose of the
requirements of layers of knowledge which were once required to be
brought to the reading of any text worth reading. When this is linked
to a politically-driven program to discredit the past in general –
which was wrong about everything (only the present, and your present,
precisely, having any value or validity) – and an aggressive rejection
of any requirement to be humble (or humbled) before the works of genius
(derided as a social construction imposed upon the powerless to ensure
their submission to elites), that anything of value from the past
survives is astonishing.
When I suggested, in a Herald article, that the idea that anybody could
graduate in English Literature without having undertaken the serious
study of Milton, a furious correspondent (an English teacher) decried
my defence of Milton, asking why on earth would anybody require his
presence, as a sine qua non, on an English syllabus. That Wordsworth
himself wrote one of the most celebrated sonnets in the language about
Milton would only have confirmed her view of the conspiracy of men of
genius from which we are now being liberated by a congeries of
feminist-Marxist-ersatz PoMo-theoretical enlightenment:
Milton! thou shouldst be living at this hour:
England hath need of thee: she is a fen
Of stagnant waters: altar, sword, and pen,
Fireside, the heroic wealth of hall and bower,
Have forfeited their ancient English dower
Of inward happiness. We are selfish men;
Oh! raise us up, return to us again;
And give us manners, virtue, freedom, power.
Thy soul was like a Star, and dwelt apart;
Thou hadst a voice whose sound was like the sea:
Pure as the naked heavens, majestic, free,
So didst thou travel on life’s common way,
In cheerful godliness; and yet thy heart
The lowliest duties on herself did lay.
This irrelevant effusion, from the boys’ club of the dead poets’
society, in praise of a man of genius, by a man of genius, is in fact
literature occupied about its proper and ancient business, of the
immortal expression of profound truths, challenging the decay of
present mores – Wordsworth has the Industrial Revolution and its
aftermath in mind – but ranging over the centuries, recalling the
challenges Milton himself faced to his evolving principles at the time
of the English Civil War and celebrating the qualities of his poetic
voice and the profundity of his life, not least in his courageous
bearing of the tragic blight of blindness for a man of letters.
Wordsworth’s specific references to such as the “altar”, symbolic of
the Church, and “the heroic wealth of hall and bower”, to “manners” and
“virtue”, not to mention “godliness” and the great poet’s humility, his
“lowliest duties” construct a multi-layered poetic petition (within the
tight constraint of the sonnet-form) of moral and spiritual dignity and
urgent social concern which, certainly, has no immediate relevance to
the superficial realities of 21st-century Australian life as
experienced by an 18-year-old boy or girl. Instead, it presents a
vision of and response to life that is perennial in its scope and
expression. The challenge – and no-one is denying that the task is
difficult (that is part of what makes it worthwhile) is to submit to
what it has to say, connect with it through a considerable amount of
research (into the circumstances of the poem’s composition and its
various references), and then see how and why it has spoken to readers,
strikingly and memorably, for 200 years. But that pedagogical and
intellectual exercise, into which a gifted and dedicated teacher will
draw his or her pupils, requires a profound belief in the worthwhile
character of the exercise itself, founded, in turn, on a love of
literature. And that’s where the problems lie. The poison of
postmodernism – at least, in its boiled-down version, peddled by the
politically-driven syllabus-composers from the School of Resentment –
has effectively jettisoned such works and their appreciation (and,
indeed, love) from the curriculum. It is a betrayal of the young which
is nothing less than a disgraceful scandal.
The intelligent young have seen through it. Several have told me, in
recent years, how they went through the motions of conforming to the
syllabus formulae for the “correct” discussion of texts, using the
jargon, saying the “right” things, knowing that, in the future, they
could return to the study of literature and nurture their love for it
untrammeled by this straitjacket of the mind. But this is no
consolation for the less gifted students who should have as much right
to be exposed to the best that has been known and thought in the world,
to the great books, but who are being denied this access by soi-disant
educators who preach social liberation through intellectual and
Dr Barry Spurr is Senior Lecturer in English Literature at the
University of Sydney.