Skip to content

From the archives

When Terror Came to Canada

The response to the FLQ crisis remains controversial five decades later

A Neglected Pledge

Moving beyond apologies

The Nobel of Numbers

How a Hamilton native played mathematical peacemaker after World War One

The Prognosis

Looking the consequences in the eye

David Cayley

It is striking how often historical events arrive with their meanings plainly legible. When Napoleon clattered by Hegel’s window in Jena, in October 1806, the philosopher famously perceived in the emperor’s appearance “the soul of the world . . . seated on a horse.” The thought that fit the moment was ready for its occasion. In summer 1914, when Europe went to war “like a sleepwalker,” as the economic historian Karl Polanyi later recalled, it enacted a fate it was already dreaming. In our time, 9/11 declared its significance almost the instant it happened, as if everyone had just been waiting. With the ruins of the World Trade Center still smoking, the patent meaning of the towers’ collapse was easily parsed in the next morning’s papers: it was the end of every decadence, the beginning of a new unity, a new discipline, a new age. And so it seems to have been with the pandemic.

Everyone appeared to know right away what COVID‑19 meant. Some, like George Monbiot in the Guardian, heard “nature’s wake‑up call to a complacent civilization”; others sensed the advent of a bigger, more caring government; a few even welcomed the chance to test drive a new health-security state, to be better prepared for even worse pandemics in the future. But everyone agreed that the world “had changed forever,” that a judgment had been passed on our heedless way of life, and that a new social condition — a “new normal,” as so many have said — was dawning.

What has impressed me about the coronavirus is the extent to which its fearsome reputation has eclipsed and occasionally exceeded its actual effects. This is not to deny that some of these effects have been, in places, quite terrible. It is only to point out that the myth of the pandemic — the story that already clothed it upon arrival — has sometimes had more influence on policy than the facts of the matter, which are more difficult to ascertain. Two events seem to have had an outsized influence. The first was the announcement by the director of the World Health Organization on March 11 that the spread of COVID‑19 should be considered a pandemic. The word hit with extraordinary force. A National Post headline encapsulated the reaction: “PANIC,” it simply said, in a font so big and bold that it occupied a good part of the front page. At the time it was written, this headline was not an accurate description of things in Canada. Aside from the play on words — pan‑ic, pan‑dem-ic — it can only have been an instruction or permission. From that day on, the virus became the almost exclusive preoccupation of daily newspapers, as if, suddenly, there were nothing else in the world but the contest of it versus civilization.

The second signal event was the publication, on March 16, of a speculative model that had been hastily assembled by the COVID‑19 Response Team at Imperial College London. The model tried to predict what might happen in three possible cases: no intervention, moderate intervention, and aggressive intervention. In the first case, the forecast was a disaster: 2.2 million deaths in the United States, more than half a million in the United Kingdom, and so on. The second case was also pretty bad, but the predicted outcome with aggressive action was much better. This model had less foundation than the average weather forecast, since the disease was new, and, at that point, little was definitively known about either its virulence or its communicability. Nevertheless, the predictions quickly carried the day. “I don’t think any other scientific endeavor has made such an impression on the world as that rather debatable paper,” stated Johan Giesecke, a former chief medical officer in Sweden. Without visible deliberation or consultation, a direction was set: we would fight, as Winston Churchill once said, “on the beaches . . . in the fields and in the streets,” and we would “never surrender.” The reference to Churchill is not an idle one: the pandemic seemed instantly to awaken his memory. His defiant attitude and stirring rhetoric in June 1940 would become a touchstone in the weeks and months that followed, remembered and referred to again and again.

An event that arrived already clothed in its own mythology.

Horace Vernet, Bataille d’Iéna, 1836; Wikimedia Commons

Some voices were more cautious. John Ioannidis of the Stanford School of Medicine, a recognized expert in the fields of epidemiology, population health, and biomedical data science, warned of “a fiasco in the making” if draconian political decisions were taken in the absence of evidence. A number of other equally qualified doctors and medical scientists followed suit. The epidemiologist Knut Wittkowski, formerly at New York’s Rockefeller University, recommended that the disease be allowed to spread through the healthy part of the population as rapidly as possible. John Oxford, a virologist at Queen Mary University of London, warned that what we were experiencing was “a media epidemic.” In Canada, a former chief public health officer in Manitoba, Joel Kettner, phoned CBC Radio’s Cross Country Checkup on March 15 to warn against overreaction and to point out that “social distancing” was a largely unproven technique. “We actually do not have that much good evidence,” Kettner said. While it might work, he went on, “we really don’t know to what degree, and the evidence is pretty weak.” Such opinions — contrary to the headline news — were easily available to those who sought them out, but they made little dent in the emerging consensus. Kettner, for example, was treated with strained courtesy by Cross Country Checkup host Duncan McCue and then dismissed with little follow‑up. The larger narrative had already developed such momentum, and such an impressive gravity, that marginal voices had little effect.

One of the interesting features in all of this was the role the word “science” played. I have yet to hear a statement by either Justin Trudeau or Doug Ford, the two main political figures for a citizen of Ontario like myself, that fails to emphasize that they are “following science” or, often enough, “the best science,” as if others might be following the inferior kind. Yet when this began, there was little science — good, bad, or indifferent — to actually follow. In place of controlled, comparative studies, we had informed guesswork. No one had seen this virus before, and certainly no scientist had ever studied a situation in which an entire healthy population, minus its essential workers, was quarantined to try to “flatten the curve” or to “protect our health care system.” Such a policy had never been tried.

Behind claims that our political leaders are following science lies a fateful confusion. Does science mean merely the opinions of those with the right credentials, or does it refer to tested knowledge, refined by careful observation and vigorous debate? My impression is that when the premier of Ontario says he is following science, he is referring to the former — the opinions of his expert advisers — but, at the same time, invoking the aura of the latter — verifiable knowledge. The result is the worst of both worlds: we are governed by debatable positions but can make no appeal to science, since the general population has been convinced, in advance, that we are already in its capable hands.

This is a dangerous situation on two counts. First, it disables science. What is best understood as a fallible and sometimes fraught quest for reliable evidence becomes instead a pompous oracle that speaks in a single mighty voice. Second, it cripples policy. Rather than admitting to the judgments they have made, politicians shelter behind the skirts of science. This allows them to appear valiant — they are fearlessly following science — while at the same time absolving them of responsibility for the choices they have actually made or failed to make.

Science, in other words, has become a political myth — a myth quite at odds with the messy, contingent work of actual scientists. What suffers is political judgment. Politicians abdicate their duty to make the rough and ready determinations that are the stuff of politics; citizens are discouraged from thinking for themselves. With science at the helm, the role of the citizen is to stand on the sidelines and cheer, as most have done during the present crisis.

The decisions made at the beginning of this pandemic will have consequences that reverberate far into the future. These will include unprecedented debt, deaths from diseases that have gone undiagnosed and untreated during the COVID‑19 mobilization, lost jobs, stalled careers and educations, failed businesses, and the innumerable unknown troubles that have occurred behind the closed doors of the lockdown. Whether these harms outweigh the benefits of flattening the curve is a moral question, not a scientific one. It would remain a moral question even if the Imperial College wizards had had an infallible crystal ball and could have given us an accurate forecast.

A great part of the panic this past spring was about saving our health care system and not putting overwhelmed doctors into the position where they would have to decide who lived and who died in hospital wards. But did we not quietly make equivalent decisions about others, all the while hiding the fact that we were making them? If someone loses a business, in which they have invested everything, and then their life falls apart, have they not been sacrificed or triaged, just as surely as the old person who we feared might not get a ventilator? Moral decisions are difficult, but they should at least be faced as moral decisions.

Whenever I have seen the costs of total mobilization compared with the benefits, the costs invariably come out as substantially greater — sometimes by several orders of magnitude. For example, the epidemiologist Jayanta Bhattacharya, of Stanford University, and the economist Mikko Packalen, of Waterloo University, have argued in The Spectator that infant mortality will increase dramatically during the economic downturn induced by the shutdown, resulting in as many six million deaths over the next decade. Other studies predict increased deaths from cancer and tuberculosis, as preoccupation with COVID‑19 interrupts diagnosis, treatment, and vaccination programs. Yes, these studies are speculative and may rest on questionable assumptions, but in this respect they are just like the many coronavirus models that have induced such fear. They may also involve invidious, fanciful, or impossibly abstract comparisons where one is asked — to take an instance I recently heard — to choose between “saving Granny” and “saving the economy.”

My point is not that a particular model is right or wrong. The variety of plausible scenarios indicates that we are in a condition of ignorance and uncertainty — a condition that should not be hidden by the pretense that science is lighting the way. Nevertheless, such models, as in the case of Bhattacharya and Packalen’s work, can remind us that in saving some, we may have abandoned many others, and that the ones saved will often be those who are already in the best position to protect themselves, while the abandoned will often be the weakest or most vulnerable. Put another way: political deliberation may have stopped — transfixed by the threat of the virus — exactly where it should have started.

Following the early instruction to panic, newspapers excluded all other subjects from their pages for weeks on end — as if it were almost indecent to speak of anything else. CBC Radio, with a few exceptions, followed suit. Soon, the pandemic filled the sky. Extravagant rhetoric became commonplace. One heard that everything had changed, that there had never been anything like this, that there would be no going back. The prime minister, speaking on March 25, called the pandemic “the greatest health care crisis in our history” — an astonishing remark. How can one even compare the flu‑like illness that will be suffered by most of the people affected by COVID‑19 with the ravages of cholera or the devastating impact of smallpox on Indigenous communities? Yet the prime minister’s hyperbole attracted little comment. It fit seamlessly with all the other excited talk about how “unprecedented” this all was.

The media onslaught had two great effects. The first was to transfer all agency to the virus. Governments took the measures that closed businesses and immured people in their homes — measures that really were unprecedented — but these steps were never treated as problematic or debatable, because constant reiteration of the threat posed by the virus made them seem unquestionably necessary. It was not the government that had turned the world upside down. It was the virus’s doing.

The second great effect was to establish a war psychology. That we were fighting a war, that the virus was a mighty and relentless adversary, that we must win no matter the cost — these all quickly became commonplace ideas. People love war, just as much as they hate it, and a war against an invisible foe, belonging to no race, nation, or class, was ideal. It generated solidarity; it fortified purpose; it empowered heroism; it provoked repentance. How careless we were, we said, before our invisible enemy reminded us of the things that really matter. How brave are the nurses and grocery clerks who serve on the “front lines.” Political careers have been rehabilitated, without the slightest taint of opportunism touching those who were thus revived. The becalmed government of Doug Ford suddenly had the wind back in its sails. The prime minister, hopelessly impaled by contradictions that his sunny ways had failed to overcome, became once again a healer, a generous and resolute friend, a stern father. “Enough is enough,” he reprimanded his wayward children. “Go home and stay home.” Does it sound cynical to say this? Of course it does, even if, by now, the magic has already worn off a little. Who undermines confidence in the government, or questions its motives, during a war?

Quarantine of the sick is ancient. The attempt to quarantine an entire healthy population by keeping everyone apart is novel. Has it worked? Some research says yes, some says no. In April, Isaac Ben-Israel, an Israeli scientist, published a study in the Times of Israel that suggests COVID‑19 infections have followed a remarkably similar pattern in affected countries, no matter what attempts at containment have been made. Ben-Israel, the chairman of the Israeli Space Agency and of its National Council for Research and Development, wrote,

Some may claim that the decline in the number of additional patients every day is a result of the tight lockdown imposed by the government and health authorities. Examining the data of different countries around the world casts a heavy question mark on the above statement.

It turns out that a similar pattern — rapid increase in infections that reaches a peak in the sixth week and declines from the eighth week — is common to all countries in which the disease was discovered, regardless of their response policies.

Ben-Israel’s observations may be impressive, but they don’t account for the resurgence in infections that has since occurred in Israel, the U.S., and other places.

The case of Sweden, a country that tried to steer a middle course, is likewise hard to judge. It took many precautions: shutting down universities and senior secondary schools, closing old‑age homes to visitors, encouraging social distancing, and prohibiting large gatherings. But it also kept its borders and businesses open; and its government trusted the good sense of its citizens far more than in other places. Sweden’s per capita mortality has been relatively high — less than the worst-affected countries but still dramatically higher than its more tightly -locked-down neighbours. How are we to interpret these numbers? Sweden is different than its neighbours — more heavily industrialized, with a bigger immigrant population, and larger old‑age residences. And the countries that have suffered even worse per capita mortality did lock down.

Perhaps those who suffered more at first will suffer less later. If an effective vaccine proves elusive, as many predict, then “flattening the curve” may have meant only postponing the day of reckoning. Variations in the constantly mutating virus, along with differences in ecology, age structure, and genetic makeup, may turn out to be more significant than initially thought. I return to our fundamental ignorance — even the question of whether, and for how long, infection confers future immunity is still under active and disputed consideration. When asked to compare Sweden’s numbers with those of its neighbours, Giesecke, the former chief medical officer, gave an unusually good and truthful response: “Call me next year at this time.”

The fundamental difficulty with assessing the mass quarantine lies in the distinction between correlation and cause. The lockdowns may have little effect on the progress of the disease, as Ben-Israel tried to show, but since they occurred at the same time, they can always be assigned the credit when infections begin to diminish. Controlled study of the question would be fiendishly difficult, if not impossible, and so the whole matter must remain moot. There is no “settled science.” The question then arises: Why were we so quick to adopt such a debatable policy, and why has it been so widely acclaimed?

In many ways, we’ve been practising for this day. Consider the growing emphasis placed on safety. When I was young, people did not urge one another to “be safe,” but now it is a synonym for “see you later.” Many children have entirely lost their independence in the name of safety. Houses and cars have been fortified and securitized. Surveillance has expanded. And every new increment in safety has quickly become mandatory. It’s incredible to recall that the old CBC building on Jarvis Street in Toronto, where I worked for many years, had minimal security and more or less open public access until the late 1980s, when a frightening intruder caused some alarm. Then, in 1992, we moved to the new broadcasting centre, where gates barred the public from work areas, key cards were required for access, and we were asked to display our dog tags at all times (though few did). Immediately the former regime began to seem almost unthinkably unsafe. Good enough had turned into zero tolerance.

Risk consciousness has run on a parallel track. The idea itself is old — traders began sharing the risks of dangerous ventures millennia ago — but it has become more pervasive and more mathematical in our time. Expecting parents, long before they ever meet their child, know the probability of various conditions for which he or she may be at risk. People are regularly checked for diseases they don’t yet have, because they are at risk of getting them. We are, as the health researcher Alan Cassels once joked, pre-diseased. This fosters what might be called a hypothetical cast of thought: a habit of living in the future or acting in advance. It also accustoms people to thinking of themselves in statistical terms rather than as unique individuals. Risks pertain not to individuals but to a population, a hypothetical entity composed of statistical figments that resemble each other in some way. “My risk,” in other words, does not pertain to me personally — I remain terra incognita — but rather to my statistical doppelgänger. When awareness of risk, in this sense, reaches a certain intensity, a habit of thought forms. People are primed for impending risks. It makes sense when we are told that we have to act now, before we know anything for sure, because, if we wait, it will surely be too late.

Risk has another aspect that is relevant to the present moment. In 1986, the German sociologist Ulrich Beck described a “risk society” — a social formation that amounts to an ongoing science experiment with risks we can neither assess nor control. We have no other Earth on which we can conduct a nuclear war and observe the consequences; no spare atmosphere that we can heat up experimentally to see how things turn out. This is a terrifying situation, and it has the consequence of making us extremely risk averse. At the mercy of towering risks that we can barely comprehend, we become all the more zealous in attempting to contain more manageable ones. At least we can try to “wrestle the virus to the ground,” as multiple politicians and editorial boards have put it this year.

Then there is management, and our collective expectation that everything can and should be managed. Fifty years ago, humans first saw images of our planet hanging in space. Very quickly, these awe-inspiring photographs were domesticated, appearing on T‑shirts and key chains, fundraising flyers and advertisements. Twenty years later, a Scientific American cover featured a stylized version of the Blue Marble and the words “Managing Planet Earth.” By then, the idea had begun to seem almost plausible. Not only were we capable of managing the earth, but we had an obligation to do so in the interests of our survival. A few old souls sensed the astonishing hubris of this claim, but it was soon taken as a given. At the same time, management institutes and faculties grew more and more influential. People got used to the talk of corporate and civic re-engineering and reinvention. This bred another habit of thought: problems must be proactively managed, never simply avoided or endured.

I would note two other factors predisposing us to the supposed scientific consensus. The first is that we have grown accustomed to a state of emergency or exception, as crisis succeeds crisis in our media and our minds. It’s worth remembering that just before the pandemic struck, we were at another extremity: Indigenous protesters were disrupting the national transportation system, and the very legitimacy of Canada as an inclusive political community was being called into question. The second is the sentimentality that has become pervasive in our social and political affairs. By sentimentality, I mean a tendency to pretty things up, to speak and act as if we all felt an almost saintly ardour for the common weal, and to continually dramatize feelings we do not actually have — for example, the “thoughts and prayers” that are transmitted day and night over our airwaves. This tendency has made it easy to turn the pandemic into a morality play, with heroic front-line workers risking their lives to keep the housebound safe from harm. I do not mean to disparage the real dangers some have faced, only to point to the habit of exaggeration that endows everyone who ventures out in public to do a job with an aura of sanctity. The unctuous, honeyed tones in which the prime minister has addressed the country have been particularly egregious, but many agencies have participated in this agony of solicitude. Hospitals, for example, regularly praise their “champions” who “stop at nothing” in their exercise of “relentless care.” These effusions are a kind of blackmail that sets policy beyond the reach of careful thought by investing it with unimpeachable feeling.

To complete my litany of preconditions that led to total mobilization against the virus, I will add the halo that has appeared around the word “life.” This amounts to a new religiosity, and perhaps even to a new religion. In The New Religion of Life in Everyday Speech, from 1999, the theologian Don Cupitt argued that in daily talk, “life” has assumed all the attributes formerly possessed by God: it is providence, guardian, and guide. Life leads us, teaches us, and has its way with us. It embodies sanctity and inspires devotion. “Life,” Cupitt wrote, signifies “a thing or power or agency that carries us along as a fast-flowing river carries a boat, this way and that; a moving Power that is both immanent within us and (poetically) over against us and surrounding us; that is thought of as not only filling us and inspiring us, but also as having quasi-personal attributes [like] having things in store for us.”

This impalpable power that we so revere has also become an almost palpable entity, which we have an unquestionable duty to foster, protect, and administer. Life has become a topic in law, where one can sue for “wrongful life”; in ecology, which takes life as its subject; in theology, where it is announced as “the highest value”; in business, where every corporation fosters its “human resources”; and in bioethics, where life becomes the object of moral deliberation. All this works to turn life into a definite thing. In the discourses of life, what had been, for secular society, a quality or condition and, for the person of faith, an expression of God’s sustaining will has become a discreet quantum for which “we” feel ourselves responsible.

A ready example is the quantification of life in the news media. Catastrophes are measured by their death toll. Lives “saved” are a gauge of success. “Saving lives,” boasts Toronto’s Sunnybrook Hospital, ”one innovation at a time.” These lives are an aggregate, an abstraction. We do not need to know anything about any of them to know that their conservation is an unrestricted good. They are an amount — captured by a new metric, gross national lives. In New York, Andrew Cuomo typified this attitude when he said in March, “I want to be able to say to the people of New York: I did everything we could do. . . . And if everything we do saves just one life, I’ll be happy.” There is an echo here, conscious or not, of the Talmudic teaching that whoever saves a single life saves the whole world. At the very least, the governor’s remark must be understood as a religious statement, since as a political statement it is almost criminally irresponsible. And that is my point.

The most important consequence of this new religion of life, in the present case, is the attitude it engenders toward death. When life is something that we have, not as a loan or a gift or a quality, but as a possession we’re duty-bound to secure, conserve, and extend, death becomes an obscene and meaningless enemy. To think of “the hour of our death” — the ancient formula of the “Ave Maria” — as an occasion we should contemplate and seek out comes to feel defeatist. The old, as I have discovered, are barely permitted to speak about their age without getting a pep talk in return. This makes it difficult to take the losses inflicted by the virus gracefully, even when they are unavoidable. Better to pretend with Cuomo that we will fight to the last ditch. This inability to face death, or to speak frankly about it, makes even the manifest destruction created by the lockdown seem preferable.

The most terrible aspect of the obsession with saving lives, for me, has been the way the old have been left to die alone during these past few months. This is unconditionally wrong. To justify it as an unfortunate, temporary trade-off — or as a necessity in service to the greater good — misses something fundamental. The dying should be accompanied and held, comforted and mourned by those they have loved and who have loved them. No calculus of health and safety should limit this defining obligation: it simply belongs to us as human beings. That safety has supervened over humanity in this way helps illustrate the substitution of lives for persons. Persons are unique — each will be born and die only once, and the respect due to these two great passages is absolute. There are fates worse than death, and one of them is the bullying of the old into the self-serving belief that we have incarcerated and abandoned them for their own good.

Many preconditions converged in a perfect storm with the onset of COVID‑19. Apocalyptic fear, sanctification of safety, heightened risk awareness, glorification of management, habituation to a state of exception, the religion of life and the fear of death — all came together. And, together, they have made it seem perfectly obvious that total mobilization was the only possible policy. How could any politician have resisted this tide? But recognizing the force of the safety-at-all-costs approach shouldn’t prevent us from looking its consequences in the eye.

Mortality will increase from all the other illnesses that have been forced to take a back seat. Many small and even large businesses will fail, while a few gigantic ones, like Amazon, will prosper even more mightily. Small businesses add colour and conviviality to our neighbourhoods and cannot be replaced by drones and trucks dispatched from distant warehouses. Jobs and opportunities will be lost, predominantly among those who are young and least established, the so‑called precariat. Civil liberties will suffer, as they already have. At the beginning of the crisis, for example, the federal government tried, unsuccessfully, to give itself broad powers to spend, tax, and borrow without consulting Parliament. Shortly afterwards, the Alberta legislature passed Bill 10, which authorizes, among other things, the seizure of property, entry into private homes without warrant, and mandatory installation of tracking devices on phones. Authoritarian governments, like Hungary’s, have gone much further in consolidating and aggrandizing power under the cover of emergency. Experience shows that all of these new powers, once assumed, will not be readily relinquished. Habits of compliance, developed in the heady days when we were “all in this together,” may prove equally durable.

We are seeing the beginnings of a thoroughgoing virtualization of civic life, not all of which will end with the pandemic. Writing in The Intercept, Naomi Klein wittily called this development the Screen New Deal. Among her evidence: the announcement that the former Google CEO Eric Schmidt will chair a blue-ribbon commission charged with “reimagining” New York. The work will focus, Schmidt says, on telehealth, remote learning, broadband, and other “solutions” that “use technology to make things better.” New York has also announced a partnership with the Bill & Melinda Gates Foundation to develop “a smarter education system.” In doing so, Andrew Cuomo called Bill Gates “a visionary,” while asserting that this is “a moment in history where we can actually incorporate and advance [his] ideas.” Do we really need “all these buildings, all these physical classrooms,” the governor asked rhetorically, given “all the technology” now available. And let’s not forget that Mark Zuckerberg has been in Washington promoting the saving role of technology in a world where people are afraid to get close to each other.

So this is the heritage: the possibility that the deaths averted by lockdowns will be offset by the deaths caused by them; spectacularly indebted governments whose deficits may threaten basic state functions; increased surveillance; reduced civil liberty; lost jobs and ruined careers; a frightened, more pliable citizenry; and an economy that has shrunk in the worst possible way by casting off the poorest and the weakest — a terrible irony for long-time advocates of degrowth, including me.

Mass quarantines and social distancing measures are easier to begin than to end. Such policies, once undertaken, assume lives of their own and tend to become the reason for their own existence. They generate the fear they ostensibly address: if we weren’t in danger, we wouldn’t be “sheltering in place” or “keeping two metres apart” or wearing masks or giving restaurants our personal details for potential contact tracing. When this all began, we were told that we must protect our health system from overload and our doctors from agonizing decisions about who should get scarce resources. Even when hospitals were not overtaxed here, I have met people, still under the influence of the initial panic, who believe they were. Nevertheless, the lockdown persisted long past the time when there was any reason to fear that our hospitals would be swamped.

To be sure, an unknown, highly infectious virus does present a serious public health emergency. And yes, we had to do something. But perhaps we responded with an extremely destructive policy; perhaps we responded without waiting to find out what we were dealing with. Our reaction requires a degree of justification that I have not yet seen from those in charge. We’ve had plenty of Churchillian rhetoric, lots of flattery, cheerleading, and sentimentality, but little that I would call debate over policy. Nonetheless, three elementary points ought to be plainly legible. First, in the absence of a vaccine, we have only postponed our reckoning with this virus. Second, our efforts to temporize rather than improvise in the face of threat have done a huge amount of harm. And, finally, the almost instant willingness to accept that “everything has changed” has opened the door to far worse evils in future. Perhaps we have been afraid of the wrong things.

David Cayley is the author of the forthcoming Ivan Illich: An Intellectual Journey.

Related Letters and Responses

Linden MacIntyre Toronto

Sanaz Harland West Vancouver, British Columbia

@drjohnm via Twitter

Darren Alexander Victoria

Evan Bedford Red Deer, Alberta

Advertisement

Advertisement