Archive for the ‘Mathematics’ Category

The Sherlock Holmes Law

Friday, April 3rd, 2020

I rather like Arthur Conan Doyle’s Sherlock Holmes stories. I should also admit that I’m not a hard-core devotee of mysteries in general. If I were, I probably would find the frequent plot holes in the Holmes corpus more annoying than I do. I enjoy them mostly for the period atmosphere, the prickly character of Holmes himself, and the buddy-show dynamic of his relationship with Doctor Watson. To be honest, I’ve actually enjoyed the old BBC Holmes series with Jeremy Brett at least as much as I have enjoyed reading the original works. There’s more of the color, more of the banter, and less scolding of Watson (and implicitly the reader) for not observing the one detail in a million that will somehow eventually prove relevant.

Irrespective of form, though, the Holmes stories have helped me articulate a principle I like to call the “Sherlock Holmes Law”, which relates to the presentation of fictional characters in any context. In its simplest form, it’s merely this:

A fictional character can think no thought that the author cannot.

This is so obvious that one can easily overlook it, and in most fiction it rarely poses a problem. Most authors are reasonably intelligent — most of the ones who actually see publication, at least — and they can create reasonably intelligent characters without breaking the credibility bank. 

There are of course some ways for authors to make characters who are practically superior to themselves. Almost any writer can extrapolate from his or her own skills to create a character who can perform the same tasks faster or more accurately. Hence though my own grasp of calculus is exceedingly slight, and my ability to work with the little I do know is glacially slow, I could write about someone who can look at an arch and mentally calculate the area under the curve in an instant. I know that this is something one can theoretically do with calculus, even if I’m not able to do it myself. There are well-defined inputs and outputs. The impressive thing about the character is mostly in his speed or accuracy. 

This is true for the same reason that you don’t have to be a world-class archer to describe a Robin Hood who can hit the left eye of a gnat from a hundred yards. It’s just another implausible extrapolation from a known ability. As long as nobody questions it, it will sell at least in the marketplace of entertainment. Winning genuine credence might require a bit more.

Genuinely different kinds of thinking, though, are something else. 

I refer this principle to the Holmes stories because, though Mr. Holmes is almost by definition the most luminous intellect on the planet, he’s really not any smarter than Arthur Conan Doyle, save in the quantitative sense I just described. Doyle was not a stupid man, to be sure (though he was more than a little credulous — apparently he believed in fairies, based on some clearly doctored photographs). But neither was he one of the rare intellects for the ages. And so while Doyle may repeatedly assure us (through Watson, who is more or less equivalent to Doyle himself in both training and intelligence) that Holmes is brilliant, what he offers as evidence boils down to his ability to do two things. He can:

a) observe things very minutely (even implausibly so);

and

b) draw conclusions from those observations with lightning speed. That such inferences themselves strain logic rather badly is not really the point: Doyle has the writer’s privilege of guaranteeing by fiat that they will turn out to be correct.

Time, of course, is one of those things for which an author has a lot of latitude, since books are not necessarily (or ever, one imagines) written in real time. Even if it takes Holmes only a few seconds to work out a chain of reasoning, it’s likely that Doyle himself put much more time into its formation. While that probably does suggest a higher-powered brain, it still doesn’t push into any genuinely new territory. Put in computer terms, while a hypothetical Z80 chip running at a clock speed of 400Mhz would be a hundred times faster than the 4Mhz one that powered my first computer back in the 1982, it would not be able to perform any genuinely new operations. It would probably be best for running CP/M on a 64K system — just doing so really quickly.

It’s worth noting that sometimes what manifests itself chiefly as an increase in speed actually does represent a new kind of thinking. There is a (perhaps apocryphal) story about Carl Friedrich Gauss (1777-1855), who, when he was still in school, was told to add the digits from one to a hundred as punishment for some classroom infraction or other. As the story goes, he thought about it for a second or two, and then produced the correct result (5050), much to the amazement of his teacher. Gauss had achieved his answer not by adding all those numbers very rapidly, but by realizing that if one paired and added the numbers at the ends of the sequence, moving in toward the center, one would always get 101: i.e., 100 + 1 = 101; 99 + 2 = 101; and so on. There would then be fifty such pairs — hence 50 x 101: 5050. 

A character cannot produce that kind of idea if the author doesn’t understand it first. It makes the depiction of superintelligent characters very tricky, and sometimes may even limit the portrayal of stupid ones who don’t think the way the rest of us do.

For readers, however, it is different. Literary works (fictional or not) can open up genuinely new kinds of ideas to readers. While a writer who has achieved a completely new way of thinking about some technical problem is less likely to expound it in fiction than in some sort of a treatise or an application with the patent office, fictional works often present ideas one has never considered before in the human arena. It need not be a thought that’s new to the world in order to be of value — it needs merely to be new to you.

Such a thought, no matter how simple it may seem once you see it, can blow away the confines of our imaginations. It’s happened to me at a few different stages in my life. Tolkien’s The Lord of the Rings awakened me when I was a teenager to something profound about the nature of language and memory. C. S. Lewis’ “The Weight of Glory” revolutionized the way I thought about other people. Tolstoy’s War and Peace laid to rest any notion I had that other people’s minds (or even my own) could ever be fully mapped. Aquinas’ Summa Theologica (especially Q. 1.1.10) transformed forever my apprehension of scriptureThe list goes on, but it’s not my point to catalogue it completely here.

Where has that happened to you?

Reflections on Trisecting the Angle

Thursday, March 12th, 2020

I’m not a mathematician by training, but the language and (for want of a better term) the sport of geometry has always had a special appeal for me. I wasn’t a whiz at algebra in high school, but I aced geometry. As a homeschooling parent, I had a wonderful time teaching geometry to our three kids. I still find geometry intriguing.

When I was in high school, I spent hours trying to figure out how to trisect an angle with compass and straightedge. I knew that nobody had found a way to do it. As it turns out, in 1837 (before even my school days) French mathematician Pierre Wantzel proved that it was impossible for the general case (trisecting certain special angles is trivial). I’m glad I didn’t know that, though, since it gave me a certain license to hack at it anyway. Perhaps I was motivated by a sense that it would be glorious to be the first to crack this particular nut, but mostly I just wondered, “Can it be done, and if not, why not?”

Trisecting the angle is cited in Wikipedia as an example of “pseudomathematics”, and while I will happily concede that any claim to be able to do so would doubtless rely on bogus premises or operations, I nevertheless argue that wrestling with the problem honestly, within the rules of the game, is a mathematical activity as valid as any other, at least as an exercise. I tried different strategies, mostly trying to find a useful correspondence between the (simple) trisection of a straight line and the trisection of an arc. My efforts, of course, failed (that’s what “impossible” means, after all). Had they not, my own name would be celebrated in different Wikipedia articles describing how the puzzle had finally been solved. It’s not. In my defense, I hasten to point out that I never was under the impression that I had succeeded. I just wanted to try and to know either how to do it or to know the reason why.

My failed effort might, by many measures, be accounted a waste of time. But was it? I don’t think it was. Its value for me was not in the achievement but in the striving. Pushing on El Capitan isn’t going to move the mountain, either, but doing it regularly will provide a measure of isometric exercise. Similarly confronting an impossible mental challenge can have certain benefits.

And so along the way I gained a visceral appreciation of some truths I might not have grasped as fully otherwise.

In the narrowest terms, I came to understand that the problem of trisecting the angle (either as an angle or as its corresponding arc) is fundamentally distinct from the problem of trisecting a line segment, because curvature — even in the simplest case, which is the circular — fundamentally changes the problem. One cannot treat the circumference of a circle as if it were linear, even though it is much like a line segment, having no thickness and a specific finite extension. (The fact that π is irrational seems at least obliquely connected to this, though it might not be: that’s just a surmise of my own.)

In the broadest terms, I came more fully to appreciate the fact that some things are intrinsically impossible, even if they are not obvious logical contradictions. You can bang away at them for as long as you like, but you’ll never solve them. This truth transcends mathematics by a long stretch, but it’s worth realizing that failing to accomplish something that you want to accomplish is not invariably a result of your personal moral, intellectual, or imaginative deficiencies. As disappointing as it may be for those who want to believe that every failure is a moral, intellectual, or imaginative one, it’s very liberating for the rest of us.

Between those obvious extremes are some more nuanced realizations. 

I came to appreciate iterative refinement as a tool. After all, even if you can’t trisect the general angle with perfect geometrical rigor, you actually can come up with an imperfect but eminently practical approximation — to whatever degree of precision you require. By iterative refinement (interpolating between the too-large and the too-small solutions), you can zero in on a value that’s demonstrably better than the last one every time. Eventually, the inaccuracy won’t matter to you any more for any practical application. I’m perfectly aware that this no longer pure math — but it is the very essence of engineering, which has a fairly prominent and distinguished place in the world. Thinking about this also altered my appreciation of precision as a pragmatic real-world concept. 

A more general expression of this notion is that, while some problems never have perfect solutions, they sometimes can be practically solved in a way that’s good enough for a given purpose. That’s a liberating realization. Failure to achieve the perfect solution needn’t stop you in your tracks. It doesn’t mean you can’t get a very good one. It’s worth internalizing this basic truth. And only by wrestling with the impossible do we typically discover the limits of the possible. That in turn lets us develop strategies for practical work-arounds.

Conceptually, too, iterative refinement ultimately loops around on itself and becomes a model for thinking about such things as calculus, and the strange and wonderful fact that, with limit theory, we can (at least sometimes) achieve exact (if occasionally bizarre) values for things that we can’t measure directly. Calculus gives us the ability (figuratively speaking) to bounce a very orderly sequence of successive refinements off an infinitely remote backstop and somehow get back an answer that is not only usable but sometimes actually is perfect. This is important enough that we now define the value of pi as the limit of the perimeter of a polygon with infinitely many sides.

It shows also that this is not just a problem of something being somehow too difficult to do: difficulty has little or nothing to do with intrinsic impossibility (pace the Army Corps of Engineers: they are, after all, engineers, not pure mathematicians). In fact we live in a world full of unachievable things. Irrational numbers are all around us, from pi to phi to the square root of two, and even though no amount of effort will produce a perfect rational expression of any of those values, they are not on that account any less real. You cannot solve pi to its last decimal digit because there is no such digit, and no other rational expression can capture it either. But the proportion of circumference to diameter is always exactly pi, and the circumference of the circle is an exact distance. It’s magnificently reliable and absolutely perfect, but its perfection can never be entirely expressed in the same terms as the diameter. (We could arbitrarily designate the circumference as 1 or any other rational number; but then the diameter would be inexpressible in the same terms.)

I’m inclined to draw some theological application from that, but I’m not sure I’m competent to do so. It bears thinking on. Certainly it has at least some broad philosophical applications. The prevailing culture tends to suggest that whatever is not quantifiable and tangible is not real. There are a lot of reasons we can’t quantify such things as love or justice or truth; it’s also in the nature of number that we can’t nail down many concrete things. None of them is the less real merely because we can’t express them perfectly.

Approximation by iterative refinement is basic in dealing with the world in both its rational and its irrational dimensions. While your inability to express pi rationally is not a failure of your moral or rational fiber, you may still legitimately be required — and you will be able — to get an arbitrarily precise approximation of it. In my day, we were taught the Greek value 22/7 as a practical rational value for pi, though Archimedes (288-212 BC) knew it was a bit too high (3.1428…). The Chinese mathematician Zhu Chongzhi (AD 429-500) came up with 355/113, which is not precisely pi either, but it’s more than a thousand times closer to the mark (3.1415929…). The whole domain of rational approximation is fun to explore, and has analogical implications in things not bound up with numbers at all.

So I personally don’t consider my attempts to trisect the general angle with compass and straightedge to be time wasted. It’s that way in most intellectual endeavors, really: education represents not a catalogue of facts, but a process and an exercise, in which the collateral benefits can far outweigh any immediate success or failure. Pitting yourself against reality, win or lose, you become stronger, and, one hopes, wiser. 

STEMs and Roots

Tuesday, February 2nd, 2016

Everywhere we see extravagant public handwringing about education. Something is not working. The economy seems to be the symptom that garners the most attention, and there are people across the political spectrum who want to fix it directly; but most seem to agree that education is at least an important piece of the solution. We must produce competitive workers for the twenty-first century, proclaim the banners and headlines; if we do not, the United States will become a third-world nation. We need to get education on the fast track — education that is edgy, aggressive, and technologically savvy. Whatever else it is, it must be up to date, it must be fast, and it must be modern. It must not be what we have been doing.

I’m a Latin teacher. If I were a standup comedian, that would be considered a punch line. In addition to Latin, I teach literature — much of it hundreds of years old. I ask students, improbably, to see it for what it itself is, not just for what they can use it for themselves. What’s the point of that? one might ask. Things need to be made relevant to them, not the other way around, don’t they?

Being a Latin teacher, however (among other things), I have gone for a number of years now to the Summer Institute of the American Classical League, made up largely of Latin teachers across the country. One might expect them to be stubbornly resistant to these concerns — or perhaps blandly oblivious. That’s far from the case. Every year, in between the discussions of Latin and Greek literature and history, there are far more devoted to pedagogy: how to make Latin relevant to the needs of the twenty-first century, how to advance the goals of STEM education using classical languages, and how to utilize the available technology in the latest and greatest ways. What that technology does or does not do is of some interest, but the most important thing for many there is that it be new and catchy and up to date. Only that way can we hope to engage our ever-so-modern students.

The accrediting body that reviewed our curricular offerings at Scholars Online supplies a torrent of exortation about preparing our students for twenty-first century jobs by providing them with the latest skills. It’s obvious enough that the ones they have now aren’t doing the trick, since so many people are out of work, and so many of those who are employed seem to be in dead-end positions. The way out of our social and cultural morass lies, we are told, in a focus on the STEM subjects: Science, Technology, Engineering, and Math. Providing students with job skills is the main business of education. They need to be made employable. They need to be able to become wealthy, because that’s how our society understands, recognizes, and rewards worth. We pay lip service, but little else, to other standards of value.

The Sarah D. Barder Fellowship organization to which I also belong is a branch of the Johns Hopkins University Center for Talented Youth. It’s devoted to gifted and highly gifted education. At their annual conference they continue to push for skills, chiefly in the scientific and technical areas, to make our students competitive in the emergent job market. The highly gifted ought to be highly employable and hence earn high incomes. That’s what it means, isn’t it?

The politicians of both parties have contrived to disagree about almost everything, but they seem to agree about this. In January of 2014, President Barack Obama commented, “…I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree. Now, nothing wrong with an art history degree — I love art history. So I don’t want to get a bunch of emails from everybody. I’m just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.”

From the other side of the aisle, Florida Governor Rick Scott said, “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so.”

They’re both, of course, right. The problem isn’t that they have come up with the wrong answer. It isn’t even that they’re asking the wrong question. It’s that they’re asking only one of several relevant questions. They have drawn entirely correct conclusions from their premises. A well-trained plumber with a twelfth-grade education (or less) can make more money than I ever will as a Ph.D. That has been obvious for some time now. If I needed any reminding, the last time we required a plumber’s service, the point was amply reinforced: the two of them walked away in a day with about what I make in a month. It’s true, too, that a supply of anthropologists is not, on the face of things, serving the “compelling interests” of the state of Florida (or any other state, probably). In all fairness, President Obama said that he wasn’t talking about the value of art history as such, but merely its value in the job market. All the same, that he was dealing with the job market as the chief index of an education’s value is symptomatic of our culture’s expectations about education and its understanding of what it’s for.

The politicians haven’t created the problem; but they have bought, and are now helping to articulate further, the prevalent assessment of what ends are worth pursuing, and, by sheer repetition and emphasis, crowding the others out. I’m not at all against STEM subjects, nor am I against technologically competent workers. I use and enjoy technology. I am not intimidated by it. I teach online. I’ve been using the Internet for twenty-odd years. I buy a fantastic range of products online. I programmed the chat software I use to teach Latin and Greek, using PHP, JavaScript, and mySQL. I’m a registered Apple Developer. I think every literate person should know not only some Latin and Greek, but also some algebra and geometry. I even think, when going through Thucydides’ description of how the Plataeans determined the height of the wall the Thebans had built around their city, “This would be so much easier if they just applied a little trigonometry.” Everyone should know how to program a computer. Those are all good things, and help us understand the world we’re living in, whether we use them for work or not.

But they are not all that we need to know. So before you quietly determine that what I’m offering is just irrelevant, allow me to bring some news from the past. If that sounds contradictory, bear in mind that it’s really the only kind of news there is. All we know about anything at all, we know from the past, whether recent or distant. Everything in the paper or on the radio news is already in the past. Every idea we have has been formulated based on already-accumulated evidence and already-completed ratiocination. We may think we are looking at the future, but we aren’t: we’re at most observing the trends of the recent past and hypothesizing about what the future will be like. What I have to say is news, not because it’s about late-breaking happenings, but because it seems not to be widely known. The unsettling truth is that if we understood the past better and more deeply, we might be less sanguine about trusting the apparent trends of a year or even a decade as predictors of the future. They do not define our course into the infinite future, or even necessarily the short term — be they about job creation, technical developments, or weather patterns. We are no more able to envision the global culture and economy of 2050 than the independent bookseller in 1980 could have predicted that a company named Amazon would put him out of business by 2015.

So here’s my news: if the United States becomes a third-world nation (a distinct possibility), it will not be because of a failure in our technology, or even in our technological education. It will be because, in our headlong pursuit of what glitters, we have forgotten how to differentiate value from price: we have forgotten how to be a free people. Citizenship — not merely in terms of law and government, but the whole spectrum of activities involved in evaluating and making decisions about what kind of people to be, collectively and individually — is not a STEM subject. Our ability to articulate and grasp values, and to make reasoned and well-informed decisions at the polls, in the workplace, and in our families, cannot be transmitted by a simple, repeatable process. Nor can achievement in citizenship be assessed simply, or, in the short term, accurately at all. The successes and failures of the polity as a whole, and of the citizens individually, will remain for the next generation to identify and evaluate — if we have left them tools equal to the task. Our human achievement cannot be measured by lines of code, by units of product off the assembly line, or by GNP. Our competence in the business of being human cannot be certified like competence in Java or Oracle (or, for that matter, plumbing). Even a success does not necessarily hold out much prospect of employment or material advantage, because that was never what it was about in the first place. It offers only the elusive hope that we will have spent our stock of days with meaning — measured not by our net worth when we die, but by what we have contributed when we’re alive. The questions we encounter in this arena are not new ones, but rather old ones. If we lose sight of them, however, we will have left every child behind, for technocracy can offer nothing to redirect our attention to what matters.

Is learning this material of compelling interest to the state? That depends on what you think the state is. The state as a bureaucratic organism is capable of getting along just fine with drones that don’t ask any inconvenient questions. We’re already well on the way to achieving that kind of state. Noam Chomsky, ever a firebrand and not a man with whom I invariably agree, trenchantly pointed out, “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.” He’s right. If we are to become unfree people, it will be because we gave our freedom away in exchange for material security or some other ephemeral reward — an illusion of safety and welfare, and those same jobs that President Obama and Governor Scott have tacitly accepted as the chief — or perhaps the only — real objects of our educational system. Whatever lies outside that narrow band of approved material is an object of ridicule.

If the state is the people who make it up, the question is subtly but massively different. Real education may not be in the compelling interest of the state qua state, but it is in the compelling interest of the people. It’s the unique and unfathomably complex amalgam that each person forges out of personal reflection, of coming to understand one’s place in the family, in the nation, and in the world. It is not primarily practical, and we should eschew it altogether, if our highest goal were merely to get along materially. The only reason to value it is the belief that there is some meaning to life beyond one’s bank balance and material comfort. I cannot prove that there is, and the vocabulary of the market has done its best to be rid of the idea. But I will cling to it while I live, because I think it’s what makes that life worthwhile.

Technical skills — job skills of any sort — are means, among others, to the well-lived life. They are even useful means in their place, and everyone should become as competent as possible. But as they are means, they are definitionally not ends in themselves. They can be mistakenly viewed as ends in themselves, and sold to the credulous as such, but the traffic is fraudulent, and it corrupts the good that is being conveyed. Wherever that sale is going on, it’s because the real ends are being quietly bought up by those with the power to keep them out of our view in their own interest.

Approximately 1900 years ago, Tacitus wrote of a sea change in another civilization that had happened not by cataclysm but through inattention to what really mattered. Describing the state of Rome at the end of the reign of Augustus, he wrote: “At home all was calm. The officials carried the old names; the younger men had been born after the victory of Actium; most even of the elder generation, during the civil wars; few indeed were left who had seen the Republic. It was thus an altered world, and of the old, unspoilt Roman character not a trace lingered.” It takes but a single generation to forget the work of ages.

But perhaps that’s an old story, and terribly out of date. I teach Latin, Greek, literature, and history, after all.

Computer Programming as a Liberal Art

Monday, September 3rd, 2012

One of the college majors most widely pursued these days is computer science. This is largely because it’s generally seen as a ticket into a difficult and parsimonious job market. Specific computer skills are demonstrably marketable: one need merely review the help wanted section of almost any newspaper to see just how particular those demands are.

As a field of study, in other words, its value is generally seen entirely in terms of employability. It’s about training, rather than about education. Just to be clear: by “education”, I mean something that has to do with forming a person as a whole, rather just preparing him or her for a given job, which I generally refer to as “training”. If one wants to become somewhat Aristotelian and Dantean, it’s at least partly a distinction between essence and function. (That these two are inter-related is relevant, I think, to what follows.) One sign of the distinction, however, is that if things evolve sufficiently, one’s former training may become irrelevant, and one may need to be retrained for some other task or set of tasks. Education, on the other hand, is cumulative. Nothing is ever entirely lost or wasted; each thing we learn provides us with a new set of eyes, so to speak, with which to view the next thing. In a broad and somewhat simplistic reduction, training teaches you how to do, while education teaches you how to think.

One of the implications of that, I suppose, is that the distinction between education and training has largely to do with how one approaches it. What is training for one person may well be education for another. In fact, in the real world, probably these two things don’t actually appear unmixed. Life being what it is, and given that God has a sense of humor, what was training at one time may, on reflection, turn into something more like education. That’s all fine. Neither education nor training is a bad thing, and one needs both in the course of a well-balanced life. And though keeping the two distinct may be of considerable practical value, we must also acknowledge that the line is blurry. Whatever one takes in an educational mode will probably produce an educational effect, even if it’s something normally considered to be training. If this distinction seems a bit like C. S. Lewis’s distinction between “using” and “receiving”, articulated in his An Experiment in Criticism, that’s probably not accidental. Lewis’s argument there has gone a long way toward forming how I look at such things.

Having laid that groundwork, therefore, I’d like to talk a bit about computer programming as a liberal art. Anyone who knows me or knows much about me knows that I’m not really a programmer by profession, and that the mathematical studies were not my strong suit in high school or college (though I’ve since come to make peace with them).

Programming is obviously not one of the original liberal arts. Then again, neither are most of the things we study under today’s “liberal arts” heading. The original liberal arts included seven: grammar, dialectic, and rhetoric — all of which were about cultivating precise expression (and which were effectively a kind of training for ancient legal processes), and arithmetic, geometry, music, and astronomy. Those last four were all mathematical disciplines: both music and astronomy bore virtually no relation to what is taught today under those rubrics. Music was not about pavanes or symphonies or improvisational jazz: it was about divisions of vibrating strings into equal sections, and the harmonies thereby generated. Astronomy was similarly not about celestial atmospheres or planetary gravitation, but about proportions and periodicity in the heavens, and the placement of planets on epicycles. Kepler managed to dispense with epicycles, which are now of chiefly historical interest.

In keeping with the spirit, if not the letter, of that original categorization, we’ve come to apply the term “liberal arts” today to almost any discipline that is pursued for its own sake — or at least not for the sake of any immediate material or financial advantage. Art, literature, drama, and music (of the pavane-symphony-jazz sort) are all considered liberal arts largely because they have no immediate practical application to the job of surviving in the world. That’s okay, as long as we know what we’re doing, and realize that it’s not quite the same thing.

While today’s economic life in the “information age” is largely driven by computers, and there are job openings for those with the right set of skills and certifications, I would suggest that computer programming does have a place in the education of a free and adaptable person in the modern world, irrespective of whether it has any direct or immediate job applicability.

I first encountered computer programming (in a practical sense) when I was in graduate school in classics. At the time (when we got our first computer, an Osborne I with 64K of memory and two drives with 92K capacity each), there was virtually nothing to do with classics that was going to be aided a great deal by computers or programming, other than using the very basic word processor to produce papers. That was indeed useful — but had nothing to do with programming from my own perspective. Still, I found Miscrosoft Basic and some of the other tools inviting and intriguing — eventually moving on to Forth, Pascal, C, and even some 8080 Assembler — because they allowed one to envision new things to do, and project ways of doing them.

Programming — originally recreational as it might have been — taught me a number of things that I have come to use at various levels in my own personal and professional life. Even more importantly, though, it has taught me things that are fundamental about the nature of thought and the way I can go about doing anything at all.

Douglas Adams, the author of the Hitchhiker’s Guide books, probably caught its most essential truth in Dirk Gently’s Holistic Detective Agency:

”…if you really want to understand something, the best way is to try and explain it to someone else. That forces you to sort it out in your mind. And the more slow and dim-witted your pupil, the more you have to break things down into more and more simple ideas. And that’s really the essence of programming. By the time you’ve sorted out a complicated idea into little steps that even a stupid machine can deal with, you’ve learned something about it yourself.”

I might add that not only have you yourself learned something about it, but you have, in the process learned something about yourself.

Adams also wrote, “I am rarely happier than when spending entire day programming my computer to perform automatically a task that it would otherwise take me a good ten seconds to do by hand.” This is, of course, one of the drolleries about programming. The hidden benefit is that, once perfected, that tool, whatever it was, allows one to save ten seconds every time it is run. If one judges things and their needs rightly, one might be able to save ten seconds a few hundred thousand or even a few million times. At that point, the time spent on programming the tool will not merely save time, but may make possible things that simply could never have been done otherwise.

One occasionally hears it said that a good programmer is a lazy programmer. That’s not strictly true — but the fact is that a really effective programmer is one who would rather do something once, and then have it take over the job of repeating things. A good programmer will use one set of tools to create other tools — and those will increase his or her effective range not two or three times, but often a thousandfold or more. Related to this is the curious phenomenon that a really good programmer is probably worth a few hundred merely adequate ones, in terms of productivity. The market realities haven’t yet caught up with this fact — and it may be that they never will — but it’s an interesting phenomenon.

Not only does programming require one to break things down into very tiny granular steps, but it also encourages one to come up with the simplest way of expressing those things. Economy of expression comes close to the liberal arts of rhetoric and dialectic, in its own way. Something expressed elegantly has a certain intrinsic beauty, even. Non-programmers are often nonplussed when they hear programmers talking about another programmer’s style or the beauty of his or her code — but the phenomenon is as real as the elegance of a Ciceronian period.

Pursuit of elegance and economy in programming also invites us to try looking at things from the other side of the process. When programming an early version of the game of Life for the Osborne, I discovered that by simply inverting a certain algorithm (having each live cell increment the neighbor count of all its adjacent spaces, rather than having each space count its live neighbors) achieved an eight-to-tenfold improvement in performance. Once one has done this kind of thing a few times, one starts to look for such opportunities. They are not all in a programming context.

There are general truths that one can learn from engaging in a larger programming project, too. I’ve come reluctantly to realize over the years that the problem in coming up with a really good computer program is seldom an inability to execute what one envisions: it’s much more likely to be a problem of executing what one hasn’t adequately envisioned in the first place. Not knowing what winning looks like, in other words, makes the game much harder to play. Forming a really clear plan first is going to pay dividends all the way down the line. One can find a few thousand applications for that principle every day, both in the computing world and everywhere else. Rushing into the production of something is almost always a recipe for disaster, a fact explored by Frederick P. Brooks in his brilliantly insightful (and still relevant) 1975 book, The Mythical Man-Month, which documents his own blunders as the head of the IBM System 360 project, and the costly lessons he learned from the process.

One of the virtues of programming as a way of training the mind is that it provides an objective “hard” target. One cannot make merely suggestive remarks to a computer and expect them to be understood. A computer is, in some ways, an objective engine of pure logic, and it is relentless and completely unsympathetic. It will do precisely what it’s told to do — no more and no less. Barring actual mechanical failure, it will do it over and over again exactly the same way. One cannot browbeat or cajole a computer into changing its approach. There’s a practical lesson and probably a moral lesson too there. People can be persuaded; reality just doesn’t work that way — which is probably just as well.

I am certainly not the first to have noted that computer programming can have this kind of function in educational terms. Brian Kernighan — someone known well to the community of Unix and C programmers over the years (he was a major part of the team that invented C and Unix) has argued that it’s precisely that in a New York Times article linked here. Donald Knuth, one of the magisterial figures of the first generation of programming, holds forth on its place as an art, too, here. In 2008, members of the faculties of Williams College and Pomona College (my own alma mater) collaborated on a similar statement available here. Another reflection on computer science and math in a pedagogical context is here. And of course Douglas Hofstadter in 1979 adumbrated some of the more important issues in his delightful and bizarre book, Gödel, Escher, Bach: An Eternal Golden Braid.

Is this all theory and general knowledge? Of course not. What one learns along the line here can be completely practical, too, even in a narrower sense. For me it paid off in ways I could never have envisioned when I was starting out.

When I was finishing my dissertation — an edition of the ninth-century Latin commentary of Claudius, Bishop of Turin, on the Gospel of Matthew — I realized that there was no practical way to produce a page format that would echo what normal classical and mediaeval text editions typically show on a page. Microsoft Word (which was what I was using at the time) supported footnotes — but typically these texts don’t use footnotes. Instead, the variations in manuscript readings are keyed not to footnote marks, but to the line numbers of the original text, and kept in a repository of textual variants at the bottom of the page (what is called in the trade an apparatus criticus). In addition, I wanted to have two further sets of notes at the bottom of the page, one giving the sources of the earlier church fathers that Claudius was quoting, and another giving specifically scriptural citations. I also wanted to mark in the margins where the foliation of the original manuscripts changed. Unsurprisingly, there’s really not a way to get Microsoft Word to do all that for you automatically. But with a bit of Pascal, I was able to write a page formatter that would take a compressed set of notes indicating all these things, and parcel them out to the right parts of the page, in a way that would be consistent with RTF and University Microfilms standards.

When, some years ago, we were setting Scholars Online up as an independent operation, I was able, using Javascript, PHP, and MySQL, to write a chat program that would serve our needs. It’s done pretty well since. It’s robust enough that it hasn’t seriously failed; we now have thousands of chats recorded, supporting various languages, pictures, audio and video files, and so on. I didn’t set out to learn programming to accomplish something like this. It was just what needed to be done.

Recently I had to recast my Latin IV class to correspond to the new AP curriculum definition from the College Board. (While it is not, for several reasons, a certified AP course, I’m using the course definition, on the assumption that a majority of the students will want to take the AP exam.) Among the things I wanted to do was to provide a set of vocabulary quizzes to keep the students ahead of the curve, and reduce the amount of dictionary-thumping they’d have to do en route. Using Lee Butterman’s useful and elegant NoDictionaries site, I was able to get a complete list of the words required for the passages in question from Caesar and Vergil; using a spreadsheet, I was able to sort and re-order these lists so as to catch each word the first time it appeared, and eliminate the repetitions; using regular expressions with a “grep” utility in my programming editor (BBEdit for the Macintosh) I was able to take those lists and format them into GIFT format files for importation into the Moodle, where they will be, I trust, reasonably useful for my students. That took me less than a day for several thousand words — something I probably could not have done otherwise in anything approaching a reasonable amount of time. For none of those tasks did I have any training as such. But the ways of thinking I had learned by doing other programming tasks enabled me to do these here.

Perhaps the real lesson here is that there is probably nothing — however mechanical it may seem to be — that cannot be in some senses used as a basis of education, and no education that cannot yield some practical fruit down the road a ways. That all seems consistent (to me) with the larger divine economy of things.

Learning and teaching…and learning

Monday, February 28th, 2011

When we first started homeschooling our kids, Christe and I generally divided our tasks according to our general areas of relative expertise — she took the more scientific and mathematical subjects, while I dealt with the more humanities-oriented ones, especially those having to do with language. But it didn’t always fall out that way, and sometimes we had occasion to cross those lines.

One of the more surprising and delightful discoveries to emerge from this process was that it offered, on occasion, an opportunity to do right what I hadn’t done terribly well the first time around. My high school math career was not a progress from glory to glory: I did pretty well in geometry, but that experience was sandwiched between twin skirmishes with algebra from which I emerged somewhat bloodied and perhaps prematurely bowed. After algebra/trig, I generally concluded that math was not for me (or that I was not for it) and I set a course that wouldn’t require me to take any more of it. I completely avoided it in college — something that I now rather regret.

But a number of years later, after college and partway through my graduate career, I found myself teaching both geometry and algebra to our kids. It was liberating to have the controlling hand on the algebra, and to realize that once I was able to see the overall rationale behind the subject, it wasn’t so hard. From here it seems painfully obvious that the whole point of algebra is to isolate the key variable for which you are solving, and simplify the expression on the other side of the equal sign as much as possible. This is the invariable task in every algebra problem. Why none of my teachers ever made that clear to me at the time is a mystery to me, though in all honesty, I’m not sure whether my failure to grasp it was their fault or my own. In any case, I’ve actually come to like and appreciate algebra after all these years. Do I use it in my daily work? No, generally not: its intersection with Latin and Greek is fairly slight. But I use it just enough, to solve for all manner of things, that I wouldn’t be without it.

It was not in algebra, however, but in geometry that I encountered my most humbling but exhilarating experiences as a homeschooling dad. We had a copy of the old Jurgensen, Brown, and Jurgensen geometry book — a traditional, solid member of the Dolciani family of texts that many of my generation used in high school. We did not own a teacher’s manual. Most of the problems in the book were reasonably straightforward, once you knew how to tackle geometry in general, and, as I said, I was fairly good at geometrical thinking. A minority of them, however, were considerably less straightforward, and a handful just stopped us — my high-school aged daughter Mary, her mathematically precocious younger brother David, and me — in our tracks. There were a few of them that occupied us for hours over the space of several days, while our progress through the text came to a standstill.

I would be lying if I didn’t admit that I was occasionally afflicted with self-doubt on these occasions. What, I wondered, am I doing to my kids? Don’t they deserve someone with more expertise here? And doubtless in some situations they would have benefited from that expertise. But they did ultimately become very good in geometry anyway, and they learned into the bargain another lesson that I couldn’t have predicted or contrived, but that I wouldn’t trade for anything. We did, I think, eventually come up with a workable solution in each case, but the most important lesson Mary and David got from the experience as a whole was a lesson in gumption. They learned that it was possible to be stuck — not just for them to be stuck, but for us all to be stuck — and still not give up. I wasn’t holding the right answer in a sealed envelope or a crystal box, ready to produce it when I figured they’d evinced enough character or good will. They came to realize that it just wasn’t about them. It was about it — what we were trying to learn and figure out. There was an objective reality out there that was the implacable goal of our efforts. We could get ourselves to the finish line, or we could not; but the finish line wasn’t moving. It wouldn’t come to us, no matter what. It was what it was.

Gumption is a virtue that has largely gone out of fashion of late in educational circles. Concern for self-esteem has in some places eclipsed it, I think, but it’s a bad bargain. Gumption of the sort I’m talking about is rooted in a healthy regard for objective truth, and for the fact that the world as a whole really doesn’t care about our self-esteem. By the same token, real self-esteem comes from measuring oneself up against that objective reality and doing something with it. The value of learning that lesson that cannot be overestimated, and only a genuine appreciation and realization of what it means will turn the passive — perhaps even docile — student into a scholar in the more meaningful sense of the term. I don’t mean a professional academic, necessarily: out of our three kids, only one of them has gone on to pursue formal academics as a career path; many people in professional academics today, moreover, aren’t really scholars in the sense I’m talking about anyway. I mean something else — I’m talking about the cultivation of a bull-terrier mind that won’t take no for an answer or be deterred from finding out. I think all three of our kids got that.

That transformation is not, of course, instantaneous, and in our kids’ case it was not entirely a consequence of wrestling with a few geometry problems. But I think they helped crystallize the process, precisely because it was not a set-up thing, contrived as an object lesson, in which the answer would emerge after one had played the the game for a certain number of hours, or had demonstrated a sufficient degree of effort or frustration. In the real world, the truth is not dispensed, like a treat tossed to a dog who has done a trick. It’s won through struggle. The problems may have been contrived, but our engagement with them was genuine. We realized soon enough that if we didn’t figure the problem out, we wouldn’t get the answer. We worked on these problems together, and we worked on them separately too. In retrospect, I think that the most important thing I was able to do for my kids in homeschooling them was to model my own real response to my own real ignorance.

In a world where there is so much to know, we are surrounded by nothing so much as our own ignorance. It’s with us all the time, and if we don’t confront it honestly, we’re certainly fairly far gone in a pattern of self-delusion. Having a sane attitude toward it, and a way of dealing with it, is essential to overcoming it. The victory over ignorance, however great or small, is never assured: it’s always at stake. Sometimes you just don’t get what you were striving for. There are things we still don’t know, that people have been trying to figure out for a long time. That’s okay. Victory over ignorance is not given as a reward for diligence, but it will seldom be won without hard work. Ultimately the cold fact is that each student must take responsibility for his or her own learning. Nobody else can carry that burden. Nobody — not a parent, not a teacher, not anyone — can learn for you, any more than someone else can eat, sleep, or perform any of the other basic functions of life in your place; neither can you win a race by proxy. The student who grasps that lesson, and is willing to embrace it, despite the lack of assurances, is the one that really stands to make something of education and of life.

Making Sense and Finding Meaning

Sunday, October 4th, 2009

My intermediate and advanced Greek and Latin classes are largely translation-based. There’s a lot of discussion among Latin teachers about whether that’s a good approach, but much of the dispute is, I think, mired in terminological ambiguity, and at least some of the objections to translation classes don’t entirely apply to what we’re doing. What I’m looking for is emphatically not a mechanical translation according to rigid and externally objective rules (“Render the subjunctive with ‘might’,” “Translate the imperfect with the English progressive,” or the like), but rather the expression of the student’s understanding of each sentence as a whole, in the context of the larger discussion or narrative.

We aren’t there to produce publishable translations: that’s an entirely different game, with different rules. For us, translations are the means to an end: the understanding is the real point of the process, but it’s hard to measure understanding unless it’s expressed somehow. The translations, therefore, are like a scaffold surrounding the real edifice — engagement with the text as a whole: its words, its sounds, and its various levels of meaning. That engagement is hard to pin down, but it allows us to make a genuine human connection with the mind of the author. A detached mechanical “translation”, though, is like a scaffold built around nothing, or the new clothes without the emperor. Even were artificial intelligence able to advance to the point that a computer could produce a flawless rendition of a text into another language, it still would not have achieved what is essential. It will not have understood. It will not have savored the words, grasped the concepts, combined them into larger ideas, applied them to new contexts, or come to a meeting of the minds with the author.
This is not always an easy concept for students to grasp. Some are fretful to get exactly the right wording (as if there were such a thing), but apparently less concerned with understanding the essential meaning. At the beginning of the year, I usually have a few students who make the (to me bizarre) claim, “I translated this sentence, but I don’t understand it.” My response is always some variation on, “If you didn’t make sense of it, you didn’t really translate it.”

We talk about making sense of the passage, but even that turn of phrase may be one of the little arrogances of the modern world. The prevalent modern paradigm suggests that the world is without order or meaning unless we impose it; Christianity, however, presupposes a world informed by its Creator with a consistent meaning that we only occasionally perceive. For us, it would probably be more accurate, and certainly more modest, to talk of finding or discovering the sense in the passage.

Whether we call it “making sense” or “finding sense”, though, it is not just the stuff of language classes. Every discipline is ultimately about finding meaning in and through its subject matter. In language and literature we look for the informing thought behind speech and writing. In history we look to understand the whole complex relationship of individuals and groups through time, with their ideas, movements, and circumstances, and what it all meant for them and what it means for us today. The sciences look to find the rationale in the order of the physical universe, mathematics the meaning of pure number and proportion, and philosophy to find the sense of sense itself. Each discipline has its own methods, its own vocabulary, and its own techniques. Each has its own equivalent of the translation exercise, too — something we do not really for its own sake, but to verify that the student has grasped something larger that cannot be measured directly. But behind those differences of method and process, all of them are about engaging with the underlying meaning. All real learning is. (In that respect it differs from training, which is not really about learning as such, but about acquiring known skills. Both learning and training are essential to a well-rounded human being, but they shouldn’t be confused with one another.)

From a secular point of view, this must seem a rather granular exercise with many dead ends. That each thing should have its own limited kind of meaning, unrelated to every other, seems at least aesthetically unsatisfying; it offers us Eliot’s Waste Land: a “heap of broken images”, pointing nowhere. Language is fractured, and our first great gift of articulate speech clogs and becomes useless.

Our faith offers us something else: we were given the power to name creation — to refer to one thing through or with another — as a way of proclaiming the truth of God, surely, but also, I think, as a kind of hint as to how we should view the whole world. Everything, viewed properly, can be a sign. As Paul says in Romans, “For since the creation of the world God’s invisible qualities—his eternal power and divine nature—have been clearly seen, being understood from what has been made, so that men are without excuse” (1:20, NIV); Alanus ab Insulis (1128-1202) wrote, about 1100 years later, “Every creature in the world is like a picture to us, or a mirror.” Signification itself is transformed and transfigured, sub specie aeternitatis, from a set of chaotic references into a kind of tree, in which the signifiers converge, both attesting the unitary truth of the Lord and endowing every created thing in its turn with a holy function.