On Dewdrops and Turning Fifty

Out of curiosity about the affinities between Stoicism and Buddhism, I decided to read Marcus Aurelius’s Meditations over the summer. There are clear and significant differences -- Buddhism, at least the Buddhism of the early scriptures, is a path of reclusive detachment, while Stoicism counsels something more like inner renunciation coupled with wise engagement with society and commitment to the public good. Still, the points of overlap are hard to overlook.

Both are preoccupied with impermanence: “all is ephemeral,” Marcus  writes, advising us to “constantly observe all that comes about through change.” Surveying the ordinary human lot, he regards it as a repetitive cycle of transient preoccupations: “people marrying, having children, falling ill, dying, fighting, feasting, trading, farming, flattering, pushing, suspecting, plotting, praying for the death of others, grumbling at their lot, falling in love, storing up wealth, longing for consulships and kingships” -- with none of this leading to true happiness.

Likewise, the Diamond Sutra, a Mahayana text, compares our experience of this world to “a tiny drop of dew, a bubble in a stream, a flickering lamp, an illusion, a phantom or a dream.” A much earlier scripture, possibly recording actual words of the historical Siddhartha Gautama, puts the matter more bluntly: “Insofar as it disintegrates, it is called the 'world.'"

Having just turned fifty, I’m tempted to say “insofar as I’m disintegrating, I’m called ‘me’.”. Call me disintegration-in-progress: astigmatic eyes, muscle pains, bags and sags in new places. I’m getting to be an old hand at this, except that I’m not: we age slowly in comparison to people in past eras. With increasing life expectancy, the threshold of “old age” shifts upward: a fiftysomething in Marcus Aurelius’s time, when adult life expectancy was around 47.5, could be considered an old man.

Nor can I say that my life has been particularly imbued with flux. The lives of twenty-first century Americans are almost fantastically stable and serene compared to the historical norm. Through our neglect and shortsightedness, we may be sowing conditions of instability for our descendants, but as of now things are as copacetic as they’ve ever been. Two centuries ago it was unusual not to have lost family members at an early age due to any number of now-tamed diseases. It was unusual for a male not to have had some direct personal contact with warfare. Middle-class Americans today may face unexpected and grave disruptions: cancer and car accidents are among the most likely. But here’s the rub: people in other times and circumstances also faced the likelihood of illness or accident, along with a whole slew of other upheavals which most of us don’t even think about. Given all this, it seems presumptuous, even churlish to complain about the degree of impermanence in my life.

On the other hand, it may be that we’re most obsessed with that which, like the sword of Damocles, can drop onto our heads at any moment. I find it interesting that the sages most closely identified with the study of impermanence were generally born into conditions of relative ease. Siddhartha Gautama was a prince of the Sakya clan, a member of the 1% of his time. When he became a wandering hermit, he left behind a milieu of affluence, comfort and power. Likewise, the early Buddhist scriptures repeatedly refer to “sons of noble family” as potential disciples. Many of the Stoic philosophers were either in the top social strata or on their way there; Marcus Aurelius, whose family owned brick and tile factories, became emperor.

It’s as though these individuals recognized that they enjoyed what most people take as benchmarks for happiness -- wealth, prestige, power -- and had the insight to see that the fundamental problem remained unsolved. They understood that the stability they enjoyed was precarious and that catastrophic upheaval was being deferred, not cancelled. Modern consumer society presents a similar problem: it promises the optimization of happiness, but a perfect happiness would, be definition, not be ephemeral and therefore can’t be realized without somehow transcending the problem of impermanence. Failing that, the effort will at some point yield diminishing returns. And meanwhile we’re killing the planet.

I haven’t come across many -- or any, really -- practicing Stoics. Buddhism is another matter. Not only can we find Americans who have aligned themselves with the established Buddhist traditions, but Buddhist influence has extended into psychology and provides a template for many secular self-help movements. Affluence and education surely play a role here: people with means and relative leisure are better able to undertake a meditation practice, go on extended retreat, or attend expensive classes at Kripalu. Another contributing factor is the modern-day shakiness of Christianity, which used to be the go-to answer for anyone grappling with the fleetingness of our lives. The more we understand about science, the more difficulty we have in believing that prayer is effective, or that we can faith our way into an eternal realm where everything and everyone departed will return to greet us. Also, we’re consumers in a market society, and we like choice -- including in the spiritual marketplace.

Should we welcome such multiplicity? It often arouses disdain in those who have gone “all in” with one of the available paths. The really hardcore yogis, for instance, look down on the dabblers and their "dharma lite," just as devout Catholics look down on “cafeteria Catholics” and evangelicals look down on everyone else.  Whether multiplicity is desirable or not, though, it’s probably inevitable: we’re too various and restless a society for any approach to convince everyone, and we’re increasingly too complex -- a product of too many competing influences -- to be purists. We have a proliferation of answers, all aimed at addressing more or less the same unresolved questions.

All this probably for the best. Breakthroughs in most human fields of endeavor -- science, engineering, art -- typically happen when a lot of different people are experimenting with approaches to a problem, and I don’t see why the spiritual/philosophical arena should be any different.

We might conceivably find ways for our species to be at more at ease with the world and less inclined to destroy it. I sure hope so.

Quotations from the Meditations of Marcus Aurelius are from Martin Hammond's translation, published by Penguin in 2006.

Fear and Loathing

Some years ago, I had a conversation with a work colleague who told me we don’t place sufficient trust in our instincts  when it comes to assessing  threats. I'm not so sure. I think we may trust them more than we should.

As Stephen Pinker suggests, “the mind is a system of organs of computation, designed by natural selection to solve the kinds of problems our ancestors faced in their foraging way of life, in particular, understanding and outmaneuvering objects, animals, plants, and other people.”  We do a certain amount of “understanding and outmaneuvering objects, animals, plants, and other people” in our (contemporary) daily lives -- while driving, for instance. Even in those situations, though, an instinctive response is not necessarily the right response. We can kill ourselves by braking at the wrong time, out of instinct. We can create a disaster by speeding after a driver who has enraged us, chasing down this adversary like our ancestors might chase a raider from another tribe.

So while my conversation partner was right in observing that we have evolved a potent mechanism for perceiving and responding to danger, it’s probably best suited to a kind of environment that humans knew in the distant past. In a modern, urbanized setting it frequently sends out false alarms.

Pulp ficIt’s bad enough when instinct misleads us in everyday life; even worse when it distorts our view of social or political problems. For example, our perception of crime differs from the reality: over the past quarter century, it’s dropped steadily. Yet most Americans believe crime levels are going up, not down. Newt Gingrich, on CNN recently, said that while the data "theoretically may be right...it’s not where human beings are.” Indeed, when perception differs from fact, most people will go with perception. Multiple studies, moreover, have found that being presented with contradictory evidence rarely impacts opinion.  On the contrary, researchers at the University of Michigan found that "when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs."

Another example: we dread terrorism, but overlook other threats which are equally capable of killing us. We've evolved to be somewhat indifferent to risks arising from nature, while being extremely sensitive to risks posed by an identifiable human enemy -- militant jihadists, for instance.  Because emotion is thought to be a major driver in risk perception, it's no surprise that we have little to zero tolerance for the risk posed by Al-Qaeda and ISIS, even though, on average, we’re more likely to be crushed by furniture than killed by a terrorist. The 9/11 attacks were a national trauma that most Americans beyond a certain age threshold experienced, either directly or indirectly (that is, via the media), and that we associate with powerful emotions of horror, grief and fear.

As a result, we make calculation errors that inadvertently sustain terrorism, by demonstrating its effectiveness in playing off the human psyche. Al Qaeda goaded the United States into a costly and inconclusive war.  ISIS and its adherents could, if we let them, have the power to sway an election (Hillary Clinton's declining poll numbers in June and July correlate with attacks in Orlando and Nice, though other factors were also in play). Terrorism is a method of influencing policy without the benefit of a large military or geopolitical clout, that is, by taking advantage of known idiosyncrasies in the way we perceive risk.

We're stuck, it seems, with a flawed toolkit, and we can't simply exchange it for another; it's not easily removable. Almost everyone will, in certain circumstances, go on gut impulse even when presented with contradictory evidence. I doubt that many of us are able to be consistently rational in all situations, and that probably wouldn't be desirable anyway: for one thing, it would make movies and theme parks a lot less fun. We can and should, however, check our intuitive responses against the available data and learn to recognize false flags.

In more situations than we’d probably care to admit, “trust your instincts” is really bad advice.

The Socioeconomics of Snow Days

Snowzilla, the Miller B-type storm that clobbered the Mid-Atlantic in late January, caused schools in my county to close for a week. That shouldn't have surprised anyone who has experienced more than one severe weather event in this area. In 2010, our kids had what amounted to a second winter break, with the HCPSS offering more or less the same reasons: it takes time to clear out the parking lots, lingering ice endangers children at bus stops, and students in rural parts of the county have to trudge a long way to school.

It shouldn't be a surprise, either, that frustrated working parents began to kvetch. During the aftermath of any snow event, there's a period of time during which everyone agrees that the schools should be closed. Then, as the driveways get dug out and the roads cleared, dissension begins to manifest. This started to happen around Wednesday. By Friday -- seven days after the onset of the storm -- the irritation index was high. Some took to the school system's Facebook page, only to be punched back by defenders of the closures.

 "Not all of us 1) stay at home, 2) have relatives who can help out or 3) work [that] allows us to have make-shift take your child to work day," wrote one parent, Renee Kamen. "Not all of the residents in HoCo are privileged and [able to] just take off of work and it shouldn't be the assumption." 

The opposing camp fired back, piling on the indignation. Schools are not daycare centers. The HCPSS knows whether it's safe out there, and you don't. There's still ice on the roads, dammit -- a kid could slip and fall. Don't you care about safety? What kind of parent are you, anyway?

The attempted emotional blackmail didn't faze Shannon Drury, another closure skeptic. "If we are going to wait to send students back to school until we can be guaranteed no child will be injured, we will never be sending students back to school," she pointed out.

Things took a creepy, passive-aggressive turn when some posters hinted at a wish for the children of school-criticizing parents to suffer accidents, so that these misguided folk would see the light.

I side with the misguided, personally -- if only because the other set is piling on the sanctimony and dishonesty, dismissing valid concerns, and refusing to see that this debate is about socioeconomic privilege. For the record, the extended closures are not a big problem for my family. It's good to see the kids out of doors and offline. My wife and I both do work that can be done at home, and we work for employers that make reasonable accommodations. We're lucky.

We know people who are even luckier, like a former neighbor -- a civil servant -- who is only required to be physically present at his office a few times a month. Pretty sweet deal, I say.

For some, snow days are no worry at all. They're an opportunity for hot chocolate, family memories and Facebook photo ops.

But what if you work in retail? What if you're at a company that regularly threatens its workers with layoffs? What if your work isn't transportable, and an unplanned week off the job leads to missed deadlines? What if your employer is stingy about general leave? What if you used it up already on doctor's appointments or school system "professional work days"?

Flexibility, in our society, is a perk. Just as some people work at organizations that offer excellent health care coverage, an on-site gym, or subsidized classes, some people also get the right not to feel anxious about snow days. Unfortunately, when we don't feel anxious about something, we often lose the capacity to understand others' anxiety. I see a lot of that going on in the Facebook comments.

If extended school closures aren't a headache for you, it probably means you're a stay-at-home parent, married to a stay-at-home parent, or employed at a job that lets you take leave on short notice or allows you to work at home.

And if you're snide and dismissive when someone speaks up on behalf of the less privileged, it's not because you're taking a brave stand on behalf of child safety.

You're just gloating.





Ulysses With Smartphone

For more than a quarter century, Joyce's Ulysses has been at or near the top of my literary bucket list.

I gave it a half-hearted shot during a hitchhike across Ireland during my twenties. Inspired by seeing the Martello Tower -- the outside of it, at least; the Joyce Museum was closed -- I managed about five or six pages. The tome mostly served to add extra weight to the backpack.

Over the last couple years, I've felt that maybe I'm now focused enough to succeed where my younger self failed. (My younger self was more interested in crawling Irish pubs). Also, my brother-in-law read it and has spoken to me on several occasions of his love for this book.  And he isn't even a current or former English major. I have a graduate degree in English but have not read Ulysses, one of the defining literary works of the modern era. There's less Someday available than there used to be. It's time.

In late May, I bought a copy of the Vintage paperback edition and set a goal of 10 pages a day, which I've been able to keep (and sometimes exceed). So far, it's lived up to its reputation for being worth the trouble: it has the linguistic virtuosity that I love in Shakespeare, and the eye for human ridiculousness that I love in Monty Python. It's by turns earthy, cerebral, satirical and compassionate; high points for me include the "Hades" episode, which details Leopold Bloom's journey to the funeral of one Paddy Dingnam, and the "Wandering Rocks" episode, a tour-de-force involving multiple points of view.

And yes, it's difficult -- although it presents varying kinds of difficulty. Often it's a matter of Joyce's methodology; he ditches a lot of conventional narrative glue in favor of jump-cuts, sudden shifts between interior monologue and exterior happening, free-associating banter among the characters, and so on. This requires more attention of me as a reader. In return for that closer attention, there's a sense of immediacy and direct access to the characters' experience of the world.

Another kind of difficulty in Ulysses arises from references and allusions, of which there are many -- particularly in the episodes centered on Stephen Dedalus, Joyce's fictional alter ego. If someone were to conduct a survey of readers who gave up on Ulysses, I wouldn't be surprised if the majority said they threw in the towel somewhere around page 36 or 37. This is the "Proteus" episode, which follows Stephen's trains of thought as he walks along Sandymount Strand. It's one philosophical, literary, cultural or political allusion after another, and making any sense out of the episode depends on understanding these allusions.  Without being a Jesuit-educated, early 20th-century Irish intellectual, that's not so likely.

UlyssesLuckily, we have Google. And not only do we have Google, we have browser-equipped smartphones.  For me, then, the experience of reading Ulysses has often involved alternating between a bulky, printed book and a considerably lighter electronic device that serves as a dictionary/encyclopedia/study guide. Thanks to mobile technology, I now know that Mananaan is the Irish god of the sea, and that some accounts have the Virgin Mary claiming she was impregnated by a pigeon.

Google's also a help in navigating the novel's vocabulary, which includes numerous visitors from the hinterlands of English. Do you own a gamp?  Are those birds over there rufous? How does one learn to walk in chopines? Not knowing these words makes me feel like a bosthoon

Sometimes I get the sense that Joyce had in mind an ideal reader, one who understands Irish history, Catholic theology, Greek and Gaelic mythologies, and the landmarks of Dublin well enough to get all the references.  With Google at hand, I can be that reader.

Tweenhood With Zombies

During my late childhood and tween years I often puzzled over a difficult Question. It was an especially perplexing one because I couldn't figure out how to articulate it correctly. I'm aware now that this hard-to-define Question was actually a combination of questions, and that the central one had to do with what philosophers of mind term qualia.

If I'd had the right terminology at hand, I might have asked: why is there a sense of subjective, conscious experience?  And why do have such a sense; why is it something that is instantiated in the form of an Self? Does everyone else have it too? This third question bothered me especially. It's difficult to prove that anyone else is having a subjective experience, because such experience is, well, subjective. Making statements about it requires knowledge of what's going on inside another person's head.

And even observing the contents of someone's head isn't enough: one could subject another person to an MRI and still be unable to detect whether any qualia were happening in there.

What if I'd been born into a world of zombies? Or suppose this was a world that is partly inhabited by zombies, and partly inhabited by sentient beings who experience qualia? How could we distinguish one from the other? There seems to be no way to logically rule out the possibility that zombies are moving among us. Indeed, a now-famous thought experiment by David Chalmers invites us to conceive of zombie twin: "someone or something physically identical to me (or to any the conscious being, but lacking conscious experiences altogether."

The creature is molecule for molecule identical to me, and identical in all the low-level properties postulated by a completed physics, but he lacks conscious experience entirely....he will be awake, able to report the contents of his internal states, able to focus attention in various places, and so on. It's just that none of this functioning will be accompanied by any real conscious experience. There will be no phenomenal feel. There is nothing it is like to be a zombie. (The Conscious Mind, 94-95)

Chalmers's purpose wasn't to suggest the existence of a zombie population, but to show the difficulty of explaining qualia in physical terms. If we can conceive of a zombie world, though, what's to prevent our actual world from being populated, to some degree, by zombies? How can we know?

Empathy is one way of establishing the reality of other people's subjective experience; humans generally have the capacity to "understand another person's condition from their perspective," to use Psychology Today's definition of the term. The possibility of zombies probably bothered me more at a younger age because younger people are still building their capacity for empathy, some more quickly than others (I was a slow learner). Very small children don't have the capacity at all; they are unabashedly narcissistic.

It's true that empathy is not always reliable; some people know how to fake and manipulate. In theory, a robot could mimic human body language, facial expressions, and vocal inflections so closely that we'd begin to empathize with it. We sometimes respond to the wrong cues: a dolphin, for example, is not really smiling. Its jaws are just shaped that way.

All that aside, though, empathy is reliable at least some of the time. That gives us enough to work with. We can recognize what others are going through; we can relate their experiences to ours. And empathy is not merely a process by which psychological functions in one person trigger a psychological response in another; a non-empathetic psychopath is capable of responding in this way. Rather, empathy involves the recognition that others are conscious beings. We can infer that not only are they having experiences, but that they have the experience of subjectivity -- we're all pinging the network: "here I am, here I am."

Another clue to the prevalence of qualia is the fact that people have philosophical discussions about the topic. If they weren't common to humans, we would have no reason to engage in such conversations. It would be like having late night talks on the subject of zzkvgt or brgrwatsq. (That can happen, after enough booze).

So, for  me, the clincher is that the question has been defined and asked. It means others have wondered about it, intensely enough to figure out how to express it precisely. Whatever qualia actually are -- epiphenomena of the brain, the result of "psychophysical laws," or some kind of non-material property -- we can be sure others have them. Most of us, anyhow. I think.

The Historical Seventies

Free to beIf you’re involved in raising a daughter in the United States, you’ll have heard of American Girls. And if you’ve heard of American Girls, you’re probably aware that there are Historical girls and Girls of the Year.

Each is a doll and also a character in a series of accompanying storybooks. But Girls of the Year are only available for one year, while their Historical counterparts stick around until the company decides to retire them.

I was mildly surprised to find out that one of the Historical girls -- Julie Albright -- has the same birth year as me. Julie (the character) and I both turned 48 this year, though she preceded me  by a few months: she’s a Taurus and I’m a Leo. I grew up in the interval between Vietnam and the fall of the Berlin Wall, between the Summer of Love and the Apple IIe, between the Beatles and Wang Chung.

And so did Julie, whose name peaked in popularity in 1971, when it was #10 in the United States. (It is still in the top hundred. My name, Robert, peaked between 1920 and 1940 but was still in the top five circa 1971).

It feels odd to think of that era as “historical”. The term usually reminds me of dressed-up town centers or places like Colonial Williamsburg. Yes, significant history took place during the 1970s -- Watergate, busing, recession, the oil embargo, the non-ratification of the Equal Rights Amendment, the bicentennial, the death of Elvis Presley. But history is being made all the time; it is not quite the same thing as being “historical.”

Partly I’m surprised because I don’t think of life during the 1970s as being eye-poppingly different from life today. Families went on vacations in station wagons rather than minivans, but these are just permutations on a theme. Kids stocked up on yo-yos, bouncy balls and collectible cards, as they do now. Pokemon didn’t exist, or the internet. In place of the internet, we had metal lunchboxes, preferably with Marvel Comics characters on them. There you go: the 1970s were the time before internet. But so were the sixties, fifties, forties and every other era since the origins of time.

I asked my seven-year-old, Heidi, what she thinks about the times described in the Julie books and in what ways they seem different from hers.

Although she took note of Julie’s cassette tape recorder, record collection and other obvious date-stamps, it wasn't the technological rewind that made the strongest impression. She said Julie’s childhood seemed “exciting, because she had lots of challenges.” In particular, she added, schools back then didn’t let girls play certain sports.

This is part of the main storyline of the first Julie book -- she’s great at basketball, but Coach Manley doesn’t want girls on the team. I won’t give away the ending, except to say that it involves Title IX.

Advocacy is a theme in the later installments, too. Fifth-grader Julie defies the status quo and runs for student council president, a post usually reserved for sixth-grade boys. She helps a deaf friend stand up to teasers and tormenters, and works to end an unfair detention system at her school. And all of this while adjusting to her parents’ divorce. If I’m making Julie sound like a Girl Power stick figure, I’m not being fair: the writer of the series has created a complex character. She has moments of doubt and despair, but is stubborn enough to keep pushing.

So there’s one angle on the seventies: an era when entrenched attitudes still kept women from realizing their potential, though things were starting to change thanks to organized campaigns, legal action and something called Feminism. That’s true about the times I remember, and in that sense they were certainly historic.

Still not sure why they’re “historical,” though -- the fight’s still on.

Unclaimed Time

Did you have a purpose-driven childhood? I didn’t.

A good portion of my pre-adolescence was spent doing nothing -- that is, nothing of particular benefit to my future college or job prospects. I drew maps of imaginary cities, hammered nails into blocks of wood, or went around to the back fence and pulled out honeysuckle pistils, hoping for a fat bead of nectar.


I also rode my bike a lot: up to "the world's oldest continually operating airport", where I could watch small planes take off and land. Or over to the post office, where the change machine sometimes malfunctioned and would give you back six quarters for each quarter you put in.


My sisters had their own ways of squandering time, though I’d be hard pressed to say what those were. They did their thing and I mine, though we’d team up when the back yard iced up in winter, enabling us to shoe-skate. Not infrequently, we squabbled and teased and whined about being bored. An undertone of monotony could be detected even when we were having fun. It was like TV hum or air conditioner noise: just part of the way things were.

Bike crosshatchWe did some of the other kind of stuff, structured stuff, too. We had music lessons, played sports, learned how to swim. And of course there was school. I notice, though, that I've filed my memories of such organized, purposeful, parent-selected experiences under a separate category: I think of them as activities that occurred during childhood, but not so much as “my childhood."

What I think of as “my childhood” took place during the other kinds of time, the unstructured intervals. Since such time wasn’t claimed by others, the ownership defaulted to me.

Ian Frazier, in his 1998 essay “A Lovely Kind of Lower Purpose,” writes about what he calls marginal activities, that is, ones that aren’t yoked to a program or plan. For a boy growing up in semirural Ohio, that meant time in the woods: constructing forts, picking berries, shooting frogs with slingshots. Time-tested fooling around, as he terms it.

“Especially as the world gets more jammed up, we need margins,” he writes. “A book without margins is impossible to read. And marginal behavior can be the most important kind. Every purpose-filled activity we pursue in the woods began as just fooling around. The first person to ride his bicycle down a mountain trail was doing a decidedly marginal thing. The margin is where you can try out odd ideas that you might be afraid to admit to with people looking on.”

Frazier connects the disappearance of margins to exoburbization and sprawl, and he’s probably right -- though maybe it's not the infrastructure itself that's at fault. Children bring a certain purposelessness to whatever they encounter; they tend to view things as a matrix of possibilities, rather than in terms of their designated functions. So anything from a mailbox to the sidewalk to the crawl space under a porch can be appropriated for their needs, which are the needs of the imagination.

The trouble, I think, stems more from the urge to optimize everything: goods, services, childhoods. The exoburb makes this easy to do: practically any capability we want our kids to acquire is on offer within reasonable driving distance. We can select from drop-down menus of possible friends, skills and pursuits.

More often than not, a kid in the exoburbs knows how to skate because at some point his or her parents said: “Would you like to learn skating? We can sign you up for Saturday mornings."

But kids in another kind of environment take up skating because, well, there’s a pond that freezes over in winter, and the other kids are doing it, and what the frack else is there to do?

This isn’t a tirade against plans, structure or skating lessons, although it may sound like one. It’s just that structured time has a way of devouring the other kind; the impetus to optimize and maximize can make us see absence of structure the way a developer sees open land.

Kids, thankfully, will push back, demanding the right to keep some of their life off-the-clock, uncalibrated, not mapped to this or that desired outcome. True, often what they want instead is quality time with YouTube. But not always.


One of the things I'm pretty sure our kids will miss about the old house, the one we've just moved from, is the marginal space nearby -- a common area that is part of the nine-house community, but hasn’t had anything built on it. It consists of knee-high weeds, a ditch that rain sometimes turns into the tiniest of creeks, and an adjacent tree grove.


After school, if the kids and their neighborhood friends weren’t at home, or playing around the cul-de-sac, it usually meant they were there, in that use-free space.

I’d putter around in our front yard, meanwhile, pretending to do something purposeful -- rake leaves, plant grass. Every now and then one of the kids would show up asking for string or duct tape or a cardboard box.

I never had a clear idea of what they needed it for, or what they were doing with those stretches of unaccounted-for time. Whatever it was, though, it was theirs.