Elvis on My Elbow, Dylan on My Calf: Tattoos

Some time ago, I decided to get a tattoo.

There was a time when a statement like that might have inspired anything from a raised eyebrow to a rueful shake of the head to an incredulous gasp, but I confess I have no idea when that time would be. Most likely it wasn’t even within my lifetime. Tattoos are so ubiquitous today as to be something a little worse than banal — they’re predictable. In the suburbs, it’s tramp stamps and tasteful ankle and shoulder decorations; in Chicago, where I live, half-sleeves are apparently the minimum in order to get hired in any restaurant, bar or Apple retail store. Any overtones of rebellion or non-conformity that tattoos might have had are long gone. For a substantial portion of my age group, getting inked is simply an ordinary aspect of becoming an adult, about as out-there as getting a passport.

Back in the ’80s, George Carlin complained that wearing an earring had been drained of all its revolutionary impact: “It was supposed to piss off the squares. The squares are wearing them now!” Likewise, whereas getting tattooed once (literally) branded you as belonging to a group situated a marked distance from mainstream society, today it means almost the opposite, a necessary signifier of a certain urbane, would-be sophistication. It’s strange to think that something as radical as painting your own skin would become common enough to carry a faint whiff of conformity.

This has always presented something of a dilemma for me. I am a non-conformist of the quiet type, meaning I don’t have the balls to chuck my nine-to-five job and become a freegan with a vegetable-oil-powered van, but I do take a quiet pleasure in steering clear of the most egregious fads. Tattooing has reached the point of cultural saturation where my contrariness reflex normally kicks in. I ought to hate the whole idea of it. The proliferation of tattoos today, hundreds and hundreds of them everywhere I go, bothers me. I don’t want to be like these people (even though I probably already am, in more ways than I care to admit) … but then, I don’t want to be like any people. Continue reading

How Hot It Was, How Hot

Summers didn’t used to be like this. As a kid in the suburbs, I would run from the coolness of our front porch into a street warmed to perfection by a sun that never seemed to overstep its bounds. Temperatures tended to hover in the low eighties most days, with the occasional bursts of rain or heat. There were breezes that actually felt refreshing, rather than like blasts from an oven.

This is not nostalgia for the past. A lot of things sucked back then. But in general, the weather wasn’t one of them.

I know this because the kind of hellishly hot days we now experience all the time in Chicago — days when the air lays on you like a searing blanket, thickening your breath, impossible to ignore — were the days I looked forward to as a kid, and there weren’t all that many of them. Hot days were swimming days. When I was young, my neighbors had a pool; later on, we had one. I was never one of those kids who could swim in any kind of weather. Eighty-two degrees is a perfect summer day, but too cool for swimming, at least for a skinny kid with a poorer-than-average ability to regulate his own body temperature. When the weatherman forecast a day in the nineties, it was a treat — it meant there was definitely going to be swimming somewhere, and on a warm day like that, there was no reason to get out of the pool, save for the occasional food and bathroom break. (There is nothing quite like the mix of pleasure and icy agony of going into an air-conditioned house wearing a wet bathing suit on a ninety-plus-degree day.)

Various children of the '80s beating the heat. I'm the kid at center left with the somewhat skeptical look on his face.

I don’t recall precisely when I noticed that this fine homeostatic balance was upset, but I’m pretty sure it was before I had a real understanding of what “global warming” meant. Seemingly all of a sudden, air conditioning went from being an occasional amenity to a vital tool for survival, running for days and even weeks on end. When I began driving, I rarely used my car’s air conditioner: better to feel the wind in my face, I thought, and besides, it’s not that hot out. Now it’s constantly that hot out, and that humid on top of it.

The only consolation we have is that the heat affects pretty much everyone to the same degree. There is a sense, at least on my part, that everyone is in the same boat, dealing with the same back sweat, the same lethargy, the same lingering sense that one should never be outdoors and among strangers while feeling this sticky, smelly and gross. Even with air conditioning, the heat cannot be managed away. It is nature intruding on us coarsely, insisting we give it its due. It’s a little humbling, when all is said and done, and in some lobe of my brain capable of abstract reasoning, I think that’s a good thing. But I still hate the goddamn heat.

The Last Pepsi

We were a Pepsi household growing up. We bought it in glass bottles, eight to a case, which we had to return to the store once they were empty; I remember riding my bicycle to the store holding a rattling case of empty Pepsi bottles on the handlebars. During the summer, some stores would sell them chilled, but usually the cases came home with us at room temperature and sat on the floor between our refrigerator and cabinet.

I loved it, when I was permitted to have it. My parents were responsible enough not to permit me to feed my soda monkey at will. I could not drink it at dinner, unless the meal was pizza; my mandated beverage at meal times was milk. I could get away with it in the evening, or with an afternoon snack. Gradually, as I came within sight of adulthood, I drank milk less and less, and Pepsi more and more. I went away to college, where no one was around to tell me what I should be drinking with dinner, or lunch, or in between meals.

I have easily drank 10,000 Pepsis in my life; the real number could be half again as high. I drank it out of cans, glass bottles and, when neither of those were available, plastic bottles, and could taste the difference in each container. I figured out just how much ice to put in a glass to chill the liquid without diluting it too much; if it got flat, I threw it away. If I were looking for a place to grab lunch and had no particular taste for anything, I would pick a franchise that served Pepsi over one that didn’t. I didn’t drink it at breakfast, but I drank it pretty much any other time, with every food short of chocolate cake.

And now, to quote Henry Hill, it’s all over. I have been diagnosed with seriously high blood sugar and a severe (and surely not coincidental) sensitivity to cane and corn sugar. I drank my last Pepsi this past Tuesday, May 8, at lunch. Continue reading

I’m Like, I Said

Or, In Defense of a Much-Loathed Linguistic Trend

So I was talking to my boss the other day and I was like, “Does anyone know what they’re doing on this project?” And he was like, “I wish.”

Now, what did I just say there?

People have been lamenting the decline of the verb to say for a surprisingly long time — at least as long as I’ve been around, which is enough. When I was growing up, the culprit was goes:

“So he goes, ‘What are you doing this weekend,’ and I go, ‘Going to a stupid family reunion’.”

I never liked goes very much. As a writerly type, I always felt an obligation to speak properly, whatever that meant, and to not give in to imprecision, trends, laziness or other bad linguistic habits. (That doesn’t mean I correct other people when they do it, but that’s for another post.) In college I took some linguistics courses — well, all of two, but it didn’t take much to change the way I think about language. The thing that struck me most was the distinction linguists make between being descriptive and prescriptive. As far as I had always known, as far as I had ever been taught, the only relevant issues concerning writing, speaking and language related to what you should do. Don’t end a sentence with a preposition. (Actually, it’s OK to do that.) Avoid double negatives. Make positive statements rather than negative ones (“I forgot” versus “I didn’t remember”). It hadn’t really occurred to me that it was possible to take a different stance: that of the impartial observer, dissecting the ways in which people bend and shape the language to suit their needs, just as they’ve been doing ever since they started talking. Continue reading

Welcome to the Hamster Hotel

Reading blurrpy.com earlier this week, I came upon a link to a most wondrous thing. Some design firm called O*GE Creative (the asterisk adds a lovely note of pretension, don’t you find?) created a giant, human-habitable bird nest:

The giant birds’ nest was created “as a prototype for new and inspiring socializing space, which can be seen as a morph of furniture and playground … Ready to to be used, to be played in, and be worked in.” I think it’s a marvelous idea, and one I am certain to have in my house, once I win the lottery and begin establishing my network of seasonal homes across the globe. But a work space? The thought of clambering into this thing with my colleagues to discuss our latest projects gives me the heebies. It would feel way too much like climbing into bed and I really want to stop thinking about it. Besides, I sometimes have a terrible time staying awake in meetings, and nestling into this, well, nest would be like mainlining an Ambien drip straight into my cerebellum, or whatever part of the brain gives me that happy tired feeling at the end of the day.

So I won’t be pushing to have the giant birds’ nest installed in our office anytime soon. But it did remind me of an idea I had a long time ago that I can’t seem to let go of. It concerns hamsters. Continue reading

Did You Ever Have to Remake Up Your Mind?

Or, How to Convert an Atheist in Seven Extremely Difficult Steps

Faith, defined a little too simply, is a belief one holds without evidence. Perhaps that definition sounds somewhat derogatory or appears to contain an implied rebuke. But people of all stripes have beliefs they cling to for no intellectually defensible reason, whether they be common superstitions (“Crime is more prevalent during the full moon” — it isn’t), personal idiosyncrasies (“Something good always happens to me when I wear my lucky sweater”) and even moral or philosophical precepts (“If I make a point of being trusting and kind, others will be encouraged to follow my example”). Most beliefs of this sort are quite harmless, a few are beneficial and the rest are a small price to pay for the freedom to be occasionally irrational. I think it would be a terribly dull world if everyone had a solid empirical basis for everything they did. Besides, I’d probably have to stop buying lottery tickets, and I like having something to fuel my daydreams.

The snag is that a belief held without evidence is also extremely resistant to change. Christopher Hitchens once said that anything that is claimed without evidence can be dismissed without evidence. That’s an intellectually justifiable position, but not a very satisfying one, at least not if you find yourself wrangling with someone whose judgement you otherwise respect about an issue you can’t agree on. Faith beliefs are felt in the gut; they accord with our sense of how the world operates and are the result of influences we are mostly unaware of, from our parents and families to the media messages we’re exposed to every day. Though I defend recreational irrationality, I don’t hold it as justification for never changing your mind. Resistance to evidence is usually rooted in fear: fear of admitting you may be wrong and feeling stupid, fear of having your worldview attacked, fear of having to start at square one in determining just what it is you believe. This kind of fear is unhealthy and ought to be stood up to, at least once in a while. So occasionally I undertake the mental exercise of determining what it would take to change my mind on an issue I care deeply about. Today’s issue: religion.

I am an atheist, and I am an atheist of a particular stripe: I do not believe in a god or gods. That is not the same as saying “there is no god.” The latter is a statement about the nature of reality, the former about one’s own knowledge and the limits thereof; another way of saying it might be “I have seen no evidence of a god.” This distinction is sometimes called “soft atheism” versus “hard atheism” (neither of which are to be confused with agnosticism, an oft-misused word that describes the belief that true knowledge of god’s existence or non-existence is unknowable by human standards). In practical terms, there is not much daylight between the two positions, and holders of either belief/nonbelief would be indistinguishable in how they lived their lives. The only difference is that one has come to a conclusion and the other hasn’t. In the spirit of jiggling a knife into that small chink in the armor of certainty, and in keeping with Carl Sagan’s dictum that “extraordinary claims require extraordinary evidence,” here are the conditions I would require to renounce my atheism and adopt a belief in god. Continue reading