Sunday, August 12, 2007

Reasons why Harry Potter has become a modern classic

“Dark and difficult times lie ahead. Soon we must all face the choice between what is right and what is easy.”—Albus Dumbledore in Harry Potter and the Goblet of Fire.

In the world of flashy media we live in, you’d think getting a kid to read a book would be hard. And, if any book would be hard, you’d think it impossible to get a 10-year-old to voluntarily read an 800-page book.

But J.K. Rowling has made the seemingly impossible downright attractive to not just children, but adults as well.

Okay, I admit it. I’ve read all the Harry Potter books, in order, cover to cover.

But don’t mistake me for a fanatic. I can’t recite chapter and verse of the books. I can’t quote Dumbledore off the top of my head, and I definitely don’t remember what spell does what.

But I do love them.

It’s been such a long time since a set of books came out that became instant classics—and J.K. Rowling’s work definitely qualifies.

So what’s the real attraction?

Magic is always exciting, especially for the younger, more imaginative set. But I don’t really think that’s it. There’s plenty of books that have magical themes.

And the story is good, really good and extremely well-written. But I don’t think that either is entirely responsible for why kids (and adults) can’t get enough of Harry Potter.

So why would millions upon millions of people flock to bookstores and stay up ‘til midnight to get the next installment? Why has this series of books been translated into more languages than I can count?

(By the way, I didn’t stand in line for my copy of the seventh and last book in the series. but I did buy it the next day, and read it in its entirety over a very delightful long afternoon, sitting in front of a fan, eating blueberries, chocolate and ice cream.)

Ironically, it may be because the books, though fantasy in nature, contain many truths, and they don’t make any attempt to sugar-coat bad things.

Obviously, I don’t know that this is consciously in the mind of every child that reads them, but even to my adult mind, the lack of sugar-coating is very refreshing.

Though fantasy, the characters are real. They have flaws, they make mistakes. And they deal with their mistakes the best they can. Just like most good people do in real life.

Also, there is empowerment in the idea that though a situation may be daunting, with friends, skill and perhaps a little luck, even the most dire of circumstances can be dealt with.

These are themes one is quite unlikely to find on television or in the average movie. But it would seem that even though there is a young generation that has never known the world any differently, they still crave more than big media usuually offers.

Today’s world can be scary, especially to the younger set, some of whom cannot remember a time when this country was not at war, or when “terrorist” was not in every American’s everyday vocabulary.

But even if this was not the case, the world can be a scary place. Having some heroes that actually do good and are skilled in an art they had to work for are attractive role models, something this world can always use a few more of. People that won’t bow down in the face of tremendous adversity. People that seek truth, and eschew what is easy for what is right.

And, in Harry Potter, they have been born, and will probably continue to delight and educate children, as well as some adults for decades to come.

Which is a good thing, because every generation has its obstacles and burdens. And, no matter where or when you live, whether here or in Harry Potter’s world of fantasy, there are always dark and difficult times, and we always do choose between what is right and what is easy.

The Harry Potter novels not only remind you of this truth, along with many others that bear remembering, but provide one with a good, long, nearly 800-page opportunity to enjoy a wonderful, adventure—whether you have the blueberries, ice cream and chocolate on hand or not.

See you soon.

Pseudo-‘green’ brings in the other kind of green—money

It seems like you can’t very far these days without hearing something about the environment. That there are real environmental problems being caused human activities, that we are affecting our habitat in a negative way, is not in dispute by anyone who is bothering to look at the facts being presented by scientists around the world.

So, it’s no wonder that phrases like “eco-friendly,” “greener (fill in the blank),” and “sustainable development” are being used to market all sorts of things. And, indeed, using these terms does help the bottom line, or we wouldn’t see them being touted so often.

That phrases like “sustainable development” are successful marketing tools shows that the public is at least somewhat aware of the damage being caused by runaway consumerism and land development, and some want to help lessen the impact (or at least feel less guilty about their own personal contribution to it).

But the more I hear about “greener” things, the more cynical I find myself becoming.

It’s not that I don’t believe in environmental causes—I most fervently do. But I think these phrases are being used to divert attention from the real environmental issue, which is that at its core, our way of life, as it is, is not in any way, shape or form sustainable, and any real sustainable society we create is going to require some major changes in the way we think in this country.

At first that sounds like a very scary prospect, one that brings to mind dreary pictures left over from the ’60s and ’70s of depressing, under-heated and dim dwellings reminiscent of Soviet Russia. (Indeed, this impression may have been purposely left to discourage any real conservation efforts on the part of the public. It may sound paranoid, but it’s been proven that some so-called grass roots efforts have in fact been funded by large corporations, such as petroleum industries, to promote or oppose efforts on those companies’ behalf.)

Other “conservation” efforts may have left the public with the impression that public transportation or bicycles are the only real alternatives to owning a car of one’s own, and both of those options are perceived as being not only inconvenient, but for the lower classes (busses) or health nuts (bicycles).

We are told over and over that new, cleaner technology is a few years down the road. We’ve been told this for decades, but it never seems to materialize. In 1979, then-President Jimmy Carter climbed to the White House roof to show off the building’s new solar energy system. Granted, it did not power everything, but it was a start, not to mention a major show of support. The system supplied hot water to the entire White House until the end of Carter’s term. Upon taking office, Ronald Reagan had it dismantled and taken down. Guess which president had major stock and friends in the petroleum biz.

The technology doesn’t materialize because the effort is not being put into it. I can’t believe we can figure out how to find whole new ways of getting more oil out of shale in just a couple of years, but no one has managed to make and market an efficient solar collector in three decades. The answer lies in motivation—here, shortsighted profits. I have every confidence that if half the effort were put into marketing and developing reliable clean technologies, they would be very profitable—just not necessarily for the oil guys.

But new technology is just a small part of the picture. Our overall consumption and business models that are dependent upon ever-increasing profits, both nearly inextricably linked, have a lot more to do with our situation—and the solution.

First, let’s start with the idea that creating a truly sustainable society will require major sacrifices on everyone’s part, and that our standard of living will be lowered. I guess it depends on what you mean by sacrifices. If you will miss the countless tons of packaging that is created specifically to become trash, well, I suppose that will be a sacrifice. But if you realized how much of the price you pay for any given item is tied up in unnecessary packaging, you might feel differently. And, if you or your child were healthier, and say, didn’t have asthma as a result of all the extra pollution that wasn’t created while not creating all that extra trash (which not only takes up precious space in a landfill, but used natural resources such as petroleum and trees to create, and used other natural resources to transport from its origination point to assembly point to store to your trash bin, where it finally is picked up by and takes up space on a garbage truck that also uses petroleum) you might just decide you’re entirely better off without all those pieces of glitz designed to convince you to buy something whose lifespan was all of 35 seconds, but whose utterly useless remains may be with us for the next 20,000 years.

In many cases, the things we think of as more permanent aren’t much better. Durable goods and household appliances don’t have the longevity they once had, by design no less. The calculation of this started decades ago and was chronicled by Vance Packard, of “The Hidden Persuaders” fame in his aptly titled book, “The Wastemakers.”

You see, the things we so often take for granted in today’s marketplace were actually carefully calculated and engineered decades ago, after World War II. The problem is, what was good for the short term then is definitely not good for the long term now, but we’re told over and over, “This is the way it is,” with the implication that we’re stuck, that there is no other way.

George W. Bush has proclaimed this country is addicted to oil, but it was force-fed to us, along with the concept of never-ending increase in profits and a throwaway society—and now our infrastructure is geared to it. We, the public, are not addicted to oil, but its purveyors are definitely addicted to its ever-increasing profits as it grows scarcer. We, the public, however, are definitely addicted to luxury and the “quick fix.”

And, as long as everyone “needs” a new cell phone and computer every year, with no way of recycling the old ones, as long as everyone “needs” the latest, greatest bit of plastic crap or gizmo or toy that will undoubtedly be outdated by next week, necessitating a completely new one, as long as everyone “needs” a new kitchen every few years instead of every few decades, this will not end.

Overall, the idea of changing our ways for real does not have a very good chance of succeeding if we don’t embrace some of the ideas of our forefathers, who got along very well with what they had. Well enough, in fact, that we call that undefined era as “the good old days.”

Which isn’t to say that we should go backwards, or that everything in the past was rosy. Certainly, it wasn’t. But there are some “old fashioned” ideas that, perhaps, should have never been discarded, and could even be expanded on.

Things that seem small, like the concept of returnable soda bottles, could really make an impact if implemented. Considering that it’s a pretty safe bet that on average there is at least one bottled or canned soft drink consumed per person in the City of Easton per day, in just one local urban municipality, 27,000 containers could be kept out of a landfill daily, not to mention the amount of energy that would be saved by not having to remanufacture or physically recycle those containers, but only wash and sanitize them for reuse.

“Yes, but what about all the different brands?” you ask.

Well, that’s where the problem of infrastructure comes in. But it’s not too difficult to solve, if we just had the motivation. Bottles could always be returned and sorted at their point of origin—in other words, where one bought the product in the first place. Certainly grocery stores would have no problem coming up with gimmicks to go with this and to entice customers to participate. And, then there’s always the idea of generic bottles, with paper or printed labels.

Another concept that shouldn’t have been abandoned is the well-stocked corner grocery store. While the big box grocery stores we have become accustomed to may seem like they offer more, very few really offer a better selection than one could get close to home in the past and actually offer the consumer less convenience, not more.

The big box store you have to drive to is placed for the corporation’s convenience, not yours. While it might seem efficient to have a tractor-trailer have only one central destination and unload, it is only efficient for the supermarket, not the consumer. The consumers all drive individually to the store, racking up the miles at their own expense, and much more gasoline is used than if the one tractor trailer delivered the goods to, say, five smaller locations that were actually located closer to where the consumers lived.

That’s not to say it’s a good idea to have a Wal-Mart every 10 feet (though things seem headed that way anyway), but for something basic that everyone needs, such as food, it seems only logical to make it more immediately accessible. Assuming that one big box grocery serves 10,000 people who drive an average 5 miles to get there, wouldn’t it make more sense to have five grocery stores serving about 2,000 people apiece that only drive a maximum of 2 or 3 miles? The same 10,000 people would collectively drive 20,000 to 30,000 less miles per grocery trip, and collectively save 800 to 1,200 gallons of gas (using an average of 25 miles per gallon) per trip, each gallon of which would have contributed 19 more pounds of greenhouse-gas building carbon dioxide into the air.

These are just a couple of small examples of how we, as a society, are not really putting our money where our mouths are when it comes to the environment. “Greener” may sound good, but if it doesn’t address the issues of waste that got us here in the first place, it’s just another way of fooling you into thinking it’s all okay. It’s not really green at all.

Reporter’s rash deed deserves all the applause it’s getting and more

A few weeks ago, a very curious thing happened on MSNBC. One morning early this month, MSNBC reporter Mika Brzezinski refused, on air, to cover Paris Hilton’s release from jail. Competing for airtime with the ridiculous heiress’ ongoing saga was the bill that proposed a timetable for withdrawal of troops from Iraq having been introduced on the House and Senate floor.
So incensed by the story and her producer’s (and teleprompter’s) refusal to budge on running Hilton as the lead story was Brzezinski, that she first attempted to set the script on fire, then tore and shredded the papers—all on live television.

Her explanation for the deed? “I just don’t believe in covering that story, especially not as the lead story in a newscast when you have a day like today.”

Brzezinski deserves a big round of applause, and she’s been getting it.

That the news is not the news, particularly on the national level, is, well, unfortunately, not news anymore. But it’s really good to see that someone finally said something, even if it’s because they snapped. It proves some folks in national media actually still care.

This story didn’t get a lot of mainstream coverage, and the explanation on the show “Morning Joe,” was fairly short—the incident was only brought back up after it was clear, from the thousands upon thousands of emails cheering Brzezinski’s act.

The incident also quickly made its way to YouTube, where it was popular and well-viewed, generating even more support for Brzezinski.

That the news on TV is not news, but “infotainment,” as I mentioned earlier, is nothing all that new. Television newscaster Edward Murrow prophetically warned of this problem more than 50 years ago, and we’ve been moving at an ever-increasing pace towards nearly no “hard” news. In fact, the problem is so bad I’m personally often reminded of scenes from Ray Bradbury’s “Farhenheit 451” when I watch network news.

Constantly, when the powers that be deign to answer the question “Why is there no more ‘serious’ news?” we get the answer, “Because that doesn’t sell. ‘Infotainment’ does. It is what the public wants.”

This is complete and utter bunk. If nothing else, the droves of emails and intelligently written commentaries on the incident one will find on the Web belie this idea. And, I can personally vouch, as I cover community news, my readers tell me they want more, not less, ‘hard’ news complete with all the details we can muster.

The fact is, while the public is sometimes guilty of working from the lowest common denominator, big money media is far more guilty of force feeding this excrement to the public. Many, if not most, people turn on television news, hoping to glean some nuggets of real information of what is going on in the world. They dodge the bullets of “non-news” such as the latest highway car crash and stories of questionable merit, such as the Paris Hilton soap opera, looking in vain for some real news of what is going on in Washington D.C., Iraq, Afghanistan, or even just the world and nation in general.

From what I hear regularly, they are usually disappointed.

Given that the owners of big media also have many other corporate interests (such as petroleum, international industry, and even just mundane things such as big box retail) who might be affected by news coverage either in the form of incoming advertising dollars, or by the possibility that a story might just tarnish the image of a owner’s other interest and negatively impact their overall bottom line, the corporate masters of big media really have no interest in informing the public. It is much more convenient, and perhaps monetarily profitable, to distract, shock and entertain the public with drivel than to tell the actual news of the day on what is supposedly news programming.

For myself, having “killed my television” some years ago (I do literally own two, but I don’t subscribe to a cable or satellite service.), most of my national and international news comes from radio or the Internet. And, I find, if I really want to know what’s going on, both here and abroad, tuning into news reported from other countries, such as the BBC, is a lot more efficient than trying to decipher what’s really happening from a couple of sound bites squished in between “non-news,” “infotainment,” and commercials.

A few friends have mentioned to me that they do the same thing—if they really want to know what’s going on, they turn to independent or overseas sources.

It struck me the other day how sad this situation is. Not so many years ago, the U.S. was a recognized source around the world for free information. During the Cold War, there were U.S. efforts to broadcast uncensored news to Communist block states and other totalitarian areas around the world. What happened? How did it come to be that we, in America, need to go to reporting sources overseas to get a more accurate picture, or any real picture, for that matter, of what is happening in our own country?

But Brzezinski’s act on air does give me some hope. When a national news reporter snaps on live television, possibly jeopardizing her career, because she’s being made to report a story that is not news once too often, we might be getting somewhere. It’s even more heartening that thousands, if not a few million, cheered by flooding the network with emails and posting commentary praising her on the Internet.

Let’s just hope that the network folks begin to listen. However, with comments during the incident from Brzezinski’s male colleagues such as, “Why are you such a journalist?” and calling her a “wench,” unfortunately, I have little hope that I will be enjoying—or trusting—what is being reported on national network news anytime soon.

To see Brzezinski’s refusal to lead the morning news with Paris Hilton’s release from jail yourself, visit www.youtube.com/watch?v=6VdNcCcweL0.

Is the art of polite debate truly lost in this country?

I recently received an email from a friend with the subject heading, “So how mad are you at me really????” The thing that really surprised me about this email was that I wasn’t angry with her at all, and it never occurred to me to be mad at her. I wasn’t even annoyed with her. I didn’t even know we had an issue.

Luckily, it turned out in the end that there wasn’t any issue at all. The email was based on an assumption, and the friend is a polite person who wanted to avoid a potential conflict. Both of us serve as board members for a local organization, and we had differing opinions on whether the organization should give a small bit of money to support an upcoming event. It doesn’t matter who was for or against what in this case—we disagreed on the issue. But what still mystified me about this email is why my friend would think this would be an issue to possibly put a rift in our relationship. She wasn’t even at the meeting (being unable to attend, she sent a statement that contained her thoughts on the matter), and we never spoke about the matter, let alone argued about it.

While I very much appreciate the email and the offer of letting me yell at her, and the fact that she didn’t have the urge (or at least didn’t tell me she had the urge) to yell at me, there just didn’t seem to be a cause for conflict at all. In any case, I never even saw the potential. It never occurred to me this could become a yelling issue at all.

In dealing with an ongoing family emergency, I’ve spent a bunch of time in medical waiting rooms recently. I cancelled my cable subscription a little more than three years ago, so I don’t see much TV, but having been subjected to the “second-hand smoke” of TV blather in these waiting rooms, I’ve begun to notice a trend.

It started one day when I was forced to watch (or at least listen to)“The View.” About a month and a half ago, there was a show in which the panel supposedly attempted to debate some of the issues surrounding the war in Iraq. The reason I say “supposedly” is because it was clear to me that the show was heavily scripted, even though this was supposed to be an impromptu debate on the issues.

The issue was brought up by one of the middle-aged panelists, and heavily “debated” by the younger panelists, who ended up practically screaming at each other. Every so often, the matriarchal type in the bunch would attempt to calm things down in an ‘intellectual” grandmotherly fashion, which basically came across as, “There, there children, don’t bicker. Play nicely.” By the time the segment was over, the panelists verbally came to the conclusion that the issue was “too emotionally charged to possibly debate civilly and intelligently.” Oh, and one young panelist brightly proclaimed that the entire panel was at a disadvantage, since they were all women and had a tendency to get more emotional than men about these issues anyhow.

Also, since the decisions are out of our hands, we shouldn’t get too worked up about it.

The next segment in the show was about how to properly choose colors for your living room, if I remember correctly.

If it weren’t offensive enough that the fruits of decades of fighting for gender equality was being eviscerated on a show that supposedly depicts modern, liberated women, the idea that an issue facing this country—the country that first proclaimed that government should be by and for the people—is too emotional for the common people to intelligently think about and debate is even more offensive.

But the more I was disgusted by what I saw, the more I thought about it. When was the last time you saw a good example of intelligent debate?

You certainly won’t find it come election season. A “debate” on national television these days seems nothing more than a carefully written speech written by professional speech-writers designed to be its best when sound-bitten. Carefully gathered supporters comprise the live audiences, in most cases, to provide live applause for their guy on cue.

Gone seem to be the days when intelligent people gathered to discuss the issues of the day. Believe it or not, once upon a time (before television, of course), people used to gather for political debates, as well as other forms of debate. They even considered it entertaining.

But today, it seems, if two or more parties don’t see exactly eye-to-eye, that’s a conflict, and debate is unlikely. Mediation, perhaps, if the matter is big enough to become a legal issue, is possible, but talking without agreeing is apparently now taboo.

I have to wonder if this is political correctness gone too far.

Will it come to the point where anyone who says anything that offends anyone, anywhere, or has the potential to, will be considered horribly gauche? And who is to say what is offensive?

The fact is, that in the case of any disagreement or potential disagreement, communication is essential to resolution. And if we find ourselves getting so touchy that we can’t talk about issues, they WILL become actual conflicts.

I definitely appreciate that email. My friend was actually more astute than I to see the potential for conflict in today’s climate.

But I find myself wishing it weren’t like that and thinking that while humans have always been irrational, touchy creatures, in some ways, we may have been a bit more politically correct when people could still debate, occasionally disagree, be accepted for who they are and not worry about having to apologize for their beliefs—and then go out for coffee or a beer afterwards, without having to make sure their friends are still their friends.

We are all Easton’s heritage

I recently received a press release that cited the usual deplorable statistics about how little Americans really know about their country. Being that this is the season of all things patriotic, just having passed Independence Day, this is not all that surprising.

According to the Intercollegiate Studies Institute, only 48 percent of college seniors surveyed correctly identified the Declaration of Independence as the source of the phrase, “We hold these truths to be self-evident, that all men are created equal.” The release goes on to state that 42 percent of those surveyed thought the source of the phrase was from the preamble of the Constitution. And, that more than 400 students (it does not reveal how many were actually surveyed) apparently identified the phrase’s source as Marx and Engels’ The Communist Manifesto.

So, apparently, if one adds up the numbers, only 90 percent of those surveyed (supposedly America’s best and brightest—they’re college students, right?) correctly identified the phrase as belonging to one of the United States’ key documents, and only slightly more than half of those knew which of those documents it belongs too.

These are sobering thoughts, surely, but this is not a column about the ills of American society. There are plenty of those already, and I’m fairly certain that on the day I decide to return to that topic, there will still be plenty to discuss.

I’m not certain how many of the hundreds who attended Easton’s Heritage Day reading of the Declaration of Independence really knew it well before they arrived. I suspect, from what I observed during the reading, that quite a few not only respect that document but do actually understand and know its contents.

But I’m sure it was far from everyone. If you surveyed the crowd before the event, I’m not sure how well they’d have fared.

Would the results be so different from the ISI’s findings? Probably not, though I think few Eastonians would say “that all men are created equal” originated in Communist Russia.

But whether they can recite the words or not, I know America, the REAL America, is alive and well in Easton.

How do I know this? I saw it myself.

When reenactor Adam Howard got to the parts about tyrants and taxes, the crowd booed both. Unabashedly. Unashamedly.

And not just in fun, due to the spirit of the moment either. It was for real.

The last couple of years, political protesters have quietly and peacefully voiced their opinions with large signs during the event. They too, as far as I am concerned, are a symptom that the democratic principles this nation was founded on are also alive and well in Easton.

When freedom and liberty, and the pursuit of happiness were mentioned, the crowd cheered its approval.

In what now seems like another lifetime, I used to be a reenactor. In fact, I participated in Heritage Day numerous times, in period garb, for half a decade before I moved to Easton. Having been a member of “the hobby” (reenacting) for almost two decades, I’ve seen the Declaration read a countless number of times, both in Easton and elsewhere.

I’ve seen it done well, as well as poorly. (For the record, I’ve enjoyed Howard’s interpretation on all the occasions I’ve seen him.) But the one thing that is unique in Easton is the crowd.

In most places, at most events, though polite, the crowd looks bored. One finds oneself often wondering why they attended.

Perhaps it was supposed to be educational for their children, who upon taking the parental cue, also look bored. Some people inevitably drift off, either mentally, or even physically wander away before the presentation is through.

But Easton is different.

This past Sunday, I saw children watching attentively, nodding, and actually understanding the real meaning and importance of the Declaration of Independence as it was brought to life before them. I watched adults, also rapt in attention, as their faces showed the new meaning the reading gave their country’s founding document, and in some cases, also understanding its true import for the first time.

Perhaps it is that it actually happened here, in Easton, that there is some remnant trace of memory, much more so than is passed down to us in the dusty, tired old history books that most of us have come to equate with the boredom of a banal, required high school history classes, that makes the Declaration’s reading so much more meaningful here that elsewhere.

But I think it is something else too.

A few years ago, the Heritage Day Committee chose for its t-shirt slogan the phrase, “You are Easton’s heritage.”

It is my favorite slogan for Heritage Day, and my favorite t-shirt too. (I even bought a backup one this year, just in case.) The reason I like it is that it is so true.

A place is only as good as its people, and it is the people that make the history of a place. People and their achievements, both large and small, are what put a town on the map. Even if a place is renowned for something else, say its architecture, someone had to put it there. Hence, it is the people that are the impetus that make a place great or important.

Before and during the Revolutionary War, and for a long time after that, a century and a half, in fact, Easton was a place of importance and a center of commerce. Easton’s history of success is much longer than its current history of urban dilemmas.

But dilemmas we do have, and solve them we must.

Looking around on Sunday, I saw a few hundred Eastonians, all of who play their roles in what is Easton’s heritage to come, whether they realize it or not. But seeing the reaction and attentiveness at that reading, I’m not too worried about the future of Easton, not as long as we continue to boo tyrants and unfair taxes.

We, all of us, are Easton’s heritage, and Easton’s future. Let’s make sure we never forget it.

American ideal is still very real

This past week, as Americans celebrated Independence Day, and as Easton prepares to celebrate its own unique part in the birth of the United States through Heritage Day, I got to thinking a bit about what its all about.

No, this will not be yet another tedious yet politically correct column spewing patriotic platitudes here, though I love my country well. (I’ll even skip the jokes about fearing my government, at least for now.) But based on the current world situation, and perception of the U.S. and Americans around the world right now, I did get to thinking about the dichotomy between what the U.S. actually is and what average Americans perceive to be “American values.”

When it comes to Independence Day, and the birth of this nation in general, I consider myself to be somewhat more knowledgeable than the average person, since I have been involved in the reenactment of various events pertaining to the Revolutionary War for almost two decades now. My current editorship of The Easton News has curtailed my participation to near nothing in the past year and a half, but if you look closely at Heritage Day photos from the past ten years, you may just find my antique alter ego peering back at you, holding either a distaff or a linstock, demonstrating two very different aspects of colonial life in the local area.

But it doesn’t take any special study or historical expertise to realize that the America we live in today does not really resemble the America our forefathers lived in, or even exactly what they envisioned.

One can debate where or whether things have gone wrong in the intervening 200-plus years since the Declaration of Independence was written followed more than a decade later by the adoption of our Constitution and Bill of Rights, the three cornerstone documents that comprise the basis of the idea of the American way.

It’s plain to see that we, as a nation, have not always lived up to the ideals and laws we set for ourselves. It is immensely arguable that our government, particularly in its current incarnation, has been degraded from these ideals, to the detriment of both our economy and our reputation in the world.

But there can be an immense difference between a government and its people, as another lesson of history has told us time and again. Indeed, I believe there is an ever-growing gap between what the American people perceive to be the right course of action in a given situation and what seems to be happening in our government—perhaps as great as the gap that is growing between rich and poor in this country.

All is not lost, however. Though it would seem hopeless, consider that this country was founded on an outrageous idea, one that no one 200 hundred years ago thought could ever last. But last it has, and even thrived.

Recent scientific evidence confirms that information, the very stuff ideas are made of, does indeed have a physical component, though it is not completely understood entirely. That means that ideas are real, in a very physical sense.
It has always interested me that the first thing tyrants attack is ideas. One would think that once physical dominance is achieved, dictator types would feel secure, that ideas wouldn’t matter, only might.

But I think that those tyrants may have been ahead of contemporary science. They know that once an idea takes hold, it is very, very real in a physical sense.

That thought is comforting to me. See, even if our government has seemingly been usurped by folks that seem more likely to use the Declaration of Independence and our Constitution and Bill of Rights as toilet paper than to actually read the documents they’ve sworn to preserve, those ideals are embedded in the hearts and minds of nearly every American citizen.

See, if the idea of what the U.S. is and can be is intrinsically real, then the reality that idea represents is not so far away after all.

In other words, government by the people and for the people is the logical sum of the equation, not a privilege that is bestowed by some lofty power. Each and every one of us knows what America is really supposed to be about, even if those in the Oval Office seem clueless.

And that means that no matter what happens, our country’s best ideals will survive, no matter what happens. We just have to make sure that we, as a nation, live up to them.

Wednesday, June 27, 2007

Laptop offers technology’s magic only dreamed of in yesteryears

Let me start by saying I am a Luddite at heart. I like my life simple and quiet, though it never really seems to work out that way.

Years ago, I avoided getting a computer for as long as possible, bowing to the dictates of modern convention only when it became apparent that my friends and neighbors would no longer respond to me just shouting over the fence, or leaving voice mail messages. I’m not technophobic; in fact, it’s quite the opposite. But I really, really didn’t want to be bothered with one more thing in my life that would need attention.

This, however, happened back in what is now the “dark ages,” when voice mail was still called leaving a message on someone’s answering machine. (Remember those quaint devices? You bought it once and never paid for the service again until the machine broke, unlike today where for about $7 a month anyone with an access code, and probably Big Brother too, can listen to your messages any time they want, which are no doubt archived for the government’s convenience as well. But I forgot, the concept of real privacy has also become quaint these days.)

So, what is now more than a decade ago, I broke down and bought a computer.

And it really wasn’t as bad as I thought it would be. The Internet, with all the instant access to information it offers, really did make up for the fact that I was beholden to yet another electronic device.

But as I said, I’m a bit of a Luddite, and I really don’t subscribe to the idea that a device that costs hundreds of dollars should become obsolete trash in just a year or two. So I was determined to make that computer last. This was what is now about 13 years ago.

And last it did, until a little over a month ago, when with no more than a whine followed by an ominous clacking noise, it likely breathed its last. No, it was not exactly the same old beast for the entire 13 years--by now it’s a bit more like Frankenstein, since the beast has three hard drives totaling a whole of 12 gigabytes of computing power. But that said, reviving this particular beast, other than to retrieve 12-plus years of personal research, would rightly be considered some form of electronic cruelty, I think. (Though whether it would be more cruel to the beast or to me is not entirely clear.)
I know there’s a whole bunch of computer nerds out there reading this thinking, “Wow, what a relic...” I know they’re laughing. But I’d also bet the vast majority of them are under 30.

Perhaps it is a product of growing up in a world where nothing seems to stay the same for more than about 15 seconds, but once I find a good tool, I want to master it and use it comfortably. I don’t see carpenters buying really expensive newly designed hammers every fifteen minutes, and I just don’t see why I should have to buy myself a “new hammer” that often either.

That said, not having a “hammer” at all is a problem. For the last bunch of weeks, everything for the newspaper has had to be written in my office out on the far side of the airport, meaning I’ve more or less been chained to my desk in between trips to my mom’s house in New Jersey. It hasn’t been fun.

This weekend I decided to put an end to the slavery to my office computer. I’d been thinking about getting a laptop computer for a while now, but I’d also been putting it off. The problem with refusing to buy a new hammer is that one loses track of what new hammers do and how they do it. So I’d been doing a little research, but I wasn’t sure I was ready to take the plunge.

But standing on the edge of that diving board, with the prod of being chained to my desk every “spare” second for even another minute, let alone another week, was great incentive.

So I headed to the Allentown fairgrounds this past weekend for one of those computer shows they hold.

It was smaller than I expected, which is a good thing, because I also misjudged what time the show closed by an hour. In the end, I had just enough time to review the goods and sellers reasonably thoroughly. Five hundred dollars later, I’ve got a laptop “hammer” that is about eight times as powerful as my old beast, is portable, reads and writes DVDs in addition to CDs and has high-speed wireless Internet capability built in—and it runs on a battery for two hours or more before it needs recharging.

None of this technology is really new any more. And again, I’m sure the under-30 crowd is laughing.

But as I write this column on my new laptop, it strikes me how many times as a kid, reading sci-fi or watching Star Trek, this technology was mentioned and we dreamed of it becoming a reality.

Cell phones are the same way. Less than half a life time ago, people talked about how cool it would be to have a “communicator” from Star Trek. Little kids “played Star Trek” and emulated other sci-fi stories the way they played cowboys and Indians in past decades.

I very specifically remember reading the book version of “The Wrath of Kahn” the summer it came out when I was a kid. The crew is on a short transport out to the Enterprise, and there’s a description of Uhura doing some programming on a portable mini computer—in other words, a laptop. At the time, I thought that was the coolest thing, at least as cool as communicators.

I wanted one.

I don’t think many of us expected to actually own these devices in our lifetimes. They were the toys of the future, of a place and time that didn’t exist, the stuff of sci-fi fantasy, in a world where telephones were still dialed manually, and one got out of one’s chair to change the channel on the television. But today, they are reality.

We don’t always realize how technology has given us things that just a decade or two or three ago would have practically been considered magic. And certainly technology has presented as many problems as it solves—it is not the magic panacea it was often made out to be. But as I write this on my new laptop, it does strike me—whatever magic we can conceive of may indeed come to pass, if we can just wait long enough.

Thinking is not hard work

It’s that time of year. Here in the Lehigh Valley, and everywhere in America, countless thousands of young people will or have graduated from high school or college.

Many, if not most of them will breathe a sigh of relief. Exams are over, and there is no more studying to be done.

Or is there?

I’m not sure when exactly it happens, but by the time most of us have made it through school, we’ve picked up the idea that thinking is work—hard work—and is therefore something to be avoided.

Where this starts is not difficult to figure out. Though well intentioned, how many times do teachers, especially grammar school ones, tell their students to “think hard”?

Study is something we have to work hard at, we are told, over and over, and the message is repeated when we’re told to “work hard and get good grades.”

It is true enough that assigned scholarly tasks, ones we may not have chosen ourselves and may have little interest in, do indeed have the flavor of “work”—that is to say, learning something one does not want to, but does need to, is usually a bit onerous. And it can be frustrating. After all, it seems effortless to learn skills we’re eager to acquire.

But so many of the things we must gain knowledge about, either to get along in the world, or to get those good marks that are the keys to a “good future” are not on the list of what is utterly fascinating to us personally.

After 12 or more years of the “thinking is hard work” message, it seems that for most people it sinks in, sadly.

The message that thinking is hard work to be avoided is echoed throughout our adult lives by many, many media messages, particularly advertising.

If you don’t believe me, think (gently, don’t “work” at it) about it. Can you name any product that advertises itself as being difficult to use, or actually requiring thought?

Most products, particularly “new and improved” ones, carry the advertising message that says, “It’s so easy, you don’t even have to think about it.” I even once saw an ad for dishwasher detergent in ‘pods’ when this inane product was first introduced that said as much. Quite literally, the message from the ditzy blonde hired to push this stupidity was: “This is so easy, I don’t even have to think about whether I’ve put in the right amount.”

Wait a minute!!! Has thinking become such a chore that determining the right amount of dishwashing detergent to put in the machine is difficult? Does throwing in a “pod” really improve the quality of most dishwasher owners’ lives that much, saving them from the dreaded prospect of actually thinking about what one is doing?

If so, in my humble opinion, that is downright pathetic.

Once upon a time, each of us was young, very young, and the world was a new, fascinating place that we discovered something about every day. While there were definitely frustrations, often leading to tantrums and tears, there were also moments of incredible insight, those “AHA!” moments. Maybe you even remember one of those moments.

Unfortunately, it is rare that usual school subjects lead to that kind of satisfaction (except for the class valedictorian), but that doesn’t mean real satisfaction is unattainable. The things that really interest us, if pursued, can and do provide a lasting contentment, as well as “AHA!” moments and lasting contentment, things those who eschew deep thought will never know.

“Easy” may not be so easy in the end. “Easy” never leads to insight and often leads to boredom and discontentment. “Easy” does not build skills; it makes one more dependent on the purveyor of whatever it is, usually at a more exorbitant price, that have been “improved” to be so easy you don’t even have to think about it. And if one gets too into “easy,” one may find oneself so out of the habit of thought, that

Most things in life that are worth having are not easy things—they require work. But when it comes to thinking—that is, thinking things through, thoroughly—if the subject matter is interesting, it doesn’t really feel that way unless we’ve decided that any and all thought is, indeed, work.

So whether you graduated last week, or last decade, it might just be time to shed the idea that thinking is hard work, and instead remember how exhilarating it was when we were small, before the message of “this isn’t supposed to be fun” sunk in.

After all, you’re out of school now, and no one is telling you what to study. But study something, everyone should—just make sure it’s something you like, and you’ll be an expert in that field before you know it. Not to mention smarter than nearly everyone else, because you’ll know once again, that thinking is not hard work, not when you’re thinking for yourself.
It’s exhilarating.

Tuesday, June 5, 2007

Artificially inflated gasoline prices are inappropriate during holiday

Due to an ongoing family emergency, I’ve been driving a lot lately, and I can’t help but notice the price of gasoline.

This past weekend I had occasion to both make several trips to central New Jersey and also attend two Memorial Day events.

On the surface, it would seem these two things bear little connection with one another. But I noticed something that has been bothering me since this weekend, something I’ve not seen the pundits take note of.

This weekend, as it does every weekend, and especially on a holiday weekend, gasoline prices rose.

While this is not entirely unexpected, as I watched veterans and their families, along with “civilian” members of the public honor the sacrifices of those who have served in our armed services. Many of those sacrifices have been of life, as well as limb.

It may seem a lifetime ago, but just a few short years ago, I very clearly remember seeing war protesters with signs that said, “No blood for oil.” The slogan also became a bumper sticker.

We were assured by the current president and his administration that the conflict in Iraq is not about oil. I have to say, in the face of current gas prices, coupled with some very suspicious fluctuations (always on a holiday or a weekend, when many people have a need or a tendency to travel), I actually agree. This war was not about oil—it was about profits.

Not too long ago, I read a national news article that estimated our current vice president, Dick Cheney, makes about $22,000 per day from oil stock dividends and related income. This figure did not include other war profits one would be reasonably certain would be forthcoming (not to mention increasing) from being a stockholder in companies such as Halliburton that profit from the sale of armaments and military equipment.

Such a figure would arguably make one wonder how much our president, George W. Bush, would profit from increased oil prices. After all, that is the empire that made his family rich in the first place—along with holdings in military contracting companies. But, alas, that figure was not to be had. The very well researched and fact-checked article I read could not come up with an accurate figure. The reason given was that Bush’s holdings are so intertwined with the rest of the Bush family holdings that the author felt it was impossible to be able to estimate the amount—but that it was likely several times the amount Cheney receives in war profits.

The more I think about this, the more disgusted I become.

As companies like Texaco, Exxon and Chevron unabashedly inflate prices and proudly post record profits, our U.S. service men and women put it all on the line daily—and they are dying daily.

So, this weekend, as gas prices rose, so did my temper. It is a poor excuse to blame the war—after all, it is the same oil, the same refineries and the same pipelines that carry that petroleum to us. The only thing that’s changed is the price—and someone, very likely our elected officials and their oil company PAC cronies, is laughing all the way to the bank, all the while explaining to the American public, with their bare faces hanging out, that this is the consequence of war, a war they refuse to even consider ending.

Okay, so there’s a lot of money at stake, you say. But how many people would you kill for a few million, or even billion, dollars? Is there some amount of money, a minimum amount, say, that makes getting people killed so one can get richer (or richer) okay? Isn’t this the very definition of evil? And assuming one is actually that depraved to buy into that sort of thinking, that this is somehow okay, when does it end?

But either way, during a time reserved to reflect on the sacrifices of American service people, literally, our fathers, mothers, brothers and sisters, who have died and will continue to die until this current war is ended—Memorial Day weekend—it is especially pernicious to jack gas prices just to make an extra buck.

It’s enough to make one wish for instant karma.

Tuesday, May 29, 2007

Whatever happened to all those silly summer pop tunes?

It’s that time of year. You know the moment when it hits. The sun is shining and the breeze is warm. You’re in the car, the windows are down, and that song, one you haven’t heard in forever, comes on the radio. And, all of a sudden, life is very good.

But what ever happened to those silly pop tunes? Granted, at the time, an awful lot of them seemed really annoying. There would always be some tune that got played incessantly, and it would usually be the most inane thing out there.

But it struck me recently that there really are no more new happy, mindless tunes to hum along to on the radio. Unless you actually like the very narrow list of old hits that get played endlessly on the canned, preprogrammed digital loops that pass for radio these days, you are out of luck.

I found myself pondering the death of good pop over a microbrew at one of my favorite Easton establishments recently. The bar’s house CD player, which holds about 100 of the owner’s picks, shuffled to a song I haven’t heard since I was in grade school—“Mr. Blue Sky” by the Electric Light Orchestra.

It had been so long since I’d heard it, I wasn’t sure I was really hearing it, and I wasn’t even sure at first it really was ELO, despite the distinctive sound.

But the experience hit a nerve, and I went out and bought a “best of” CD. While I was at it, I stumbled over Yes’ 90125 album, which was popular when I was in high school.

Okay, so I’m dating myself here. Both bands and albums are ancient history, right?

Yup, it’s true. But I’ve noticed something interesting while I’ve been basking in the complex vocal harmonies and sounds of analog synthesizers of yesteryear.

This music is actually designed to put one in a good mood. And it succeeds admirably, which is why it was so popular in the first place, even if it has been buried by Clear Channel today.

This is a concept that is missing from most of the currently popular stuff I hear on TV and the radio. While some of the best of today’s music is designed to make one think, TV and radio don’t generally play what I consider to be the best, and the message is more than just generally negative—it’s downright toxic.

I started to think even more about it, and I tried to remember the last popular, relatively new tune I’ve heard on the radio that specifically makes me feel good.

The most recent I came up with was “Give a Little Bit.” But interestingly enough, it’s a remake of a “silly pop tune” written by Supertramp in the 1970s.

So I thought a little bit more. The closest thing I could come up with was a few Green Day tunes, but much as I like them, they really don’t qualify because they’re cynical, even if the melodies are relatively happy.

I still can’t trace back exactly when the last time I heard a new purely “feel good” pop song is, but I think it may go back to the beginning of the war in Iraq, if not even before that. But really, can you think of any new, original hits with a positive tone that have come out since 9/11?

They may have been headed out for even longer than that. Silly pop tunes are by nature happy, bubbly creatures, but the world is pretty far from that mentality right now.

However, it may be just what we need. I’ve been playing that ELO CD in my car a lot lately, and it’s been pretty nice out. You know, the weather’s fine, the breeze is warm and the windows are down most of the time. And, as I drive by, I have noticed that the music makes just about everyone smile, just like it was written to, three decades ago.

“Sun is shining in the sky
There ain't a cloud in sight
it's stopped raining
Everybody's in the play
And don't you know it's a beautiful new day, Hey-hey
Running down the avenue
See how the sun shines brightly
In the city
On the streets where once was pity
Mr. Blue Sky is living here today, Hey-hey

Mr. Blue Sky,
please tell us why,
you had to hide away for so long,
Where did we go wrong?”
—“Mr. Blue Sky,” by Jeff Lynne of the Electric Light Orchestra

Monday, May 21, 2007

What really happened to electric cars?

“It’s like waking up every morning with a full tank of gas—except that it’s not gas.”—an EV1 driver

“What the detractors and the critics of electric vehicles have been saying for years in true. The electric vehicle is not for everybody. Given the limited range, it can only meet the needs of 90 percent of the population.”—EV1 driver Ed Begley Jr.

You may have heard a little about the concept of electric cars. You may have also heard that they are not practical, that there were problems with batteries being able to store enough energy.

That may have once been true, but viable electric cars have been around since cars were first introduced. In fact, about a hundred years ago, there were more electric cars on American roads than gasoline-powered ones.

Modern electric cars were introduced in California in 1996. The state had introduced the Zero Emission Vehicle mandate in 1990. Since automakers had already been developing electric car technology and California had been working towards an electric car charge-up infrastructure, initially, the idea seemed to work well.

Ford’s Saturn division first introduced the EV1 in the late 1990s, an entirely electrically powered vehicle. While in just a few months, thousands of people signed up on a waiting list to lease the vehicles—they were never offered for outright sale—in the end only about 800 EV1s were actually made, and the competition for those vehicles was high. Mel Gibson, who eventually was able to lease one of the vehicles, had to fill out a detailed questionnaire to prove he was a worthy candidate. Sales representatives were required to get permission to lease the vehicles they were supposedly trying to “sell.”

When first introduced, the EV1 held a charge that was good for about 75 miles. But interestingly enough, GM didn’t use the best technology available, even though it was reasonably affordable. The first EV1s utilized standard acid-lead batteries manufactured by AC Delco, a GM subsidiary, even though GM had purchased superior nickel-hydride technology for the project, which would allow a charge to take the car about 125 miles. But this technology was not utilized until later, and when it looked like it might actually succeed, the technology was sold to Chevron Texaco.

In 2001, California was sued by car manufacturers for its zero emission law, and the federal government, led by our current president, George W. Bush, joined the suit against California. Interestingly enough, the White House Chief of Staff at the time was Andrew Card, former vice president of GM and president and CEO of the American Auto Manufacturers.

Meanwhile, supposed grass-roots organizations that were lobbying against an electrical charging station infrastructure in California were exposed to be funded by the oil industry.

Eventually, in April of 2003, while war raged in Iraq, California killed its zero emissions mandate.
GM and other manufacturers of electric vehicles immediately took action—they began recalling leased electric vehicles.

Nearly every electric car was available only for lease, not purchase, and lessors were not allowed to buy or re-lease their vehicles, despite the fact that many tried. Complaints from customers fell on deaf ears.

The fleet of EV1s was collected by GM and taken off the road. Research and photos gathered by electric car enthusiasts and activists indicate that nearly without exception, the EV1 fleet was systematically crushed, shredded and dumped in a landfill. Similar electric vehicles, including the Ford Think, Toyota’s Rav EV series and Honda electric vehicles were similarly gathered and similarly destroyed.

In a number of California cities, protests were held, and activists and EV drivers even staged a funeral. Even local officials attended, as many were EV drivers and supporters, including Los Angeles City Council President Eric Garcetti and California State Senator Alan S. Leventhal, both of whom were EV1 drivers and enthusiasts.

None of this is ancient history—the calculated fall of electric vehicles has taken place in just the last few years, all during the war in Iraq. The destroyed electric cars were still only a few years old, and they were mechanically sound. People wanted to drive them, and in fact, EV1 drivers and activists offered GM $1.9 million to buy out the leases on the last remaining 78 vehicles. GM never responded and destroyed the cars.

So what was the real motivation behind the destruction of the EVs? There are a lot of answers and nearly all of them lead back to big oil.

But I think the problem is a little larger, if you can believe that. See, you generally charge up an EV at home—you just plug it in and it charges overnight, at an equivalent cost of about 60 cents a “gallon.” Enterprising homeowners in appropriate geographical locations could even use solar or wind power to supplement their charging. This “autonomy,” not being dependent on big oil or big corporations, combined with the possibility of using an actual renewable power source is what really irks big oil and car manufacturers.

Also, electric vehicles do not depend on the internal combustion engine, a device that requires a lot of costly parts and maintenance, which means further “lost” profits for those companies. Electric cars don’t require oil changes, catalytic converters, or even an exhaust system, since they do not produce exhaust. They even require less brake maintenance, since the car automatically brakes itself to an extent when one ceases to apply pressure to the accelerator.

Each gallon of gasoline burned produces 19 pounds of carbon dioxide. Existing electrical plants supply the power for EVs, and even coal-fired plants provide a better efficiency than current gasoline-powered vehicles, especially since pollution created by power-plants can be controlled and collected at the source, as opposed to being released everywhere one’s car goes.

In his 2003 State of the Union address, Bush proudly said that his administration was working towards alternative fuel technologies, particularly hydrogen fuel cell technology. But there are still major problems with this concept, and a working fleet of vehicles is still well over a decade away. In addition, hydrogen vehicles will never allow the consumer to fuel up at home—a large infrastructure of at least 20,000 to 30,000 hydrogen filling stations will be needed before these vehicles will be viable.

Meanwhile, oil companies continue to profit. Gasoline prices in the Lehigh Valley hover around $3 per gallon. In 2003, oil company profits were about $33 billion. In 2004, profits were approximately $47 billion, and in 2005, they were $64 billion.

Clearly, some people have monetary incentive to keep the status quo, no matter what the cost to the rest of us, or the planet we live on and call home.

Clean, quiet, fast (EVs can generally do 0 to 60 miles per hour in under 4 seconds), non-polluting technology that is not dependent on foreign oil exists—and was literally shredded less than five years ago. While an unjust war based on lies over oil rages in Iraq, or anywhere else for that matter, I cannot think of a more unwise move, nor one that was less moral.

To find out more about the rise and fall of electric vehicles, see the documentary, “Who Killed the Electric Car?” or go to www.whokilledtheelectriccarmovie.com

Christina Georgiou is the editor of The Easton News. She is currently on extended leave due to a family emergency, but can be reached at cgeorgiou@lehighvalleynewsgroup.com.

Wednesday, April 25, 2007

God’s self-appointed sales crew and other nuisances of modern living

So I’m at home at my desk writing one Monday afternoon recently, and the doorbell rings. Not once, but a whole bunch of times in a row, hard, like it’s really urgent—a delivery guy who has no time to wait, or an actual emergency.

I’m not expecting anyone, and it’s never a good idea to just buzz someone into the building, so I grab shoes and keys and bolt down two flights of stairs to see who it is, only to find no one there.

I step outside and look up and down the block. Maybe it was the postal carrier and something needs to be signed for. I see no one.

Just as I’m giving up and going back in, a woman, accompanied by two men, comes out of the hair salon downstairs from me, and says, “Oh, there you are. I just rang your doorbell.”

“Yes, I heard you. That’s why I tried to answer the door,” I say. “Can I help you?”

“I just wanted to tell you about the Bible,” she says, smiling, taking a step forward.

The look on my face stopped her cold.

“You dragged me down two flights of stairs in the middle of my work day to tell me about the Bible? You’re kidding, right? I’m a Greek news editor—I promise, we’ve heard about the New Testament,” I tell her.

She takes not one, but two steps back.

“And by the way,” I continue, “if God is up in heaven, then I was definitely closer to Him a few minutes ago, when I was still working peacefully in my apartment on the third floor.”

I turned and headed back inside, but I’d swear I saw all three raincoated (why do door-to-door missionaries always seem to wear suits and raincoats? It may be mean to say, but I always wonder if it’s because they want to be prepared in case of an unexpected rain of frogs...) figures actually running away before I firmly slammed the door.
_______

It’s approaching dinnertime on a weeknight, and the phone rings.

“Hello,” I answer, in the customary, time-honored fashion.

“Hi, this is Danny from TruGreen Chemlawn. How are you this evening?” says an eager voice on the phone.

“Lawnless,” I answer, deadpan.

This is apparently a new one on Danny. He loses his rhythm. “What?’ Danny stammers.

“Lawnless. I live in a third-story city apartment, and I don’t have a lawn,” I explain.

This is apparently exactly the right thing to say, because he tells me he is very sorry for bothering me and hangs up before I can utter another syllable.

I bet I never hear from Danny again.

Yea! I win.
_________

Again, the phone rings. (I am beginning to develop an unconscious habit of flinching when the phone rings at certain times of day...)

“Hi, this is Ed from Soandso Marketing in Somewhere, Virginia. I’m looking for William.”

Yes! They are not a marketing company looking for me. I can tell them they have the wrong number and be rid of them.

“Sorry, you have the wrong number,” I say. “There’s no William here and never has been.”

“Well, is this 610-555-9999?,” he asks.

“Yes,” I say, repeating, “And there’s no one named William here, and there never has been.”

“Well, that’s okay, because we really just wanted the number,” he tells me.

Internally, I groan. Great. Telemarketers have somehow figured out yet another way to bug the ever-living tar out of us all by calling and pretending to look for someone who doesn’t exist.

“What for? And what are you selling?” I ask warily.

“We wanted to inform you that you have been entered in our dream prize sweepstakes.”

“Great,” I reply. “And just how would you know what my dreams are? More to the point, what are you selling?” (I’m amazed at how many times one has to repeat oneself with these folks. Maybe because they are forced to repetitively say the same thing, they don’t hear incoming data until it’s been repeated several times?)

“We have a bunch of prizes. One of them must fulfill one of your dreams,” he tells me.

I think this is rather presumptuous and tell him so.

He tells me my dreams must be rather special, why don’t I tell him about them?

EWWW!! This is getting far too touchy-feely for me. A telemarketing stranger calls up and wants to know my dreams? Why do I even answer the phone?

I tell him he’s dreaming if he thinks I’m going to tell him my dreams and could he get to the point already please, before I’m forced by sanity’s dictates to hang up the phone and end the conversation?

Get this—he tells me that’s it. I’ve been informed I’ve been entered.

I’m not sure that’s the case, since I didn’t give any real information about me other than that I am not William.
In what contest, for what supposed prize, I still don’t know. And I don’t care.

Because the day I really win, I will know it. It will be a day when no one contacts me needlessly, trying to parasitically waste my time or try to sell me something I don’t want or need.

Monday, April 23, 2007

Goodling scandal may indicate a deeper problem

In these United States, we tolerate many ideas—nearly all ideas, in fact. We trust that everyone each has an equal say by way of their vote, and that our elected officials and those they appoint are a fair representation of the diverse viewpoints that make up our society.

But what would happen if one group of people, possibly a large group but still a minority, decided that things were not going to their liking and that they have been instructed by God to recitify that situation? What if they decided that they were specifically entitled to rule over others because of their morally superior position? What if they decided to make a concerted effort to get their people into government, even if it meant a couple of decades of work, and then perhaps violating the very principles they espouse to further that cause? What do you think about an America run on the principles of Pat Robertson?

These are not entirely a hypothetical questions, though few people have been asking them—at least not very publicly and not as many as should be asking.

In the news in the last few weeks have been the question of firings in the U. S. Attorney General’s office.

Folks who have been watching trends in government have been increasingly alarmed by the droves of longtime experienced staffers that are resigning, being forced to resign or retire, or are outright fired or removed from positions in which they have years of expertise. The Bush administration has replaced these people with its own people—people loyal to its neo-conservative cause, but not necessarily (and more often than not, not) the best in their field. Hence, the Katrina fiasco.

A few weeks ago, the name of Monica Goodling surfaced in connection with those firings, when she resigned abruptly without explanation after refusing to testify about her role in the firings of several U.S. attorneys for what appear to be partisan reasons, while asserting her Fifth Amendment privilege against self-incrimination. Goodling, who was senior counsel to Attorney General Alberto Gonzales and Justice Department liaison to the White House, like many new legal appointees in the White House, is a graduate of Regent University, the college founded by evangelist Pat Robertson 29 years ago.

Regent’s Web site boasts that 150 of its graduates have been placed in high-ranking federal government positions since 2001 and are currently serving in this administration and that “approximately one out of every six Regent alumni is employed in some form of government work.” About 5,000 students currently attend Regent University.

That seems an amazing feat for a college that is less than three decades old. Not so long ago, Regent grads had trouble passing their bar exams, though the school has apparently raised its standards since then. Former Attorney General John Ashcroft teaches at Regent, and other graduates have been raised to positions in the Bush administration.

The school’s motto is “Christian Leadership To Change the World,” and that is precisely what it and its graduates seem to be attempting. Their goal is not only to tear down the wall between church and state in America, but to enmesh the two.

I don’t know about you, but I don’t really want to wake up one day to find I live in some alternate version of “The Handmaid’s Tale,” which is what this is eerily beginning to sound like. There are very good reasons for a secular government. Having specifically Christian standards may sound like a good idea to some, but who gets to decide which Christian standards? Never mind the million of Jews, Muslims, Hindus, as well as the myriad of other religions practiced by American citizens.

And then there is the question, as in the case of Goodling, of people who appear to start out having the best of intentions, only to have them manipulated for political gain, which seems to be the case every single time in all of history that religion and politics are entwined.

Have folks like Robertson always vied for political power and gain? Yup. But the problem here is that he and folks like him have been quietly gaining ground.

If you actually think Robertson is just a man inspired by God to become a preacher to millions, above political ties or thought of power and personal gain, think again. His biography alone, quoted from Robertson’s own Web site, is evidence against that idea:

“Marion Gordon ‘Pat’ Robertson was born on March 22, 1930, in Lexington, Virginia, to A. Willis Robertson and Gladys Churchill Robertson. His father served for 34 years in the U.S. House of Representatives and Senate. Robertson’s ancestry includes Benjamin Harrison, a signer of the Declaration of Independence and governor of Virginia, and two United States presidents, William Henry Harrison and Benjamin Harrison, the great-grandson of the signer of the Declaration of Independence. Robertson also shares ancestry with Winston Churchill.”

Okay, so Robertson claims a few famous ancestors. So what?

He’s also the founder and chairman of the Christian Broadcasting Network.

Still, despite a few million viewers, so what? What does that have to do with the rest of us?

Well, the problem may just go deeper than a bunch of lawyers in Washington, though that situation is more than a little disturbing. There is a bit of evidence that there’s a whole bunch of folks that have quietly been working to push their own “Christian” agenda, regardless of what the rest of us might think.

Lest you think I’m being entirely paranoid, I offer the following quote from a press release I received recently from a local evangelical group looking to promote National Prayer Day.

“The National Day of Prayer Task Force’s mission is to communicate with every individual the need for personal repentance and prayer, mobilizing the Christian community to intercede for America and its leadership in the five centers of power: Church, Education, Family, Government and Media.”

The National Day of Prayer Web site does go on to state that “The National Day of Prayer Task Force was a creation of the National Prayer Committee for the expressed purpose of organizing and promoting prayer observances conforming to a Judeo-Christian system of values. People with other theological and philosophical views are, of course, free to organize and participate in activities that are consistent with their own beliefs.”

(By the way, I didn’t change anything in these quotes—the grammatical error and capitalizations are theirs.)

It’s nice that they put in a disclaimer that we’re still free to organize around our own beliefs, but ultimately the underlying message here is that if you’re not with them, you’re inferior. And that’s a little disturbing, coming from a group that wants to “intercede for America and its leadership in the five centers of power: Church, Education, Family, Government and Media.”

There’s a big difference between taking initiative on fixing a perceived problem, and taking it upon oneself to decide that one’s way is the only right way and that it must be imposed at all costs, even by a stealth coup, if necessary. One way has a chance of success for all, and the other is a delusional psychosis that, if successful, could take us all to hell in a hand basket, literally.

I really do hope I’m just being paranoid.

Goodling scandal may indicate a deeper problem

In these United States, we tolerate many ideas—nearly all ideas, in fact. We trust that everyone each has an equal say by way of their vote, and that our elected officials and those they appoint are a fair representation of the diverse viewpoints that make up our society.

But what would happen if one group of people, possibly a large group but still a minority, decided that things were not going to their liking and that they have been instructed by God to recitify that situation? What if they decided that they were specifically entitled to rule over others because of their morally superior position? What if they decided to make a concerted effort to get their people into government, even if it meant a couple of decades of work, and then perhaps violating the very principles they espouse to further that cause? What do you think about an America run on the principles of Pat Robertson?

These are not entirely a hypothetical questions, though few people have been asking them—at least not very publicly and not as many as should be asking.

In the news in the last few weeks have been the question of firings in the U. S. Attorney General’s office.

Folks who have been watching trends in government have been increasingly alarmed by the droves of longtime experienced staffers that are resigning, being forced to resign or retire, or are outright fired or removed from positions in which they have years of expertise. The Bush administration has replaced these people with its own people—people loyal to its neo-conservative cause, but not necessarily (and more often than not, not) the best in their field. Hence, the Katrina fiasco.

A few weeks ago, the name of Monica Goodling surfaced in connection with those firings, when she resigned abruptly without explanation after refusing to testify about her role in the firings of several U.S. attorneys for what appear to be partisan reasons, while asserting her Fifth Amendment privilege against self-incrimination. Goodling, who was senior counsel to Attorney General Alberto Gonzales and Justice Department liaison to the White House, like many new legal appointees in the White House, is a graduate of Regent University, the college founded by evangelist Pat Robertson 29 years ago.
Regent’s Web site boasts that 150 of its graduates have been placed in high-ranking federal government positions since 2001 and are currently serving in this administration and that “approximately one out of every six Regent alumni is employed in some form of government work.” About 5,000 students currently attend Regent University.

That seems an amazing feat for a college that is less than three decades old. Not so long ago, Regent grads had trouble passing their bar exams, though the school has apparently raised its standards since then. Former Attorney General John Ashcroft teaches at Regent, and other graduates have been raised to positions in the Bush administration.

The school’s motto is “Christian Leadership To Change the World,” and that is precisely what it and its graduates seem to be attempting. Their goal is not only to tear down the wall between church and state in America, but to enmesh the two.

I don’t know about you, but I don’t really want to wake up one day to find I live in some alternate version of “The Handmaid’s Tale,” which is what this is eerily beginning to sound like. There are very good reasons for a secular government. Having specifically Christian standards may sound like a good idea to some, but who gets to decide which Christian standards? Never mind the million of Jews, Muslims, Hindus, as well as the myriad of other religions practiced by American citizens.

And then there is the question, as in the case of Goodling, of people who appear to start out having the best of intentions, only to have them manipulated for political gain, which seems to be the case every single time in all of history that religion and politics are entwined.

Have folks like Robertson always vied for political power and gain? Yup. But the problem here is that he and folks like him have been quietly gaining ground.

If you actually think Robertson is just a man inspired by God to become a preacher to millions, above political ties or thought of power and personal gain, think again. His biography alone, quoted from Robertson’s own Web site, is evidence against that idea:

“Marion Gordon ‘Pat’ Robertson was born on March 22, 1930, in Lexington, Virginia, to A. Willis Robertson and Gladys Churchill Robertson. His father served for 34 years in the U.S. House of Representatives and Senate. Robertson’s ancestry includes Benjamin Harrison, a signer of the Declaration of Independence and governor of Virginia, and two United States presidents, William Henry Harrison and Benjamin Harrison, the great-grandson of the signer of the Declaration of Independence. Robertson also shares ancestry with Winston Churchill.”

Okay, so Robertson claims a few famous ancestors. So what?

He’s also the founder and chairman of the Christian Broadcasting Network.

Still, despite a few million viewers, so what? What does that have to do with the rest of us?

Well, the problem may just go deeper than a bunch of lawyers in Washington, though that situation is more than a little disturbing. There is a bit of evidence that there’s a whole bunch of folks that have quietly been working to push their own “Christian” agenda, regardless of what the rest of us might think.

Lest you think I’m being entirely paranoid, I offer the following quote from a press release I received recently from a local evangelical group looking to promote National Prayer Day.

“The National Day of Prayer Task Force’s mission is to communicate with every individual the need for personal repentance and prayer, mobilizing the Christian community to intercede for America and its leadership in the five centers of power: Church, Education, Family, Government and Media.”

The National Day of Prayer Web site does go on to state that “The National Day of Prayer Task Force was a creation of the National Prayer Committee for the expressed purpose of organizing and promoting prayer observances conforming to a Judeo-Christian system of values. People with other theological and philosophical views are, of course, free to organize and participate in activities that are consistent with their own beliefs.”

(By the way, I didn’t change anything in these quotes—the grammatical error and capitalizations are theirs.)

It’s nice that they put in a disclaimer that we’re still free to organize around our own beliefs, but ultimately the underlying message here is that if you’re not with them, you’re inferior. And that’s a little disturbing, coming from a group that wants to “intercede for America and its leadership in the five centers of power: Church, Education, Family, Government and Media.”

There’s a big difference between taking initiative on fixing a perceived problem, and taking it upon oneself to decide that one’s way is the only right way and that it must be imposed at all costs, even by a stealth coup, if necessary. One way has a chance of success for all, and the other is a delusional psychosis that, if successful, could take us all to hell in a hand basket, literally.

I really do hope I’m just being paranoid.

Friday, April 13, 2007

In the Age of Information, misinformation abounds

It probably wouldn’t be much of a warning to tell you to not believe everything you read on the Internet—after all, everyone knows that, right?

But there are some sites that are trusted more than others. If you Google a topic, you can be reasonably sure you will actually turn up some Web sites that will give you the information you need, even if it is up to you to determine which of those sites are reliable and offer accurate information.

Another site many people have come to trust is Wikipedia, which bills itself as an encyclopedia. While it doesn’t have the reputation of, say, Encyclopedia Britannica, countless people use it everyday to find out more about topics of interest.

But the big difference between Encyclopedia Britannica and Wikipedia is the source of the listed information. While Britannica utilizes paid researchers and fact-checkers, Wikipedia is completely driven by contributions—submissions and editing done by its readers, with relatively few moderators.

That in and of itself is not a bad thing, provided care is taken. Wikipedia is free, which is a big plus, or should be, to those seeking knowledge on the cheap.

But often you get what you pay for, and while Wikipedia does monitor its listings and require citations, it really doesn’t do the job that a well-researched encyclopedia does.

The problem is that many things can “slip” through, and Wikipedia is actually susceptible to having misinformation posted.
Most particularly worrisome is its policy on biographies of living people. Despite the vulnerability of factual errors, Wikipedia does not allow the subjects of its bios to correct mistakes.

While I can see there may be some wisdom in prohibiting people from posting their own biographies on the site, it just seems plain stupid to not allow corrections by the people who probably know the subject—themselves—best of all.

While I have caught some Wikipedia mistakes myself, the problem was brought to my attention recently on Dean Radin’s blog. I’m not a big fan of most blogs, but Radin’s research into the possible relationship between psi phenomena and quantum physics is fascinating—and controversial.

It is the controversial nature of his work that has made him the target of true-believer skeptics (those that believe fervently psi does not exist), as well as a target of some religious communities.

But Radin is a serious scientist, utilizing recognized methodology and statistical analysis to examine one of the more mysterious aspects of human potential. To knowingly leave factual errors about his life’s work is to knowingly perpetuate misinformation, and for an “encyclopedia” to do that, it loses all credibility, as Radin accurately points out on his blog.

“I discovered this when attempting to correct factual errors in the entry page on my name, and for the Institute of Noetic Sciences. I’ve been asked not to edit these pages, even though I am arguably the expert on me, and an expert on (the Institute of Noetic Sciences), because it violates Wikipedia's guidelines… Wikipedia’s absurd guidelines means that for topics of interest to many people, namely controversies, the articles are guaranteed to be of poor quality. What a ridiculous state of affairs this good idea has come to, one that very effectively does one thing well—it perpetuates stupidity,” Radin wrote.

I can’t agree more, and upon closer inspection, it would seem that the more controversial the subject the less likely it is a balanced perspective is be presented. Because Wikipedia’s authors are anonymous, there is no one specifically to address about accuracy, and since Wikipedia is a decentralized group of volunteers, there is no one who is responsible.

Wikipedia may have started out a good idea, but a reference resource is only as trustworthy as it attempts to be—and Wikipedia isn’t trying very hard at all, unless it is to, as Dr. Radin so aptly stated, perpetuate stupidity, only in the false guise of wisdom.

For more on Dean Radin’s work, go to www.ions.org, www.deanradin.com or http://deanradin.blogspot.com.

(Originally published in The Easton News, April 12, 2007)

Discriminating between bad and good news is a must for sanity

You turn on the evening news and hear it every day. Bad news again. Nineteen killed in a bus accident in Bangladesh. Shootout in Detroit leaves three dead, nine wounded. Tornado flattens church in Kansas.

Or take these real-life examples of “news” briefs that recently ran in the local dailies:
“Police detain suspect in dog beheading—(Minneapolis) A man suspected of cutting the head off a teenage girl’s dog and leaving it at her front door in a gift-wrapped box was in jail Friday on suspicion of terrorist threats. The 24-year-old man, who was not immediately charged, used to date Crystal Brown, the girl’s grandmother said.”

And:
“Teen found dead in school bathroom—(Hyrum, Utah) A 14-year-old girl who had talked about suicide died in a hospital a day after hanging herself in a school bathroom, school officials said. A classmate had found Kailey Mathews unconscious Wednesday in the bathroom at South Cache 8-9 Center.”

Isn’t news supposed to inform you of things that may affect you? How does either of these items affect anyone in the Lehigh Valley, excepting the vague possibility that the alleged animal abuser, Crystal Brown or the poor suicidal teen is related to someone in Pennsylvania? And if that is the case, how does putting either of these items in the paper inform anyone of anything they need to know?

Are we supposed to feel relieved that our dogs are safe from some twisted maniac in Minneapolis? Was he headed east? And maybe I’m incredibly unhip, but who is Crystal Brown, and why should I care who she is? A Google search turned up very little on her. Apparently she is a poet, but having a dog that was a murder victim recently is probably her bigger claim to fame at this point, which is pretty sick. Not quite as sick as a person who would behead a dog, but still pretty sad commentary.

The second “news” item is just worthless, in my opinion. Reporting something like that so far away serves absolutely no purpose whatsoever, unless there was some mitigating information that makes it locally newsworthy, such as a sudden nationwide rash of teenagers with an urge towards suicide in school bathrooms. There is nothing anyone in the Lehigh Valley is going to do about this particular sad incident—it only serves to depress and to distract one from more relevant issues.

That is a big problem with some “news” these days.

There seems to be more and more of a trend towards what I call “non-news.” These are items that have absolutely no chance of having any bearing on your life, yet are played continually on the mainstream news scene. When it came to be that an entire industry seemingly got together in some big conspiracy to waste newsprint, ink and airtime, I’m not sure, since they didn’t send me that memo, but it does seem to have come to pass.

The worst of “non-news” items are the negative briefs. Too short to actually flesh out the story with any useful information that would allow the reader to make sense of whatever issue is at hand, they basically drop a bombshell and move on to the next “story,” inevitably leaving the reader feel both unsatisfied and uneasy.

It also wastes the reader’s time. Why bother reading about something that has no bearing on one’s life, that one can do nothing about, that only serves to depress? Most people wouldn’t, but most “non-news,” especially the kind that comes in brief, sneaks up on the reader—by the time one realizes one doesn’t care about the piece, that one would really rather move on to something else, something more worthwhile—it’s over.

There are several nasty effects of this. This first is that the reader begins to approach news with some caution and trepidation. Is this news worth it, or is it just going to depress for no apparent reason?

When people tell me they’re “not into” following the news, I often suspect this is the reason—they’ve been stung by irrelevant, bad “non-news” so often they are numb to the difference and just shy away from the entire arena out of fear of getting depressed about something they can do nothing about.

That is the biggest danger of bad “non-news.” It’s disempowering, and being swamped with it day in and day out is even more disempowering. Being showered with negative information one can do nothing to change eventually gives one the impression that one can do nothing about anything negative, regardless of where the problem originates.

“Non-news” is also a bit confusing. In this Information Age, we’re literally flooded with information at a rate never before seen. More may be defined as better, but is it really better if the more isn’t quality stuff? There may be an endless stream of information, but there’s still only so much time in which to absorb it.

The “news” these days requires the a new set of hyper-developed skills in order to filter the “good” news, the kind with useful information, from the “bad,” or useless, kind of non-news, in order to keep oneself from drowning rather than surfing in the “news” world.

(Originally published in The Easton News, April 5, 2007)

Encouraging self-esteem is a mistake if you don’t encourage self-respect

It was a downright rotten thing last week to hear there was another life lost in the Lehigh Valley to the gang “culture.” Paul “Bam Bam” Serrano III, 18 was charged in Bethlehem by police for the shooting of 15-year-old Kevin Muzila, and police say it was a case of mistaken identity, that Muzila wasn’t even the intended target. Likely, Serrano was “auditioning” for a gang.

Though the murder took place in Bethlehem, it could have just as easily been Easton. When these tragedies happen, everyone at first grieves and asks, “Why?” It’s a valid question. Then they quickly stop asking, “Why?” and assigning blame. If you don’t believe me, go to the comments section on the daily newspapers’ Web sites and see the vitriolic diatribes. Blame the parents, blame the schools, blame Youth Services, blame Serrano—just blame someone, it’s got to be someone’s fault.

And it is. If Serrano did indeed murder young Kevin Muzila, then, as an 18-year-old citizen of this country, he is responsible, regardless of what his parents, the schools or Youth Services did or did not do.

But this is not any sort of isolated incident. This is certainly not the first time in the Lehigh Valley, unfortunately, that some dumb young punk has gotten it into his head that murder will somehow make him “more of a man” or a “better” person. In fact, it’s happened recently in just about every state in the nation.

Sure, you can blame violence on television, or bad parenting, and I don’t deny these are contributing factors.

But if you look back on it, a lot of the increased violence seemed to start a few years after there was a public push to build kids’ self-esteem.

Don’t get me wrong. I do think kids should be encouraged to feel good about themselves—at least, when they’ve done something good and deserve to feel good about themselves. It’s a bad message to send, that one should still feel good about oneself, even if one knowingly does bad things. Sorry, if you go around hurting people, you do not deserve to feel good about yourself, and a person who habitually goes around hurting people is not a good person.

But instead, the message has been, “Well, you did something very bad, but it’s okay. You’re still a good person.”

It’s not okay. That’s entirely the wrong message to send.

And there’s a subtle, but important difference between self-esteem and self-respect.

The message of self-respect would go something like this: “Well, you’ve done something very bad, and it’s definitely NOT okay. I know you can be better person than that. Now live up to it.”

The kid with self-respect would not lower himself to become a murderer or a criminal, but unfortunately, the kid with self-esteem but no self-respect might not stop himself, if he thinks there’s a chance of gain. After all, you might murder someone (or rape them, or steal their stuff), but underneath it all, you’re still a good person, right?

Obviously, that’s wrong. But there’s a good chance Serrano was thinking something along those lines.

So, whose fault was it? If proven guilty, Serrano’s. But every one of those folks who told him it was okay, that he was still a good person anyway, who didn’t set him straight along the way, whether they be parents, teachers, coaches or even strangers, might just share a bit of the blame.

The gangs, who promote this “culture” of violence in our neighborhoods, near our homes, poisoning our children, need to be deglamourized. And, underneath it all, because they do what they do, they are not good people.


A tale from the West Ward

Last Friday, I was at a stop sign approaching Ferry Street, when I noticed two young boys on the corner, one of who was repeatedly making a gang gesture reminiscent of the censored Beardsley Lysistrata illustrations at me. I didn’t immediately notice, if you can believe that, and then did a double take to see if I was really seeing what I was seeing. These boys were no more than 9 or 10.

I decided to stop for a moment and stare. The boy, egged on by his friend, continued to make the dramatic, vulgar gesture.

I pondered this for a moment. Obviously, they were trying to get me to react. It’s not like young boys haven’t been known to make rude gestures at women for centuries. But seriously, you had to be there; this was way over the top. I really couldn’t just let a little kid stand ignorantly on a busy street corner gesturing like that at his crotch, especially when I was pretty certain he didn’t have a clue what he was really doing.

I rolled my window down and asked him pleasantly if he knew what that gesture meant. I was right; clearly he didn’t. But his friend, the one who’d egged him on did—sort of.

“Suck it, baby!” he told me exuberantly, as though he would get some sort of prize for the right answer.

So I decided to push it a bit, to teach them a lesson.

“Suck what?” I asked.

“Uh...”

“Suck what? That is an action involving opening one’s mouth and wrapping one’s lips around something. Suck what?” I said. “I assume by ‘baby,’ you mean me, even though I’m 37, and you appear to be considerably younger.”

The egged-on perpetrator got a burst of courage at this point, and tried one more time. He made the gesture again, and said, with a shrug, “Suck it.”

“Rest assured it will not happen, but are you actually telling me you want to pull down your pants, underpants included, and somehow make me wrap my lips around your privates, here in public, on this street corner? Because that is what you are proclaiming, for all the world to see,” I said to him.

He stopped cold, eyes wide.

“I’m in fourth grade,” he said.

“I thought you might be about that old,” I told him. “And I pretty much thought you might not really know what you were doing when you made those gestures. I don’t know who showed you that, but they are not a good person to listen to, whoever they are. They are definitely not cool. What do you suppose could happen if you made that gesture at someone who took you seriously? Or who’s crazy, and got offended and tried to hurt you? Are you sure you’re really ready to deal with that?”

Things were pretty somber after that. I told them to have fun but be careful, and they waved goodbye as I drove off.

The whole conversation took less than two minutes, but they were the best-invested two minutes of my day, if not my week.

(Originally published in The Easton News, March 29, 2007)