Mass communication

One of the latest fads to grace the conventional wisdom is the assertion (and I paraphrase) that unfettered mass communication is tantamount to the Second Coming.  Whether it’s the credibility-shaking claim that the Time “Person of the Year” is “you,” or the deification of Wikipedia/MySpace/YouTube/Linux as the harbingers of a new and enlightened social consciousness, the message of the technorati is the same: New tools of mass communication are freeing the flow of information, and this is an unmitigated Good Thing.

Well, OK.  I’ll admit to an almost slavish addiction to my myriad e-mail accounts, and I’ve been known to grace an IM conference or two in my day.  I even, on rare occasion, use SMS (text messaging) on my BlackBerry.  Yet there is a world of difference in taking advantage of the sundry conveniences wrought by our relentlessly advancing technology, and in claiming that this technology marks a new epoch in human social relations.

Consider the argument offered by those sophisticates most inclined to drink the bleeding-edge Kool Aid.  They suggest that the proliferation of free or very-low-cost tools permitting the widespread distribution of information means a great number of emotionally pleasing but astonishingly vague benefits will logically follow:  stronger democracy, better information, more enlightened debate unconstrained by the traditional limits on engaging in mass democracy (e.g., owning a press or possessing the financial means to influence the organs of government).

Trend-watchers will point to the bloggers’ exposure of Dan Rather’s memos as a crowning achievement of the citizen journalists.  All hail the common man as he slays the beast of corporate media!

I don’t dispute that there may, in certain circumstances, be situations where the touchy-feely argument holds.  China and Iran seem good case studies; in both countries, dissidents are using the Internet to facilitate collaboration and communication in ways that are undermining (however subtly) an otherwise authoritarian regime.  Such examples, however, are not especially controversial, and raising them as defense of an original major premise seems to border on the tautological.

The more interesting aspect to the spread of tools of mass communication rests not in the developing world, but in the industrialized North.  The very vanguard of the revolution points to its eventual Thermidor — since the masses tend to use these new tools like MySpace or YouTube in disgustingly narcissistic pursuits, it seems inevitable that the “old media” owners of the established communications channels will eventually purchase or license the hammers and sickles of the new media proletariat.  It’s hard to claim the high road against commercial exploitation, after all, when the audience has already laid the paving-stones on the low road.

For no matter how much some are cheering the growth of blogs, video-sharing services and social-networking hubs, the overwhelming evidence points to the ugly truth that The People are not interested in becoming little Patrick Henrys, but rather prefer to use the Internet as a giant exhibition hall, so that the artifacts of self-glorification of the voyeurs, by the voyeurs and for the voyeurs shall not perish from the Google Search Results window.

Ever been to MySpace?  Or perused the roster of clips on YouTube?  They offer innumerable mini pantheons erected to the great god Ego.  Bluster, name-dropping, “being seen” — these are the primary purposes to which “mass communication” is being directed.  Even projects that seem to have some redeeming value, such as Wikipedia, have to be protected from the very people intended to contribute to it, lest they spoil it with pranks or false information.

There is a subset of people for whom the responsibilities of living in a world of instant mass communication will be taken to heart.  These are many of the Old Guard of the early days of the Internet, who remember what Usenet was like before AOL granted its members access to public newsgroups, and who built e-mail protocols not believing that widespread forgery of headers would become routine by people hoping to get a cheap buck or to infect an unsuspecting user’s computer.

I do not mean to sound like a curmudgeon.  I am generally enthusiastic about the innovations in computer and commnications technology that are easing the long-standing barriers among people.  Heck, this is a blog entry, after all.  But I am loath to invest in this phenomenon the same degree of pseudo-messianic significance that seems to be trickling through the e-salons of über-enlightened hipdom.

With every complex good thing comes the potential for many unintended consequences.  E-mail, wikis, IM protocols and the like can provide benefits — real benefits, as in Iran, or mere conveniences — but they remain, in the end, nothing more than tools manipulated by ordinary people.  And the essence of people will not change simply because they get these shiny new tools.  Our impulses are still governed by the same biological drives, and our actions are still stained by the same fallen nature.  We remain just as capable of dignity or depravity, of beauty or barbarism, of wisdom or witlessness, as we were ten years ago — and we will remain thus, ten or a thousand years hence.

By all means, use the technology, if you can.  But do not mistake a material cause of our ongoing social development with its efficient cause.  The new tools of mass communication are merely a means to an end, and their presence or absence does not change the essence of who we are as thinking, feeling beings. And they never will.

In other words … skip the Kool Aid.

Thank you, President Ford

I just watched the service inaugurating the viewing of President Gerald R. Ford in the U.S. Capitol.  The speeches by Speaker Dennis Hastert and Vice President Richard Cheney were appropriate.

I reside in Ford’s Congressional district.  Much of Grand Rapids honors the man; we host the Ford Museum, the Ford International Airport, and the Gerald R. Ford freeway (I-196).  I was even born during his adminstration.

Cheney and Hastert echoed similar themes — that over the history of the American Republic, Providence has blessed us with the right man for the times.  Ford took the high road during the political turmoil of Vietnam and Watergate.  He pulled us out of Vietnam, and pardoned President Richard Nixon. 

The pardon probably cost him the 1976 election.  But he did the right thing.  Now, 30 years later, we are better able to appreciate the wisdom and the foresight of his decision.

As Cheney said, we don’t know what turmoil, what pain, was circumvented because of that pardon.  We can guess, though, that it would not have been pleasant.

So thank you, President Ford.  May history judge you more fairly than your contemporaries did.

Social reality

I made my first actual trip to the gym today.  I joined a little before Christmas, but hadn’t had the time to really get into it until the urge hit in the early afternoon.  Interesting mix of people — some older folks trying to stay in shape, some younger guys trying to bulk up, and a lot of people on the treadmills.  Including me: Those things are phenomenal, with full-screen touch-sensitive displays and the whole works.  Impressive place.

Anyway, as I was jogging away on the machine, my mind wandered to thoughts about social reality.  Yes, surrounded by hot chicks jiggling away on stationary machines, I was thinking philosophy.  I am a nerd, albeit an unusually healthy one.  Nevertheless ….

A few years ago, I read John Searle’s book, The Construction of Social Reality.  Although I grasped his main argument at the time, the impact of it hasn’t really hit me until recently.  The book can best be summarized by (believe it or not) its back-cover blurb:  “… Searle examines the structure of social reality (or those portions of the world that are facts only by human agreement …), and contrasts it to a brute reality that is independent of human agreement.  Searle shows that brute reality provides the indisputable foundation for all social reality, and that social reality, while very real, is maintained by nothing more than custom and habit.”

Well and good.  What this means is that much of what we take for granted as objectively true is “true” only insofar as we all agree to think it so.  We believe, for example, that there are facts about money, or marriage, or art — but these facts have no basis in a world without humans.  What good does it do to say that a penny is 1/100 of a dollar, if there were no people around to use currency?

It’s interesting to see how a localized social reality, e.g. in the home or office, can shift simply as a matter of public perception.  I am witnessing just such a scenario play out over the last few weeks.  A generally accepted understanding about a particular person’s role within a group has shifted dramatically merely because a few key players have allowed themselves to form a different opinion about that person’s contribution to the overall effort.  Although nothing specific changed, and there were no incidents to prompt a paradigm shift, the change in the center of gravity of the group meant that the person in question went from insider to outcast in short order.  And once the prevailing winds turned, the others acted as if the new paradigm had been true all along.

Whether it’s the living room or the conference room, I think we too often take for granted that so much of what we believe to be “true” is merely a matter of convention.  As my friend Emilie so eloquently noted, people don’t take kindly to the black sheep in the flock.  When the conventional wisdom changes the social reality of any group of people, the folks clad in dark-colored wool can only rarely use reason and logic to advocate for change, since logic — that is, the art of argument — is essentially the manipulation of fact and not fact itself.  Or:  If a “social fact” becomes the conventional wisdom, then reason alone is disadvantaged against it.

[All of this commends the written works of Robert Greene.  I have read The 48 Laws of Power and am finished with 31 of The 33 Strategies of War.  Next up is The Art of Seduction.  Each of these is written in a dispassionate, almost amoral, tone unapologetically infused with power dynamics — the very essence of the popular misconception of Machiavelli’s works — and a lack of appreciation for the literary devices pervading Greene’s work can be seen in the derogatory reviews his books sometimes receive.  Yet once you peek behind the curtain, the reader encounters some rather interesting insights into managing interpersonal relationships (seduction), group dynamics (war), and personal ambition (power).]

Taking Searle’s epistemological argument to a small-group scenario prompts ethical questions about the appropriate methods of social interaction.  We like to think that being honest, rational, and direct with people is the best policy.  Best, because most noble and most effective in the long run.  But is this necessarily true? 

If the perceptions of a group ascend to the level of social fact, then logic alone is largely incapable of changing it.  Absent logic, only indirect appeals to emotion have the power to shift perception in most people.  Granted that there are rational people who will respond well to well-reasoned arguments, it seems the case that most people remain affixed to their social facts and will only change them when the conventional wisdom shifts (we see this herd mentality with the punditocracy, for example).  So if reason doesn’t work, must we resort to emotional manipulation?

I realize that this is not a binary proposition.  Yet the logic of it does suggest that “good boy” behavior is only truly effective for those who can afford to use it.  For the rest of us, other means of ensuring success may be more strongly indicated. 

At any rate, this has been a fun topic to ponder, and I’m not finished with it yet.

Modes of discourse

Some groups have a dynamic wherein certain forms of communication are more valued than others. This phenomenon holds true across broad swathes of human endeavor — and the workplace, not least of all. This prompts a question: What is the appropriate response for a person who does not subscribe to the dominant mode of discourse in a given social context?

I ponder this as I struggle to arrive at a coherent response to some ongoing disruptions in the workplace. Healthcare has its own rhythm, and mixing revenue-cycle operations with clinical care can sometimes lead to curious cultural hybrids. But my hospital in general, and my department most especially, is moving along a path that is offering a privileged place to one particular way of communicating. And I’m not sure this is a good thing.

In brief … My department is hewing ever more closely to a communications culture that elevates relationship building and indirect influencing as the officially prescribed means of discourse. It is a very feminist model — the highest virtue is preserving relationships and avoiding direct conflict. It also encourages multilateral negotiation over unilateral assertion, and legitimizes a wide range of emotional responses to workplace stresses as not just valid, but encouraged. More masculine modes of discourse — especially those that give pride of place to logic, authority, stoicism and bluntness — are frowned upon, and practitioners thereof are de-legitimized as being difficult, arrogant, or power-hungry lone rangers who don’t care about the team.

To be sure, there is great value in building relationships and in finding indirect means of encouraging a particular outcome short of direct conflict. But however valuable these skills may be, they do not represent the source and summit of professional behavior. The feminist modes are excellent tools in one’s communication toolbox, but they cannot be the only tools, and they cannot be used indiscriminately. Sometimes, persuasion and consensus-building is appropriate; sometimes, someone just needs to make a decision and be done with it.

It amazes me, still, to see the number of times that logic is trumped by the desire to avoid “burning a bridge” — especially when the logical position must yield to an irrational emotional response (actual or anticipated) by others. And I lose count of how many times a conversation has shifted from the “what” to the “how” of communication; when concerns about “trust” are permitted to cloud the substance of a disagreement, everyone loses.

It takes only the briefest survey of the day’s headlines to conclude that society is increasingly incapable of channeling natural male aggression to socially useful ends. Suicide bombing, prolonged “college” adolescence and inner-city gangs provide ample evidence of a certain degree of widespread social decline. It does not help when traditionally masculine behaviors are banished from the pale of professional behavior and feminist approaches to communication are considered normative, deviation from which is considered to reflect poorly on the transgressor.

A healthy approach to workplace communication recognizes that there are several (often contradictory) approaches that are equally valid. Superior communicators realize that there are many different tools in the idea-sharing toolbox, and have an understanding of which tool is right for a specific job.

But all this notwithstanding, it is a difficult task to move two dozen leaders away from the Kool-Aid of feminist discourse and toward a more healthy and comprehensive understanding of communication excellence. What an adventure this should prove to be!

The Election, Briefly Remarked Upon

Let me begin this post-election analysis by admitting a few things.

First, I do not claim to have any special wisdom about what happened on Nov. 7. Second, I did not make any sweeping predictions about the outcome, because I was not confident enough in my usual sources to even hazard a guess — and I freely admit that the informal predictions I made to friends proved significantly off the mark. (Yes, Tony, you’ll get your $50.) Third, I disdain Wednesday-morning quarterbacking and am dismayed at how simple and crystal-clear the reasons behind “sweep” have proven… in hindsight. Yet, if the lessons were that obvious, then why couldn’t they be discerned before the first ballots were cast?

That said, there are a few things that warrant comment.

1. The voter-suppression effect of reporting in the mainstream media cannot be overstated. The drumbeat of months of uniformly negative reporting about the GOP and the war in Iraq took a toll, and given the closeness of so many of the deciding races, this effect may have contributed to the Democratic margin. After all, what better way to depress the Republican base and energize the Democrats than story after story about how corrupt and aimless the GOP Congress has become and how badly the war in Iraq has been conducted?

2. The margins in the deciding races was close, and a lot of otherwise safe Republican seats were lost for unusual reasons (e.g., resignations). Although the Democrats were smart in recruiting conservative challengers for these seats — it did prove to be a winning strategy — it remains to be seen whether the Democratic hard-left base can deal with the disappointments that will follow from having a closely divided Congress that includes even more conservative Democrats in the House. The odds that a torrent of radical legislation will flow from Washington is virtually nil; the ideological cast of the Democratic contingent is less to the left than it had been, despite the presence of so many of the old guard in leadership positions. The DNC has its majority; whether its sustainable given intra-party disagreements is a different matter altogether.

3. It appears that the Republicans underestimated the Democrats’ ability to figure out how to turn out voters in a 72-hour effort. Let us hope that this error in judgment will not be repeated in 2008.

4. In Michigan, the decades-long problem with the GOP is that the Republican leadership in the state has been more motivated by political evangelical Christianity than the people have ever been willing to tolerate. John Engler won because he was pragmatic; he could appeal to the Reagan Democrat tradition that pervades the rank-and-file members of the automotive industry. The DeVos family, however, has pushed preferred policy preferences on the state to the detriment of the party’s overall electoral viability. For example, the desire by the local party elite to have a universally unpopular school-voucher initiative question in 2000 cost George W. Bush the electoral vote and Spence Abraham his Senate seat. The “I’d rather be pure than victorious” strand of Republicanism is alive and well in Michigan.

5. The Republicans were their own worst enemy over these last few years. Speaking as a committed Republican, I can admit to a sense of disappointment over the gross overspending, lax oversight, coziness with lobbyists, and lack of legislative initiative that has marked recent GOP Congresses. I am loath to suggest that the GOP deserved to lose, but party leaders didn’t make a compelling case for retaining the majority.

6. A lot of pundits are saying that it’s all about Iraq. I have my doubts. Distilling the “why” of electoral results is rarely so simple and elegant, for starters. But Iraq is complicated, and few even among the educated have a decent grasp of what’s really going on. To argue that the election results were a referendum on Iraq is therefore misleading; it might be more accurate to say that the election was tinged by public misperception about the state of affairs in the Middle East. And as such, changing policy as a response to poorly informed public reaction is a bad idea. It does, after all, tend to reinforce the inappropriate behavior of those who use terror as an instrument of political persuasion.

This election has come and gone. The story of its long-term results and its impact on the 2008 race has yet to be written. But one thing is clear — there is a lot going on, and not all of it can be summarized into a handful of talking points.

A Clash of Nihilisms

Two phenomena dominating U.S. politics are intertwined in a manner that few seem willing to appreciate.

The first is the reality of extremist Islamism, and its culture of jihad waged through means that include suicide bombing as an acceptable, even routine, tactic. The second is the moral collapse of the Left in the Western world. The wellspring, in both cases, is the same — radical nihilism, and the socially appropriate methods of finding existential meaning in a nihilist culture.

In the West, nihilism is essentially philosophical and has its roots in the logic of the Enlightenment. In reaction to the political role played by organized religion in Europe in the later Middle Ages (itself the result of social collapse whose defining moment can be sourced to the sack of Rome in 476), the trend among European elites was to move increasing far from the Church. Thus, scientism became supreme, and with it, a toleration for difference that culminated in today’s diversity movement.

The assumption underlying classical liberalism is that only through objectivity of fact and relativism of belief could a free society advance. The only truths to which the Left can admit, then, are those it holds (as a matter of faith, ironically) to be absolutely scientific; religion must be limited exclusively to the private sphere, if it is even to be permitted at all, and any other system of belief must not make claim to objective truth.

Of course, this is a problematic position. To claim that scientism is the only approach to truth is to ignore the intellectual problems wrought in areas where scientific rationality simply cannot hold court. And, there is a logical contradiction at the core of relativism — to wit, that all things are relative except for the one absolute that all things are relative.

This leaves today’s Left with … not much in its toolbox. Its scientism, being largely triumphant, is no longer capable of rallying the troops (despite the occasional jeremiad against “the theocons” in Washington). Its relativism, barring it from rendering definitive value judgments, leaves it incapable of responding forcefully to strategic shifts among civilizations that may imperil the very viability of Western civilization. With no core belief system that it can hold to be true (apart from scientism and relativism themselves), the Left cannot articulate a coherent defense of the West or even of liberalism. Hence the inanity of debate in Europe about Islamic immigration. Thus denuded of both sword and shield, all that remains for the Left is mere spit and bile.

This phenomenon is tellingly demonstrated in the behaviors of far-Left politicians in the United States and their cohorts abroad. They are capable, for example, of denouncing President George W. Bush and Prime Minister Tony Blair in the strongest of terms, but those terms are almost always ad hominems. Bush is stupid; Bush is evil; Bush is a monkey; Bush is a war criminal. Each claim is patently false, but for the Left, just making the claim is considered a heroic act of sophisticated truth-telling. Without core beliefs and an openness to non-scientific truths, the Left’s politics is little more than stone-throwing.

Of course, not all denizens of the Left are irrational hate-mongers, but even so, the response across the spectrum of Left-wing civility has been to rely increasingly on asserted value claims and not on reasoned arguments, and whether your source is the Daily Kos or the New York Times editorial board, too much of the Left’s political commentary is torn by its desire to assert value-laden truth-claims about non-objective subjects while attempting (usually inadequately) to preserve its scientific, relativistic orthodoxy. I am reminded of the trope used by a priest at my church, long ago, who punctuated an especially animated homily with the statement: “God cannot sin against Himself.” Neither can the Left betray its own basic assumptions without a fair amount of long-term psychological damage.

In the Islamic world, by contrast, the nihilism is more recent and is locused in demography. Because Islamic philosophy embraced a fundamentally Platonic worldview, it was more comfortable with authoritative pronouncements about the world than the Aristotelean West ever was. In an intellectual milieu wherein Koranic philosophy contains the definitive delineation of metaphysics, there is less need for experimentation or even a spirit of inquisitiveness. Thus, the Islamic world fell behind as the West’s technological lead widened — and with that gap came socioeconomic disability that is fundamentally incompatible with Islamic self-perception. Nihilism is a rational response to dealing with the divergence between the ontological claims of one’s faith tradition and the oh-so-obvious reality in which that tradition is lived. In other words: If the logical validity of a person’s scripture is undermined by the discrepancy between the world and what the scripture says about the world, then either a person must abandon (or at least, re-interpret) the scripture, or abandon the world. Radical Islamists have chosen the latter path.

The demographic problem of the Middle East, then, is fueling the radical Islamist assault. If the Koran says that God blesses Muslims with happiness and prosperity in this world, but you live in backward squalor, then the Koran must be false. But for idealistic youths who have no other socially acceptable outlet for their natural, biological aggression, there is a second option — to assume that Muslim civilization is under assault by the Other (Christians, for example) and that therefore it is the will of God that the oppressors of Islam be brought to earthly justice. Hence the attractiveness of suicide bombing. And given the number of young males, and the high birthrates in the Arab world, the nihilism of the contemporary Islamic intellectual position is explosively aligned with a burgeoning youth culture that grasps for meaning, recognition and tribal solidarity … and finds it in radicalized religion.

The twin gorillas of contemporary American politics are the moral vacuousness of the Left in response to a civilizational assault, and the proper response to militant Islamism per se.

The Democratic Party, the standard-bearer for the American Left, simply denies that radical Islamism actually constitutes a coherent threat. Against all the objective evidence presented over the last few decades (not to mention the assertions of radical imams across the globe), the Left rationalizes its inability to respond to the the threat by denying that the threat actually exists.

The Republican Party, for its part, has responded to the threat of Islamist demography but not to the ethos that feeds it. Invading Afghanistan or Iraq (or Saudi Arabia, or Iran) won’t solve the problem. True, it will mitigate it; Bush may be right that the best we can do is take the fight to the enemy on the enemy’s own soil. But the definitive resolution to radical Islamism can originate only from within Islam; Muslims need to reform from within. Until that happens, the best the West can do is simply to police the borders.

The confict in the West between the desire to respond militarily to provocation, and the desire to ignore the basic problem by treating terrorism as a legal problem, has created a less-than-robust reaction to militant Islamofascism. This weakness, in turn, is incorrectly perceived by radicals as a sign of (a) the emasculation of the West and (b) the favor of God. Both justify the tactics and beliefs of radical Islamism.  The circle becomes vicious, and the body bags pile up.
We are left, then, with a clash of nihilisms. The politically potent and aggressive manifestation of Islamic demographic nihilism is clashing with the morally vacuous and passive manifestation of Western philosophical nihilism. Whether a shift in demography or in political reality will affect the interplay of these nihilisms remains to be seen, but the outlook if things continue as they are is not encouraging — for Islam, or for the West. Perhaps the West will rediscover its faith in its own beliefs and institutions. Perhaps moderate Muslim leaders will stem the worst of radical behavior. Or perhaps the struggle will continue for generations.

Or perhaps one side will learn the wisdom of repudiating its own nihilism, thus freeing it to respond more effectively to the other. Some of us still dare to hope.

Stewardship as metaphor

My church, a Roman Catholic parish of more than 1,500 families, is officially big on the concept of stewardship. You know the drill — you give your “time, talent and treasure” to the church and in return you will get various blessings and happiness. Fork over 10 percent of your gross income, and somehow God will give you even more in return (often in vague and undetectable ways). Something like the “Prayer of Jabez” with a distinctly Catholic spin. It’s a wonder Wall Street hasn’t been more bullish on the concept.

Anyway, we are subjected to relentless preaching about the virtues of stewardship. We are unceasingly exhorted to give, give, give in order to improve our faith lives. Whether one accepts the hidden premises here is irrelevant; what is interesting, from a philosophy-of-religion perspective, is the sequencing of stewardship.

If we concede the religious principle, that acts of mercy or acts of charity are morally good and spiritually beneficial, we must ask: Which comes first? The faith or the act?

Stewardship as an organized program presupposes that good stewards are already good Catholics of strong faith. Yet everyone is encouraged to be a steward. It’s trivially true that not every Catholic is a good, practicing Catholic with strong faith. So what gives? Is this a form of Pascal’s Wager, wherein a life of faith is to be cultivated through habituation? You act like you believe in order to gain faith?

The role of the church is to work to ensure the salvation of souls. Although the parable of the good steward is a great metaphor, the metaphor cannot substitute for reality. Nor can a metaphor, no matter how applicable it might be to some aspects of our lives, serve as a guiding principle for the totality of our existence.

There is more to being a good Catholic than merely following the formal precepts of stewardship, but many of the parishes aren’t teaching these other aspects to the same degree. It’s as if “stewardship” is the one-size-fits-all method for living an authentically Catholic existence.

I have no objection to occasional reminders to give more. But if the link between faith and practice is as strong and as logically necessary as that presupposed by the theory of stewardship, then instead of incessant exhortations, perhaps the church should focus, as I once told a former pastor, on helping the faithful to center their lives on Christ. If, after all, the faith is there, then the act should follow. If the faith is lacking, then no amount of nagging will achieve the desired outcome.

Focusing on the act to build the faith seems backward, but it’s the central (if unspoken) conceit of stewardship. Although this is may be excusable in the periphery, to elevate this concept to a position of centrality in parochial catechesis seems detrimental to the long-term spiritual health of the faithful.

Natural slavery

When I was on staff at the Western Herald, I often promised (or threatened, depending on one’s perspective) to write a column in defense of Aristotle’s theory of natural slavery.

To recap:  Aristotle often takes heat for defending the institution of Greek slavery.  It should be noted, of course, that Greek-style slavery was substantially different from the American experience with African slavery.  Aristotle argued that some people were “slaves by nature” – that is, their capabilities and their outlook left them especially well-suited to the life of a slave.  Contemporary commentators forget the myriad points of departure from today’s Judeo-Christian moral climate and the virtue-based environment of antiquity, which accorded honor and moral praise to those who most completely fulfilled their social function; it is therefore not especially difficult to understand Greek slavery and the rationale behind Aristotle’s defense thereof in light of the history of moral philosophy.

Slavery is illegal in the United States.  But the institution of “natural slavery” seems to be as strong as ever.  Want proof?  Look at the local gas-station attendants or grocery-store cashiers.  It’s one thing to do menial work for minimal remuneration for extended periods; for some, circumstances do not allow for a realistic exit from this economic reality.  But for others, this scenario is escapable but the “natural slave” has absolutely no desire to do anything different.  He or she fully understands how to work the cash register and finds some degree of fulfillment in being promoted to chief clerk or assistant manager.

Indeed, I met many of these people during my first job.  I worked as a clerk at a grocery store while in high school.  We had a number of senior cashiers, some of whom had been employed for longer than I had been alive, whose daily working life focused on who got to be the shift team leader or who was given their very own supervisor number for the cash registers.  Some of these people even had college degrees, but they lacked ambition.  They were comfortable working as retail clerks.  They were good at it, and whatever their aspirations, they did not possess the gumption to improve their lot in life.

Slavery is possible even lacking whips and chains.  We are slaves to our own needs, wants and desires.  It is trite to observe that we are our own worst enemies, but if the shoe fits ….

I am a victim!

My name is Jason, and I’m a victim of road rage.

On Saturday, as I was driving to church (church!), a male in his thirties in a minivan backed out of a driveway directly into the path of my oncoming Jeep.  So what did your friendly blogger do?  He passed him, so as to avoid killing him.  Right decent of me, I think.

My kindness earned me an assault of the most bizarre kind.  See, the road upon which we traveled was zoned as “no passing,” but to avoid hitting the minivan, I had to pass him.  Not the worst thing in the world, since there was no oncoming traffic.  But I guess I hit a nerve with this guy, who sped up, blew a stop sign, and cut me off at a choke point along the road.  He then exited his vehicle, ran up to me, and started screaming about me passing him in a no-passing zone.

Well, I’m one of those strange people who gets very calm and rational in the face of aggression.  Whenever people blow up in my face, my mind clears and I become almost tranquil.  So I very gently reminded the angry man that, in fact, he pulled out in front of me.  At which he became even more angry — apparently, the fact that he didn’t bother to look for oncoming traffic means that I must’ve been going at least 40 in a 25 zone.  Why those numbers?  Don’t ask me.  I was too busy watching his spittle coat my window to focus on his math.

OK.  Guy in a minivan with a horrible case of road rage.  I can deal with that.  What astonished me, however, was the behavior of those present for the adventure.  Two other vehicles served as effective barriers, boxing me in place, while the occupants simply observed the encounter.  A second minivan, which was nearly sideswiped by the road-rage guy, also simply sat there while the nut was screaming at me.  One shudders to imagine how supine my fellow travelers would have been if the road-rage guy were an Islamist terrorist with a bomb.  Would they have watched in silent fascination as he carefully armed his suicide fanny pack?

“America Alone”; the objectification of women

This past week was spent in sunny Central Florida for the annual Cerner Health Conference.  Overall, things went quite well, from my perspective — I traveled with a great group of co-workers, the sessions were mostly informative, and the hotel (Gaylord Palms) spectacular.

While in Florida, I read Mark Steyn’s new book, America Alone.  His thesis is that militant Islamism presents a serious civilizational threat because the political systems of the West have denuded Western Man of a certain vitality — and that this tendency gives Islamists the upper hand because (a) the West’s preoccupation with “diversity” means we downplay the threat, and (b) the nihilistic vacuousness of Western ideology is being rationally displayed through dangerously low birthrates.  In short:  Radical Islam might win because radical Muslims might outbreed us.

Speaking of breeding … the dehumanization of women as sexual beings may be more advanced than I thought.  Many are familiar with the colorful terms used in, say, gangsta rap — but what is more chilling is the reaction in a closed and relatively informed discussion group to one person’s comments about casual sex.  The short version is that he has male friends who like to sleep with different women, without using contraception — even pressuring the women into not using contraception — so that these virile studs might “breed” them (his terms, which went largely unchallenged and unanalyzed by the group).  For these men, the thought of having dozens of unknown children by buxom, servile women is a psychological turn-on of the first rank, and the titilation factor is only enhanced by explicitly referring to women in terms usually reserved for livestock.  That this has always been true, in a latent sense, is probably trite; that social conventions are loosening to the point that sexually predatory male behavior is essentially uncontrolled, is a development with complex outcomes whose advent has not received the attention it deserves.  Not the least of which is an acceleration of the sexual objectification of women.

On a not-very-related note, I’ve had several people mention dating problems to me.  Which is sorta funny, in a way.  But the theme is similar: No one is out there, I’m all alone, men/women only want one thing (that “I” don’t have).  OK; fair enough.  But everyone presents a package of strengths and weaknesses to potential partners.  I firmly believe that anyone who tries really hard can find a mate.  The challenge, though, is that desperation tends to work in contradictory ways.  For some, it relaxes their standards; for others, it tightens them.  As it happens, for some of my friends, the latter is happening, and so they’ve narrowed their “minimum acceptable criteria” in such a way that anyone would have trouble finding the ideal him or her — and moreso given the limitations of each person’s own ante-up into the dating game.  Until they realize the improbability of a Royal Flush, their luck at the table will probably be disappointing.