Thursday, January 12, 2006

To Compromise a Principle is to Abandon It

I am becoming increasingly sensitive to a lot of incorrect doctrine and dogma that is a part of modern American Christianity. Instead of blogging about my toothache or my bills, I would like to get some things out in the open.

I keep hearing variations on a theme that I might have accepted a few years ago without question, but now, as a more mature and discerning man, I blanch at immediately. Here it goes: “America is a Christian nation” or “America was founded on Christian principles” or something to that effect.

After pondering this in solitude for a while, and doing some research for a while after that, I have to say that I don’t believe this to be true. My inner Black man says, “Duh!!” -after all, if America started out as a Christian nation, and all these fools are looking backward to the glory days of the past, then hey, I should just find the nearest landowning Southerner and offer to be his slave. Slavery was legal and encouraged at America’s founding. And for the squeamish, you don’t even want to know the intimate details of what it took to turn free peoples into slaves. My inner Black man says that simply on the fact of the most brutal form of slavery the planet has ever known, America was not a Christian nation circa 1776. But apologetics will excuse the barbaric behavior of church attending, Father Son and Holy Ghost worshipping, slave raping and murdering Christians, and say that they were an anomaly. But what about old Thomas Jefferson, who had Sally Hemmings (and who knows how many more like her)? No, if I am going to make my case, I have to take it off racial grounds, because the good Christian Americans of the day can’t be faulted for their treatment of the Africans and Indians, after all, these nonwhites had no souls. So let me change my approach.

What are Christian principles? I’ve been a Christian for about 25 years, and this is something I actually had to think about. I would say that the fundamental Christian principle is to love one’s neighbor as you would love yourself. Other Christian principles are turning the other cheek, being honest, loving nothing more than God, taking care of those less fortunate. Anybody disagree with that?

Okay, so I’m looking at American history, and I am struggling to see cohesive and consistent application of these supposedly obvious Christian principles. I see land grabbing and greed, I see exploitation, I see genocide and slavery. I see oppression and arrogance. I see hypocrisy and hate. I see the name of God used to justify killing women and children like this was the Old Testament. I don’t recall God anointing America the new Israel. I am looking through a few of my history books, and I don’t see how anyone could honestly claim that America started out on Christian principles unless they are either ignorant, or propagandists. I’d prefer to think they were merely ignorant, rather than willfully distorting facts to push an agenda. I need someone to name a Christian principle that America has not been in violation of since its inception –because I can’t think of any. I think of the Compromises of 1820 and 1850, Dred Scott, Reconstruction, and … it’s quite distasteful. If America was founded on Christian principles, then Christianity has different meanings to different people, and all the meanings can’t be correct. I can’t believe this is something people are debating. What is there to debate? Invoking God’s name here and there makes neither man nor nation state righteous or holy. But you shall know them by their fruits.

0 Comments:

Post a Comment

<< Home