close
America, a Christian Nation?

Coming to Terms with a Sad Truth

A number of different thoughts have been rattling around in my head recently. And as the months have passed, they’ve coalesced into something larger that I can’t seem to escape. There’s something frightening about it because it’s paradigm-altering. But the longer I meditate on these things and read the scriptures and watch the news, the surer I become that our paradigm needs shifting. So, what am I talking about?

I’m talking about colonizing the Kingdom of Darkness.

I probably should throw up a *TRIGGER WARNING* right about now. What I’m about to say is going to make some of you mad. Some of you probably won’t even be able to finish reading it because the thoughts I’m going to share run so contrary to what you want to believe.

So before I go any further, let me say this. I love America. I love American history. Many of the US founding fathers are personal heroes. I have immense respect for those who have risked and given their lives to protect America. Nothing I say here is meant to disparage any of that.

Instead, like the news that you have a terminal illness, what I’m about to say is something we need to face head-on. We can’t escape it. We can’t hide from it.

It’s time we come to terms with it.

If we’re going to effectively reach the nations with the Gospel, we must realize this simple fact: America is not a ‘Christian nation.’

America is not a ‘Christian Nation’

Before we go any further, I figured I’d throw the big one down first. If there’s anything that Christians need to come to terms with, it’s this.

Now, before you get out your pitchforks and torches, hear me out.

This is a nation that has had abortion – the murder of unborn children – legal for 46 years. And in that time, over 60 million children have been murdered – most for personal reasons.

This is a nation that, over a year ago, redefined marriage to include same-sex couples – something no Christian theologian or ethicist could ever have imagined until very recently.

This is a nation where 40-50% of marriages end in divorce.

This is a nation where 66% of men and 41% of women admit to viewing pornography at least once per month.

This is a nation that revels in consumerism and voyeurism.

This is a nation that has been at war 93% of the time it has existed.

This is a nation which, according to polls done in 2014, is only 70% Christian to begin with (46% Protestant, 20% Catholic).

So, if you met someone who had just divorced his spouse, married someone of the same sex, thought abortion was a great lifestyle choice, viewed pornography regularly, constantly fought with everyone around him, considered himself ’70ish% Christian’, and was unrepentant about all of it, would you consider him a Christian?

My guess is, if you’re of the conservative evangelical stripe, your answer would be “No. Of course not. Now, if he repents then that would be a different story but until then…absolutely not.”

So my follow up question would be, “Then why do we continue to call America a ‘Christian nation’?”

Why Do Many Continue to Call America a ‘Christian Nation’?

Is it because of her founding? America was founded largely by Christians who held to a Judeo-Christian worldview (albeit one affected by the Enlightenment, but Christian nonetheless). But arguing that America is a ‘Christian nation’ because of her founding is akin to arguing that a person is a Christian because he was born to Christian parents who had the best Christian intentions and tried to raise him in a Christian environment.

That’s great and all, but regardless of his parent’s intentions, when little Sammy grows up to be Uncle Sam, and he decides to go out and eat slop with the pigs, he’s no longer a Christian. It doesn’t matter that he was born into a Christian home or that his parents raised him right. It doesn’t matter that his birth certificate was signed by 54 godly, Christian men (and four of them, preachers). All of these things may be important for historical reasons but there’s a reality that needs to be faced today.

And that reality is this: America is not a ‘Christian nation.’

Regardless of who wins this election, we need to come to terms with this if we’re going to move forward.

As long as we deceive ourselves into believing that America is a ‘Christian nation’, we will approach the task of living in America wrongly.

What do I mean by this? If we view America as some sort of New Israel, we will assume that it is our task to impose Christianity from the top-down. In other words, the primary goal of Christians will be to elect Christian leaders and vote for Christian laws. When we view America through this paradigm, we see ourselves in the same vein as the Old Testament prophets, calling Israel back to God and his Gospel. But America isn’t Israel.

So, this leads us to another important question: If America isn’t the New Israel, what is she?

Next Time: American Babylon



Tags: , ,

Leave a Reply

2 Comments on "America, a Christian Nation?"

Notify of
avatar
Sort by:   newest | oldest | most voted
trackback

[…] what I’ve written over the past couple of blogs ( America, A Christian Nation? , American Babylon ) is true and America, like every other nation in the world, is simply another […]

trackback

[…] But before we can do any of these things, we must see the stark difference between the Kingdom of Darkness and the Kingdom of God. […]

wpDiscuz
Story Page