Thursday, December 17, 2009

Taking Christianity Seriously?

I have often wondered why our culture at large reserves special distaste for the Christian religion. It tends to come to the fore this time of year, but it exists year-round. How do political commentators get away with claiming that Christianity is the most dangerous religion on earth? How is it that Christian symbols seem to be singled out as targets during holidays?

I have a new theory – it is because Christianity is the only religion our culture takes seriously.

Every other faith is viewed as a matter of personal spiritual fad. None of them, according to popular culture, make objective claims on a human life and none of them are exclusive. Now, all of this would be news to all these religions, but our American culture has reduced them in this way. As a result, Christianity stands alone in the eyes of the populace as the religion that makes objective moral and exclusive religious claims.

If I am right, maybe we have something to celebrate: a few of the core truth claims of the Christian faith remain.

No comments: