Is America a “Christian” nation? What do we even mean by that question? Historically, as I argued in my recent book, even secular conservatives have tended to assume that the United States needs Christianity. Is that still the case? Will it be the case for the foreseeable future? A couple of new polls offer contradictory impressions.
LifeWay, an evangelical research organization, asked 1,000 Americans if they thought God and America had a “special relationship.” Most did. Fifty-three percent overall agreed. In different categories, the percentages went up. Just over two-thirds of self-described evangelical Protestants thought America and God had a special bond. African Americans believed it (62%) more commonly than white Americans (51%). Tellingly, among the highest percentages of agreement were from white evangelicals over 45, seven of ten of whom agreed (71%).
Perhaps this demography helps explain a seeming paradox. Another Fourth of July-timed research report from the Public Religion Research Institute asserted that the numbers of Americans who think the USA is a “Christian nation” is declining, fastest among the young. Only about a third of respondents agreed that the USA is now and has always been a Christian nation. Older people tend to believe it more often.
What does it mean to think of the country as a “Christian” nation? These polls are tricky, since different respondents can think different things, even if they check the same boxes. For some respondents, the idea of a Christian nation likely evokes a tight bond between public spaces and evangelical religion. In a properly Christian nation, some think, public schools and meetings ought to be guided by Christian prayers and ideas.
Other people might simply mean that the USA has a majority of Christians. If we understand “Christian nation” in this sense, it means public spaces can still be secularized, even if a majority of citizens share a Christian faith.
As I’ve argued in the past, it is a mistake to try to pinpoint one point in time when America turned into a secular society. It is a mistake to try to understand the history of evangelicals in America as having changed at a specific point in time from an unchallenged majority to a beleaguered minority. We hear these sorts of claims all the time, of course, especially when the Supreme Court issues a seemingly anti-Christian decision.
From an historical perspective, evangelical Protestants have always held enormous influence over the entirety of American society. Over the long haul, certainly, evangelicalism’s influence in public life has waned.
Does that mean America is moving away from its history as a “Christian” nation?
Or does it only mean that the United States is now and has always been a pluralistic society, with evangelicals battling for control with all kinds of other religious and non-religious groups?