Are Belief in God and Christianity Really Dying in America?
What’s really happening to religion in America? Plainly stated: it’s complicated.
Perhaps the title of the latest Pew Research Center report — “U.S. Public Becoming Less Religious” — provides the most concise overview, though there’s some debate over what, exactly, is going on beneath the numbers when it comes to religious adherence and practice.
This was the second of two extensive religion reports released this year by Pew, with the data within providing a snapshot of the beliefs and practices of the American populace. In contrast, the first report titled, “America’s Changing Religious Landscape,” was released in May, focusing mainly on overarching demographic changes.
The takeaway from both reports was that the American populace is becoming less religiously devout, but answering the “how” and “why” gets a bit more dicey, as pastors, faith leaders and sociologists all have theories as to what’s really happening, culturally speaking.
Read the article “Are Belief in God and Christianity Really Dying in America?” on theblaze.com.