Are Belief in God and Christianity Really Dying in America?

By Published on November 13, 2015

What’s really happening to religion in America? Plainly stated: it’s complicated.

Perhaps the title of the latest Pew Research Center report — “U.S. Public Becoming Less Religious” — provides the most concise overview, though there’s some debate over what, exactly, is going on beneath the numbers when it comes to religious adherence and practice.

This was the second of two extensive religion reports released this year by Pew, with the data within providing a snapshot of the beliefs and practices of the American populace. In contrast, the first report titled, “America’s Changing Religious Landscape,” was released in May, focusing mainly on overarching demographic changes.

The takeaway from both reports was that the American populace is becoming less religiously devout, but answering the “how” and “why” gets a bit more dicey, as pastors, faith leaders and sociologists all have theories as to what’s really happening, culturally speaking.

Read the article “Are Belief in God and Christianity Really Dying in America?” on theblaze.com.

Print Friendly, PDF & Email

Like the article? Share it with your friends! And use our social media pages to join or start the conversation! Find us on Facebook, Twitter, Instagram, MeWe and Gab.

Inspiration
Military Photo of the Day: Standing Guard on USS New York
Tom Sileo
More from The Stream
Connect with Us