The Decline of Christianity in America

The Decline of Christianity in America

The United States is a significantly less Christian country than it was seven years ago.
The Daily Buzz Image: Examiner
This finding from the Pew Research Center’s newest report, "America’s Changing Religious Landscape," is ricocheting through American faith, culture and politics. The familiar map of American religion is rapidly changing. The map that usually comes to mind is the South as the "Bible-belt" of white evangelicals, the Northeast cradle of Catholics, and the Midwest full of Protestants, and the Latter Day Saints in the West. Now, the map is reflecting a change of more and more people who claim no religious brand label.U

Image: Adherents

According to the survey, Christianity still dominates American religious identity (70 percent), but atheists and agnostics have nearly doubled their share of the religious marketplace. Furthermore, the study demonstrates how more people are moving beyond the confines of denominations, an example being the rise in interfaith marriages.

A sign of the decline of Christianity appeared in the 2010 Pew Research "U.S. Religious Knowledge Survey." This study found that atheists and agnostics, Jews and Mormons were among the highest-scoring groups on a new survey of religious knowledge, outperforming evangelical Protestants, mainline Protestants and Catholics on questions about the core teachings, history and leading figures of major world religions. H/T PewResearch, ReligionNews, RawStory Another sign of the times is the ongoing push from atheists to remove "In God We Trust" from U.S. currency. Read the story excerpt HERE.    

Have your say