We Are Losing Our Culture
America was founded by religious Christians who believed the Bible and followed it in their daily lives. We have gradually reached the point where Christianity has no place in public discourse. Because of this, we are losing our culture. American culture is gradually being destroyed by left wing socialist ideas, and a foreign invasion that …