Is the West Becoming Pagan Again?

what is ending is not the Christian faith, with its rites and dogmas, but only Christian culture — the way Christian societies are governed