The Future Belongs to Christianity? Christianity may grow in the rest of the world, but it is on a steady decline in the West. Which is very unfortunate, as more and more people stumble upon a worldview that is a cancer on Western society - postmodernism. And all that it encompasses. Existentialist thought, nihilism, moral relativism, all of it is causing the West to decay. It truly is a sad sight, what once was the bastion of human civilization, has been reduced to nothing more than a fraction of what it once was. Nuclear Families are now a rarity, nobody has hope, individuality and responsibility are downplayed. The psychological impact of this can be seen clearly across the West, with rates of depression and other mental illnesses skyrocketing. You cannot expect Western values to remain afloat when you tear apart the very institutions, ideas, and beliefs that they stand on. You can't have the West without God, it is as simple as that.