Add new comment

Lisa-I hear you...and get you! It is my conviction that Christianity, as the dominant religion in the West for centuries, co-opted the ethics and posture of "empire" rather than do what I believe is core to the Christian faith--embody the ethics and teachings of Jesus. And, while I believe that the ethics of "empire religion" has led to historically tragic outcomes in how Christian religion relates to the other religions and philosophies of the world, etc, the current decline of the Christian religion in the West is a remarkable opportunity to have the Church's ethos distilled down and refined to look more like the core principles and ethics of Jesus and not nationalism, imperialism, etc. Jesus led with compassion, mercy, justice, and love, and not the and exclusionary and vilifying sentiment that you feel so deeply. Thanks for sharing.