Add new comment

I would like to challenge Christian churches today who have mostly side-lined the matter of healing from their efforts to live as a Christian, appa-rently leaving it up to the medical profession alone to fulfill that task, and in fact making it almost mandatory to do so.

Clearly healing was one of the primary instructions for us to do from Jesus, along with other matters which Christian churches do work mightily to address, like spreading the word of God and giving to the poor.

A young minister in a Protestant church I recently attended asked the youngers in the children's ser-mon what Jesus did for the people when he was a-live. They dutifully described everything except healing and raising the dead!

Why is that no longer a part of Christianity? (Is the Apostle's Creed, which also leaves that out, partly to blame?)