There's little to indicate that God heals our bodies when we become Christians. We can experience emotional healing, freedom from addictions, and spiritual growth. But physical healing of the body--like recovering eyesight, walking when you couldn't before, being deaf, or having failed organs suddenly start working, those kinds of miracles rarely if ever happen.
You are viewing a single comment's thread from:
Hi barncat, I don't think body-healing is common in this day and age but I don't know if it's super rare either. Thanks for commenting :)