I am so struggling with this right now. To be honest, I have always had a hard time sorting through the scriptures and beliefs that if we have faith we will be healed. I have a niece who is very seriously ill, and her parents believe VERY strongly that she WILL be healed. They believe that because of their faith in God, and understanding of His will that will be healed.
I am reading through the bible, and I am in the Luke right now - so I have read over and over the many stories where the words go something like "because of your faith, you are healed", etc. I trust God's word to be truth 100%.
I have more of a I have-faith-that-God-is-powerful-enough-to-do-whatever-he-wants-and-I-pray-that-it-is-in-His-will-to-heal type of a faith.
I am praying to God to help me in this area, and if nothing else, I do know God loves to reveal Himself to us!
What have you all learned about this?