I always tell people that you haven't found the right doctor until you find one that routinely talks about nutrition. Western doctors are indoctrinated by universities into teaching a "big pharma style of medicine". Western medicine treats symptoms of disease but never address the root cause; the toxic conditions within the body that create disease. Eastern medicine, nutrition, and spiritual healing should be practiced before the last resort of western medicine.
Totally agree
Nutrition its the Key of Medicine !!
A great documentary series you can watch that helps understand the flaws in western medicine is The Truth About Cancer. Episode one is posted below: