Naturism

•  Naturism  –  The belief or doctrine that attributes everything to nature as a sanative agent.