The practice of treating or healing oneself, or the body's natural ability to heal itself without medical intervention.
From Greek 'auto-' (self) + 'therapy' (from 'therapeia' meaning healing). Became prominent in medical literature during the 20th century as physicians studied natural healing processes.
Try Another Word