Animalism

/ˈænɪməlɪzəm/ noun

A doctrine or belief system that emphasizes animal nature; the theory that humans are fundamentally animals driven by instincts.

From 'animal' plus '-ism' (denoting a doctrine or practice), developed in the 18th-19th centuries as philosophers debated human nature versus reason.

📖 Full word page — etymology, 47 translations, audio 🔑 Get Free API Key — 50 lookups/day 📚 Read the Docs — integrate Word Orb