Dechristianize

/diːˈkrɪstʃənɑɪz/ verb

Definition

To remove Christian influence, beliefs, or practices from a place, institution, or society.

Etymology

Composed of the prefix 'de-' with 'Christianize.' This word emerged in 18th-century European intellectual discourse as Enlightenment thinkers questioned religious authority.

Kelly Says

Some countries dechristianized their schools by removing religious instruction and prayer—a shift that happened gradually in much of Western Europe but sparked huge debates about values and culture!

Related Words

Explore More Words

Get the Word Orb API

Complete word intelligence in one call. Free tier — 50 lookups/day.