To remove Christian influence, beliefs, or practices from a place, institution, or society.
Composed of the prefix 'de-' with 'Christianize.' This word emerged in 18th-century European intellectual discourse as Enlightenment thinkers questioned religious authority.
Some countries dechristianized their schools by removing religious instruction and prayer—a shift that happened gradually in much of Western Europe but sparked huge debates about values and culture!
Complete word intelligence in one call. Free tier — 50 lookups/day.