A doctor who is trained to care for people’s teeth and gums, including cleaning, repairing, and removing teeth.
From French *dentiste*, from *dent* meaning 'tooth', from Latin *dens* ('tooth'). The specialized term developed as dentistry became its own medical profession.
The word is literally 'tooth-ist'—a specialist of teeth. Dentistry is one of the few medical areas almost everyone visits regularly, which makes the dentist a kind of 'maintenance engineer' for your mouth.
Dentistry, like many medical professions, was historically male-dominated, and language and imagery often assumed male dentists and female assistants or patients. Women dentists faced barriers to training and professional recognition.
Use 'dentist' as a gender-neutral professional term and avoid assuming a dentist's gender; specify pronouns only when known or relevant.
Women dentists and dental researchers have significantly expanded access to oral healthcare and diversified the profession.
Complete word intelligence in one call. Free tier — 50 lookups/day.