Dermatophyte

/dɜːˈmætəfaɪt/ noun

Definition

A fungus that causes infection of the skin, hair, or nails, such as the organisms responsible for ringworm or athlete's foot.

Etymology

From Greek 'derma' (skin) + 'phyton' (plant), coined in the 1880s when scientists realized skin infections were caused by plant-like organisms.

Kelly Says

Dermatophytes are some of the only fungi that have evolved to prefer dead tissue—they break down keratin in your skin and nails, which is why they rarely cause systemic illness.

Related Words

Explore More Words

Get the Word Orb API

Complete word intelligence in one call. Free tier — 50 lookups/day.