Director

/dəˈrɛktər/ noun

Definition

A director is a person in charge of guiding and controlling how something is done, such as a film, a play, or a company. They make key decisions and coordinate others’ work.

Etymology

From Latin 'director' meaning 'guide, teacher, ruler', from 'dirigere' meaning 'to set straight, direct'. The word came into English to describe someone who gives direction or guidance.

Kelly Says

A director is literally a 'pointer'—someone who points everyone else in the same direction. Whether it’s actors on a stage or workers in a company, the job is the same: align all the moving parts.

Ethical Language Guidance

Gender History

"Director" in film, theater, and corporate contexts has historically been associated with men, with women underrepresented and often overlooked in awards, funding, and leadership positions. Language around "director" has sometimes assumed a male default.

Inclusive Usage

Use "director" as a gender-neutral term and avoid assuming a director is male; mention gender only when relevant (e.g., discussing representation).

Inclusive Alternatives

["leader","head","manager","filmmaker"]

Empowerment Note

When discussing directing in arts or business, include women directors and their work, especially where they have been excluded from canons or leadership narratives.

Related Words

Explore More Words

Get the Word Orb API

Complete word intelligence in one call. Free tier — 50 lookups/day.