Football

/ˈfʊtbɔːl/ noun

Definition

Football is a team sport played with a ball that is kicked, carried, or thrown, depending on the type of football. The word can refer to soccer (association football) in most of the world, or American football and other versions in some countries.

Etymology

“Football” joins “foot” and “ball,” originally describing games played on foot (not on horseback) with a ball. Different cultures developed their own rules, leading to modern soccer, rugby, American football, and others.

Kelly Says

The global argument over whether “football” means soccer or American football is built into the word’s history. All versions share the same core idea: a ball, a field, and people on their feet battling for territory.

Ethical Language Guidance

Gender History

Football (both association and American) has been historically framed as a masculine domain, with women’s participation restricted, underfunded, or trivialized. Media and institutional support have long centered men’s leagues, sidelining women players and fans.

Inclusive Usage

Specify gender when relevant (e.g., “women’s football league”) and avoid assuming players, coaches, or fans are men by default. Use neutral terms like “players,” “team,” and “fans” instead of gendered collective labels.

Inclusive Alternatives

["soccer","American football","the sport","the team sport"]

Empowerment Note

Acknowledge the achievements of women’s football—such as pioneering women’s clubs and national teams—and how their success has challenged assumptions about who ‘belongs’ in the sport.

Related Words

Explore More Words

Get the Word Orb API

Complete word intelligence in one call. Free tier — 50 lookups/day.