A part of the body that is not the normal or expected shape, often present from birth or caused by injury or illness.
From Latin 'deformis' (de- 'away from' + forma 'shape'), meaning 'shapeless' or 'disfigured.' It entered English in the 14th century, originally used to describe any distortion of natural form.
Try Another Word