To remove social shame or negative attitudes from something, making it more acceptable to discuss or pursue openly.
From de- (to remove) + stigmatize (to mark with shame or disgrace) + -ize; a late 20th-century term used in social movements and healthcare reform.
Public health campaigns to destigmatize HIV testing, therapy, and LGBTQ+ identities have literally saved millions of lives by making people comfortable enough to seek help instead of hiding in shame—proving that language and attitude change can be lifesaving.
Complete word intelligence in one call. Free tier — 50 lookups/day.