Body image is the thoughts and feelings of a person has about their own body. Having a positive body image effects self-esteem. In studies the majority of women and girls in western societies have a negative body image. The American ideal is to be thinner than is normal or healthy. Men and boys are thinking more about their bodies, wanting to be more muscular. Unrealistic ideals are learned from parents, friends, and the media.[1]
In non-western cultures, body image does not have the same meaning. In some societies, people think of themselves as part of a group, not as individuals. Where getting enough food is a problem, growing thinner is seen as unhealthy. Westernization of cultures has decreased positive body image worldwide.[2]
A 2007 report by the American Psychological Association found that a culture-wide sexualization of girls was contributing to increased female anxiety associated with body image. The study found that women are more worried about their body image than men.[3]