For as long as America has been a country, the straight white American man has been king of the hill. But as society changes and culture evolves, the ground beneath that hill is growing shaky. Economically, physically and emotionally, many American men are fighting to maintain a foothold.
“What it means to be a man today is different than what it meant 20 years ago,” says James O’Neil, PhD, a psychologist at the University of Connecticut who studies gender role conflict.