Rape Culture in America
Rape culture is a huge problem in America that not many people focus on. Our society is teaching young men that their sexual violence is okay and it teaches women that they are to blame.
Dear future President,
As a young women living in America, rape culture scares me greatly. Rape culture is a phrase that shows the ways in which society blames victims of sexual assault and justifies male sexual violence. I do not want to grow up in a country where sexual assault is excused and considered normal. I do not want to be afraid to walk alone at night or be scared to be alone with a boy. As much as I try, rape culture is everywhere and I cannot escape.
Rape culture is all around us; in music, TV shows, movies, laws, ect... Magazines objectify women and their bodies, making it seem okay for men to do the same. There are countless cases of victims of rape or sexual assault being called liars or being told they're overreacting. Public figures who objectify and rape women are teaching the young generation it is okay. People in our country believe rape is just something that happens, that it's inevitable and it's just the way things are. They believe that it is the victims fault because of what they were wearing or their body language. They think "boys will be boys"... Like sexually assaulting someone is just something that occurs when young people are growing up. They make jokes and trivialize rape, acting like it's not a problem. It is. This way of thinking is a plague that is infecting our county. Soon you will have the power to speak out and cure it. Instead of teaching young girls to be scared and to cover up, teach young men to respect all living things- including women.
Rape culture doesn't only affect women. Rape is something that could happen to anyone- male or female. It is a problem that affects everyone. So, future President of America, please be a President for everyone, and help end the way our society promotes rape culture.