What role do women play in societies around the world? We talk about repressive cultures while considering the West, a safe haven for women. Is that an accurate depiction of the situation? Are women in the West truly more free from the influence of male patriarchy? What comes to mind when we think about a Middle Eastern woman? Is it not time we break down our stereotypes and seek to understand instead?