What do you think it means when you hear people say things like “she doesn’t act or behave like a woman?” What do you think it means to “act like a woman?”

Do you think society is telling woman to be passive, submissive and to be in the background AND at the same time advocates for them to embrace feminism? What do you think it means to hear someone tell a guy to “act like a man?”

Are they saying that men should be more aggressive, on your face and active than woman?” Does society indirectly supports the idea of women being the weaker sex? Does the society indirectly expect more from Men than women? If so, doesn’t that contradict the idea of men and women being equal??