Only if women started believing what they're seeing and yes they would, girls would grow up into it and accept it.
Men ARE objectified as rapists, woman beaters, and bumbling stupid idiots all the time mainly in the media and commercials and women pop culture and it DOES bother me. A lot of women think men are dangerous and evil. Also little girls might grow up to think they're in danger and it all blocks the truth of men having good qualities to give women and girls/women don't see the good only danger.
The EXACT same goes for women marketed as sex/ornaments it makes boys grow up to only see what the media wants him to see and instead of him seeing the genuine qualities girls and young women have to offer him he only see it as "I wanna get layed".
The real problem I have is why so many women and ALL feminists deny that men are objectified too in a different way and it is equally damaging to society.