I don't get why people are so offended to see nudity in films, especially when there is nothing sexual about it. Why all have bodies, so why should we be ashamed of them? I'm not saying that it is okay to whip out your goods on a public street, but why is nakedness on television such a taboo...