M
Me
Guest
a lot of magazines and stuff often have articles saying " how to have a 'healhy, sex life". I mean, I know that having sex is a natural act and everything, but are people who have sex often healthier than those who don't? I mean, does it "feel" healthy. I know that there are numerous diseases and stuff out there, so Im just wondering? I'd appreciate if experienced people could answer only. and btw, how do you have a healthy sex life?