Do women in America like to dine out?

tom h

New member
women in America love to eat, and love to be taken out to eat, and love to be taken to nice places. Don't expect any favors returned, though.
 
Back
Top