I graduated from Florida State University with a Bachelor of Science in Political Science and English Literature. I'm very proud of the degree, but I favor my English major. That's why I now teach English. I could teach Civics, which is perhaps a traditionally more manly discipline, but I prefer English Literature. I just really like the language.
I've been single for a while now and I'm not sure if women are turned off by my profession. I don't make very much money, but I barely have any bills either--a one-room townhouse, a dog, some fuel for my car, and food money. I don't hesitate to tell women what I do for a living and I don't think it's ever hurt my chances, but I have been single for a while.
Do you think women are turned off by the fact that I teach high school English? Is there anything inherently wrong with guys teaching English?
I've been single for a while now and I'm not sure if women are turned off by my profession. I don't make very much money, but I barely have any bills either--a one-room townhouse, a dog, some fuel for my car, and food money. I don't hesitate to tell women what I do for a living and I don't think it's ever hurt my chances, but I have been single for a while.
Do you think women are turned off by the fact that I teach high school English? Is there anything inherently wrong with guys teaching English?