I mean when/if everybody gets over thinking that being homosexual his such a sin and is a disgusting thing. To me, there's really no difference in it than heterosexuality but people still don't seem to get it which bothers me. Being gay, bi, and trans really does exist deeply in peoples' hearts and we can't help that we're attracted to the same sex any more than heterosexual people can help but be attracted to the opposite in every of the same ways. (Transgendered people as well - from my understanding - they don't just wake up one morning and decide "oh! I'm going to be the opposite sex today! no big deal..." it's actually a recurring thing they struggle with each day; people never understand that though unless they take the time to)
I feel that complete heterosexual people/families who never have to have anything to do with it don't really understand what it means to be homosexual, aside from the stereotypes. All they know are the stereotypes and it's very frustrating; many of them never take the time to learn the real facts unless somebody close to them is gay or trans, in all honesty.
But once everybody gets over thinking so badly about others being LGBT, do you think it should be (at least briefly) taught in schools' health classes so that everybody will have more of an understanding of homosexuality and other LGBT people?
To me, teaching that is just as important as teaching the "ins and outs" of heterosexuality because it's not just something that can be taken lightly - does anybody else agree? I'm sort of surprised that non-LGBT people these days still don't understand, but I think the main reason for that is because it's not taught to them directly in the right way, such as in school.
The only way that they naturally "learn" about it are from everyone around them who are, most of the time, against it and make fun of it, which is not the way it should be. Perhaps explicit factual education about homosexuality would help more people to understand that it's not just a laughing matter.
I feel that complete heterosexual people/families who never have to have anything to do with it don't really understand what it means to be homosexual, aside from the stereotypes. All they know are the stereotypes and it's very frustrating; many of them never take the time to learn the real facts unless somebody close to them is gay or trans, in all honesty.
But once everybody gets over thinking so badly about others being LGBT, do you think it should be (at least briefly) taught in schools' health classes so that everybody will have more of an understanding of homosexuality and other LGBT people?
To me, teaching that is just as important as teaching the "ins and outs" of heterosexuality because it's not just something that can be taken lightly - does anybody else agree? I'm sort of surprised that non-LGBT people these days still don't understand, but I think the main reason for that is because it's not taught to them directly in the right way, such as in school.
The only way that they naturally "learn" about it are from everyone around them who are, most of the time, against it and make fun of it, which is not the way it should be. Perhaps explicit factual education about homosexuality would help more people to understand that it's not just a laughing matter.