I don't get it. If religion is the root of all evil (an opinion felt by

Really Now?

New member
some) why have religion education? What is the purpose of it? Does it put religion in a negative, positive, or fair light. What if the teacher is bias? Is atheism taught in such a course? If so how is it taught?
 
Back
Top