Christians and Others: Is "indoctrination" necessarily a bad thing?

Pastor Dick

New member
The word "indoctrination" has come to have a negative connotation in the English language. And yet, isn't it simply the act of teaching doctrine to a disciple? Isn't indoctrination mandatory for any organized religion? And is it necessarily bad?

1. To instruct in a body of doctrine or principles.
2. To imbue with a partisan or ideological point of view:

from thefreedictionary.com
 
Back
Top