Which book should you read first - the Bible or the Dictionary?

It strikes me that Christians need to learn what 'faith' really means
Belief without evidence
i.e. NOT 'a wonderful ability to magically know the truth'

How can you tackle ANY book when you allow anything OTHER than the dictionary to explain what another book's words mean?
 
Back
Top