If you look at the US News college rankings (or any list, really), the top universities are all in Democratic states. Harvard, Yale, Stanford, MIT, Princeton, etc. You have to go all the way to #8 to find one in a Republican state (North Carolina).
Do you think that Democrats simply provide more funding to education, which creates better universities, or do you think that people in these states vote Democrat because they are more educated?
Do you think that Democrats simply provide more funding to education, which creates better universities, or do you think that people in these states vote Democrat because they are more educated?