I've looked into a lot of schools and the types of business majors they have, then after looking into things more I was told that being a business major doesn't really prepare you to own/manage a business. That it's more of the thing you learn with experience. I know that there are a few colleges that have kind of rare majors to fit more what you want to do, I was hoping someone on here might know of a major I could look up that would teach me how to run a business or if you a know college that has that. I know that I could read books on my own time but I'd prefer to have a degree