I was talking to one of my Southern friends today and he was saying how "the South would rise again..." at first I thought he was crazy but then I started thinking about sports. A Southern team won the Super Bowl and NCAA Championship this year, could easily win the World Series this year, and will likely win the NBA Championship next year and for many years to come. That got me to wondering... has the South risen again in the form of professional sports?