Why don't people acknowledge that 'The South' has changed?
Whenever someone wants to talk about a group of people who are uneducated, behind the times or racist- why do they use the South?
Why are people ignoring the changes that have happened in the South during the past thirty years?
We now have tech, medical and financial industries. We have a huge migration to the region because of our cost of living- so we have a lot of different cultures and races that manage to get along just fine. A student in Kentucky was just awarded a Rhodes Scholorship and her parents didn’t complete high school- so shouldn’t that be a beacon of showing change?
Why do people ignore racism in the North, red states in the West and illiterate people in California?
Why are people still clinging to this antiquated idea that the South has nothing but farmers, Bubba’s and black people that say ‘massa’? Why is it so acceptable to make a blanket statement about the South (that usually isn’t true) and have not a single person stand up and correct them?