Why or why not is it relevant to still function as a United States of America?
One state is red the next is blue (aside from the swing states) and so on and so on. So what’s the point of being a United States of America anymore? The opposite ends of the spectrums are so night and day I no longer see a point on where politics agree or disagree, there’s no room in the middle ever it seems.
Socially speaking what benefits or negative outcomes do you see arising, if the United States had decided to become individualized.
**Disclaimer** I’m just trying to promote discussion. Please no personal attacks. This question came to me when I realized through FaceBook comments neither reds or blues ever meet in the middle with a discussion. It nearly always ends with Benghazi scandal pulled up, or Obamas this or that, or the liberal media this or that.
This question is in the General Section. Responses must be helpful and on-topic.