Why is the military the go-to career choice for so many advice-wielding civilians?
I’m not asking why so many people take it upon themselves to go into the military, just to get that out of the way.
It seems that some people are always ready to tell those around them to go into the military. It’s their go-to career advice for anyone and everyone. “You need to join the military.” No, I don’t think I do.
People giving out unwarranted career advice is annoying in itself, but what exactly is so great about the military that people are so eager to suggest it? I never hear anyone in the military suggest this to others and I more often hear them complain about their benefits/pay and all the downsides to their jobs.
Is this a southern thing, or does it happen all over the US? Does anyone have an idea as to why? Am I the only one that finds this odd and borderline aggravating?