Many say that the USA is going to be doomed. What do you think? Will we pull off a Roman Empire (not conquering people...er...at least not considering it conquering) and be invaded by "barbarians" and be totally humbled and changed.
The USA will eventually fall, as has every great empire before it. I can't see it happening anytime in the near future, but eventually it will. It might not be by barbarians though...it could happen from the inside even...but somehow, one day it will.