You are viewing a single comment's thread from:

RE: Will American Football Be Dethroned?

in #sports7 years ago

I am sorry, but this post had me triggered a little bit, but not for the reason you might think.
Like, when will Americans catch up with the rest of the world and realize that the one true sport is FOOTBALL, and by football I mean real football, not the American version of rugby. It is big in every other country in the world except in the US and Canada. And who came with the made up term of “soccer.” I absolutely hate this ridiculously sounding made up term.
And why Americans call the Super Bowl winner world champions???? How can they be World Champions when all other teams are American only; and like one or two Canadian ?

Talking about World competition, how many people know the US didn’t qualify for Russia 2018, because they lost to an island 90 percent of Americans cannot find on the map; and all they needed was a draw??
The sad thing is if America invested half of the money in football as they did in American rugby they would be world contenders .

But instead football is not even top 5 sports in the US

Sort:  

Soccer/football is the world's sport and justifiably so. Kids in the US play it widely, but it's never really caught on at a pro level in terms of the reach that these other sports have in the US. Good point with the World Champions label. As you know, we Americans can be rather us-centric. :)