I'm hoping someone can help me out with this. I have no idea why Republicans are so fascinated with such an arbitrary claim. What defines being center-right?
Even if the US happens to be a center-right country who cares. Thanks to the Tea Party, Republicans are being pushed further right so are they moving away from what they claim the country wants? If so why push this idea?
I suppose if I had to engage in a debate over this term I would say that the Republican zeal here is a misunderstanding of elections. Polls don't show the US as a center-right country if that definition is based on political affiliation. Voting tends to be center-right but not the people.
Regardless, I just don't get it. It reminds me of the guy who is so infatuated with his own football fantasy team that he thinks other people care. I don't care about how Republicans see the country. I do care how the govern it. But Republicans always need a simple minded slogan to rally around regardless of if that slogan is rooted in reality and I guess center-right is the new thing for the simpletons.
The reason that Republicans are so obsessed with that particular sound bite is really simple.
ReplyDeleteIt's their way of defending what they're doing.
By declaring the United States being "Center Right" basically equates a marginalize the left.
After all if the country is "Center Right" liberalism is some sort of extremism that is little more than a fringe element.
It doesn't matter than even a "Republican year" and the contests that had huge margins for the Republican Candidate at least one person in three voted for the Democrat.
The really odd thing is that by insisting we are center-right they are admitting that the US is the extreme right of the rest of the world. This make Liberals more "normal" as far as the rest of the world is concerned.
ReplyDelete