A rare case in which a news outlet actually tries to undermine its own poll by pointing out problems in its methodology. Is that because it’s terrible for the Unicorn Prince or because, to be fair, it’s hard to draw any conclusions from numbers these vague?...
In 2010, Nate Silver of the New York Times blog FiveThirtyEight wrote the article “Is Rasmussen Reports biased?”, in which he mostly defended Rasmussen from allegations of bias. However, by later in the year, Rasmussen's polling results diverged notably from other mainstream pollsters, which Silver labeled a "house effect". He went on to explore other factors which may have explained the effect such as the use of a likely voter model, and claimed that Rasmussen conducted its polls in a way that excluded the majority of the population from answering. Silver also criticized Rasmussen for often only polling races months before the election, which prevented them from having polls just before the election that could be assessed for accuracy. He wrote that he was “looking at appropriate ways to punish pollsters” like Rasmussen in his pollster rating models who don’t poll in the final days before an election.
After the 2010 midterm elections, Silver concluded that Rasmussen's polls were the least accurate of the major pollsters in 2010, having an average error of 5.8 points and a pro-Republican bias of 3.9 points according to Silver's model. He singled out as an example the Hawaii Senate race, in which Rasmussen, in a poll completed three weeks before the election, showed incumbent Daniel Inouye only 13 points ahead, whereas in actuality he won by a 53% margin – a difference of 40 points from Rasmussen's poll, or "the largest error ever recorded in a general election in FiveThirtyEight’s database, which includes all polls conducted since 1998". Silver named Quinnipiac University Poll as the most accurate poll of the election cycle. However, according to RealClearPolitics, in toss-up races where both Rasmussen Reports and Quinnipiac polled, the Rasmussen Reports final poll was closer to the mark in every race. The two firms projected the same candidate to win every race but the Florida gubernatorial race, where Rasmussen correctly projected Rick Scott's victory, while Quinnipiac showed Alex Sink with the lead.
TIME has described Rasmussen Reports as a "conservative-leaning polling group". According to Charles Franklin, a University of Wisconsin political scientist who co-developed Pollster.com, “He [Rasmussen] polls less favorably for Democrats, and that’s why he’s become a lightning rod." Franklin also said: "It’s clear that his results are typically more Republican than the other person’s results.”
The Center For Public Integrity listed "Scott Rasmussen Inc" as a paid consultant for the 2004 George W. Bush campaign] The Washington Post reported that the 2004 Bush reelection campaign had used a feature on the Rasmussen Reports website that allowed customers to program their own polls, and that Rasmussen asserted that he had not written any of the questions or assisted Republicans.
Rasmussen has received criticism over the wording in its polls. Asking a polling question with different wording can affect the results of the poll; the commentators in question allege that the questions Rasmussen ask in polls are skewed in order to favor a specific response. For instance, when Rasmussen polled whether Republican voters thought Rush Limbaugh was the leader of their party, the specific question they asked was: "Agree or Disagree: 'Rush Limbaugh is the leader of the Republican Party -- he says jump and they say how high"
Source at Wikipedia