MetaCritic Defends Self from Gamasutra 'Study' Report

Score collector goes to war with actual review writers

Posted by
MetaCritic Defends Self from Gamasutra 'Study' Report
A chap called Adams Greenwood-Ericksen from somewhere called "Full Sail University" (an "entertainment media institution") gave what he called 'A Scientific Assessment of the Validity and Value of Metacritic' at GDC in SF this week in which he 'revealed' how the score-scraper worked. MetaCritic begs to differ...

First up, Adams is not actually a scientist. He is "User Experience Consultant at Full Sail Institute for Research in Entertainment (FIRE)". His boldest claim is that Metacritic adds greater weight to certain sites game scores. Yes, certain big sites get more weight assigned to what nobody at all ever thought was a totally objective view of all review scores.

GamaSutra has reported that, "Metacritic confirmed to Greenwood-Ericksen during the course of his research that the site applies different weightings to incoming critics and publications' reviews in order to calculate its 'averaged' numerical score for any particular title."

Metacritic would not, however, reveal the weighting. No one knew why... yet.

Adams and "his students then set about modelling the weightings based on data pulled from the site. Finally, after six months of work, the researchers compared their modeled scores to the actual scores and discovered that across the 188 publications that feed into Metacritic's video game score work, their findings were almost entirely accurate."

Metacritic took umbrage. In a hilarious turn of events, it's not pissed at Adams and his team but a GamaSutra. It's used Facebook to do so. It's used words such as "guesses are wildly, wholly inaccurate."

Let's not edit MetaCritic though... here's the wording:

"Today, the website Gamasutra "revealed" the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning. There's just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect).

"And here's the most important thing: their guesses are wildly, wholly inaccurate. Among other things:

"* We use far fewer tiers than listed in the article.

"* The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic. For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula. That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation.

"* Last but definitely not least: Our placement of publications in each tier differs from what is displayed in the article. The article overvalues some publications and undervalues others (while ignoring others altogether), sometimes comically so. (In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)"

Read it here.






Comments

ergo 28 Mar 2013 20:10
1/1
"...nobody at all ever thought was a totally objective view of all review scores."

Except the publishers who tie bonus checks to those scores and won't pay out eve if you hit sales milestones if your score is below a certain level? Glad you cleared that up for us, Spong.
Posting of new comments is now locked for this page.