Professional critics may think of themselves as defending true art from the savage tedium of the marketplace (and they may be right about that), but most people read record reviews hoping to buy something good, or avoid buying something lousy: They want an in-depth opinion from someone who’s good at forming and presenting opinions. In other words, they’re looking for qualitative analysis. And that can be extremely helpful: To name just one example, I’ll be forever grateful to Maura Johnston for hipping me to Sleigh Bells. But one limitation of qualitative analysis is that respondents (like torture victims) tell you what they think you want to hear, and in the case of music critics that results in an oddly non-conformist conformism: As Lester Bangs pointed out forty years ago, anyone looking to make a name in the field will look for stuff the masses aren't interested in, and try to get them interested in it. Over time - as the scribes push their pet acts, and their colleagues don't want to be seen as falling behind - groupthink develops, and the next thing you know there's critical consensus on boring, trivial acts like the BLAND (Beach House, LCD Soundsystem, Arcade Fire, The National and Deerhunter) Class of 2010.
Being aware of these pitfalls, market researchers usually seek to combine quantitative and qualitative analysis, and that’s exactly what Metacritic tries to do: filter all the blathering of critics into recommendations that a consumer can use. The trouble is, their methodology not only doesn’t cut through groupthink, but actually celebrates and elevates it: the flavor-of-the-month acts clutter up the top of their list, who also tend to be the artists you’re already familiar with because every other article is about Frank Ocean or Jack White. So it ends up being a technology-based version of the Pazz & Jop Poll: useful if you need to know who critics are hyping, not useful if you’re looking to find compelling art that might enrich your life or at least inspire you to put down that bag of chips and get off the couch.
Surprisingly enough, it’s that venerable Village Voice poll itself that suggests a way out of this mess: They’ve taken the annual vote numbers and thrown them up on a sortable page that lets you see, for example, which records were picked in the Top Ten by only one critic; which records were picked mostly by non-groupthink critics; and (interesting for a dweeb like me) which reviewers had the most similar lists to a given critic. That dataset only includes Top Ten picks and only covers five years, but it’s a big step in the right direction. We’re getting within hailing distance of the system I consider ideal: You enter in your favorite albums, and a database matches you up with a critic whose taste is similar to yours, but (we assume) listens to a lot more stuff and thinks about it a lot more, and is thereby in a position to bring lots of amazing, transcendent, mind-exploding music to your attention. Wait, did I just give Metacritic that idea for free? Dang.
PS In case anyone’s interested, just two of my 2012 Top Ten albums received votes in P&J: Regina Spektor’s What We Saw From The Cheap Seats and Angel Haze’s Reservation. I was much more groupthink-y in 2011, where seven of my Top Ten received votes, and my #1 pick won the poll.