NEILSTEIN SOUNDSCAM: PONDERING A NEW BAND RANKING SYSTEM
In a response to the last Neilstein Soundscam column, MS commenter “Ogre” pointed out the alleged hypocrisy of the fact that I focus on music sales each and every week while simultaneously celebrating the death of the CD and the rise of streaming services. Periphery’s Misha Mansoor counter-commented with a good response to that accusation, namely that the industry is still very old-fashioned and looks to first week numbers as one, although not the only, major relative gauge of a band’s success and worth for tours, etc. While I agree with Misha’s response and appreciate his getting my back, Ogre does have a point: analyzing album sales seems increasingly irrelevant and definitely doesn’t capture the whole picture whether you think they’re important or not.
There needs to be some kind of a new ranking system that takes into account, for lack of a better phrase, the real world — EVERYTHING a band does — as opposed to just the cottage industry of album sales. Since the new release season for 2011 is all but over and Nickelback’s new album was the best-selling “heavy” record for the second week in a row (and since re-hashing and re-re-hashing how much we all hate Nickelback has stopped being fun), let’s dive deeper into what such a new system could look like.
What I’m imagining is some kind of Power Ranking Chart that’s analogous to the way the BCS standings are calculated in college football. For those who don’t follow college football, the BCS (Bowl Championship Series) is a computer-generated ranking published every Monday after all the previous week’s games are completed, most of which take place on Saturday. The BCS rankings published each week take into account several factors; a team’s won-loss record, the strength of the teams they’ve beaten and lost to, how convincing those wins and losses were, what their ranking was last week in relation to other teams that won and lost and how THOSE teams performed, and a whole host of other variables that frankly make my head spin. All of that data is entered into a computer, and bam, the computer does its thing based on a very complicated algorithm and spits out a chart. To add a human element to the equation to account for so called intangibles that a computer could miss, the final BCS ranking also takes into account two ranking systems compiled by real, live people; the Associated Press Top 25 — consisting mainly of sports writers and broadcasters — and the USA Today Coaches Poll, compiled by, you guessed it, college coaches. Those two polls are reconciled with the BCS computer rankings to produce the final BCS ranking chart every Monday.
How could we adapt this model for music? Here are a few categories that should be included in the computer ranking, off the top of my head; tour income, merch income, pre-sale numbers, social network activity, press activity, radio play, streaming/cloud service plays, torrent/P2P downloads. For the time being, I believe album sales should be included as well, as they are still relevant, even if that relevancy has diminished. The algorithm would have to assign weights to those different categories, and the formula used to calculate them could be ever-changing so as to reflect changes in the industry (i.e. as album sales become less and less important and streaming becomes more and more important, the weights assigned to each of those could be adjusted so the final number is an accurate representation of a band’s net worth). Based on the “score” generated from the computer, bands would be assigned a ranking between 1 and 200, as they are now. I’m not sure whether a human element would be necessary for this kind of ranking system, but if we were to introduce one it could consist of rankings submitted by influential writers and bloggers such as myself (hardy har!), industry pundits like Bob Lefsetz, and generally anyone “in the know” that doesn’t have a financial interest in any bands on the list. It might be best to forgo a human element altogether to eliminate possible corruption from the equation, but I’m just throwing it out there for sake of comparison to the BCS. Bands could be further broken down by genre, like they are on Soundscan now, and categories like “Heatseakers” and “Top New Artists” could remain intact as a sub-set of the overall list.
An added benefit of a system like this is that emphasis would be taken off first-week numbers because… first week of what? Album releases in general stop mattering as much as they do now. A band would be able to shoot up the chart even if they don’t have a new album out on the strength of a booming tour; having a new album that’s selling and getting radio play could help push the band up even further. The downside is that this would allow bigger bands to dominate the top of the charts even when they’re doing nothing at all, but is that really so much different from today’s tactics of releasing greatest hits comp or pushing catalog albums with a blowout sale? At least this kind of chart would paint a more accurate picture of relative popularity, and there would still be the “New Artist” chart to ensure there’s a meaningful place to look for emerging artists without Metallica dominating the top spot every week.
Who would construct such system and how they’d do it… well, we’ll cross that bridge later.
SO, what do you think? Neilsen Soundscan is ridiculously outdated and needs to go. Whether it’s the above system or something else, we need a new ranking that accurately reflects a band’s stature with regards to ALL of the elements of their career.