Customer Discussions > top reviewers discussion forum

Researching rankings

Sort: Oldest first | Newest first
Showing 1-22 of 22 posts in this discussion
Initial post: 18 Jan 2013 14:04:37 GMT
Last edited by the author on 18 Jan 2013 14:24:13 GMT
On another thread, somebody said that there is no proof for anything I say about Amazon's voting and ranking system. That person is a non-reviewer who hasn't done any research and probably never will. I therefore feel it is worth discussing research.

1) Disclaimer

It is worth confirming that nobody outside Amazon has been able to work out the algorithm used on the current ranking system. Some people have tried to work out individual aspects, such as how many votes a customer can give a reviewer, or the weight ratio of an unehelpful vote to a helpful vote. Some are adamant that their research "proves" that you can't give more than 5 negs to any customer, but I regard the source as unreliable and haven't seen the detail. I'll concede that it is plausible and even that it might apply in most cases, but no more than that. And I won't be carrying out testing to see if I can get 6 negs to stick, if those negs appear over several years and are interspersed by a far greater number of pozzies. That would be unethical.

2) The old ranking system was reverse-engineered

While the current system only displays tables showing the top 10,000 reviewers, the old system showed tables listing every ranked reviewer, down to the millions. By fiddling the page number in the URL, it was possible to identify the lowest ranked reviewers (the ranking basement) easily, and also to jump to whatever ranking one wanted to analyse. I did not do the early research on that ranking system; it was before my time. Others did, and explained the method. I posted that method in a blog post, but the blog is currently offline; it will eventually re-emerge as part of a website.

That system was reverse engineered in all the main details, with only slight doubts about how many times a customer could vote for a reviewer and their votes still count. Everything else was known and could be confirmed using the known method by anybody who wanted to confirm it for themselves, up to the point when the current system was introduced.

3) Ranking tables restricted

When the current system was installed in America in October 2008, the old system remained and was updated alongside it - a situation that existed for a further three years and a few weeks. While we were no longer able to see tables beyond the top 10,000 on either system, we were able to compare rankings for each reviewer.

4) How the current system was analysed

Being able to compare the current and old systems made the early analysis a lot easier than it would otherwise have been, but it is possible to confirm the findings by other methods. To begin with, we compared the top 100s. About half of the old USA top 100 reviewers were also in the current top 100 when it was new, but a lot has changed since then - the figure is now under 30% and dropping steadily. Early researchers had three main types of reviewer to look at -

1) those in both the current and old top 100
2) those only in the current top 100
3) those only in the old top 100

Those who were only in one or the other top 100 varied considerably according to their ranking in the other system, so the first obvious points of research were those that had made the biggest improvements and those who had made the biggest falls. Looking at the profiles of these reviewers made it easy to see that those who had negative votes in large numbers were the most severely penalised. Inactive reviewers were also penalised, but to nothing like the same extent. It later emerged that those in the ranking basement started to move away from the bottom just a little. Joe Mac Guy once propped up the rest but is close to escaping from the bottom 100 due to his years of inactivity. Not that he would care, especially as with thousands of new reviewers added each day, his ranking still drops like a stone.

Over time, further analysis has been carried out, but now we can only compare reviewers in the same ranking system. As no two reviewers have the same mix of products, the same reviewing habits or the same pattern of votes on their reviews, we aren't truly comparing like with like, but we have learned the basic patterns. You can see those patterns by comparing reviewers with each other, if you find two reviewers who are similar in many ways, except for one thing.

For example, you can find plenty of reviewers with just one review each (one-hit wonders). If they have identical vote totals but different dates, the main difference in their ranking is due to the dates of their respective reviews - unless some other factor is involved. If they have identical dates but different votals, he main difference in their ranking is due to the voting on their respective reviews - unless some other factor is involved. If both votes and dates are identical, they might be very close in the rankings, but unless the researcher has been tracking the reviews and knows when the votes were cast, the difference in voting date is unknown. In such cases, ranking discrepancy may provide a clue.

If you compare enough reviews by one-hit wonders, you can either satisfy yourself that dates and votes are the only things that affect their rankings, or you can find something suggesting that another factor is involved. In doing research, it is best to keep all links so that you can go back and check - not least because more votes may have been added, or more reviews.

If, in doing research, you find something that doesn't fit the pattern, say so on a forum but keep your mind open to the possibility that there is an explanation that somebody else already knows. If there is no explanation, your finding could be the basis for further research.

5) Facts v speculation

Like I said earlier, nobody outside Amazon knows the algorithm, but we do know the pattern. So, for example, it is a FACT that unhelpful votes weigh more heavily than helpful votes, but their relative weighting is SPECULATION.

Amazon's current ranking system is mysterious but that makes it intriguing to some of us.

Posted on 18 Jan 2013 14:27:46 GMT
The Leveller says:

Has it ever entered your thoughts that Amazon could be manipulating the rankings to 'some' extent by awarding positive/negative votes to certain reviewers, just to keep people on their feet so to speak? As in the ghost vote phenomenon mentioned on here a few times.

If you post around 10 or so reviews sometimes within an hour they can all gain a plus vote. Saying that though I noticed one reviewer in the last week who washed their account and posted 80+ overnight reviews and gained only a handful of + and - votes.

Any thoughts?

In reply to an earlier post on 18 Jan 2013 14:38:38 GMT
The Truth has posted at various times that it's a software glitch. I know this nonsense doesn't happen in America.

There are any number of scams on all Amazon sites. Washing reviews is not against the rules but while I understand people doing it to get rid of particularly nasty comments that Amazon won't remove, some people just do it to get rid of negs - and not just top reviewers.

When I reviewed an album last year, it had one other review and it was a good one. Other reviews eventually appeared but I wasn't impressed by them. Unusually for me, I watched that page most days for a while and the other review made it to 6/6 as mine did. I forgot about the page for a while and next time I looked, my review had more votes including some negs, but I didn't see the other review. Eventuially, I found it among the reviews with no votes, so it had been washed. Going to the reviewer's profile page, I saw it was his only review. Sad.

In reply to an earlier post on 18 Jan 2013 14:39:18 GMT
The Truth says:
Don't take this the wrong way, it's a compliment, but I sometimes wonder if you have Asperger's - either that or you are a cyborg and part computer :-?

Was that 80 plus new reviews Levelller, or did they wash and repost 80 plus reviews that had already written and posted previously?

In reply to an earlier post on 18 Jan 2013 14:42:16 GMT
The Truth says:
True - I reckon it was a glitch (but of course had no proof) it seems to have stopped now though... at least for me.

In reply to an earlier post on 18 Jan 2013 14:49:07 GMT
You aren't the first to mention Asperger's, but based on what I've heard and read about it, I think it's a load of bull anyway.

In reply to an earlier post on 18 Jan 2013 14:55:00 GMT
The Leveller says:

It was 80 odd new reviews. A mammoth effort of copy and pasting overnight (I don't sleep much). Same review on a set of 20 books, four times from different sellers/publishers. A lot were removed the next day due to uber negging by the looks of it.

Just a long time top reviewer trying to protect their position from dropping further I think.

Posted on 18 Jan 2013 15:56:14 GMT
Last edited by the author on 18 Jan 2013 15:57:59 GMT
Bob says:
In summary, does it matter, it is what it is. Unless anyone wants to manipulate the rankings there is no need to know.
We review we move up and down the ranking in a mysterious way.
I actually would not to like to know how it all works as if it was known even more people would try and manipulate it and it would become even more meaningless.

For the first time ever when I tried to post this I got one of those jumbled "catcha" words to enter before I could post. Perhaps Amazon are now worried about computer generated votes/reviews/comments.

In reply to an earlier post on 18 Jan 2013 16:19:20 GMT
Damaskcat says:
They are doing that quite a lot at the moment - I've had to type the letters in twice so far today.

In reply to an earlier post on 18 Jan 2013 16:43:25 GMT
"It is worth confirming that nobody outside Amazon has been able to work out the algorithm used on the current ranking system."

This is exactly what I was saying and the only "fact" I was contesting.

There is nothing more to say.

In reply to an earlier post on 19 Jan 2013 09:31:01 GMT
Last edited by the author on 19 Jan 2013 09:38:44 GMT
I guess you ignored the rest of the post. But I see it isn't just Amazon that you are clueless about. You are also clueless about getting disability benefits.

In reply to an earlier post on 19 Jan 2013 09:38:27 GMT
I agree with the idea that the more is known, the more likely people are to minipulate the system, although people who want to manipulate the system know enough already. It isn't necessary to know the precise numbers; the major factors are sufficient, and they were established long ago.

Nevertheless, people will always ask questions, like why do people with fewer reviews and fewer votes sometimes have a better rank than those with more reviews and more votes? That and other questions have been answered many times, not just by me, but there's a continuous stream of newbies who don't know the answers.

Posted on 19 Jan 2013 10:32:51 GMT
For those who just want a small selection to research, a new Amazon site is always useful.

In reply to an earlier post on 19 Jan 2013 11:23:00 GMT
That's rather sweet!

Posted on 19 Jan 2013 11:30:03 GMT
Last edited by the author on 19 Jan 2013 11:41:08 GMT
Bob says:
People have even written "learned" papers on it.


There may be data here if you can extract it

In reply to an earlier post on 19 Jan 2013 14:21:51 GMT
Last edited by the author on 19 Jan 2013 14:33:42 GMT
I have no interest in disability benefits whatsoever and was simply repeating what my GP told me in person. I have no reason to believe my GP is a liar and I suggest you visit your own doctor and ask how much time they are spending dealing with disability benefit claimants.

Posted on 19 Jan 2013 15:30:43 GMT
The Truth says:
Someone should make a King of Kong style geekumentary on all us reviewers :-D

In reply to an earlier post on 19 Jan 2013 15:33:20 GMT
The Truth says:
Phwoar! Hubbard Hubbard!!! Sixth place Marina Moura gets my vote!!!

In reply to an earlier post on 21 Jan 2013 09:54:24 GMT
I am not registered with any doctor, but I do know from the disabled community that government-approved doctors check on disability claimants. Even if (as you claim your GP says) it is possible to get disability benefits purely on a request from your local GP, it will at some pointr be vetted by a government-approved doctor. How long ago was it that your local GP told you about this stuff anyway? It may have been in the days when rules were less stringent.

In reply to an earlier post on 21 Jan 2013 10:37:29 GMT
[Customers don't think this post adds to the discussion. Show post anyway. Show all unhelpful posts.]

In reply to an earlier post on 6 Feb 2013 19:42:12 GMT
sally tarbox says:
I did wonder this last year when i posted a couple of reviews on for the hell of it (and to practise my French!) Almost immediately one of the three reviews got a positive! In the year or so since I've only ever had two more votes on them.

In reply to an earlier post on 7 Feb 2013 10:02:32 GMT
When I had a job, I bought some stuff from Amazon's French and German sites including music in their own languages. I don't speak, read or write in their languages but I did post some reviews there. After all, English evolved from German (Anglo-Saxon origins) and acquired a heavy dose of Norman French along the way. But the French don't always appreciate English language reviews. The Germans are more tolerant.
‹ Previous 1 Next ›
[Add comment]
Add your own message to the discussion
To insert a product link use the format: [[ASIN:ASIN product-title]] (What's this?)
Prompts for sign-in

Recent discussions in the top reviewers discussion forum (327 discussions)


This discussion

Participants:  8
Total posts:  22
Initial post:  18 Jan 2013
Latest post:  7 Feb 2013

New! Receive e-mail when new posts are made.

Search Customer Discussions