Ghostery has recently updated its add-on and the majority of users did not like the new version.
I have been reading the reviews and I wondered why the average rating for Ghostery at AMO remained unchanged even though there were dozens of negative reviews.
So, I now have 3 screenshots from the Ghostery’s reviews and it would be very helpful if someone can explain how the reviews are fewer today and how some 1 star reviews have mysteriously disappeared. Is there a manipulation in AMO ratings???
@TheOne
I hope I’m still entitled to have an opinion on it.
When you have several people independently reporting that negative reviews disappear for some add-ons (and I’m talking about my local support forum, which has not exactly the traffic and population of New York), you take time to file a bug and then an issue which is basically wontfixed, that counts to me as “this is not a problem”.
I’ve asked for data (How many reviews are deleted for these add-ons? What’s the average for other add-ons?), I didn’t get any. Also, who is part of the independent reviewer team? Is the list available somewhere?
If you add a negative review, and that’s flagged down and removed, and the following ones are removed because they don’t comply with the review guidelines, that sounds like a structural problem to me. People get upset when they take time to write reviews and they disappear.
As a developer, I get upset when a user writes a negative review, sometimes even a very comprehensive one with several paragraphs detailing every step of whatever, blaming me for something that in the end has nothing to do with the add-on they’re reviewing (this happens more often than you’d think).
The review guidelines are open to some interpretation after all. For example, riddle me this, would you consider this to be a helpful review:
1 star - “Meh…”
Meh? Why meh? What’s meh? Does it not work? Is it not useful in your case? Does it look bad? What/when/where/how/who…? Does a review like that deserve to be shown with just the same full value as all other reviews to every single user viewing the page?
The structural problem you mention has already been recognized: a lack of qualification system for the reviews themselves. Which is basically saying this: currently all reviews are valued equally, and presented to all users as such. In those conditions, I definitely don’t want unhelpful reviews scaring away potential users, especially not just because “they took the time to write them”.
I’m not saying that all of those Ghostery reviews should have been deleted. If I was part of the AMO moderators team, I probably would have kept a few of those in the screenshots, but definitely not all. But I believe the ability to flag and remove unhelpful reviews or those otherwise not-valid is necessary until such a qualification system is put in place. And specific cases like these should be handled as what they are, specific cases, not as an excuse to blame and take down the whole system.
I maintain a dictionary with ~115k users and some negative reviews completely unrelated to the add-on (which has no code, not much to blame it for). Have I ever deleted a negative review? No, because I expect users who check reviews to be smarter than these commenters.
The current system is flawed, I’m glad you agree. While it’s fixed, why not answering to these reviews instead of flagging them down? The system allows for that, it obviously takes time, especially for big add-ons.
What people seem to be missing is that users won’t blame the system, they’ll blame the developer (thinking he can delete reviews on his own, and I agree that’s exactly what it looks from the outside) and most important they’ll blame Mozilla for allowing that.
Anyhow, I recon this discussion is going nowhere as much as the issue on GitHub did, so I’ll take a step back and let you enjoy it. I’d still be interested in data and stats about flagged reviews, but at this point I’m not even sure they’re available.
Are you sure about that? Did you follow the discussions about Ghostery’s reviews? There are many negative ones that have been deleted even though they complied with the review guidelines.
So, which is this “independent” team that removed about 50% of Ghostery’s negative reviews?
It is obvious that something is going wrong. Flod is 100% right!!! “they’ll blame the developer and most important they’ll blame Mozilla for allowing that”.
For the sake of argument, I’ll just respond to these two points, as I feel they are relevant for when considering a possible future replacement review qualification system.
Precisely. The ones who write and the ones who read the reviews come from the same user-base. In my experience, only developers know how to tell the difference between a “helpful” review and a “non-helpful” one because only they know what actually happens beyond clicking the “add to Firefox” button; most users take every review at full value because it’s just there (just like I did the other day when reading opinions about cars with my friends, I know absolutely nothing about the subject, so I believed everything I read since I had no reason not to until one of my friends either agreed or explained otherwise).
Absolutely agreed! It’s always better to reply. But if the user never responds again afterwards, what’s the point in keeping it indefinitely? Which cycles back to the previous point and my previous comment, without a way to qualify these reviews somehow, there needs to be some way to not induce readers in error. The credibility of a developer’s response can be easily put into question if there are many reviews complaining of the same thing, (mostly because it requires interaction to be seen, but also) because unfortunately it’s human nature that negativity will always be louder than positivity.
Perhaps it’s better to give specific examples of which reviews you think should not have been removed (I do agree that there are some, as I said), so that the moderators can adapt their methods to better comply with the guidelines themselves if necessary. Both this thread and the linked github issue above focus on numbers and dates, and those aren’t really the problem in themselves.
That’s pretty crazy how this all works. Have you guys seen AdBlockPlus’s reviews? They had like 100 or so in the last 2 weeks. And today, all 1 star and 2 stars have been deleted so there is 0. I’ll attach a screengrab. What gives?
This is clearly none of my business… but have been a firefox user for as long as I can remember… to be perfectly honest, I doubt the independent reviewer team is as “independent” as it should be… but as long as it doesn’t concern me too much, truth is, I don’t really care… But its disconcerting though, that valid reviews were deleted whereas invalid reviews stay up.