New York Times: Fake Reviews A Growing Trend
Back in 2010 I wrote a piece about reviews on software download sites. In it I discovered that some developers manipulated the public perception of their program by adding fake reviews, usually in the form of a five-star rating for their programs. The problem here was that some developers went overboard with the fake reviews so that their lesser known program received the same amount or even more reviews than a very popular software on the same portal. Some reviews also did not add up, especially when low and high reviews where compared with each other.
The New York Times yesterday reported that fake reviews are a growing trend. The story concentrates on tourism and product review sites.
Product owners, marketing agencies or individuals can buy reviews online for a small amount of money. If you visit Fiverr for instance, you will notice that you can buy positive reviews for $5 on almost every site imaginable. But Fiverr ist just one of the sites where you find people willing to put up fake reviews on websites.
The ingenious aspect of this is that hiring people to post fake reviews bypasses most of the site's fake detection security. If you were to do it on your own, you would connect with certain characteristics like the computer's IP address, browser version or operating system which might be used to identify manipulation, even if proxy servers or virtual private network connections were used. A single cookie could be enough for that.
But with unique users from all over the world, it is not possible to use hard facts to identify fraud.
Cornell researchers recently published a paper about fake review detection. The algorithmic approach looks for strong and slight deceptive indicators in a review to determine whether it is fake or not. Indicators on the other hand are not proof, and it will happen that the algorithm detects legit reviews as fake and vice verse.
Some sites could implement a better procedure to avoid the majority of fake reviews. Amazon for instance could only allow reviews by users who have purchased the product on their site. While that would certainly reduce the number of reviews on site, it would eliminate the majority of fake reviews as well.
Businesses who use these marketing techniques will adapt. They would simply have to do some initial teaching, or review writing of their own, to deceive the algorithm.
One element that has not been mentioned yet, and that has not been addressed in the paper, is the option to write fake reviews with a less than perfect rating. I personally read the negative reviews first on most sites to get an understanding of what's wrong with a product. Some complaints here are less serious than others. A picky user might complain that the product arrived late, or that the breakfast buffet at the hotel did not have enough carrots on one day. Those might be serious issues for them, but they might not be serious to the majority of potential customers.
My guess is that we will see better fake reviews in the coming years. We will see fake reviews with less-than-perfect ratings, and fake reviews that use the findings of the research paper to avoid detection.
Your take on fake reviews on the Internet? Let me know in the comments.Advertisement