13 years
Determining the number of fake reviews on the Web is difficult. But it is enough of a problem to attract a team of Cornell researchers, who recently published a paper about creating a computer algorithm for detecting fake reviewers. They were instantly approached by a dozen companies, including Amazon, Hilton, TripAdvisor and several specialist travel sites, all of which have a strong interest in limiting the spread of bogus reviews.
Wonderful! With all of the interest from big players they shouldn’t have a problem funding future research and the development of this algorithm. Can the hostel industry help?
The Cornell researchers tackled what they call deceptive opinion spam by commissioning freelance writers on Mechanical Turk, an Amazon-owned marketplace for workers, to produce 400 positive but fake reviews of Chicago hotels. Then they mixed in 400 positive TripAdvisor reviews that they believed were genuine, and asked three human judges to tell them apart. They could not.
…
So the team developed an algorithm to distinguish fake from real, which worked about 90 percent of the time. The fakes tended to be a narrative talking about their experience at the hotel using a lot of superlatives, but they were not very good on description. Naturally: They had never been there. Instead, they talked about why they were in Chicago. They also used words like “I” and “me” more frequently, as if to underline their own credibility.
90% is a pretty good success rate.
It has been argued that hotel guests and hostel guests are different types of people. Would an algorithm that works for hotels have to be adapted for the psychology of our clientele, or are we not so different after all? Do you think hostel reviewers would use different language than hotel reviewers, or would a fake review stand out as a fake review regardless of who writes it?
Log in to join discussion