Motivated by potential financial gain, companies may hire fraudster groups to write fake reviews to either demote competitors or promote their own businesses. Such groups are considerably more successful in misleading customers, as people are more likely to be influenced by the opinion of a large group. To detect such groups, a common model is to represent fraudster groups’ static networks, consequently overlooking the longitudinal behavior of a reviewer, thus, the dynamics of coreview relations among reviewers in a group. Hence, these approaches are incapable of excluding outlier reviewers, which are fraudsters intentionally camouflaging themselves in a group and genuine reviewers happen to coreview in fraudster groups. To address this issue, we propose “FGDT”, a framework for “fraudster group detection through temporal relations.” FGDT first capitalizes on the effectiveness of the HIN-recurrent neural network (RNN) in both reviewers’ representation learning while capturing the collaboration between reviewers. The HIN-RNN models the coreview relations of reviewers in a group in a fixed time window of 28 days. We refer to this as spatial relation learning representation to signify the generalizability of this work to other networked scenarios. Then, we use an RNN on the spatial relations to predict the spatio-temporal relations of reviewers in the group. In the third step, a graph convolution network (GCN) refines the reviewers’ vector representations using these predicted relations. These refined representations are then used to remove outlier reviewers. The average of the remaining reviewers’ representation is then fed to a simple fully connected layer to predict if the group is a fraudster group or not. Exhaustive experiments of FGDT showed a 5% (4%), 12% (5%), and 12% (5%) improvement over three of the most recent approaches on precision, recall, and F1-value over the Yelp (Amazon) dataset, respectively.
|Number of pages||15|
|Journal||IEEE Transactions on Neural Networks and Learning Systems|
|Publication status||E-pub ahead of print - 25 Oct 2022|