1 Introduction
The prosperity of online media has attracted people and organizations to join the platform and use it for several purposes, such as creating a network among commonalities, building and broadening their business, and promoting/demoting e-commerce products. This has led users to choose artificial ways of gaining social reputation to get benefits within a short time. The main reason behind choosing artificial boosting is that the legitimate efforts of gaining appraisals (followers, retweets, likes, shares, etc.) take a significant amount of time and may not meet the actual needs of users. Such activities impose a massive threat to social media platforms as these are mostly against the
Terms of Service (ToS) of many social media platforms (see Section
2.1.2 for more details). Such artificial boosting of social reputation/growth is often known as
collusion [
29,
48]. In the following, we present the formal definition of “collusive users”:
According to a recent survey by HitSearch,
1 \(98\%\) of content creators admitted to having spotted follower fraud among online influencers on Instagram. The adversarial impact of the collusive entities poses a massive threat to online media. These entities create an atmosphere where people start trusting their information due to the popularity they receive. For example, in the 2019 UK general election, politicians approached the blackmarket services
2 for online political campaigning to reach out to their potential voters. It is also reported that most of the politicians currently in contention or conversation for the 2020 U.S. presidential election have a very high percentage or volume of non-human followers linked to their Twitter account. The preceding examples show how collusive entities boost the believability of information during events. Moreover, the limitation of humans’ potential to distinguish between the collusive and genuine entities due to the vast amount of available information is an important concern that motivates the design of methods for automatic identification of these entities. The collusive entities not only deceive people but also pollute the entire social space. Using the blackmarket services, the collusive entities can perform appraisals such as improving the credibility of rumors, propagate fake news, inflate/deflate the ratings of products in e-commerce platforms, and gain popularity in video sharing platforms. Figure
1 illustrates how collusive entities gain artificial appraisals on various online media platforms. We request the reader to refer to Section
4 for a thorough explanation of Figure
1. Recent studies that tackle the problem of collusive entity detection state that it is harder to discern these entities as they express a mixture of organic and inorganic activities [
48].
Although artificial boosting has become a regular practice with the increasing popularity of different online media platforms, there is a lack of coherent and collective attempts from different research communities to explore the micro-dynamics controlling such malpractice and the extent of its effect in manipulating social reputation. Most of the existing studies aim to detect spam [
14,
31,
91,
94,
109,
123,
129,
130,
134,
135,
136,
145,
150], fake [
21,
27,
32,
33,
51,
60,
61,
63,
73,
95,
107,
116,
151], and bots [
22,
25,
26,
30,
31,
39,
44,
55,
58,
76,
78,
97,
128,
131,
144,
149], and how these accounts are used for information propagation [
13,
28,
41,
83,
87,
87,
101,
114,
133] in online media. Some studies [
47,
48] have shown that collusive users are neither fake users nor bots but are normal human beings who express a mixture of organic and inorganic activities. Unlike bots, these users have no synchronicity across their behaviors [
48], which makes it difficult to design automated techniques to detect them. In this article, we present a comprehensive survey of existing literature on topics related to collusion in different online media platforms. Recently, Kumar and Shah [
82] surveyed three aspects of false information on the web and social media: fake reviews, hoaxes, and false news. They also mentioned about the lack of publicly available datasets related to false information and social media rumors. In comparison to Kumar and Shah [
82] and other related surveys, we mention our contributions in the remaining part of this section.
1.1 Related Surveys and Our Contributions
To the best of our knowledge, this is the first survey to provide a detailed overview of collusive activities in online media. The aim of this survey is to provide readers with a comprehensive overview of the major studies conducted in the field of social network analysis to detect and analyze collusive activities across different online media platforms. The following four aspects make our survey unique and different from other related surveys:
(1)
Existing surveys do not directly focus on the problem of “collusive activities” and are more centered around the detection and analysis of fake users, fraudsters, and spammers on the web. Previous studies mentioned that unlike these problems, collusive activities are very different in nature [
8,
47,
48] due to the mixture of organic and inorganic activities. In this article, we conduct a thorough analysis of past studies dealing with collusive activities in different online platforms.
(2)
Existing surveys mentioned only fraud and spam-related datasets. Here, we describe the datasets on collusion from multiple aspects—the type of dataset and the entities present in them. This will largely contribute to increasing the performance and applicability of state-of-the-art collusion detection approaches.
(3)
We also outline the annotation guidelines and evaluation metrics used for collusive entity detection, as mentioned in the related studies.
(4)
We conclude the article by highlighting a set of key challenges and open problems in the area of collusion in online media.
We use the terms collusion and artificial boosting of social reputation interchangeably throughout the article. We refer to an action/appraisal as online media activity such as retweet, follow, review, view, and subscription, and an entity as a social entity such as user, tweet, post, and review.
1.2 Survey Methodology
1.2.1 Survey Scope.
Since our scope is on investigating collusive entities in online media platforms, we will systematize the studies related to the analysis and detection of the collusive entities. As collusion is related to a few other aspects such as fake, bot, and spam, we partly focus on the previous works in those domains as well.
1.2.2 Survey Organization.
In this article, we survey the existing algorithms, methodologies, and applications in detecting and analyzing collusive activities in online media platforms. In Section
2, we provide a broad overview and background of collusion in online media. We also analyze particular cases of collusion, show how collusion is closely related to other close concepts, and see how collusion has been evolving and who are the main targets of it. Next, in Section
3, we provide some preliminary concepts on blackmarket services. In Section
4, we show how collusion happens across multiple online media platforms and overview the state-of-the-art techniques for each platform. Section
5 explains two broad categories of collusive activities. To identify what has been done so far in the literature, in Section
6, we conduct a systematic literature review of the existing techniques. In Section
7, we present the annotation guidelines, related datasets, and evaluation metrics that can be used for studies in collusive entity detection. Being an underexplored research problem, there are plenty of important issues that need further attention. Pointers to such issues are mentioned in Section
8. Section
9 concludes this survey with a summary of the main contributions and open problems.
3 A Note On Blackmarket Services
Blackmarket services are the major controlling authorities of collusive activities. Customers join these services and contribute directly or indirectly to boost their online profiles artificially. The blackmarket services are divided into two types based on the mode of service [
113]:
premium and
freemium.
Premium services. These services require the customers to pay a certain amount of money to obtain the facilities (e.g., SocialShop, RedSocial). Most of the premium services provide a comprehensive range of social media enhancement services for all purposes. These services also ensure strategic social media promotion to maintain an edge over the competitors, in the following ways: (i) location-specific actions, (ii) time during which users want to gain social reputation, (iii) users gained from the services have eye-catching display pictures and filled-out bios in the profile, and (iv) domain-specific actions (users gained from services can be from various domains like fashion, music, and blogging). These services offer actions in tiers (100, 1K, 10K, etc.) and lucrative offers to their customers and some additional facilities such as a 100% money-back guarantee, a retention guarantee, and complete privacy. Few of these services ensure replenishment if the customer experiences any drop in the actions. We divide this type of services into two categories:
(1)
General premium: Here, customers have to choose a plan that suits their budget.
(2)
Auto premium: These services provide daily/weekly/monthly delivery service of actions. Customers need to select one of the auto action packages in advance for a time duration, such as 100 Twitter auto retweets (2 to 4 days). The main advantage of these services is to automate the process of social reputation to an extent.
The basic principle of auto premium remains the same as that of general premium services but can be considered as a faster and effective way to boost social reputation. The main difference is the recurrence of delivery (auto is recurrent, general usually not).
Freemium services. These services are free to use, but they also have premium subscription plans (e.g., YouLikeHits, Like4Like). The main idea is to get customers familiar with the workflow of the services and motivate them to opt for the premium plans. Freemium services operate in one of the three ways: social-share services, auto-time freemium services, and credit-based services. More details on these types can be found in the work of Dutter et al. [
47,
48]. Freemium services operate in one of the three ways [
48]:
(1)
Social-share services: These services require customers to perform social actions on multiple platforms to get appraisals for their content. Some of the possible actions are share/like/follow on Facebook, follow/like/view/comment on Instagram, and like/view/ share on YouTube, among others. As an example, customers in FreeFollowers
6 have to perform five social actions or complete a survey in 15 to 60 seconds to get appraisals.
(2)
Auto-time freemium services: These services require customers to get access tokens from the services, after which they can request a fixed number of actions for a time duration (e.g., 10 to 50 retweets in a 10-minute window).
(3)
Credit-based services: These services are operated based on a “give and take” relationship. Customers of these services lose credits when other customers perform actions on their submitted content. Similarly, customers gain credits when they perform actions on the content of other customers.
Figure
5(a) shows the working of premium services. Here, two types of entities are involved: customers who ask for appraisals and suppliers of those supply appraisals. Figure
5(b) shows the working of freemium services. Here,
customers are involved in a credit-based ecosystem, hence
customers are also the
suppliers. To understand the common keywords used by the users of these services, we conducted an experiment [
47] by visualizing the wordclouds of the description/bios present in the profile of Twitter users from premium and freemium services (c.f. Figure
4). Notice that premium users are associated with high-profile accounts having keywords such as “CEO,” “official,” “speaker,” and “Founder.” However, freemium users use advertisement-based keywords such as “like,” “agency,” “SocialMedia,” and “YouTubeMarketing.” This suggests that collusive users participating in these services use targeted keywords in their profiles to gain quick popularity. We encourage the reader to go refer to the work of Stringhini et al. [
124,
125,
127] for a detailed study of the Twitter follower markets.
4 Compromised Online Platforms
As discussed in previous sections, collusion happens across multiple online media platforms. In this section, we discuss in detail how appraisals in these platforms are artificially manipulated by the collusive entities. We will look into seven types of platforms: social networks, rating/review platforms, video streaming platforms, recruitment platforms, discussion platforms, music sharing platforms, and development platforms. Other than the preceding platforms, we will look into the artificial manipulation of website traffic through blackmarket services. Figure
1 illustrates how different types of blackmarket services provide collusive appraisals to various online media platforms.
4.1 Social Networks
Social networks serve as a platform to build social relations among users of common interests (personal or professional). Platforms like Twitter, Facebook, and Instagram allow their users to perform actions such as post, like, retweet, follow, and share, among others. Today, these platforms also serve as real-time news delivery services and a medium for the business owners to connect with their customers and thus expand their outreach. However, the natural way to attract users is usually a tedious task and takes significant time. This motivates users to choose an artificial way of gaining appraisals.
Facebook. Facebook is the most popular social network to connect and share with family and friends online. Facebook has four types of appraisals: shares, likes, comments, and followers. The large number of appraisals on Facebook acts as a form of social proof for the sign of popularity and importance.
Twitter. Twitter is a microblogging service where users write tweets about topics such as politics, sport, cooking, and fashion. Twitter has three types of appraisals: retweets, likes, and followers. Acquiring more appraisals on Twitter helps increase the user’s social signals and attract more visitors to the profile.
Instagram. Instagram is a social networking platform that enables users to share images or videos with their audience. Users can upload photos and videos, then share them with their followers or with a group of friends. Instagram has three types of appraisals: likes, followers, and views. Higher appraisals are the key to visibility on Instagram. The more appraisals a post receives on Instagram, the higher is the post rank in search results and on the Explore page.
Pinterest. Pinterest is an image sharing platform that is designed as a visual discovery engine for finding ideas like recipes, home and style inspiration, and so forth. A message on Pinterest is known as a Pin. Pinterest has two types of appraisals: likes and followers.
Other than the common appraisals, we found a few blackmarket services where social networking users can request a verification badge. By verification badge, we refer to the Twitter white/blue verified badge
7 and the Instagram blue verified badge.
8Verifiedbadge,
9 StaticKing,
10 Prime badges,
11 and
SocialKing12 are the most popular platforms providing such appraisals. Verification badges are coveted checkmark badges that enhance the user’s social media presence and improve credibility on the platform. The minimum requirements to request verification badges through blackmarket services are as follows:
•
The user/organization should be a celebrity, journalist, popular brand, government official, or sports company.
•
The user/organization should have a Wikipedia page or media coverage in leading online news portals.
•
For social media platforms where a subscription is an appraisal, the user should have a minimum of 100K subscribers.
There is a vast literature on the detection and analysis of collusive entities in social networks. Shah et al. [
113] and Dutta et al. [
48] are two of the first few studies that investigate collusive entities on Twitter registered in blackmarket services. Dutta et al. [
48] trained multiple state-of-the-art supervised classifiers using a set of 64 features to distinguish collusive users from genuine users. They also divided the set of collusive users into three categories:
bots,
promotional customers, and
normal customers. Arora et al. [
8] obtained better classification performance than Dutta et al. [
48] by incorporating the content-level and network-level properties of Twitter users in a multitask setting. De Cristofaro et al. [
40] performed the first work on Facebook where the authors presented a comparative measurement study of page promotion methods. Sen et al. [
110] conducted the first work on Instagram, where they developed an automated mechanism to detect fake likes on Instagram. We discuss these studies in detail in Section
6.
4.2 Rating/Review Platforms
Rating/review platforms allow users to rate or share their opinion about entities, such as products, applications, food items, restaurants, and movies. Examples of these platforms are e-commerce platforms (e.g., Amazon), travel platforms (e.g., TripAdvisor), and business rating platforms (e.g., Yelp, Google reviews), among others.
Amazon. Amazon is the world’s largest e-commerce platform and is considered the ultimate hub for selling merchandise on the web. It allows two types of appraisals: reviews and ratings. High ratings and reviews for a product on Amazon attract customers and make the product more trustworthy.
Google. Google provides valuable information to businesses and its customers using two types of appraisals: reviews and ratings. Higher ratings and reviews on Google help improve the business and enhance local search rankings.
TripAdvisor. TripAdvisor is an online travel platform that offers multiple services like online hotel reservations, travel experiences, restaurant reviews, and so forth. Similar to other platforms, TripAdvisor has two types of appraisals: reviews and ratings. Positive reviews from previous occupants help the business improve its reputation and increase the customer base.
Yelp. Yelp is a local business review platform that collects crowdsourced data from its users. Yelp has two types of appraisals: reviews and ratings. Getting more reviews in turn improves business reputation and gets a lot more customers with free traffic.
Today, online reviews/ratings are becoming highly relevant for customers to take any purchase-related decisions. As these appraisals play a significant role in deciding the sentiment/popularity of a product/business, there is a massive scope for collusion among sellers/buyers to manipulate it artificially. The work of Jindal et al. [
74], Li et al. [
88], and Wang et al. [
74,
88,
137] represents a few initial studies on detecting fake reviews in review patterns representing unusual behaviors of reviewers. Another early work by Li et al. [
89] identified fake reviews on Dianping, the largest Chinese review hosting site. The authors proposed a supervised learning algorithm to identify fake reviews in a heterogeneous network of users, reviews, and IP addresses. Mukherjee et al. [
99] performed one of the first attempts to detect fraudulent reviewer groups in e-commerce platforms. Recently, Kumar et al. [
81] identified users in rating platforms who give fraudulent ratings for excessive monetary gains. Most of the studies in review platforms are focused on e-commerce platforms. We believe that the newer platforms such as Google reviews, Yelp, and TripAdvisor would open more research questions, as these platforms contain reviews of millions of hotels, restaurants, attractions, and other tourist-related businesses.
4.3 Video Streaming Platforms
Video streaming platforms are mostly used for sharing videos and streaming live videos. These platforms allow users to upload, view, rate, share, add to playlists, report, comment on videos, and subscribe to other users. Here, we discuss how appraisals on video streaming platforms are inflated artificially. With the increase in the popularity of live streaming came the concept of astroturfing, broader and sophisticated term referring to the synthetic increase of appraisals in an online social network by means of blackmarket services. The consequences of such synthetic inflation are not only restricted to increase monetary benefits, directory listings, and partnership benefits but also expanded to better recommendation rankings, thus doctoring the experiences of viewers who are recommended all of these boosted materials instead of genuine materials produced by the honest broadcasters.
YouTube. YouTube is a video sharing platform where users can create their own profile; upload videos; and watch, like, and comment on other videos. YouTube has four types of appraisals: likes, comments, subscribers, and views. Likes, comments, and views are for the videos, and subscribers are for the channels. The higher the number of YouTube users interacting with a video/channel, the higher that video/channel will be listed to other users. These appraisals are considered as the measure of engagement and not only make the entity popular but also help the channel with sponsorship opportunities and monetization options.
Twitch. Twitch is the most popular live streaming platform on the web. It is mostly used by gamers to stream their games while other users can watch them. Twitch has two types of appraisals: followers and views. The higher views a channel has, the higher is its popularity on Twitch and to be ranked on the featured list. Twitch is usually considered to be one of the most difficult platforms to earn quick popularity due to the presence of so many streamers using the platform.
Tiktok. Tiktok is a video sharing social network that is primarily used to create short videos of 3 to 15 seconds. Tiktok has two types of appraisals: likes and followers. Getting more likes on Tiktok posts increases the chance to maintain its presence and make it famous among the audience.
Vimeo. Vimeo is a video hosting and sharing platform that allows users to upload and promote their videos with a high degree of customization that is not available on other competing platforms. Vimeo has two types of appraisals: likes and followers. A higher number of appraisals helps show the video in the suggested videos list created by Vimeo’s algorithm.
Few studies addressed issues of astroturfing in video streaming platforms. Shah [
111] made the pioneering attempt to combat astroturfing on live streaming platforms. He proposed
Flock, an unsupervised method to identify botted broadcasts and their constituent botted views. Note that he did not disclose the name of the live streaming corporation on which the study was performed. In a recent study, Dutta et al. [
50] proposed
CollATe, an unsupervised method to detect collusive entities on YouTube. The method utilizes the metadata features, temporal features, and textual features of a video to detect whether it is collusive or not.
4.4 Recruitment Platforms
Recruitment platforms are employment-oriented platforms that also provide professional networking among users. These platforms offer the opportunity to discover new professionals either locally or internationally and help one with their professional endeavors. Hiring new employees is a crucial part of every organization, which starts with posting new job ads and ends with recruitment. It can be thought of as a multistep process, which is normally quite time consuming and prone to human errors. With the advent and rise of automated systems, the hiring process of an organization is being done in the cloud with the help of tools such as the
Applicant-Tracking-System (ATS).
13 ATS makes the hiring process faster and accurate by preparing job ads, posting them online, collecting applicant resumes, making efficient communication with them, and finding the best-fit resumes for the organization. However, increasing use of ATS also invokes various disadvantages such as spammers compromising the job seekers’ privacy, slandering the reputation of the organizations, and financially hurting it by manipulating the normal flow of functioning of the system—most frequently the job ads publishing process (often recognized as the employment scam).
LinkedIn. LinkedIn is the most popular online recruitment platform where employers can post jobs and job seekers can post their profiles. There are four types of appraisals on LinkedIn: followers, recommendations, endorsements, and connections. A higher number of connections and followers on LinkedIn helps the user gain attention. LinkedIn endorsements help add validity to the user’s profile by backing up his or her work experience.
Adikari and Dutta [
1] identified fake profiles on LinkedIn. The authors considered state-of-the-art supervised classifiers designed on a set of profile-based features to detect fake profiles. Prieto et al. [
102] detected spammers on LinkedIn based on a set of heuristics and their combinations using a supervised classifier. Another work by Vidros et al. [
132] tackled the problem of
Online Recruitment Frauds (ORF) (see Section
6 for more details). The authors also released a publicly available dataset of 17,880 annotated job ads (17,014 legitimate and 866 fraudulent job ads) from various recruitment platforms.
4.5 Discussion Platforms
Discussion platforms are content sharing platforms primarily used to entertain community-based discussions. The discussions can be in the form of questions and answers (Q&A) or content that other users have submitted using links, text posts, or images. Next, we explain how collusion happens on discussion platforms.
Quora. Quora is a Q&A-based discussion forum that empowers the user to ask questions on any subject and connect with like-minded people who contribute unique insights and high-quality answers. Quora has four types of appraisals: followers, upvotes, downvotes, and comments. The higher the number of followers and upvotes a user gains for his or her answers, the higher is the ranking factor and the more the answers are displayed to other users. The aim of a user on Quora is to be named as a top writer in the long run. Answers written by top writers on Quora are considered as expert opinions and are also displayed in the featured list.
ASKfm. ASKfm is another Q&A-based discussion forum and is mostly used by users to post questions anonymously. The platform has two types of appraisals: followers and likes. Higher likes on answers grow its rating on ASKfm at a faster rate.
Reddit. Reddit is a social news based discussion forum that allows users to discuss and vote on content that other users have submitted. Reddit has five types of appraisals: subscribers, upvotes, downvotes, karma, and comments. Having more karma on Reddit allows the user to post more often on the platform and gives him or her more reputation. Having more upvotes on posts helps users gain more exposure, which eventually pushes the posts higher up on the targeted Reddit or Subreddit.
Studies in discussion platforms have been conducted with the goal of manipulating the visibility of political threads on Reddit [
23]. The authors measured the effect of manipulation of upvotes and downvotes on article visibility and user engagement by comparing Reddit threads whose visibility is artificially increased. Another work by Shen and Rose [
115] investigated polarized user responses on an update to Reddit’s quarantine policy.
4.6 Music Sharing Platforms
Music sharing platforms enable users to upload, promote, and share audio. Here, we explain how collusion happens on music sharing platforms.
SoundCloud. SoundCloud is an audio sharing platform that connects the community of music creators, listeners, and curators. SoundCloud has three types of appraisals: plays, followers, and likes. A higher number of followers and likes creates a massive fan base for creators and gets more attention from the community. The popularity of a soundtrack in the platform is driven by the number of plays it receives. Plays allows users to popularize music and grow their brand on SoundCloud.
ReverbNation. ReverbNation is an independent music sharing platform where musicians, producers, and venues collaborate with each other. The platform has two types of appraisals: plays and fans. The higher the number of plays in an audio or video, the higher is the rank of the artist on the platform.
Some studies have been conducted on fraudulent entity detection in music sharing platforms. Bruns et al. [
20] investigated Twitter bots that help in promoting SoundCloud tracks. The authors also proposed a number of social media metrics that help identify bot-like behavior in the sharing of such content. Another work by Ross et al. [
106] proposed a method to distinguish between two groups of SoundCloud accounts: bots and humans.
4.7 Development Platforms
Interestingly, we found a few popular development platforms where collusion happens.
GitHub. GitHub is a repository hosting service that provides distributed version control and source code management (SCM) functionality. GitHub has three types of appraisals: followers, stars, and forks. User profiles with more followers make the account more popular. Similarly, stars and forks are the metrics to show the popularity of a repository. Blackmarket services help deliver GitHub followers, stars, and forks from real and active people.
Hacker News. Hacker News is the most popular discussion platform for developers. It has three types of appraisals: upvotes, karma, and comments. The higher the number of comments and upvotes on a post, the higher is its popularity. User profiles with high karma can perform additional appraisals on a post: downvote and making polls, among others. Blackmarket services help deliver upvotes and karma on Hacker News simply by adding the post link and making the payment.
Medium. Medium is an online publishing platform that is commonly used by developers to share ideas, knowledge, and perspectives. It has two types of appraisals: followers and claps. A higher number of claps in a post ranks it higher in feed and search results. Similarly, a higher number of followers increases the post’s reach and view. Blackmarket services help deliver followers and claps by adding the post link and making the payment.
Most of the studies in development platforms are on measuring user influence and identifying unusual commit behavior by analyzing the attributes of the platforms. Hu et al. [
67] measured the user influence on GitHub using the network structure of the following relation, star relation, fork relation, and user activities. The authors also introduced an author-level H-index (also known as H-factor) to measure users’ capability. Goyal et al. [
59] identified the unusual changes in the commit history of GitHub repositories. The authors designed an anomaly detection model based on commit characteristics of a repository. One potential research direction is to examine how collusive appraisals on these platforms help popularize the reputation of the repositories/users. Another potential research direction is to develop collusive entity detection techniques by considering the hidden relations between the entities of the platform.
4.8 Other Platforms
Other than the online media, artificial boosting is also observed in other platforms. Website owners can avail premium/freemium blackmarket services to get traffic on their website. The idea is to endow the responsibility for the necessary amount of web traffic to some other company. In blackmarket terms, the appraisal is called hits or web traffic. Gaining artificial hits helps the website gain popularity faster compared to the slow process of working on growing organic visitors via Search Engine Optimization. Blackmarket services offer two types of web traffic: regional traffic, where the customers can opt for traffic from a specific country or region and, niche traffic, where customers can opt for targeted traffic that focuses on a specific type of business such as music based, e-commerce based, and so forth. To activate traffic to a website, customers have to enter the URL of the website, the number of visitors required, and the preferred timespan for delivery.
Overall, it can clearly be seen that collusion happens in multiple online media platforms and several studies have been conducted to detect collusive entities in these platforms. The common platform is the social network, where collusive entities attempt to spread misinformation. One of the major challenges to conduct studies on other platforms is collecting the data with the restrictions and limitations posted by the APIs. Moreover, once the data are collected, labeling them by experts is a time-consuming task and requires significant manual effort. We now dive deeper into the studies that target identification of collusive entities based on the mode of operation.