The Rise, Fall, and Return of Kiwi Farms After Deplatforming Efforts

by DeplatformApril 27th, 2025
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Research shows deplatforming can curb online extremism, but the Kiwi Farms case highlights how resilient and adaptable toxic communities can be, despite massive industry efforts.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - The Rise, Fall, and Return of Kiwi Farms After Deplatforming Efforts
Deplatform HackerNoon profile picture
0-item

Authors:

(1) Anh V. Vu, University of Cambridge, Cambridge Cybercrime Centre ([email protected]);

(2) Alice Hutchings, University of Cambridge, Cambridge Cybercrime Centre ([email protected]);

(3) Ross Anderson, University of Cambridge, and University of Edinburgh ([email protected]).

Abstract and 1 Introduction

2. Deplatforming and the Impacts

2.1. Related Work

2.2. The Kiwi Farms Disruption

3. Methods, Datasets, and Ethics, and 3.1. Forum and Imageboard Discussions

3.2. Telegram Chats and 3.3. Web Traffic and Search Trends Analytics

3.4. Tweets Made by the Online Community and 3.5. Data Licensing

3.6. Ethical Considerations

4. The Impact on Forum Activity and Traffic, and 4.1. The Impact of Major Disruptions

4.2. Platform Displacement

4.3. Traffic Fragmentation

5. The Impacts on Relevant Stakeholders and 5.1. The Community that Started the Campaign

5.2. The Industry Responses

5.3. The Forum Operators

5.4. The Forum Members

6. Tensions, Challenges, and Implications and 6.1. The Efficacy of the Disruption

6.2. Censorship versus Free Speech

6.3. The Role of Industry in Content Moderation

6.4. Policy Implications

6.5. Limitations and Future Work

7. Conclusion, Acknowledgments, and References

Appendix A.

Most studies assessing the impact of deplatforming have worked with data on social networks. Deplatforming users may reduce activity and toxicity levels of relevant actors on Twitter [28] and Reddit [29], [30], limit the spread of conspiratorial disinformation on Facebook [31], reduce the engagement of peripheral members with hateful content [44], and minimise disinformation and extreme speech on YouTube [32]. But deplatforming has often made hate groups and individuals even more extreme, toxic and radicalised. They may view the disruption of their platform as an attack on their shared beliefs and values, and move to even more toxic places to continue spreading their message. There are many examples: the Reddit ban of r/incels in November 2017 led to the emergence of two standalone forums, incels.is and incels.net, which then grew rapidly; users banned from Twitter and Reddit exhibit higher levels of toxicity when migrating to Gab [33]; users migrated to their own standalone websites after getting banned from r/The Donald expressed higher levels of toxicity and radicalisation, even though their posting activity on the new platform decreased [45], [46]; the ‘Great Deplatforming’ directed users to other less regulated, more extreme platforms [47]; the activity of many right-wing users moved to Telegram increased multi-fold after being banned on major social media [34]; users banned from Twitter are more active on Gettr [48]; communities migrated to Voat from Reddit can be more resilient [49]; and roughly half of QAnon users moved to Poal after the Voat shutdown [50]. Blocking can also be ineffective for technical and implementation reasons: removing Facebook content after a delay appears to have been ineffective and had limited impact due to the short cycle of users’ engagement [51].


The major limitation of focusing on social networks is that these platforms are often under the control of a single tech company and thus content can be permanently removed without effective backup and recovery. We instead examine deplatforming a standalone website involving a concerted effort on a much wider scale by a series of tech companies, including some big entities that handle a large amount of Internet traffic. Such standalone communities, for instance, websites and forums, may be more resilient as the admin has control of all the content, facilitating easy backups and restores. While existing studies measure changes in posting activity and the behaviours of actors when their place is disrupted, we also provide insights about other stakeholders such as the forum operators, the community leading the campaign, and the tech firms that attempted the takedown.


Previous work has documented the impacts of law enforcement and industry interventions on online cybercrime marketplaces [20], cryptocurrency market price [52], DDoSfor-hire services [14], [15], the Kelihos, Zeus, and Nitol botnets [53], and the well-known click fraud network ZeroAccess [54]; yet how effective a concerted effort of several tech firms can be in deplatforming an extreme and radicalised community remains unstudied.


2.2. The Kiwi Farms Disruption

KIWI FARMS had been growing steadily over a decade (see Figure 1) and had been under Cloudflare’s DDoS protection for some years.[2] An increase of roughly 50% in forum activity happened during the COVID-19 lockdown starting in March 2020, presumably as people were spending more time online. Prior interventions have resulted in the forum getting banned from Google Adsense, and from Mastercard, Visa and PayPal in 2016; from hundreds of VPS providers between 2014–2019 [55]; and from selling merchandise on the print-on-demand marketplace Redbubble in 2016. XenForo, a close-source forum platform, revoked its license in late 2021 [56]. DreamHost stopped its domain registration in July 2021 after a software developer killed himself after being harassed by the site’s users. This did not disrupt the forum as it was given 14 days to seek another registrar [57]. While these interventions may have had negative effects on its profit and loss account, they did not impact its activity overall. The only significant disruption in the forum’s history was between 22 January and 9 February 2017 (19 days), when the forum’s owner suspended it himself due to his family being harassed [58].[3]


The disruption studied in this work was started by the online community in 2022. A malicious alarm was sent to the police in London, Ontario by a forum member on 5 August 2022, claiming that a Canadian trans activist had committed murders and was planning more, leading to her being swatted [23]. She and her family were then repeatedly tracked, doxxed, threatened, and generally harassed. In return, she launched a campaign on Twitter on 22 August 2022 under the hashtag #dropkiwifarms and planned a protest outside Cloudflare’s headquarters to pressure the company to deplatform the site [59]. This campaign generated lots of attention and mainstream headlines, which ultimately resulted in several tech firms trying to shut down the forum. This is the first time that the forum was completely inaccessible for an extended period due to an external action, with no activity on any online places including the dark web. It attempted to recover twice, but even when it eventually returned online, the overall activity was roughly halved.


The majority of actions taken to disrupt the forum occurred within the first two months of the campaign. Most of them were widely covered in the media and can be checked against public statements made by the industry and the forum admins’ announcements (see Figure 2). The forum came under a large DDoS attack on 23 August 2022, one day after the campaign started. It was then unavailable from 27 to 28 August 2022 due to ISP blackholing. Cloudflare terminated their DDoS prevention service on 3 September 2022 – just 12 days after the Twitter campaign started – due to an “unprecedented emergency and immediate threat to human life” [24]. The forum was still supported by DDoSGuard (a Russian competitor to Cloudflare), but that firm also suspended service on 5 September 2022 [25]. The forum was still active on the dark web but this .onion site soon became inaccessible too. On 6 September 2022, hCaptcha dropped support; the forum was removed from the Internet Archive on the same day [60]. This left it under DiamWall’s DDoS protection and hosted on VanwaTech – a hosting provider describing themselves as neutral and noncensored [61]. On 15 September 2022, DiamWall terminated their protection [26] and the ‘.top’ domain provider also stopped support [27]. The forum was completely down from 19 to 26 September 2022 and from 23 to 29 October 2022. From 23 October 2022 onwards, several ISPs intermittently rejected announcements or blackholed routes to the forum due to violations of their acceptable use policy, including Voxility and Tier-1 providers such as Lumen, Arelion, GTT and Zayo. This is remarkable as there are only about 15 Tier-1 ISPs in the world. The forum admin devoted extensive


Table 1: Complete snapshots of public posts on KIWI FARMS and its primary competitor LOLCOW FARM until 31 Dec 2022.


effort to maintaining the infrastructure, fixing bugs, and providing guidance to users in response to password breaches. Eventually, by routing through other ISPs, KIWI FARMS was able to get back online on the clearnet and remain stable, particularly following its second recovery in October 2022.


This paper is available on arxiv under CC BY 4.0 DEED license.


[2] Cloudflare’s service tries to detect suspicious patterns and drop malicious ones, only letting legitimate requests through.


[3] Minor suspensions observed in our forum dataset are on 2 Feb 2013, 24 Jan 2016, 29 Sep 2017, and 11 Jan 2021, yet without any clear reasons.


Trending Topics

blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks