Need help now? clickandstop.ch
WTF: Gülsha sparks attention – and awareness

OnlyFakes

Artificially created closeness with real consequences

Trigger warning: This page addresses digital sexualized violence..

Image
Image
Image
Image
Image
Image
Image
Real or not?

Imagine this: You come across Gülsha Adilji – or even yourself – on a platform like OnlyFans. The face is familiar, yet not a single image is real.

With this deceptively realistic mock-up account, we show what happens online every single day – often without the person’s knowledge. We want to highlight how easily intimate deepfakes can be created today, and how quickly they can destroy someone’s reputation, privacy, and sense of security. At the same time, we present possible ways to protect yourself.

Digital Fake

The topless photos were created by AI artist Basil Stücheli using artificial intelligence.

Here, Basil shows how a deceptively realistic fake photo is created from the original image from the Ringier archive by Geri Born:

What does sextortion do to us?

Shock Effect

You start doubting yourself and your control over your image.

Power Imbalance

Perpetrators have something you can’t stop – and use it to threaten you.

Extortion Attempt

Many pay to have the images taken down – often in vain.

Isolation

Out of fear and shame, victims withdraw instead of seeking help.

Our goal:

Expose to Educate

Uncover fakes before they cause harm.

Raise Awareness and Empower

Support those affected and reveal perpetrators’ tactics.

Act, Don’t Look Away

Make knowledge, tools, and help accessible.

Strengthen Awareness

So that digital closeness connects rather than harms.

#OnlyFacts

One third of all young people have already encountered sextortion-like content.

(Source: clickandstop.ch, 2024, CH context)

80% of victims do not come forward due to fear/shame.

(Source: Europol, IOCTA 2023, EU context)

AI can create deceptively realistic fake nudes from a single selfie.

(Source: Sensity AI, 2023, “State of Deepfakes”)

98% of all deepfake videos on the internet are pornographic.

(Source: Sensity AI, 2023, “State of Deepfakes”)

#ClickWithCare

What you can do NOW
If it happens to you:

- Don't pay – giving in will make the situation worse.
- Get help immediately at clickandstop.ch – onfidential, anonymous, and professional.
- Preserve evidence, but don’t store illegal content (document chat, user names, payments).
- Block the perpetrator— but only after securing evidence.
- Report to the police —acting quickly is crucial.

Prevent:

- Check privacy settings and set profiles to private.
- Use strong passwords and 2FA (two-factor authentication)
- Don’t send intimate images to strangers – not even to seemingly trusted people.
- Do not open forwarded links/files from unknown senders.
- Use reverse image search (e.g. Google Lens).
- Don't store/share or forward your own sensitive content.
- If unsure, talk to a trusted person or seek professional advice.

What you can do NOW

#NoShameZone
You're not alone

3 tips from blogger Anja SchuSchu:

Go to the police

Go to the police immediately and report the perpetrator.

Speak up right away

Talk about it immediately – not 11 years later, like I did.

Be mindful with photos

A single picture may seem harmless, but be aware of what other photos you publish.

🎥 Anja's personal story:

147.ch – The emergency helpline for young people, anonymous and available immediately.
Non-judgmental help & advice via phone or chat.

Welcome to WHAT THE FAKE — learn how to protect yourself on clickandstop.ch.
In cooperation with Child Protection Switzerland & clickandstop.ch

Important note: campaign staging · not a real account.
The campaign is supported by Kinderschutz Schweiz and Ringier. Gülsha Adilji is aware of and supports the initiative.