AmberCutie's Forum
An adult community for cam models and members to discuss all the things!

oh no, i was suspended… šŸ„€ 😭

  • ** WARNING - ACF CONTAINS ADULT CONTENT **
    Only persons aged 18 or over may read or post to the forums, without regard to whether an adult actually owns the registration or parental/guardian permission. AmberCutie's Forum (ACF) is for use by adults only and contains adult content. By continuing to use this site you are confirming that you are at least 18 years of age.

faustfrost

Banhammered
Jan 22, 2026
24
1
1
Hello, dear @punkerbarbie

I was recently suspended from streaming, not banned, but the reason was because I was singing a HIM song called "Join Me in Death." It's a very iconic song. To make it easier to sing, I sent it to a member, and a few seconds later I was suspended. Why is that? I emailed support, and they sent me this email.


In any case, if I broke the rule, next time I'll be more careful and won't let it happen again. I promise. I am sorry.

Please bring me back to work

with love

my nickname is Faustfrost
 

Attachments

  • IMG_2447.png
    IMG_2447.png
    250.1 KB · Views: 25
Hello, dear @punkerbarbie

I was recently suspended from streaming, not banned, but the reason was because I was singing a HIM song called "Join Me in Death." It's a very iconic song. To make it easier to sing, I sent it to a member, and a few seconds later I was suspended. Why is that? I emailed support, and they sent me this email.


In any case, if I broke the rule, next time I'll be more careful and won't let it happen again. I promise. I am sorry.

Please bring me back to work

with love

my nickname is Faustfrost
I have confirmed that the support team has responded to your inquiry. Ticket #25146969
 
Upvote 0
@punker barbie View attachment 105209Unfortunately, CB support responded to me once (in the photos) 1 week ago and no more emails came.

Because YOU are supposed to respond to THEM to them that you were simply singing a song with death in the lyrics and had no intention of self-harm. Provide them a link to the Youtube video in your response so they can match it to your video feed. And then never ever sing that song again.
 
Upvote 1
Because YOU are supposed to respond to THEM to them that you were simply singing a song with death in the lyrics and had no intention of self-harm. Provide them a link to the Youtube video in your response so they can match it to your video feed. And then never ever sing that song again.
I already wrote to them 6 days ago, and they haven't responded since then. Yes, I really do want them to know that I'm fine and it's a mistake, which is why I found this forum in desperation.
 
Upvote 0
Because YOU are supposed to respond to THEM to them that you were simply singing a song with death in the lyrics and had no intention of self-harm. Provide them a link to the Youtube video in your response so they can match it to your video feed. And then never ever sing that song again.
Excuse me? This is ridiculous. This song isn't about suicide, it's full of metaphors. What the hell are you talking about? This song was played on MTV in the 2000s, everyone knows this song. If the song really had a suicidal message, it would have been removed from all platforms long ago.
 
Upvote 0
...This song isn't about suicide, it's full of metaphors....
I sometimes use ChatGPT too, it's everywhere ;)...

Yes — a suspension for singing ā€œJoin Me in Deathā€ on Chaturbate related to self‑harm concerns is very likely tied to content moderation rules about harmful or distressing content, not the song itself or copyright. While there’s no public Chaturbate press release about that specific incident that I could find, we can explain why a platform like Chaturbate would take action in that context based on their policies and broader industry standards.

Here’s how it works:

1. Chaturbate’s rules prohibit violent or self‑harm content

Chaturbate’s own Terms of Service and Code of Conduct prohibit content that depicts or references self‑harm, suicide, or other harmful physical acts — especially if it could be seen as encouraging, normalizing, or glamorizing those behaviors. The Terms explicitly disallow:

  • ā€œDepictions of actual or simulated illegal and unsafe actions, including ... cuttingā€ and other forms of physical harm. (chaturbate.com)
While the published terms don’t list every possible case, moderation guidance in industry practice shows that platforms shouldn’t expose users to content that could trigger distress or mimic encouragement of self‑harm. This applies in live chats, broadcast audio, text messages, and shared lyrics. (Samaritans)


2. ā€œJoin Me in Deathā€ has themes tied to self‑harm and suicide

The song ā€œJoin Me in Deathā€ by HIM is widely interpreted as exploring existential despair, suicidal themes, and invitation to die together. Even if sung casually, the lyrics contain repeated requests for the listener to ā€œjoin me in death.ā€

For many moderation systems — especially on platforms with live chat — context matters but risk is high. A phrase that can be read as encouraging death, even in a song lyric, might be flagged if it’s not clearly contextualized as music.


3. Platforms take a cautious approach to self‑harm language

Across major user‑generated platforms, anything that could be interpreted as promotion, encouragement, or acceptance of suicide/self‑harm is treated as harmful content. For instance, YouTube prohibits:

  • Content ā€œpromoting or glorifying suicide or self‑harmā€
  • Suicide notes or anything that could encourage self‑harm behaviors (archive.ph)
Many moderation policies — especially newer ones guided by safety best practices from organizations like Samaritans — remove or restrict content that includes wording or actions that could influence vulnerable people toward self‑injury. (Samaritans)

Chaturbate doesn’t publish all its content moderation decisions, but they use human reviewers and automated tools that flag risky emotional, violent, or self‑harm language even in live broadcasts.


4. What likely happened in this specific scenario

Here’s the most plausible sequence:

  • The model sang the song live and/or sent the lyrics in chat where viewers could see them.
  • Moderators or automated tools flagged the text because it contains wording that could be interpreted as involving suicide/self‑harm encouragement.
  • Based on the policy against depictions or references to self‑harm or unsafe actions, the account was suspended pending review or permanently.
Even if the model didn’t intend harm, moderation systems err on the side of removing harmful or ambiguous content quickly to protect users.

Because Chaturbate’s published site rules focus more on sexual safety and illegal content than on self‑harm language, the exact violation isn’t quoted in the public ToS — but industry best practices and internal moderation mandates still prohibit it.


5. Why they treat lyrics like this seriously

Several factors explain this:

āœ” Self‑harm content can ā€œtriggerā€ vulnerable users, especially in text form without contextual framing. (Samaritans)
āœ” Live chat is unpredictable — others could interpret or react dangerously. (Samaritans)
āœ” Platforms are increasingly liable for not moderating content that appears to promote self‑injury. (Samaritans)

Even if the model was quoting a song, without clear contextual signals that it’s purely musical/artistic, moderation systems sometimes treat it like harmful content.


Summary

  • Chaturbate’s rules prohibit content depicting or referencing violence, self‑harm, and unsafe acts. (chaturbate.com)
  • The lyrics of ā€œJoin Me in Deathā€ include invitations to die together, which can be interpreted as self‑harm content.
  • Moderation systems don’t always parse artistic intent — especially in live chats — so such content can trigger automatic suspension.
  • Platforms prefer to avoid potential harm or legal risk, so they enforce these rules strictly.
 
Upvote 0
Excuse me? This is ridiculous. This song isn't about suicide, it's full of metaphors. What the hell are you talking about? This song was played on MTV in the 2000s, everyone knows this song. If the song really had a suicidal message, it would have been removed from all platforms long ago.
LMAO First of all, do not come at me like this, I am trying to help you. And no a song by a popular in the early 2000's emo ass band would not be removed from youtube or other sites that stream music.

YOU however cannot sing those lyrics on cam. Nor would you be allowed to sing songs with the N-word.
 
Upvote 0
I sometimes use ChatGPT too, it's everywhere ;)...

Yes — a suspension for singing ā€œJoin Me in Deathā€ on Chaturbate related to self‑harm concerns is very likely tied to content moderation rules about harmful or distressing content, not the song itself or copyright. While there’s no public Chaturbate press release about that specific incident that I could find, we can explain why a platform like Chaturbate would take action in that context based on their policies and broader industry standards.

Here’s how it works:

1. Chaturbate’s rules prohibit violent or self‑harm content

Chaturbate’s own Terms of Service and Code of Conduct prohibit content that depicts or references self‑harm, suicide, or other harmful physical acts — especially if it could be seen as encouraging, normalizing, or glamorizing those behaviors. The Terms explicitly disallow:

  • ā€œDepictions of actual or simulated illegal and unsafe actions, including ... cuttingā€ and other forms of physical harm. (chaturbate.com)
While the published terms don’t list every possible case, moderation guidance in industry practice shows that platforms shouldn’t expose users to content that could trigger distress or mimic encouragement of self‑harm. This applies in live chats, broadcast audio, text messages, and shared lyrics. (Samaritans)


2. ā€œJoin Me in Deathā€ has themes tied to self‑harm and suicide

The song ā€œJoin Me in Deathā€ by HIM is widely interpreted as exploring existential despair, suicidal themes, and invitation to die together. Even if sung casually, the lyrics contain repeated requests for the listener to ā€œjoin me in death.ā€

For many moderation systems — especially on platforms with live chat — context matters but risk is high. A phrase that can be read as encouraging death, even in a song lyric, might be flagged if it’s not clearly contextualized as music.


3. Platforms take a cautious approach to self‑harm language

Across major user‑generated platforms, anything that could be interpreted as promotion, encouragement, or acceptance of suicide/self‑harm is treated as harmful content. For instance, YouTube prohibits:

  • Content ā€œpromoting or glorifying suicide or self‑harmā€
  • Suicide notes or anything that could encourage self‑harm behaviors (archive.ph)
Many moderation policies — especially newer ones guided by safety best practices from organizations like Samaritans — remove or restrict content that includes wording or actions that could influence vulnerable people toward self‑injury. (Samaritans)

Chaturbate doesn’t publish all its content moderation decisions, but they use human reviewers and automated tools that flag risky emotional, violent, or self‑harm language even in live broadcasts.


4. What likely happened in this specific scenario

Here’s the most plausible sequence:

  • The model sang the song live and/or sent the lyrics in chat where viewers could see them.
  • Moderators or automated tools flagged the text because it contains wording that could be interpreted as involving suicide/self‑harm encouragement.
  • Based on the policy against depictions or references to self‑harm or unsafe actions, the account was suspended pending review or permanently.
Even if the model didn’t intend harm, moderation systems err on the side of removing harmful or ambiguous content quickly to protect users.

Because Chaturbate’s published site rules focus more on sexual safety and illegal content than on self‑harm language, the exact violation isn’t quoted in the public ToS — but industry best practices and internal moderation mandates still prohibit it.


5. Why they treat lyrics like this seriously

Several factors explain this:

āœ” Self‑harm content can ā€œtriggerā€ vulnerable users, especially in text form without contextual framing. (Samaritans)
āœ” Live chat is unpredictable — others could interpret or react dangerously. (Samaritans)
āœ” Platforms are increasingly liable for not moderating content that appears to promote self‑injury. (Samaritans)

Even if the model was quoting a song, without clear contextual signals that it’s purely musical/artistic, moderation systems sometimes treat it like harmful content.


Summary

  • Chaturbate’s rules prohibit content depicting or referencing violence, self‑harm, and unsafe acts. (chaturbate.com)
  • The lyrics of ā€œJoin Me in Deathā€ include invitations to die together, which can be interpreted as self‑harm content.
  • Moderation systems don’t always parse artistic intent — especially in live chats — so such content can trigger automatic suspension.
  • Platforms prefer to avoid potential harm or legal risk, so they enforce these rules strictly.
1




HIM's "Join Me in Death" (1999) by Ville Valo is often misunderstood but is not literally about suicide. Valo explained it as a Romeo and Juliet story focusing on total commitment, sacrifice, and giving everything for love, heavily influenced by Blue Ɩyster Cult's "Don't Fear the Reaper". The lyrics use dark imagery as a metaphor for extreme devotion.
Key Details About the Song:
  • Theme: The song is a "dark love story" about escaping the cruel world together.
  • Meaning: It focuses on the idea of giving everything away for love, not the act of suicide.
  • Inspiration: Valo aimed to create a rock track inspired by Romeo and Juliet.
  • Lyrics: Despite lines like "Won't you die tonight for love?", Valo has stated it is about extreme love and passion.
 
Upvote 0
Okay, I'm willing to accept this principle, but how was I supposed to know that it's not allowed to include a song, let alone a popular song, and let a rock band play that song for the whole world to hear? Where does that say on Chaturbate?? Specifically, about including a song and its sing-along?
 
Upvote 0
Oh my god, is this song played on the radio, when people drive somewhere. Also this music is broadcast on TV. Why was I supposed to know that this song is banned on Chaturbate? Answer me?
 
Upvote 0
LMAO First of all, do not come at me like this, I am trying to help you. And no a song by a popular in the early 2000's emo ass band would not be removed from youtube or other sites that stream music.

YOU however cannot sing those lyrics on cam. Nor would you be allowed to sing songs with the N-word.
I'm glad you're trying to help, I appreciate it. But this moment that happened doesn't give me the right to be called upon to go to a mental hospital because I was play and sing a cult song named by HIM - Join Me In Death . For me, this is, first and foremost, an insult on their part.
 
Last edited:
Upvote 0
Okay, I'm willing to accept this principle, but how was I supposed to know that it's not allowed to include a song, let alone a popular song, and let a rock band play that song for the whole world to hear? Where does that say on Chaturbate?? Specifically, about including a song and its sing-along?
Well, that's kind of the shitty thing right? Kind of should have been obvious due to all of the laws we have to deal with but at the same time, just a song yeah? But, unfortunately that's not how the AI software works.
 
Upvote 0
I'm glad you're trying to help, I appreciate it. But this moment that happened doesn't give me the right to be called upon to go to a mental hospital because I was play and a cult song named by HIM - Join Me In Death . For me, this is, first and foremost, an insult on their part.
The thing is, a human being didn't do this to you, the AI software did. You have to be extremely careful about discussing prohibited things like death and suicide and depression and cutting and all of that. Your voice gave life to the lyrics as if you were expressing this as your own sentiment.

It doens't matter what the band claims it means (which their statement was probably heavily PR'd to keep them on air, btw). It matters what the lyrics say.
 
  • Helpful!
Reactions: melissa_tv
Upvote 0
The thing is, a human being didn't do this to you, the AI software did. You have to be extremely careful about discussing prohibited things like death and suicide and depression and cutting and all of that. Your voice gave life to the lyrics as if you were expressing this as your own sentiment.

It doens't matter what the band claims it means (which their statement was probably heavily PR'd to keep them on air, btw). It matters what the lyrics say.
«Your voice gave life to the lyrics as if you were expressing this as your own sentiment.». - that means I sang it well 😁

Okay, thanks for the support, but what should I do to get back?
 
  • Like
Reactions: Vixxen81
Upvote 0
«Your voice gave life to the lyrics as if you were expressing this as your own sentiment.». - that means I sang it well 😁

Okay, thanks for the support, but what should I do to get back?
Indeed it does mean that. :)

Respond to the email you received with a link to the youtube video of the song with a very brief explanation that you were singing the song as is and say that you didn't realize this was an issue and will not do it again that you are in no way wanting to self-harm and that they can look through your chat logs to verify that.

Now, here's the ultra-crappy thing, it may take a long time for them to reply to you after this. You may need to work other sites in the meantime if you aren't already doing so.
 
Upvote 0
OH hey check this out, it kind of backs the point, this was in my tip notes last night.

Jan 22, 2026, 10:56 PMTip From: were....(broken for privacy)
sexy singer
5303
 
Upvote 0
What a clownade, getting suspended for that.

And it will take them a month to check your appeal, because it's really difficult for them to read the message you got banned for.
 
  • Like
Reactions: faustfrost
Upvote 0
Huh?

@faustfrost every time you email them about this, you get sent to the back of the line. And the line is long.
I don't think this is how it works, recently my friend was blocked the day before yesterday and he wrote a confirmation letter about how he repented and he was unblocked the same day. It's actually weird to wait for support responses in 2026, I mean, I dont like the medieval vibe, that toomuch LMAO
 
Upvote 0
in any case, I want to think not only about myself, there are a lot of goths and emos on this site, and even musicians. I want them to know that they could end up in my position due to the strange AI recognition system. I will not leave this topic alone and will not forget about it. No one deserves such punishment. I wish people who love music like me to be careful, as the AI recognition system is very flawed.
 
Upvote 0
I don't think this is how it works, recently my friend was blocked the day before yesterday and he wrote a confirmation letter about how he repented and he was unblocked the same day. It's actually weird to wait for support responses in 2026, I mean, I dont like the medieval vibe, that toomuch LMAO
Yeah, that is the problem, some are getting replied on the next day, some have to wait.
They unbanned me on the next day previous time, now I'm also waiting for the 2nd week.
 
  • Like
Reactions: faustfrost
Upvote 0
I don't think this is how it works, recently my friend was blocked the day before yesterday and he wrote a confirmation letter about how he repented and he was unblocked the same day. It's actually weird to wait for support responses in 2026, I mean, I dont like the medieval vibe, that toomuch LMAO
It is how it works. But whatever I'm done helping you.
 
Upvote 0
So, is it unfair, medieval or even ridiculous? It could be. I'm not saying it is or it is not. Like everybody else, you are entitled to your thoughts and feelings. However, I think you may be confused or maybe even angry about what you think it "should be" versus what it "actually is". From what I've read, people here talked to you about "what it actually is" and, in my opinion, they've nailed both how your suspension probably came to be and what to do exactly about it. They, just like me, probably thought that the most important matter would be to get you back on your cam making money as soon as possible. So, feelings aside, the path has been laid before you.

Regarding support, the multiple emails and the waiting time, again, that is just how it really works. Don't believe it? Take some time to read other posts in this forum or do a quick search on how support teams in any large platform handle their tickets. That should give you a better perspective. No ticket is the same as another, and there are a lot of things happening on support's side that we don't get to see.

There is no easy or kind way to say what I'm going to say next, so I'll say it as it is. Please be aware that I'm not trying to attack nor invalidate what you think and feel or trying to defend the platform. If it sounds harsh, it's because it is.

CB is a private business, not a public institution or government, so rights, fairness, injustice, etc. take on a completely different meaning. A model, any model, doesn't work "for" CB and doesn't work "with" CB (in a traditional sense at least). A model works through their platform. Regarding the latter, the moment you verify your account for streaming, you are accepting and giving consent to operate by their terms and rules. One of the most important terms is, paraphrasing, "We make the rules, we can change said rules at any time, and we reserve the right to restrict or terminate our relationship at any time, with or without cause."

Is it tough? It is. But I prefer, at any time, to be clear myself on how things really work so to know what to expect and do.

So, what's more important for you right now: lifting the suspension or making sure they know they're wrong? Just saying...
 
  • Helpful!
Reactions: SCRIPTA
Upvote 0
Okay, I'm willing to accept this principle, but how was I supposed to know that it's not allowed to include a song, let alone a popular song, and let a rock band play that song for the whole world to hear? Where does that say on Chaturbate?? Specifically, about including a song and its sing-along?
I was always curious about that, but just in case i always switch some songs : like Alice Cooper- school hah last time it randomly started playing, i just switched. Or im freaking out when some artits sing something about sixteen or seventeen . Running faaaaaaaaaast to mute sound.
 
  • Like
Reactions: yummybrownfox
Upvote 0
I was always curious about that, but just in case i always switch some songs : like Alice Cooper- school hah last time it randomly started playing, i just switched. Or im freaking out when some artits sing something about sixteen or seventeen . Running faaaaaaaaaast to mute sound.
Damn, yeah... Winger - She's Only Seventeen. I can see that potentially being a quick ban. And that was MUCH more widely popular (in the US) than anything by HIM.
 
Upvote 0