Imagine coming across a brutal video on X, showing someone's last moments. Shocking, isn't it? Now imagine that this person is someone close to you, and you have to fill in a form to request the video's deletion.
Since February 2025, this has been a reality on X, the platform run by Elon Muskwhich has introduced a new policy called "Form Moment of Death". This measure has sparked lively debate and raised many questions: where does freedom of expression end? Who decides what stays online? This article explores this controversial decision, its implications for users and the future of the digital communication.
What is the Death Moment on X form?
The "Moment of Death Form" is not just a catchy term, but a new rule integrated into X's violent content policy as of February 18, 2025. This rule allows the relatives or legal representatives of a deceased person to report a video showing his or her last moments. However, this is not an automatic process. In order for the request to be considered, X requires applicants to provide concrete evidence, such as a death certificate.
X's stated aim is to strike a balance between the dignity of the deceased and what the platform calls a "robust public registry". This means that if a video is deemed "worthy of historical interest" or "of significant topicality", it could remain online, even if the family objects.
This policy raises a fundamental question: can a technology platform really reconcile ethics and total transparency?
Why has X adopted this new rule?
To understand this update, we need to look back at previous events. In 2024, X refused to remove a video of a violent attack in Sydney, despite requests from the Australian authorities. The platform cited freedom of expression as its main argument. Although the video did not show any deaths, it remained accessible, sparking a worldwide debate.
Later, it was discovered that an individual responsible for a triple murder in the UK had viewed this video before carrying out his act. Coincidence or catalyst? Hard to say, but this event certainly prompted X to adjust its rules, while maintaining its fundamental principles.
According to an excerpt from their Violent Content Policy: "X values a robust public registry, especially for landmark historical or current events." This statement shows that X prefers a systematic approach, with clear forms and criteria, rather than immediate deletion.
A complex administrative process
Filling out a form to request the deletion of a video showing the death of a loved one may seem cold and bureaucratic. Yet that's exactly what X asks. Users have to prove their connection to the victim, provide official documents, and wait for the platform to accept or refuse their request. If the content is deemed "relevant" by the moderators, it will remain online.
For many, this complexity seems deliberate. By imposing a structured process, X avoids mass deletions and retains room for manoeuvre. But for users, it can feel like yet another ordeal in an already painful time. A startup specializing in online reputation management could have its work cut out helping these families navigate this administrative process.
Freedom of expression versus human dignity
The conflict at the heart of this measure is philosophical: on the one hand, there's the freedom of expression so dear to Elon Musk and his team, and on the other, the right to human dignity and privacy, especially after death. X seems to favor the first option, but not without certain concessions.
The broad outlines of this policy are as follows:
-
Videos deemed "historic" or "of significant topical interest" are given priority.
-
Only direct relatives or legal representatives may request deletion.
-
The process is based on a subjective assessment by X.
Not everyone is happy with this compromise. Some accuse X of hypocrisy: claiming to defend freedom of expression while imposing barriers on bereaved families.
What's the impact for tech companies?
X's decision sends a strong signal to startups and technology companies. Managing violent content is becoming a strategic issue. Here are the potential consequences:
-
Brand perception : A platform that keeps shocking videos may put off some users.
-
Control Governments may tighten their laws in response.
-
Innovation : Moderation AIs will have to become more sophisticated to identify such sensitive content.
Companies may need to rethink their own policies to adapt to these new expectations. A startup specializing in AI could develop tools capable of detecting these "moments of death" in an automatic, more human way.
Users at the heart of the debate
For users, the debate surrounding the Death Moment Form is deeply divisive. Some support X, arguing that total transparency is essential in a connected world. Others feel that this approach is too cold and lacking in human sensitivity.
"If a video can inform or educate, it should stay, no matter who requests it." - Anonymous user on X.
But for every voice defending this logic, there's another crying indecency. This debate reflects a broader tension in digital society: how far can we expose reality, even the most crude?
An uncertain future for X and its policies
With this update, X is opening a veritable Pandora's box. The next steps will depend on the reaction of users, regulators and competitors. If governments get involved, as they did in Australia last year, X may have to review its policy. And if other platforms adopt stricter approaches, X risks losing market share.
For entrepreneurs and marketing professionals alike, it's essential to keep a close eye on these developments. This story shows how a simple rule can shake up a platform's strategy, influence public opinion and redefine digital standards.
At ValueYournetworkWe understand the importance of reacting quickly to changes in digital policies and adapting communication strategies to protect and enhance our customers' brand image. Since 2016, we've been supporting companies with successful influencer marketing campaigns and crisis management expertise on social networks.