DSA Transparency Report – December 2024

Hipolink

This DSA Transparency Report covers the content moderation activities of CEBC B.V., being the owner of Hipolink, under the Digital Services Act (DSA), for the period from 17th January 2024 – to 31st December, 2024.

Section 1. Information on number of orders received from EU Member States' authorities

MetricTotal numberMember States of the European Union
BEDEESFRHRITNLPTOther
Number of orders to act against illegal content0
Number of orders to provide information0
Median time to inform the authority of the receipt of the order to act against illegal content0
Median time to give effect to the order to act against illegal content0
Moderation divided by category:
Unsafe and/or illegal products / violent or graphic content / illegal activities and dangerous challenges / Negative effects on civic discourse or elections / Scams and/or fraud0

Section 2. Information on number of notices

MetricTotal numberNotice and Action mechanism (NAM)
NAM totalTrusted Flagger
Number of notices received0
Median time to take action on the basis of the notice0
Number and types of actions taken on the basis of the law0
Number and types of actions taken on the basis of the terms and conditions of service0
Number and types of alleged illegal content concerned0
Number of notices processed by using automated means0

Section 3. Information on own-initiative content moderation

Hipolink takes all possible measures to ensure that its users are satisfied with Hipolink services while remaining within the legal framework. Hipolink is using a combination of live moderation, ongoing moderation, and post-moderation. There are two types of content moderation by Hipolink: automated review and human review.

In addition to that, any user can leave a complaint through customer support. Every complaint is followed by a detailed investigation to determine how to deal with the user in question. As soon as there is evidence of abuse, any violation of laws and regulations is prevented within 24 hours.

Automated tools are also used by Hipolink. This includes automated tools that detect fraud and other user suspicious activities. These tools have also been deployed to identify any content that may violate applicable laws and regulations, as well as Hipolink's policies. Hipolink's automated tools are continually trained and enhanced to address new and emerging threats.

Persons in charge of content moderation receive comprehensive training when they join the team. This initial training is enhanced by continuous trainings to keep personnel updated on all issues that require active involvement and prompt response from Hipolink's team. A separate support team is dedicated to manual moderation, operating in accordance with our content review guidelines and policies. Each support staff member is trained based on our content moderation policies.

Total number of content moderation measures taken: 108134
Number of content moderation measures taken detected solely using automated means: 53205 (semi-automatic)

Content moderation measures taken categorised by type of restriction applied:

Types of restriction measuresNumber of measures taken
Permanently blocking the author page and all content3002
Temporary blocking of the author page119
Full blocking/deletion of author's content515
Temporary blocking of withdrawal of funds (if suspected)610
Temporary or permanent blocking of receiving funds43

Content moderation measures taken categorised by type of illegal content or violation of terms and conditions:

Types of illegal content or violation of terms and conditionsNumber of measures taken
6.1.8. – providing access to pornographic materials473
8.5, 8.6 – fraud290
5.2.3.8 – advertising of online casino, casino, slot machine hall, poker, gambling games1459
5.2.3.2, 6.2.2 – copyright infringement118
5.2.4 – malware106
2.1. Company may make the access to and use of the HIPOLINK Platform, or certain areas or features of the HIPOLINK Platform, subject to certain conditions or requirements, such as completing a verification process, meeting specific quality or eligibility criteria.783
6.1.9. – use the Hipolink Platform to fund escort activity190
5.7. – spam870

Section 4. Information on number of complaints

MetricNumber
Out-of-court dispute submitted bodies
Number of decisions submitted to out-of-court dispute settlement bodies0
Internal complaints mechanism
Number of complaints submitted to the internal-complaints mechanism129
Complaint based on procedural grounds0
Complaint regarding the interpretation of illegality or incompatibility0
Number of restrictions upheld as a result of an internal complaint103
Number of restrictions reversed as a result of an internal complaint26
Median time for decisions on internal complaintsup to 12 hours

Section 5. Information on any use of automated means for the purpose of content moderation

MetricTotal Number
Accuracy rate of the items processed solely by automated means0
Accuracy rate of the items processed partly by automated means53205 (semi-automatic)
Error rate of the automated means applied0
Safeguards applied to the use of automated means:Hipolink performs constant random sampling of the automatically approved items and sends them to moderators to ensure that the quality of automated approvals is within the acceptable range.