Meta Denied Regulator’s Request to Test Rights Manager’s Effectiveness

arcom-sDue to the staggering volume of content uploaded to the internet every single second, any hope that content moderation can be tackled at scale by humans alone, is already dead.

The threat of so-called ‘upload filters’ under Article 17 of the EU Copyright Directive didn’t just cause citizens to be fearful of the future. Tech giants including Google, Twitter, Facebook, and Amazon all warned of a chilling effect if the law was passed. The EU passed it regardless.

The basic premise is relatively simple; content uploaded by users should not appear on content-sharing service providers (OCSSPs) unless permission has been granted by rightsholders.

With a licensing deal in force, content uploaded by users that would otherwise be seen as infringing can be monetized, for example. When there’s no deal in place, the same content is seen in a different light, in theory leaving content-sharing platforms to demonstrate that every effort was made to prevent that from happening.

Automatic Identification of Content

To determine whether content is permitted or not, it must first be identified. French elecoms regulator Arcom, which has responsibility for suppressing piracy wherever it occurs in France, previously evaluated content identification systems used by Dailymotion, Facebook, and YouTube. To build on those 2020 findings, Arcom conducted a new consultation with rightsholders mid-2023.

The regulator described the deployment of content recognition tools as encouraging but noted several areas for improvement. While the music and audiovisual sectors can block or monetize with relatively ease, rightsholders producing text and image based products (publishers and photographers) expressed dissatisfaction over lacking availability of tools to identify static images.

Citing rightsholders’ reports, Arcom suggested that one of the world’s largest social platforms may not have any content detection systems at all.

“Regarding relations with service providers, the majority of respondents confirmed being aware of the tools available to them. Rights holders have noted, however, that services such as X (formerly Twitter) do not have content recognition tools or do not implement effective measures to prevent the uploading of protected works,” Arcom reported.

Commenting on Meta’s Rights Manager tool, rightsholders reported configuration issues and also “its capacity to detect content, which seems underused and not very optimal.” Both Rights Manager and YouTube Content ID are unsuitable for written content, some rightsholders complained, while TikTok’s MediaMatch arrived late and was considered ineffective.

Following that report, Arcom issued 13 recommendations which included advice for both platforms and rightsholders.

“The Authority notes with dissatisfaction the low level of involvement of the parties in implementing these recommendations. Of these, only one was followed by the rights holders,” Arcom reveals in a new report published last week.

Permission to Test Image Detection Denied

As part of its follow-up work, Arcom says that it wanted to evaluate the effectiveness of content recognition tools, particularly in respect of static images. Noting that both claim to have the ability to identify this type of content, the regulator homed in on Pinterest’s ‘Claims Portal’ tool and on Meta’s ‘Rights Manger’, which is active on both Facebook and Instagram.

From the tone of its report, Arcom may have been taken a little by surprise when both Pintrest and Meta denied its request to carry out direct tests on their respective systems. Both gave the same reason for rejection: Claims Portal and Rights Manager are reserved for the use of rightsholders alone and since Arcom isn’t a rightsholder, no can do.

“Arcom regrets not having been able to benefit from immediate access to the interfaces of these tools, on the grounds that these are reserved for rights holders, a status which it does not have. Such access would, in fact, allow the Authority to carry out more objective and intensive assessments,” the regulator explains.

Under Article L331-18 of the CPI, Arcom has authority to assess the level of effectiveness of measures taken by OCSSPs, including their deployment and operating conditions.

“Under the assessment mission mentioned [above], the authorized and sworn agents of the Audiovisual and Digital Communication Regulatory Authority may implement proportionate methods of automated collection of publicly accessible data,” the law states.

“The Audiovisual and Digital Communication Regulatory Authority may request any useful information from service providers, rights holders and designers of protection measures.”

Arcom Resorted to Other Means

In order to evaluate Rights Manager, Arcom was forced to team up with unnamed rightsholders “benefiting from access to the tools” and conduct its technical tests through them.

“The Authority’s approach consisted of publishing, on the two services Facebook and Instagram, a selection of protected images for which it had previously obtained permission from rights holders who agreed to participate in the tests,” Arcom explains.

“Once the images were published, the Authority instructed rights holders to consult their Rights Manager interface and observe the appearance of any alerts. This process made it possible to determine whether or not the images were detected by the tool.”

Deployed on Facebook and Instagram, Rights Manager reportedly functioned differently across services and profiles.

It identified images when they were posted on public spaces with no visibility restrictions in place, and when posted from private profiles on Instagram.

However, the tool did not seem to work when images were posted in public Facebook albums rather than directly to the profile’s feed. Shared in Facebook groups, regardless of whether the group was public or private, or posted by a Facebook user to a restricted audience (only their friends), images were not detected by Rights Manager.

When restricted posts were made public, Rights Manager did not detect content immediately; in some cases, it didn’t detect content at all, even a month after the switch.

Meta informed Arcom that this was not the expected behavior of Rights Manager. They had not been previously notified of the problem and were investigating it internally.

Resilience to Image Modification

Modifying images to circumvent detection is not uncommon, neither is modifying images for any other reason. Arcom’s tests concluded that Rights Manager’s performance was somewhat of a mixed bag.

These tests therefore made it possible to note that while the tool resists certain modifications, sometimes simple and weak, sometimes complex and strong, it does, however, allow images to pass whose modifications alter human visual recognition of the image relatively little, such as the mirror effect, saturation or posterization with moderate values, or cropping.

The failure to detect cropped images is all the more questionable since Instagram’s interface forces users to crop the image before publishing, forcing them to crop it if it deviates too much from a square format. In this case, the image is no longer recognized, even if the cropping is only done in a very limited way.

For those interested in the details of Arcom’s technical tests, the full report is available below (French). Results are only available for Rights Manager, however.

Pintrest’s refusal to grant access led to its system not being assessed at all. Arcom couldn’t locate a rightsholder to team up with because after extensive consultation, it couldn’t find a single rightsholder actually using it.

Arcom’s 2024 report is available here (pdf) – (2023 report for reference here)

From: TF, for the latest news on copyright battles, piracy and more.

Powered by WPeMatico

Author: oxy

Crypto Cabaret's resident attorney. Prior to being tried and convicted of multiple felonies, Oxy was a professional male model with a penchant for anonymous networks, small firearms and Burberry polos.

Share This Post On