France’s Influencer Law “V2”: what the Delaporte–Vojetta report recommends

A new report delivered by the two French MPs at the origin of the first Influencer Marketing law in France in 2023 suggests new legislative that goes way beyond the brand collaborations scope. It includes consumer protection, minors’ safety, and platform accountability.

France was one of the first European countries to adopt a dedicated legal framework for influencer marketing. The 2023 Influencer Law marked a turning point: influencer marketing was no longer treated as a grey area, but as a regulated economic activity subject to consumer protection rules.

Lawmakers are now preparing what is widely referred to as the Influencer Law “V2” or « Version 2 », a second step designed to address gaps in the 2023 framework and respond to new risks and business models that have emerged since.

A report delivered by French MPs Arthur Delaporte and Stéphane Vojetta provides a clear view of the issues likely to shape this upcoming legislative update. The 78 (!) proposals focus on consumer protection, minors’ safety, platform accountability, and the professional structuring of the influencer industry.

Lots of my European counterparts are asking me questions about what is coming next in France, so I tried to summarize the main insights below around 8 axis.

1- Donation-based live streams and gamification

One of the report’s strongest priorities is the regulation of donation-based live streaming, especially on platforms such as TikTok Live.

Certain live formats have developed into systems combining financial incentives, competitive rankings, emotional pressure, and continuous calls for viewers to spend money. The report highlights how these mechanisms can resemble aggressive commercial practices or misleading commercial practices, particularly when minors or vulnerable audiences are involved.

The direction proposed in the report is to introduce stricter rules on live monetisation, stronger safeguards against excessive spending, and a clearer legal approach to gamified donation mechanics.

2- Stronger protection of minors, beyond gambling

The 2023 law already introduced an important principle: minors should not be targeted by influencer promotions for certain restricted products, such as gambling.

The report argues for a broader approach. It calls for extending this exclusion to all promotions involving any product or service legally prohibited to under-18s. The aim is to create a simpler, more consistent, and more enforceable standard, aligned with the reality of young audiences’ exposure online.

3- Age verification and a unified framework

A major enforcement issue remains the practical ability to identify and protect minors online. The report therefore insists on strengthening age verification mechanisms through a unified framework and tools comparable to a “Digital Age Pass”.

The objective is to move away from inconsistent, platform-specific approaches and toward a standardised system that makes legal obligations enforceable in practice.

4- Closing loopholes: coaching, health claims, fundraising, and social commerce

The report targets several areas where harmful or misleading practices have become increasingly common.

A key focus is online training and coaching programs promoted by influencers. The report supports a stricter regime, potentially including a system of prior authorisation inspired by Germany, to reduce scams and abusive “business training” schemes.

It also highlights the need for tougher rules on health-related claims and the promotion of pseudo-medical content, which can expose consumers to significant risks.

Another area of concern is the growing use of influencer reach to promote online fundraising campaigns. The report calls for stronger transparency on where money goes and who ultimately benefits.

Finally, the report anticipates the rapid growth of social commerce, particularly integrated marketplaces such as TikTok Shop. This evolution is expected to raise new compliance issues, as influencer marketing becomes increasingly intertwined with direct sales and product distribution.

5- Adult-content platforms and legal ambiguity

The report addresses the promotion of adult-content platforms and supports banning influencer marketing for such platforms. It also calls for strengthening the legal framework against practices described as “digital pimping”.

This includes stronger restrictions against aggressive recruitment, misleading redirections, and monetisation practices that exploit vulnerable individuals. The report also mentions concerns linked to AI-driven sexual deepfake tools and related abuses.

6- Structuring the influencer industry as a profession

Beyond enforcement and prohibitions, the report reflects a broader shift. Influencer marketing is no longer viewed solely as a space to regulate through bans. It is also treated as an industry that needs professional structure.

The report supports the creation of a representative federation for the sector, which can be subject to interpretation as several professional organizations already exist like UMICC in France, or EIMA in Europe. It also support stronger regulation of talent agents and intermediaries. It suggests moving toward mandatory registration and clearer contractual standards to improve accountability across the ecosystem.

7- Taxation and economic transparency

The report also focuses on economic transparency and taxation, particularly around monetisation streams that remain difficult to track.

This includes affiliate revenue, gifts and benefits in kind, and income generated through donation-based systems. The report supports stronger data-sharing obligations for platforms, aligned with existing European transparency mechanisms such as DAC7.

8- Enforcement and the powers of authorities

A central message of the report is that regulation only works if it can be enforced.

The report calls for stronger operational powers for French authorities such as the DGCCRF (consumer protection), Arcom (media and online regulation), and the AMF (financial markets). This includes improved monitoring capabilities, possible web scraping powers, stronger blocking mechanisms under judicial oversight, and the creation of a single reporting portal.

Conclusion: a structural upgrade rather than a simple update

The Delaporte–Vojetta report suggests that France is moving beyond a transparency-driven approach and toward a more structural model. The focus is on emerging monetization formats, minors’ protection, platform accountability, and effective enforcement.

For creators, brands, agencies, and platforms, the message is clear. The French regulatory approach is not slowing down. It is becoming more comprehensive, more operational, and more ambitious.

The link to the full report (in French) : https://presse.economie.gouv.fr/rapport-influence-et-reseaux-sociaux-face-aux-nouveaux-defis-structurer-la-filiere-de-la-creation-outiller-letat-et-mieux-proteger/

Below are the 78 proposals of the Delaporte-Vojetta report (grouped by the report’s structure)

1. BETTER REGULATION NEW AREAS OF INFLUENCE IN THE DIGITAL AGE

1.1. Building on the positive results of the first Influencer Law (2023)

  1. Update the Influencer Marketing Code of Conduct guide every year.
  2. Publish the reference framework mentioned in Article IV (VII) of the 9 June 2023 law.
  3. Adopt the implementing decrees of the 19 October 2020 law and remove the “main subject” condition.

1.2. Going further to curb abuses and better protect consumers

  1. Prosecute certain live or recorded content (notably on TikTok Live) using direct or indirect financial mechanisms as aggressive and misleading commercial practices under French consumer law.
  2. Continue efforts at EU level to require platforms to implement user protection tools against excessive spending and to make financial-mechanism lives inaccessible to minors, including in “view-only” mode.
  3. Launch a large-scale audit operation on the declaration of user donations on platforms and the payment of tax by beneficiary influencers.
  4. Ask the DGCCRF to examine the compliance of “free-to-play” video game platforms’ commercial practices.
  5. Extend the mechanism in Article 4 of the 2023 law (forcing influencers to exclude minors from their audience) to all promotions of products/services prohibited to under-18s.
  6. Consider reforming the Digital Services Tax by adding a progressive component based on advertising revenue generated in the country.
  7. With health professional bodies, the Ministry of Health and health creators (who are licensed health professionals), create a training module and addendum to the influencer code of conduct.
  8. Strengthen controls on healthcare structures and pseudo-health professionals promoted on social networks.
  9. Launch a national prevention campaign on risks linked to illegal health claims.
  10. Following the German model, introduce a prior authorisation system for the sale or promotion of distance-learning programs by influencers covered by the 2023 law.
  11. Introduce a transparency obligation on the beneficiaries of online fundraising pools and the amounts transferred.
  12. Given serious dysfunctions on TikTok Shop, request a temporary suspension under Article 6-3 of the LCEN and demand compliance with consumer law.
  13. Map all “marketplace” platforms that may pose consumer safety risks and issue necessary injunctions.

1.3. Adult on-demand content creators: a rapidly expanding risk area

  1. Ban the promotion of profiles, content or training related to adult on-demand content platforms within influencer marketing activity.
  2. Consider creating a criminal co-liability regime for online platforms that profit from the exploitation of the prostitution system.
  3. Ban solicitation and recruitment of adult content creators, with aggravated circumstances when minors are involved.
  4. Ban link-based redirection from a website/platform to an adult content platform to prevent “harpooning” strategies.
  5. Launch a reflection on the social responsibility of creators and agencies employing “chatters” in low-income countries, and consider applying the EU due diligence directive to platforms where human rights are not respected.
  6. Prohibit nudification functionalities, often AI-based, enabling sexual deepfakes.

1.4. Better tackling the promotion of violent “masculinist” ideology online

  1. Extend Article 24 of the 1881 press law (incitement to hatred/violence based on sex) to the promotion of hateful and violent content covered by the same article.
  2. Consider classifying certain masculinist acts from radical groups promoting violence as terrorism and apology for terrorism.
  3. Build on proposals from the bill on comprehensive action against sexist/sexual violence, notably on consent for sexual image/video distribution and on creating an offence of online sexist/sexual insult.

2. CONTINUE STRUCTURING THE INFLUENCE AND CONTENT CREATION SECTOR

2.1. The challenge of a sustainable economic model for content creation

  1. Continue scaling up the CNC’s social platforms content creation support fund.
  2. Produce an influencer charter for the public sector to ensure public influencer campaigns are transparent and aligned with ethical and republican values.
  3. Create a public transparency register for all public-sector influencer partnerships, listing both contracted influencers and posts published as part of the partnership.
  4. Reform the Digital Services Tax to expand it to all platforms and increase its amount.
  5. Encourage platforms to allocate direct funding to original, independent online content, either from their own revenues or by mutualising part of ad revenue shared with creators.

2.2. Ensuring fair tax contribution from all influencers

  1. Add a reference to Article 110-1 of the French Commercial Code for influencers covered by Article 1 of the 2023 law when the activity is habitual, to simplify and unify tax/accounting declarations.
  2. Ensure compliance of platforms connecting creators with companies, and ensure influencers and economic actors (restaurants, hotels, etc.) properly declare in-kind contributions at fair value.
  3. Audit creators’ income declarations regarding correct reporting of affiliate link profits.
  4. Include business structuring and tax obligations in existing and future influencer certifications (including ARPP’s Responsible Influence Certificate).
  5. Advocate at EU level for extending DPI-MCAA and DAC7 to influencer marketing activity and to those earning revenue from providing and sharing videos.

2.3. Strengthening self-regulation and the organisation of the profession

  1. Create an influencer federation with representative and union bodies whose rights and obligations are enshrined in law.
  2. Require talent agents to register in a public register and obtain an authorisation to operate, including criminal background checks.
  3. Require greater transparency in contracts between creators and registered agencies, fight confiscatory commission thresholds, and significantly strengthen contract/practice controls to combat abuse.
  4. Ban platform-sponsored referral or listing systems for agents to avoid confusion of interest representation, in the spirit of Article 7(II) of the 2023 law.
  5. Launch targeted inspections of agencies specialised in live commercial promotions on TikTok, focusing on labour law and tax compliance (live-stream agencies).
  6. Explore integrating content-creation careers into higher education curricula.
  7. Extend the 24 August 2025 decree protecting home address information in company registers to creators performing influencer marketing activity under the 2023 law.
  8. At EU level, work with platforms to run information campaigns on creators’ rights and create an independent mediator to settle disputes between creators and platforms.
  9. Ensure platforms comply with national and EU copyright law by concluding contracts with collecting societies, particularly TikTok.
  10. Create a European, or failing that French, catalogue of creators’ copyrighted works to prevent third parties from publishing content without authorisation, inspired by prohibited-image repositories.
  11. Require content-sharing platforms to integrate creators’ files into automatic recognition systems (perceptual hashing, Content ID, etc.) to prevent unauthorised reappearance or duplication.

2.4. Supporting information content creators in a fragmented space

  1. Encourage adapting the JTI model (technical certification and independence) to enable labelling of information content creators.
  2. Review press subsidy eligibility criteria to include information content creators more broadly.
  3. Explore a fact-checking process supported by an international consortium similar to the Content Authenticity Initiative (CAI).
  4. With mental health professionals, ecosystem companies and the European Commission, create a mandatory professional and technical framework for AI companionship systems, including technical limits (max conversation duration, tone, restrictions for minors).

3. STRENGTHENING THE PUBLIC RESPONSE AND BETTER EQUIPPING THE STATE

3.1. Strengthening public authorities’ resources and improving citizens’ access to their rights

  1. Significantly increase DGCCRF resources dedicated to online consumer protection.
  2. Immediately and significantly increase ARCOM resources so it can fulfil its platform oversight mission, allocating at least 40 FTEs to these tasks.
  3. Authorise ARCOM to use scraping under strict conditions; ensure prosecutors transmit information to ARCOM; authorise ARCOM to use undercover identities.
  4. In case of serious offences on platforms not covered by the DSA or not represented in Europe, use emergency administrative court proceedings (référé-liberté) to act urgently under “public order disturbances”.
  5. Allow the AMF automated access (scraping) to publicly accessible platform content for investigations; expand undercover identity use; create an AMF administrative blocking procedure.
  6. Increase AMF resources dedicated to digital oversight.
  7. Create a single national portal for reporting “digital disorders”, directly attached to the Prime Minister’s services, and usable to route reports to trusted flaggers.
  8. Create a shared proactive monitoring system for online trends and unlawful content, attached to the Prime Minister, accessible to ministries and agencies.
  9. Entrust PEReN (or by delegation) with implementing scraping tools for authorised agencies, to mutualise efforts and resources.
  10. Improve information-sharing between public, civil society and private actors by restructuring and expanding the Permanent Contact Group (or reforming the Online Hate Observatory).
  11. Consider granting all regulators involved in monitoring illegal conduct/offers on platforms stronger administrative blocking powers under judicial oversight, similar to the gambling regulator (ANJ).
  12. Amend Article L.621-13-5 of the Monetary and Financial Code to explicitly allow referral to the judicial court to order “online platforms” to stop promoting illegal offers.

3.2. Strengthening the criminal justice response and securing the legal framework

  1. Clarify the “platform offence” in Article 323-3-2 of the Criminal Code to specify it includes dissemination of manifestly illegal content.
  2. Create a judicial public interest agreement mechanism with platforms to achieve compliance, combined with a fine.
  3. Enable joint jurisdiction referral in cybercrime cases, combining expertise from specialised Paris prosecutors with local specialised prosecutors and any judicial court.
  4. Continue strengthening staffing, training and associated resources (engineers, IT specialists) for justice, police and gendarmerie in cybercrime enforcement.
  5. Define the concept of “harm” in Article 6-3 of the LCEN and restore the possibility of interim emergency proceedings for online hate offences.

3.3. Ensuring the effectiveness of the European framework and continuing digital regulation

  1. Require platforms to overhaul moderation systems by dedicating a significant share of turnover to recruiting sufficient moderators.
  2. Specify the maximum takedown deadline after trusted flaggers report illegal content.
  3. Secure trusted flaggers’ funding through a public redistribution mechanism of a portion of platforms’ taxed revenues into a public endowment fund, to avoid conflicts of interest and ensure visibility for associations.
  4. Experimentally grant trusted flagger status to a consumer protection body and an intellectual property defence body to strengthen action against counterfeiting and consumer rights violations.
  5. Ensure the EU Digital Fairness Act becomes a shared political priority across French administrations, and build a coalition of interests to accelerate the entry into force of harmonised consumer protection standards.
  6. In the drafting of the Digital Fairness Act, ensure the text does not dilute the transparency obligation, and that contextual appreciation does not undermine mandatory explicit disclosure of commercial intent.
  7. Harmonise the French government’s position with that of the European Parliament on a strict ban on social networks for under-13s and supervised access via parental control between ages 13 and 16.
  8. Raise family awareness (prevention and addiction campaigns) on screen risks; democratise parental control tools and make them enabled by default on all devices sold in France for under-16s, including training teachers and sanctions for manufacturers who fail to preinstall them.
  9. Launch as soon as possible a “Digital Age Pass” via FranceConnect linked to the eIDAS digital wallet, allowing official recognition of “age ≥ 18” (or “age ≥ 16”) attributes.
  10. Require smartphone manufacturers to integrate age-verification interfaces into app stores at device level so that a device declared “used by a minor” automatically blocks access to risky apps and websites.
  11. Create a single mandatory age-verification framework for all digital platforms (including social networks) regardless of size; in the meantime, extend ARCOM’s October 2024 framework.

Laisser un commentaire