<?xml version="1.0" encoding="utf-8"?><!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" dtd-version="1.1" specific-use="sps-1.9" article-type="research-article" xml:lang="en">
    <front>
        <journal-meta>
            <journal-id journal-id-type="publisher-id">rdp</journal-id>
            <journal-title-group>
                <journal-title>Revista Direito Público</journal-title>
                <abbrev-journal-title abbrev-type="publisher">Rev. Dir. Publico</abbrev-journal-title>
            </journal-title-group>
            <issn pub-type="epub">2236-1766</issn>
            <publisher>
                <publisher-name>Instituto Brasileiro de Ensino, Desenvolvimento e Pesquisa</publisher-name>
            </publisher>
        </journal-meta>
        <article-meta>
            <article-id pub-id-type="doi">10.11117/rdp.v18i99.6057</article-id>
            <article-categories>
                <subj-group subj-group-type="heading">
                    <subject>Assunto Especial</subject>
                </subj-group>
            </article-categories>
            <title-group>
                <article-title>Don’t Shoot the Message: Regulating Disinformation Beyond Content<xref ref-type="fn" rid="fn01">1</xref></article-title>
                <trans-title-group xml:lang="pt">
                    <trans-title>Não Dispare a Mensagem: Regulamentando a Desinformação Além do Conteúdo</trans-title>
                </trans-title-group>
            </title-group>
            <contrib-group>
                <contrib contrib-type="author">
                    <contrib-id contrib-id-type="orcid">0000-0001-6454-2616</contrib-id>
                    <name>
                        <surname>KELLER</surname>
                        <given-names>CLARA IGLESIAS</given-names>
                    </name>
                    <xref ref-type="aff" rid="aff01"/>
                    <xref ref-type="fn" rid="fn112"/>
                    <xref ref-type="corresp" rid="c01"/>
                </contrib>
            </contrib-group>
            <aff id="aff01">
                <institution content-type="orgname">Leibniz Institute for Media Research</institution>
                <institution content-type="orgdiv1">Hans-Bredow-Institute</institution>
                <addr-line>
                    <named-content content-type="city">Hamburgo</named-content>
                </addr-line>
                <country country="DE">Alemanha</country>
                <institution content-type="original">Leibniz Institute for Media Research | Hans-Bredow-Institute (HBI). Hamburgo, Alemanha.</institution>
            </aff>
            <author-notes>
                <fn fn-type="other" id="fn112">
                    <label>Clara Iglesias Keller</label>
                    <p>Coordenadora do Digital Disinformation Hub no Leibniz Institute for Media Research | Hans-Bredow-Institute e Pesquisadora Senior em Políticas da Digitalização no WZB Berlin Social Science Center. Doutora e Mestre em Direito Público pela Universidade do Estado do Rio de Janeiro – UERJ e L.LM em Direito da Tecnologia da Informação e da Mídia pela London School of Economics and Political Science. Autora dos livros “Regulação Nacional de Serviços na Internet: exceção, legitimidade e o papel do Estado” (LumenJuris, 2019) e “Media Law in Brazil” (International Encyclopeadia of Laws, Forthcoming).</p>
                </fn>
                <corresp id="c01">E-mail: <email>c.keller@leibniz-hbi.de</email>
                </corresp>
            </author-notes>
            <pub-date publication-format="electronic" date-type="pub">
                <day>0</day>
                <month>0</month>
                <year>2023</year>
            </pub-date>
            <pub-date publication-format="electronic" date-type="collection">
                <season>Jul-Sep</season>
                <year>2021</year>
            </pub-date>
            <volume>18</volume>
            <issue>99</issue>
            <fpage>496</fpage>
            <lpage>525</lpage>
            <permissions>
                <license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by-nc/4.0/" xml:lang="en">
                    <license-p>This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License which permits unrestricted non-commercial use, distribution, and reproduction in any medium provided the original work is properly cited.</license-p>
                </license>
            </permissions>
            <abstract>
                <title>ABSTRACT</title>
                <p>This paper approaches regulatory strategies against disinformation with two main goals: (i) exploring the policies recently implemented in different legal contexts to provide insight into both the risks they pose to free speech and their potential to address the rationales that motivated them, and (ii) to do so by bridging policy debates and recent social and communications studies findings on disinformation. An interdisciplinary theoretical framework informs both the paper’s scope (anchored on understandings of <italic>regulatory strategies</italic> and of <italic>disinformation</italic>) and the analysis of the legitimate motivations for states to establish statutory regulation that aims at disinformation. Departing from this analysis, I suggest an organisation of recently implemented and proposed policies into three groups based on their regulatory target: content, data, and structure. Combining the analysis of these three types of policies with the theoretical framework, I will argue that, in the realm of statutory regulation that aims at disinformation. Departing from this analysis, I suggest an organisation of recently implemented and proposed policies into three groups based on their regulatory target: content, data, and structure. Combining the analysis of these three types of policies with the theoretical framework, I will argue that, in the realm of statutory regulation, state action is better off targeted at data or structure, as aiming at content represents disproportional risks to freedom of expression. Furthermore, content targeted regulation shows little potential to address the structural transformations on the public sphere of communications that, among other factors, influence current practices of production and spread of disinformation.</p>
            </abstract>
            <trans-abstract xml:lang="pt">
                <title>RESUMO</title>
                <p>Esse artigo aborda estratégias regulatórias contra a desinformação com dois objetivos principais. O primeiro é explorar regulações recentemente implementadas em diferentes contextos legais, com foco tanto nos riscos que representam para a liberdade de expressão, quanto no seu potencial para solucionar as questões que as motivaram. O segundo objetivo é fazer tal análise através de uma conexão entre debates regulatórios e os recentes achados das ciências sociais e das ciências da comunicação sobre desinformação. Este marco teórico interdisciplinar informa tanto o escopo do artigo (ancorado em entendimentos de “estratégias regulatórias” e de “desinformação”) quanto a análise das motivações legítimas para que os Estados estabeleçam regulação estatutária que vise o combate à desinformação. Partindo desta análise, eu sugiro uma organização de políticas recentemente implementadas e propostas em três grupos, de acordo com o alvo regulatório: conteúdo, dados e estrutura. Combinando a análise desses três tipos de políticas com o marco teórico, eu argumento que, no âmbito da regulamentação estatutária, a ação estatal é melhor direcionada a dados ou estrutura, pois o objetivo de conteúdo não apenas representa riscos desproporcionais à liberdade de expressão, mas também mostra pouco potencial para abordar as transformações estruturais nas comunicações que marcam a esfera pública atual.</p>
            </trans-abstract>
            <kwd-group xml:lang="en">
                <title>KEYWORDS</title>
                <kwd>Disinformation</kwd>
                <kwd>regulation</kwd>
                <kwd>regulatory strategies</kwd>
                <kwd>fake news</kwd>
                <kwd>digital platforms</kwd>
            </kwd-group>
            <kwd-group xml:lang="pt">
                <title>PALAVRAS-CHAVE</title>
                <kwd>Desinformação</kwd>
                <kwd>regulação</kwd>
                <kwd>estratégias regulatórias</kwd>
                <kwd>
                    <italic>fake news</italic>
                </kwd>
                <kwd>plataformas digitais</kwd>
            </kwd-group>
            <counts>
                <fig-count count="0"/>
                <table-count count="0"/>
                <equation-count count="0"/>
                <ref-count count="55"/>
                <page-count count="30"/>
            </counts>
        </article-meta>
    </front>
    <body>
        <p>SUMMARY: Introduction; 1 Definitions and scope; 2 Regulating disinformation; 2.1 Disinformation and regulatory rationales; 2.2 Regulatory strategies against disinformation; 2.2.1 Content; 2.2.2 Data; 2.2.3 Structure; 3 Analysis; Conclusions; Bibliography.</p>
        <sec sec-type="intro">
            <title>INTRODUCTION</title>
            <p>Following the 2016 Brexit referendum, a series of electoral processes drew the world’s attention to the possibility of the intentional and massive spread of false information through digital means<xref ref-type="fn" rid="fn03">3</xref>. Shortly after, this sort of practice soon revealed itself to be a threat in and outside of electoral processes, as digital disinformation became a familiar contingency to various debates. Among public health, climate change, historic revisionism and others, disinformation revealed itself to be a dynamic phenomenon that has become more and more entrenched in contemporary communications by presenting itself in different forms.</p>
            <p>As a result, we have witnessed different academic research strains aimed at understanding the multiple dimensions of disinformation and assessing suitable responses in recent years. Multidisciplinary works have covered disinformation’s conceptual implications<xref ref-type="fn" rid="fn04">4</xref>, different forms<xref ref-type="fn" rid="fn05">5</xref>, and potential effects<xref ref-type="fn" rid="fn06">6</xref>, as well as possible countermeasures<xref ref-type="fn" rid="fn07">7</xref>. While some of these strings are being untangled, other aspects remain under dispute or are simply not proven, like the debated existence of political bots (whose functions would include spreading disinformation)<xref ref-type="fn" rid="fn08">8</xref> or the assertion of a correlation between such strategies and poll results<xref ref-type="fn" rid="fn09">9</xref>. Even as these and other questions are publicly raised, disinformation remains a relevant phenomenon that bears fundamental societal risks.</p>
            <p>In all the varied approaches to the topic, some common ground has been found in the idea that there is no single silver bullet. As a multi-layered phenomenon, disinformation is an expression of tensions and transformations that, despite being linked to digitalisation, result from a broader constellation of technological, social, and political factors<xref ref-type="fn" rid="fn10">10</xref>. For the policy debate, this means that fighting disinformation requires different responses from different actors. So, for instance, fact-checking and media literacy can be performed by journalistic or civil society organisations, and digital platforms can implement their own moderation and certification procedures. As I argue in this article, there is also an important role for state regulation.</p>
            <p>This is where we start to enter even more complex territory, as governmental action towards digital disinformation mingles with the ever-delicate exercise of regulating freedom of expression – i.e., establishing rules regarding what can and cannot be said, published, and distributed. Even though every right can be subject to restrictions, freedom of expression is an indisputable pillar of modern democracies, and the line that separates these restrictions from state censorship can be thin.</p>
            <p>In light of this, this paper’s main goals are (i) to explore regulatory strategies against disinformation currently on the table and provide insight into both the risks they pose to free speech and their potential to address the rationales that motivated them and (ii) to do so by bridging policy debates and recent social and communications studies findings on disinformation. To do that, I suggest to organise recently implemented and proposed policies into three groups based on their regulatory target: content, data, and structure. After this analysis, I will argue that, in terms of statutory regulation, state action is better off targeted at data or structure, as aiming at content represents disproportional risks to freedom of expression. Furthermore, content targeted regulation shows little potential to address the structural transformations on the public sphere of communications that, among other factors, condition current practices of production and spread of disinformation.</p>
            <p>The paper is organised as follows. Section 1 explains the scope decisions and technical definitions that ground the key objects of study: regulatory strategies and disinformation. Section 2 approaches the regulation of disinformation, starting with an outline of the adequate regulatory rationales under consideration (item 2.1). Item 2.2 sets out a set of regulatory strategies against disinformation, classified according to their regulatory target. Item 3 analyses these strategies to extract key takeaways for the policy debate, and the paper is finished with the conclusions.</p>
        </sec>
        <sec>
            <title>1 DEFINITIONS AND SCOPE</title>
            <p>This paper rests on two main scope decisions. The first, regarding the concept of <italic>regulatory strategies</italic>, refers to the promulgation of rules by governments, accompanied by mechanisms for monitoring and enforcement<xref ref-type="fn" rid="fn11">11</xref>. While recognising the need for actions from different stakeholders, I will focus on “the main instruments that the state can use to regulate directly”<xref ref-type="fn" rid="fn12">12</xref>, i.e., statutory regulation proposed to or enacted by parliament, the declared goal, rationales, or formal or informal motivations of which relate to the spread of online disinformation.</p>
            <p><italic>Disinformation</italic> is understood as “false or misleading information that is intentionally spread for profit, to create harm, or to advance political or ideological goals”<xref ref-type="fn" rid="fn13">13</xref>. This definition is in tune with conceptual literature on the topic, which attempts to make sense of the differing yet blurred communication practices that are part of what has been referred to as an “information disorder”<xref ref-type="fn" rid="fn14">14</xref>. Since the expansion of practices related to the spread of false information through digital means, the terminology has evolved to allow greater accuracy and differentiation among varying phenomena.</p>
            <p>Even though the term “fake news” is still used, its mixed applications restrict its theoretical and technical relevance<xref ref-type="fn" rid="fn15">15</xref>. The conceptual debate in the English language builds, in large part, on the distinction between disinformation and misinformation – the latter being commonly referred to as information that is “false by definition”<sup>10</sup> but has not been disseminated with a specific purpose to cause harm. Intent is recognised as the element that differentiates mis- and disinformation<xref ref-type="fn" rid="fn16">16</xref>, as “[d]isinformation is meant to deceive, while misinformation may be inadvertent or unintentional”<xref ref-type="fn" rid="fn17">17</xref>. While these two concepts hold structural relevance, different classifications vary in the additional typologies to this list<xref ref-type="fn" rid="fn18">18</xref>. These taxonomies allow for the distinction among practices with one element – falsehood – in common but that still vary considerably in terms of the risks they entail for individual and collective rights.</p>
            <p>The distinction between mis- and disinformation holds fundamental relevance for the debate about regulatory policies in liberal democracies, where counteractions against these practices mingle with the ever-delicate exercise of regulating freedom of expression. Because classifying and countering disinformation intrinsically depends on a judgement on the substance of expression – i.e., it is false? – countermeasures hold an increased risk of promoting chilling effects or even censorship. Ultimately, conceptions of truth and fact are a matter of perspective, the meaning of which, in a democracy, should be fairly and equally disputed by society<xref ref-type="fn" rid="fn19">19</xref>. If statutory policy targets cases where the circulation of inaccurate or false information is not intentional, regulatory initiatives would be pushed even further away from legitimacy and compromise the “benefits of a noisy and unruly public arena”<xref ref-type="fn" rid="fn20">20</xref>. Throughout this paper, I will show that these risks may still not be neutralised even if we narrow policy debates down to disinformation.</p>
            <p>Nonetheless, disinformation is still a phenomenon that bears fundamental societal risks, especially when it stands in the way of equal and fair participation in public debate. There are legitimate reasons for states to be concerned about and regulate disinformation, but they must be equally concerned that these interventions do not pose risks to freedom of speech.</p>
        </sec>
        <sec>
            <title>2 REGULATING DISINFORMATION</title>
            <sec>
                <title>2.1 Disinformation and regulatory rationales</title>
                <p>Choosing regulatory strategies that impose minimal restrictions on free speech is in line with recent findings in social and communications sciences on the emergence and effects of disinformation trends. Beyond the usual rationales for regulating freedom of expression (e.g., by setting remedies for abuses) or media and communications structures (e.g., to assure access to plural information), I argue that the legitimate reasons to regulate in this context relate to (i) recent transformations in social communications and (ii) the “second-order effects”<xref ref-type="fn" rid="fn21">21</xref> that disinformation potentially has on democracy.</p>
                <p>The policy debate often rests on assumptions that digital disinformation, employed in and outside of electoral processes, is harming democratic institutions, the public arena of debate, and the integrity of electoral processes in particular. In this sense, disinformation is often linked to political polarisation<xref ref-type="fn" rid="fn22">22</xref>, elections disruption<xref ref-type="fn" rid="fn23">23</xref>, and far-right communication strategies, to mention a few. Although it is relevant to investigate current socio-political phenomena, empirical evidence on these theories is still disputed<xref ref-type="fn" rid="fn24">24</xref>, which weakens their potential to support policy proposals.</p>
                <p>Sharing Andreas Jungherr and Ralph Schroeder’s outlook, this paper approaches disinformation as “a symptom, and not a cause”<xref ref-type="fn" rid="fn25">25</xref> of structural tensions and transformations that have “impacted information flows and attention allocation”<xref ref-type="fn" rid="fn26">26</xref> in the public arena. This perspective carries far-reaching implications because if these transformations are not accounted for and consequently addressed by public policy, the dispute over truth or falsity by itself has little to no potential to remedy the structural challenges and harmful effects that accrue from the current digital communications landscape. These transformations are in great part connected to digitalisation and the emergence of digital platforms as information intermediaries. In an effort to name these transformations, the authors highlight how digital platforms are now “an integral part of the public arena as they provide complementary opportunities for distributing information and political messages in addition to those provided by news media and political organizations”<xref ref-type="fn" rid="fn27">27</xref>. These opportunities are shaped by the practices through which platforms exert influence over information fluxes (like content moderation and algorithm curation), which ultimately means that “how messages are disseminated on these platforms and their internal governance processes matter now beyond the narrow confines of their businesses”<xref ref-type="fn" rid="fn28">28</xref>. Connected to the emergence of digital intermediaries are other transformations relating to forms of message amplification, impacts on legacy media business models, and lack of transparency on how these intermediaries operate<xref ref-type="fn" rid="fn29">29</xref>.</p>
                <p>Although digitalised communications play an important role, the transformations in communication and socio-political practices to which disinformation refers are not connected exclusively to digitalisation. As proposed by Jeanette Hofmann, the relationship between technology and democracies is better approached by a “co-constitution” lense, according to which democracy and technology are connected “through a co-evolutionary process of mutual enabling”<xref ref-type="fn" rid="fn30">30</xref> rather than a causal link. Ultimately, this means not only that communication technologies and democracy shape each other but also that they are inserted in a “macro-level constellation of social change”, which is affected by different socio-political factors. Building on Manuel Castells, Hofmann expressly approaches the current crisis of western democracies as an example of this multisided relationship, arguing that “the decay of conventional channels of political expression” cannot be single handily pinned on digitalisation since “core representative institutions began losing support and stability long before the internet advanced as a medium for ‘mass self-communication’”<xref ref-type="fn" rid="fn31">31</xref>. It is worth highlighting that, for the policy debate, the adoption of this co-constitution perspective of democracy and technology does not imply that democracy is unaffected by digital technologies or even by disinformation. Rather, it means that addressing disinformation as a single negative externality overlooks other relevant factors and stifles the search for solutions. Yochai Benkler adopted a similar approach when writing about foreign interference in the 2016 American elections. Grounded in the realisation that “evidence of action is not evidence of influence”, the author states that the eventual success of disinformation or propaganda strategies in the country must be interpreted “in the context of long-term patterns of loss of trust in institutions, including mainstream media, and the deep alienation of the past decade since the Great Recession”<xref ref-type="fn" rid="fn32">32</xref>.</p>
                <p>These approaches provide two key takeaways for the policy debate. First, in terms of strategy, it means that, as highlighted before, countermeasures will come from multilevel actors, and state regulation is far from being a single effective solution. This coincides with much of the literature acknowledging that fighting disinformation requires different responses from different actors<xref ref-type="fn" rid="fn33">33</xref>. Whilst fact-checking and media literacy can be performed by journalistic or civil society organisations, and digital platforms can implement their own moderation and certification procedures, state regulation also plays an important role in the form of public policies implemented by legislation, which will be discussed in the next section. Second, in terms of regulatory legitimacy, motivations for state action against disinformation should not be founded on necessary causal links between disinformation and concrete results such as electoral outcomes or polarisation. Even though this link might exist, indicate that disinformation and such results are related is a non-structural way, rather then as causal pointers.</p>
                <p>This is a crucial point because, as noted by David Karpf, “online disinformation and propaganda do not have to be particularly effective at duping voters or directly altering electoral outcomes in order to be fundamentally toxic to a well-functioning democracy”<xref ref-type="fn" rid="fn34">34</xref>. Assumptions that disinformation leads to polarization or that it affects electoral outcomes are not only disputed but also unnecessary because there are “second-order effects” that already undermine democratic institutions and the “governing norms that stand as a bulwark against elite corruption and abuse of power”<xref ref-type="fn" rid="fn35">35</xref>. The author’s perception aligns with other theories that identify risks that disinformation poses to trust in democratic institutions<xref ref-type="fn" rid="fn36">36</xref> and even to fundamental rights. For instance, disinformation can “contribute to increased doubts in political and media institutions and [...] contribute to the destabilization of political systems”<xref ref-type="fn" rid="fn37">37</xref>. It “often targets institutions and individuals in vulnerable situations and affects a wide range of human rights, including economic, social, cultural, civil and political rights, in which cases its effects surpass communications and rhetoric to translate into discrimination and hatred against minorities, immigrants and other marginalized communities”<xref ref-type="fn" rid="fn38">38</xref>. Disinformation also undermines “public confidence in mainstream media”<xref ref-type="fn" rid="fn39">39</xref> in different ways (such as by discrediting, impersonating, or accusing) and “the very existence of online misinformation resembling a journalistic product can diminish the credibility of legitimate news”<xref ref-type="fn" rid="fn40">40</xref>.</p>
                <p>This shows that appropriate regulatory rationales for disinformation cannot be narrowed down to a dispute over facts. Legitimate motivation for statutory regulation are those that address digital disinformation beyond an information quality perspective. This does not exclude speech-related counteractions from the broader debate – it is still important to dispute facts and meanings in the public sphere, this should be done outside of statutory and abstractly applicable regulation. Let aside the role of courts – who, restrained by their procedural and constitutional limitations, legitimately seek for the truth in an <italic>ex post</italic> case-by-case basis – there is not a legitimate locus for arbitration inside state coercion.</p>
            </sec>
            <sec>
                <title>2.2 Regulatory strategies against disinformation</title>
                <p>Statutory regulation initiatives against disinformation have continuously grown in different national contexts since 2017, notably after Brexit and the 2016 American election campaigns<xref ref-type="fn" rid="fn41">41</xref>. Since then, several countries have enacted regulations aiming to combat disinformation in and outside of electoral processes, a trend enhanced in volume and justification by the Covid-19 pandemic (which brought the spread of public health--related disinformation to light)<xref ref-type="fn" rid="fn42">42</xref>. Previous works have accounted for these regulatory strategies against disinformation and classified them based on different criteria, such as the institutional arrangement<xref ref-type="fn" rid="fn43">43</xref>.</p>
                <p>In this paper, I propose that we separate statutory regulation (approved or under discussion in parliament) into three groups, according to their regulatory target: those aiming at individual expression, i.e., the content of the message itself; those aiming at the collection, handling, and use of personal data for disinformation ends; and those that implement structural regulation of digital intermediaries. These three categories are the result of research on global regulatory strategies against disinformation collected from pre-existing policy repositories that describe these initiatives in English<xref ref-type="fn" rid="fn44">44</xref>.</p>
                <p>It should be highlighted that these three categories of regulatory targets are not static. A regulatory target is understood here as the object to which regulation intends to conform<xref ref-type="fn" rid="fn45">45</xref>, not necessarily the subject affected by the regulation (even though this element is still important). So, for instance, while policies that regulate data or structure have the potential to bind corporate bodies almost exclusively, content regulation refers to individual or corporate behaviour (as will be clarified in the following section). Nevertheless, the element that the policies aim to influence – the element on which the rationale rests – is content.</p>
                <p>Also, as I will show, a single strategy can affect more than one target. Additionally, these three groups do not intend to consider all possible targets or regulatory strategies related to disinformation. Since disinformation is such an intricate phenomenon, the idea behind this taxonomy is to better understand policy options currently on the table, their rationales, and implications for fundamental rights. In the next subsections, I will describe each one of these groups of policies and their recent implementation or consideration in different experiences.</p>
                <sec>
                    <title>2.2.1 Content</title>
                    <p>Policies aimed at content deem disinformation an illegal type of speech and thus something that must be banned from circulation. This includes a variety of mechanisms with the potential to restrict freedom of expression to different degrees.</p>
                    <p>Some of these policies are directed at deterring individuals from producing, publishing, distributing, or spreading disinformation in any way by imposing criminal or civil liability. This can be done through a diverse set of commands, like, for instance, the creation of new criminal provisions. In this sense, Brazilian Law 13.834/2019 criminalises accusing someone of a crime or infraction of which they are innocent with electoral purposes, thus prompting administrative and criminal investigations<xref ref-type="fn" rid="fn46">46</xref>. Other examples have gained international attention, such as the case of Ethiopia, where Proclamation 1185/2020 provides for imprisonment sanctions against individuals who disseminate disinformation or hate speech by means of broadcasting, print, or social media<xref ref-type="fn" rid="fn47">47</xref>. Similar provisions were also approved, for instance, in Malaysia<xref ref-type="fn" rid="fn48">48</xref>, Cambodia<xref ref-type="fn" rid="fn49">49</xref> and Kenya<xref ref-type="fn" rid="fn50">50</xref>. Besides the creation of new types of crimes, the enhancement of pre-existing penalties for speech abuses is also possible. E.g. in Denmark<xref ref-type="fn" rid="fn51">51</xref> a 2019 amendment to the Criminal Act added activities that affect public opinion among the unlawful speech abuses by foreign governments as a clear way to avoid foreign disruption in elections<xref ref-type="fn" rid="fn52">52</xref>.</p>
                    <p>Policies aimed at content can be directed not only at individuals but also at information intermediaries. These policies include legislations that impose duties of removal of disinformation content on digital platforms, which have the potential to disproportionally curtail protected speech, as they force intermediaries “to make highly context-sensitive decisions within tight time frames and based on insufficient available information”<xref ref-type="fn" rid="fn53">53</xref>. This is the case for China, where article 47 of the Cybersecurity Law requires platforms to take appropriate measures in the face of disinformation<xref ref-type="fn" rid="fn54">54</xref>. Regulation can also target media services and digital platforms, as does French Law 2018--2012, which “enables the transmission of foreign state-controlled radio and television services that broadcast disinformation to be curtailed, or temporarily suspended, prior to elections”<xref ref-type="fn" rid="fn55">55</xref>. In the face of disinformation, an interested party can apply to a judge for an expedited order requiring that providers of online communication services take necessary steps to prevent continuing diffusion of false information<xref ref-type="fn" rid="fn56">56</xref>. The interruption of communication services as a remedy for disinformation can also take more authoritarian forms. In Belarus, media legislation was amended in 2018 to allow the Ministry of Information to block social media platforms and hold website owners liable for hosting content deemed false, defamatory, or harmful to the national interest (without warning or judicial oversight)<xref ref-type="fn" rid="fn57">57</xref>. In this case, the provision was accompanied by other internet and media targeted policies that have so far resulted in arrests of journalists and persecution of political dissent<xref ref-type="fn" rid="fn58">58</xref>. Similarly, in 2019, the Cambodian government announced it would revoke licenses “of print and online media outlets distributing ‘fake news’ deemed to be a danger to national security”<xref ref-type="fn" rid="fn59">59</xref>, and even though the measure did not come into force, local NGOs still report the use of cybercrime and other legislations under the guise of disinformation combat to crack down on political dissent<xref ref-type="fn" rid="fn60">60</xref>.</p>
                    <p>For the sake of nuance, it is important to highlight that these examples encompass policies with very different impacts on speech. For instance, subjecting proponents of a certain type of individual expression to incarceration jeopardises individual liberties more severely than generating incentives for intermediaries to over-restrict access to content would. Even though the latter still represents a threat of troubling effects that should not be taken lightly<xref ref-type="fn" rid="fn61">61</xref>, in the first case, risks to freedom of expression is worsened by the threat of incarceration, which also jeopardises one’s right to liberty and possibly physical and mental integrity. Similarly, employing judicial oversight over a possible disinformation countermeasure (like in France) is within the design of courts in liberal democracies<xref ref-type="fn" rid="fn62">62</xref> and does not threaten freedom of expression to the same extent as arbitrary executive branch decisions. National context is also a relevant factor. The examples of Belarus and Cambodia show us that when regulation against disinformation is part of a broader authoritarian legal framework, it is mixed with subterfuge for persecution political opponents – as was reported to have happened in Egypt, China, and the United Arab Emirates as well<xref ref-type="fn" rid="fn63">63</xref>.</p>
                    <p>Nevertheless, provisions that establish countermeasures for disinformation based on content essentially rest on an understanding of truth or falsity that can either be pre-established in legislation or left to the discretion of the body responsible for enforcing them. This will ultimately lead to imposing one version of facts – whether it be judges’, executive authorities’, or platforms’ – over others. Even though it is to varying degrees, strategies based on a concept of disinformation will inevitably steer the dispute over truth and fact away from where it belongs – in society and public debate.</p>
                    <p>Provisions for abuses of freedom of expression are entailed in liberal democracies’ constitutional systems, which presume the possibility of restricting fundamental rights when it conflicts with other guarantees. Terrorist content and hate speech, for instance, are prohibited because they notably impose severe risks to third-party rights. But disinformation rests in a greyer area. Other than the cases where it is paired with other sorts of online harms, like hate speech, defamation, or harassment (and can ultimately lead to concrete violence against vulnerable groups<xref ref-type="fn" rid="fn64">64</xref>), disinformation should not be necessarily illegal, as a great part of its forms can be considered legitimate expression. Depending on the conceptual framework, many different conducts can qualify as such, from information that is displaced from its original context to parodies and completely made-up facts – all of which could still be placed in the realm of legitimate speech, depending on the circumstances of each case. In fact, one could argue that even the regular adjudication of plain illegal speech, which is usually left up to courts, will inevitably rely on the judge’s interpretation and perspective of the facts under consideration<xref ref-type="fn" rid="fn65">65</xref>. This inherent risk in arbitrating speech is severely higher if this task is handed to administrative authorities without judicial oversight, given their unfettered discretion, opening up the possibility for abuse and arbitrary decision-making<xref ref-type="fn" rid="fn66">66</xref>.Finally, content-targeted policies do not promise efficiency, especially when they are applied to individuals. They overlook how orchestrated and structured disinformation campaigns operate, particularly when what differentiates their reach potential is not individual conduct but the characteristics of digital communications that they rest on – for instance, the possibility of using personal data for political microtargeting. In fact, digital platforms’ influence over information and attention fluxes rests in large part on the use of personal data, which also makes it a relevant target for regulatory policies.</p>
                </sec>
                <sec>
                    <title>2.2.2 Data</title>
                    <p>Policies that target <italic>data</italic> are represented by legal frameworks that regulate the collection, treatment, and storage of personal data for different purposes. These regulations approach data protection across sectors to protect rights holders in increasingly digitalised economies. This strain is represented by the enactment or updating of laws dedicated to data protection, most often based on “the guarantee of a fundamental right and the realization of this right by means of a legal regime of data protection, in the form of a general law on the subject”<xref ref-type="fn" rid="fn67">67</xref>.</p>
                    <p>In digital disinformation debates, the use of data is specifically related to the use of political microtargeting techniques that are meant to distribute content to a segmented audience<xref ref-type="fn" rid="fn68">68</xref>. Microtargeting is “a form of online targeted advertising that analyses personal data to identify the interests of a specific audience or individual in order to influence their actions”<xref ref-type="fn" rid="fn69">69</xref>. This logic is embedded, for instance, in the core of social media platforms’ business models, which target users with advertisements tailored to their preferences. Similarly, political microtargeting “involves collecting and analysing people’s personal data to send them tailored political messages”<xref ref-type="fn" rid="fn70">70</xref> and thus introduces information “while targeting promising individuals or groups specifically and out of sight of the public arena”<xref ref-type="fn" rid="fn71">71</xref>. The main fuel for political microtargeting is “data gathered from citizens’ online presentation and behaviour, including from their social media use”<xref ref-type="fn" rid="fn72">72</xref>. This means that the kind of data that informs political microtargeting is likely to include ethnicity, ideologies, and political and religious beliefs, among other types of information concerning which users can be further discriminated. Differentiating political microtargeting from mere microtargeting for policy purposes entails a complicated exercise of interpretation – what is political, after all? – with potential to impose stricter rules based on content.</p>
                    <p>Even though microtargeted disinformation can be used in different contexts, it is in the realm of electoral legislation that data appears as a target for regulation against disinformation, as “online disinformation and unlawful political microtargeting represent a threat to elections around the globe”<xref ref-type="fn" rid="fn73">73</xref>. Data-based political advertisement has been at the centre of disinformation strategies in different national contexts<xref ref-type="fn" rid="fn74">74</xref>, raising concerns beyond the protection of citizens’ privacy and data protection<xref ref-type="fn" rid="fn75">75</xref>. Electoral frameworks can vary across countries, but they are commonly built on the assumption that the electoral period requires qualified protection for speech, access to information, and opinion formation guarantees<xref ref-type="fn" rid="fn76">76</xref>. This presumes that all information will circulate to everyone. In this sense, political microtargeting has a doubled potential to interfere with elections, both when it harmfully manipulates information being distributed and when it restricts plurality by excluding people who are not targeted from the debate<xref ref-type="fn" rid="fn77">77</xref>. Distribution of disinformation in electoral contexts has been associated with “harming the political debate, excluding populations from it, and even making individual autonomy vulnerable in invisible and unexpected ways” <xref ref-type="fn" rid="fn78">78</xref>.</p>
                    <p>Thus, there is a case for “strengthening enforcement of data protection legislation in electoral contexts”<xref ref-type="fn" rid="fn79">79</xref>. In general, data protection legislation already serves different aspects that regulate the use of political microtargeting in electoral processes. As noted by Francisco Brito Cruz, they can limit data collection (e.g., by prohibiting unlawful surveillance or commercialisation of voters databases), data sharing (by prohibiting international data transfer in specific purposes, possibly electoral), and data management, as in cases where there is a deviation from the purposes authorised by the agent. In several countries, these constraints have been carried out by personal data protection rules and by authorities that enforce these rules<xref ref-type="fn" rid="fn80">80</xref>. Specific mechanisms may include restrictions on data gathering and accumulation for political microtargeting purposes – like in Japan, where “the capture of personal data on the electorate, and the communication of personalised political messaging” is understood to be “largely prohibited” by the current legislation<xref ref-type="fn" rid="fn81">81</xref>.</p>
                    <p>Depending on how they are carved in legislation, limitation or prohibition of use of data for “political purposes” are likely to lead to another intricate exercise of interpretation regarding what is a political purpose or not. The electoral time frame can provide a more stable criterion – however, in a highly digitalised public sphere, electoral campaigning is submitted to transformations of its own, which means that the relevance of such time frames as the key moment for political communication is diminished. Political content standards based on characteristics of a certain kind of message will probably be fully subjected to the perception of their enforcer, and therefore provide less legal certainty.</p>
                    <p>Either via general or disinformation specific regulation, data is a relevant target for public policies because of its role in feeding the dynamics of information distribution in digital platforms. This means that aiming at data would in theory promise better results. Along with how and what for these companies use data, there a series of other aspects of their business models that are a relevant for disinformation regulation, as I will show in the next section.</p>
                </sec>
                <sec>
                    <title>2.2.3 Structure</title>
                    <p>As per the third and last group of measures, policies focusing on structure regulate the digital platforms serving as one of the means to distribute disinformation<xref ref-type="fn" rid="fn82">82</xref>. These initiatives target the business models of such actors, with a particular focus on technological tools, practices, and the criteria on which they operate.</p>
                    <p>By “policies aimed at structure”, I mean the mechanisms meant to imbue digital platform’s business models with “a new ethics of responsible platforms, which can provide certainty, fairness and accountability of enforcement of speech rules, but ensure that speech control is strictly autonomous from the state”<xref ref-type="fn" rid="fn83">83</xref>. Overall, they implement “incentives for the platforms to modify their operations” through different means, including “the introduction of government mandated responsibilities; data and privacy protection measures; the use of codes of practice; and measures to strengthen skills and training policies”<xref ref-type="fn" rid="fn84">84</xref>. Regulations focused on structure do not hang on disinformation specifically but rather address aspects of platforms’ business models that exert influence on information and attention fluxes of different natures, thus impacting the distribution of a diversity of online harms. Even so, mis- and disinformation are usually perceived<xref ref-type="fn" rid="fn85">85</xref>, or even expressly listed<xref ref-type="fn" rid="fn86">86</xref>, as regulatory motivations of structure regulation proposals.</p>
                    <p>Some part of European literature distinguishes a regulatory trend towards structural regulation through expressions such as “a shift from liability to responsibility”<xref ref-type="fn" rid="fn87">87</xref> or “from liability to duty”<xref ref-type="fn" rid="fn88">88</xref>. These terms attempt to distinguish regulatory initiatives in recent years that entail “the need for proactive measures”<xref ref-type="fn" rid="fn89">89</xref> from dominant liability-centred models, which still characterise online content regulation frameworks by carving out conditions for the civil liability of digital platforms over user-generated content<xref ref-type="fn" rid="fn90">90</xref>. While different authors will include different types of regulatory strategies under what is understood here as the responsibility of digitals platforms<xref ref-type="fn" rid="fn91">91</xref>, there are a few common suspects – like duties of notification and due process in content moderation, obligations of setting user-centred flagging tools<xref ref-type="fn" rid="fn92">92</xref>, and the very popular obligations of transparency.</p>
                    <p>In fact, the latter is a key mechanism of structure regulation. It seeks to bring more clarity to digital platforms’ operations, like the criteria for and effects of content moderation decisions<xref ref-type="fn" rid="fn93">93</xref> and “the black box of algorithm decision-making”<xref ref-type="fn" rid="fn94">94</xref>. Notwithstanding critiques of its ambiguity and flexibility<xref ref-type="fn" rid="fn95">95</xref>, or even of the use of transparency as a “policy panacea”<xref ref-type="fn" rid="fn96">96</xref>, recent regulatory approaches provide, for instance, the delivery of transparency reports on content moderation decisions – a trend that came to light with the German law <italic>Netzwerkdurchsetzungsgesetz</italic> – NetzDG<xref ref-type="fn" rid="fn97">97</xref> and is now part of policy proposals under discussion in Brazil<xref ref-type="fn" rid="fn98">98</xref> and in Europe (where transparency is one of the pillars of the Digital Services Act proposals<xref ref-type="fn" rid="fn99">99</xref>). Requirements of transparency for advertisers and especially of political campaigns is also a trend reflected, for example, in the 2021 European Commission public consultation on improving transparency in political advertisement online and offline<xref ref-type="fn" rid="fn100">100</xref>, in the Brazilian proposals mentioned before, and French Law 2018-1202 (according to which platforms must publish details of the measures taken and report annually on progress in these areas<xref ref-type="fn" rid="fn101">101</xref>).</p>
                    <p>Also, in the realm of structure there are a diversity of measures that can generate different types of incentives and impact platforms and users’ rights to different degrees. Obligations of content removal, for instance, require attention for the incentives they generate<xref ref-type="fn" rid="fn102">102</xref>. There is a mix of regulatory targets in these cases, as monitoring obligations are aimed at platforms, but obligations of removal are very much centred on content. Here, the lines between content and structure regulations are blurry, and warnings of risks to freedom of expression are due. Obligations of content removal<xref ref-type="fn" rid="fn103">103</xref> are known to delegate the job of arbitrating the scope of freedom of expression to private platforms, raising immediate concerns about incentives for over-blocking<xref ref-type="fn" rid="fn104">104</xref>.</p>
                    <p>As a relatively recent trend, structural regulation does not yet rest on empirical results. In turn, its legitimacy can be justified in different ways, among which are social media platforms’ “unique public role” in communications<xref ref-type="fn" rid="fn105">105</xref>; their “systemic opinion power [...] to create dependences and influence other players in a democracy”<xref ref-type="fn" rid="fn106">106</xref>; or the fact that such companies’ decide on collective behaviour behind closed doors and therefore need to be imbued with “procedural values”, such as “the rule of law, due process and transparency”, as well as “participation in decision making”<xref ref-type="fn" rid="fn107">107</xref>. For aiming at defining aspects of platforms’ business practices – including the ones that allow them to exert influence over online communications – structural regulation bears relevant potential to countermeasure disinformation and other sorts of online harms. However, their potential to actually shift power imbalances does not meet the same enthusiasm.</p>
                </sec>
            </sec>
        </sec>
        <sec>
            <title>3 ANALYSIS</title>
            <p>The literature analysed in section 2.1 suggests that disinformation as a socio-political phenomenon cannot be approached as the single cause of democratic crisis or presumed shifts in opinion formation processes – either due to the lack of consensus on empirical evidence, or simply because this approach overlooks its complexity. As proposed by the concept of mediated democracy, a series of different co-existing conditions enable “possibilities of political action without determining them”<xref ref-type="fn" rid="fn108">108</xref>, from which I inferred that understanding recent forms of disinformation and its effects goes beyond looking at how technologies impact democratic institutions. Among the different conditions that deserve attention, this paper looks at what is needed from statutory regulation, notably in the realm of digital communications policies.</p>
            <p>Out of the three groups of regulatory strategies herein described, the ones aimed at content pose the most severe risks to freedom of expression and even to civil liberties. This is because strategies aimed at banning content (i) will inevitably rest on a certain understanding of truth, and when enforced, (ii) will privilege the understanding of the enforcer over others, removing the dispute over facts and perspectives from where it originally belongs – in societal debate. Further, strategies aimed at content show the least promise of addressing the transformations in communication practices introduced by digitalisation, which sets the conditions for the spread of disinformation and other sorts of harms in a wide extent. Therefore, statutory regulation aiming at providing countermeasures for disinformation should not be aimed at content. This does not necessarily mean that content is not a target for statutory regulation under any circumstances. Liberal democracies that are supported, among other pillars, by a right to freedom of expression must also provide and prepare for hypotheses of abuses to this right and its collision with other fundamental guarantees. Freedom of expression is usually legitimately restricted, for instance, to avoid the circulation of illegal harmful speech, but this is not necessarily the case of disinformation.</p>
            <p>Considering the policy options approached in this paper, free speech will be restricted the least if regulation aims at data and structure, which are policies aimed at curbing disinformation or minimising its impacts, instead of plainly removing or criminalising disputed facts. Still, this does not mean these policies represent no risks at all. For instance, regulating data usage affects information circulation, and specifically in electoral contexts, it can end up limiting the circulation of legitimate communication from political parties. Furthermore, and as mentioned previously, restrictions of data usage pinned on “political content” or “political purposes” will also ultimately entail subjective interpretations. Nonetheless, data remains a relevant target for disinformation regulation. Both general data protection regulations, provisions aiming at elections or even at political purposes, will present proportionally less risks to speech guarantees and fundamental liberties. Setting aside the importance of general data protection regulation in digitalised societies, they do implement minimum safeguards to different stages of data collection and treatment, binding digital platforms and other actors that engage in illegal uses of data for disinformation purposes. In fact, even provisions that restrict the use for political purposes hold at least rhetorical relevance, as they are part of a bigger societal conversation about what sort of economic, political or ideological interests microtargeting technologies should support.</p>
            <p>Similarly, depending on the kind of incentive generated by structure regulation, risks for speech will still be relevant. Incentives for removal of content by platforms deserve double attention, as they might overlap with targeting content. The shift towards structural responsibility for digital platforms is welcome, at least as a new attempt to steer digital platforms’ operations towards the public interest. Nevertheless, these policies still function very much inside the power structures in place instead of actually challenging them, and their results are yet to be seen. In fact, one could even argue that they grant even more power to digital platforms, since they recognize the legitimacy of their current operations and enable internal decision making on compliance standards<xref ref-type="fn" rid="fn109">109</xref>.</p>
            <p>Aiming at structure and data is promising because it aligns with the legitimate rationales for disinformation regulation described in section 2.1. The use of data by and general practices of digital platforms are relevant factors that contribute to the transformations of the public sphere<xref ref-type="fn" rid="fn110">110</xref>, allowing for the spread of the phenomena included in the current information disorder, as well as for a series of online harms. Influence in information and attention fluxes, for instance, are ultimately promoted by the technological tools that generate engagement by determining content distribution, allowing for digital platforms’ advertising targeting business models to thrive. Moreover, criteria upon which these fluxes operate remains opaque, and thus above public scrutiny. The impacts of these transformations go beyond the experience of online communications and affect legacy media and governments, which increasingly rely on platforms to communicate with their constituencies<xref ref-type="fn" rid="fn111">111</xref>.</p>
            <p>A grain of salt is due, as a reminder that regulation of data and structure are not complete responses to the scenarios of disinformation and distrust in democratic institutions. Still, they allow us to address some of these transformations without creating ineffective and illegitimate policies with the potential to become weapons of mass repression. On the other hand, their early age means that empirical data on the success of these regulatory strategies is still missing.</p>
            <p>Lastly, while content should stay out of statutory regulation, it remains very much at the centre of countermeasures against disinformation performed by non-state actors. Independent fact-checking, media literacy, and professional journalism should take the lead in assuring dissent and democratic dispute over facts and truth.</p>
        </sec>
        <sec sec-type="conclusions">
            <title>CONCLUSIONS</title>
            <p>As regulatory efforts towards disinformation continuously increase, this paper offered an analysis of a set of policies under discussion and implementation in different legal backgrounds. Its main finding is that current disinformation practices are fuelling the proposal of policies over restrictive of speech in several experiences, without any promise of efficiency. In fact, there is relevant indication that these policies can be used as proxies to regulate undesired speech and crack down on political dissent. Privileging regulatory strategies that aim at data and structure rather than content is not only important to preserve freedom of expression and societal debate as pillars of democracy. These regulatory strategies also bear greater potential to address the transformations promoted by digital communication practices. It should be highlighted, however, that this is a small piece of the puzzle, as the spread and effects of disinformation in current scenarios is shaped by various conditions of different nature, that go way beyond the ways through which states regulate digital communications.</p>
        </sec>
    </body>
    <back>
        <fn-group>
            <fn fn-type="other" id="fn01">
                <label>1</label>
                <p>I would like to thank my colleagues at the Digital Disinformation Hub of the Leibniz Institute for Media Research, Stephan Dreyer and Amélie Heldt, for the conversations that inspired my ideas and supported the findings in this paper. Furthermore, I would like to thank Stephan Dreyer and Leonard Kamps for their revisions and suggestions, as as well as Lena Hinrichs and Mara Barthelmes for their help with the empirical research.</p>
            </fn>
            <fn fn-type="other" id="fn03">
                <label>3</label>
                <p><xref ref-type="bibr" rid="B07">CADWALLADR, Carole, The great British Brexit robbery: how our democracy was hijacked, <italic>The Guardian</italic>, 2017</xref>; <xref ref-type="bibr" rid="B17">EVANGELISTA, Rafael; BRUNO, Fernanda, WhatsApp and political instability in Brazil: targeted messages and political radicalisation, <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019</xref>; <xref ref-type="bibr" rid="B18">FARIS, Robert M. <italic>et al</italic>, <italic>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</italic>, Cambridge, U.S.: Berkman Klein Center for Internet &amp; Society at Harvard University, 2017</xref>; <xref ref-type="bibr" rid="B13">DAS, Anupam; SCHROEDER, Ralph, Online disinformation in the run-up to the Indian 2019 election, <italic>Information, Communication &amp; Society</italic>, p. 1-17, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn04">
                <label>4</label>
                <p><xref ref-type="bibr" rid="B35">MARWICK, Alice; LEWIS, Rebecca, <italic>Media manipulation and disinformation online</italic>, [s.l.]: Data &amp; Society Research Institute, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn05">
                <label>5</label>
                <p><xref ref-type="bibr" rid="B44">ROSSINI, Patrícia <italic>et al</italic>, Dysfunctional information sharing on WhatsApp and Facebook: The role of political talk, cross-cutting exposure and social corrections, <italic>New Media &amp; Society</italic>, v. 23, n. 8, p. 2430-2451, 2021</xref>; <xref ref-type="bibr" rid="B12">DAN, Viorela <italic>et al</italic>, Visual Mis- and Disinformation, Social Media, and Democracy, <italic>Journalism &amp; Mass Communication Quarterly</italic>, v. 98, n. 3, p. 641-664, 2021</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn06">
                <label>6</label>
                <p><xref ref-type="bibr" rid="B18">FARIS <italic>et al</italic>, <italic>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</italic></xref>; <xref ref-type="bibr" rid="B28">KARPF, David, On Digital Disinformation and Democratric Myths, <italic>MediaWell, Social Science Research Council</italic>, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn07">
                <label>7</label>
                <p><xref ref-type="bibr" rid="B39">NEO, Ric, The International Discourses and Governance of Fake News, <italic>Global Policy</italic>, v. 12, n. 2, p. 214-228, 2021</xref>; <xref ref-type="bibr" rid="B46">SCHULZ, WOLFGANG, <italic>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms</italic>, [s.l.]: Hoover Institution, Stanford University, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn08">
                <label>8</label>
                <p><xref ref-type="bibr" rid="B42">RAUCHFLEISCH, Adrian; KAISER, Jonas, The False positive problem of automatic bot detection in social science research, <italic>PLOS ONE</italic>, v. 15, n. 10, p. e0241045, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn09">
                <label>9</label>
                <p><xref ref-type="bibr" rid="B18">FARIS <italic>et al</italic>, <italic>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</italic></xref>.</p>
            </fn>
            <fn fn-type="other" id="fn10">
                <label>10</label>
                <p>This is a research perspective on technological transformation called “mediated democracy”, which will be further explored in Section 3 of this paper. In general, see <xref ref-type="bibr" rid="B25">HOFMANN, Jeanette, Mediated democracy – Linking digital technology to political agency, <italic>Internet Policy Review</italic>, v. 8, n. 2, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn11">
                <label>11</label>
                <p><xref ref-type="bibr" rid="B06">BLACK, J., Decentring Regulation: Understanding the Role of Regulation and Self-Regulation in a “Post-Regulatory” World, <italic>Current Legal Problems</italic>, v. 54, n. 1, p. 103-146, 2001</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn12">
                <label>12</label>
                <p><xref ref-type="bibr" rid="B01">BALDWIN, Robert; CAVE, Martin; LODGE, Martin, <italic>Understanding regulation: theory, strategy, and practice</italic>, 2nd ed. New York: Oxford University Press, 2012, p. 105</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn13">
                <label>13</label>
                <p><xref ref-type="bibr" rid="B34">MARWICK, Alice <italic>et al</italic>, <italic>Critical Disinformation Studies – A Syllabus</italic>, [s.l.]: Center for Information, Technology and Public Life – University of North Carolina at Chapel Hill, 2021</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn14">
                <label>14</label>
                <p><xref ref-type="bibr" rid="B53">WARDLE, Claire; DERAKHSHAN, Hossein, <italic>Information Disorder: Toward an interdisciplinary framework for research and policy making</italic>, [s.l.]: Council of Europe, 2017</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn15">
                <label>15</label>
                <p>For exceptions, <xref ref-type="bibr" rid="B12">DAN, Viorela <italic>et al</italic>, Visual Mis- and Disinformation, Social Media, and Democracy, <italic>Journalism &amp; Mass Communication Quarterly</italic>, v. 98, n. 3, p. 641-664, 2021</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn16">
                <label>16</label>
                <p>For all, see <xref ref-type="bibr" rid="B53">WARDLE; DERAKHSHAN, <italic>Information Disorder: Toward an interdisciplinary framework for research and policy making</italic></xref>.</p>
            </fn>
            <fn fn-type="other" id="fn17">
                <label>17</label>
                <p><xref ref-type="bibr" rid="B22">GUESS, Andrew M.; LYONS, Benjamin A., Misinformation, Disinformation and Online Propaganda, <italic>in</italic>: PERSILY, Nathaniel; TUCKER, Joshua A. (Eds.), <italic>Social Media and Democracy. The State of the Field, Prospects for Reform.</italic>, [s.l.]: Cambridge University Press, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn18">
                <label>18</label>
                <p><xref ref-type="bibr" rid="B53">WARDLE; DERAKHSHAN, <italic>Information Disorder: Toward an interdisciplinary framework for research and policy making</italic></xref>; <xref ref-type="bibr" rid="B22">GUESS; LYONS, Misinformation, Disinformation and Online Propaganda.</xref>; <xref ref-type="bibr" rid="B18">FARIS <italic>et al</italic>, <italic>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</italic></xref>; <xref ref-type="bibr" rid="B16">EGELHOFER, Jana Laura; LECHELER, Sophie, Fake news as a two-dimensional phenomenon: a framework and research agenda, <italic>Annals of the International Communication Association</italic>, v. 43, n. 2, p. 97-116, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn19">
                <label>19</label>
                <p>HABERMAS, Jürgen. O Estado Democrático de Democrático de Direito: uma amarração paradoxal de princípios contraditórios? In: <italic>Era das Transições</italic>. Rio de Janeiro: Editora Tempo Brasileiro, 2003.</p>
            </fn>
            <fn fn-type="other" id="fn20">
                <label>20</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR, Andreas; SCHROEDER, Ralph, Disinformation and the Structural Transformations of the Public Arena: Addressing the Actual Challenges to Democracy, <italic>Social Media + Society</italic>, v. 7, n. 1, 2021, p. 2</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn21">
                <label>21</label>
                <p><xref ref-type="bibr" rid="B28">KARPF, On Digital Disinformation and Democratric Myths</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn22">
                <label>22</label>
                <p><xref ref-type="bibr" rid="B02">BARBERÁ, Pablo, Social Media, Echo Chambers, and Political Polarization, <italic>in</italic>: PERSILY, Nathaniel; TUCKER, Joshua A. (Orgs.), <italic>Social media and democracy: the state of the field, prospects for reform</italic>, Cambridge: Cambridge University Press, 2020, p. 345</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn23">
                <label>23</label>
                <p><xref ref-type="bibr" rid="B18">FARIS <italic>et al</italic>, <italic>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</italic></xref>.</p>
            </fn>
            <fn fn-type="other" id="fn24">
                <label>24</label>
                <p>For an overview of these disputes, see <xref ref-type="bibr" rid="B02">BARBERÁ, Social Media, Echo Chambers, and Political Polarization</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn25">
                <label>25</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena, p. 2</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn26">
                <label>26</label>
                <p><italic>Ibid</italic>.</p>
            </fn>
            <fn fn-type="other" id="fn27">
                <label>27</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena, p. 4</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn28">
                <label>28</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena, p. 4</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn29">
                <label>29</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena, p. 5-8</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn30">
                <label>30</label>
                <p><xref ref-type="bibr" rid="B25">HOFMANN, Mediated democracy – Linking digital technology to political agency</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn31">
                <label>31</label>
                <p><italic>Ibid</italic>.</p>
            </fn>
            <fn fn-type="other" id="fn32">
                <label>32</label>
                <p><xref ref-type="bibr" rid="B03">BENKLER, Yochai, Cautionary Notes on Disinformation and the Origins of Distrust, <italic>MediaWell, Social Science Research Council</italic>, 2019</xref>. Available at: <ext-link ext-link-type="uri" xlink:href="https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/">https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/</ext-link>. Accessed on: 10 dec. 2019.</p>
            </fn>
            <fn fn-type="other" id="fn33">
                <label>33</label>
                <p><xref ref-type="bibr" rid="B51">VALENTE, Jonas C. L., Regulando desinformação e fake news: um panorama internacional das respostas ao problema, <italic>Comunicação pública</italic>, n. Vol.14 n<sup>o</sup> 27, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn34">
                <label>34</label>
                <p><xref ref-type="bibr" rid="B28">KARPF, On Digital Disinformation and Democratric Myths</xref>. Available at: <ext-link ext-link-type="uri" xlink:href="https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/">https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/</ext-link>. Accessed on: 10 jan. 2020.</p>
            </fn>
            <fn fn-type="other" id="fn35">
                <label>35</label>
                <p><italic>Ibid</italic>. Available at: <ext-link ext-link-type="uri" xlink:href="https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/">https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/</ext-link>. Accessed on: 10 jan. 2020.</p>
            </fn>
            <fn fn-type="other" id="fn36">
                <label>36</label>
                <p><xref ref-type="bibr" rid="B40">OGNYANOVA, Katherine <italic>et al</italic>, Misinformation in action: Fake news exposureis linked to lower trust in media, higher trust in government when your side is in power, <italic>Harvard Kennedy School Misinformation Review</italic>, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn37">
                <label>37</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena, p. 3</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn38">
                <label>38</label>
                <p><xref ref-type="bibr" rid="B29">KHAN, Irene, <italic>Disinformation and freedom of opinion and expression.</italic>, [s.l.]: United Nations, General Assembly, 2021</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn39">
                <label>39</label>
                <p><xref ref-type="bibr" rid="B40">OGNYANOVA <italic>et al</italic>, Misinformation in action</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn40">
                <label>40</label>
                <p><xref ref-type="bibr" rid="B40">OGNYANOVA <italic>et al</italic>, Misinformation in action</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn41">
                <label>41</label>
                <p>The role of states in disinformation counteraction is not restricted to formal legislation. It also includes other sorts of public policy beyond this paper’s scope, like police task forces, institutional support, encouraging fact-checking and media literacy initiatives (<xref ref-type="bibr" rid="B33">MARSDEN, Chris; MEYER, Trisha; BROWN, Ian, Platform values and democratic elections: How can the law regulate digital disinformation?, <italic>Computer Law &amp; Security Review</italic>, v. 36, p. 105373, 2020</xref>, p. 3.) and even enhancing cybersecurity. Further, as this paper looks exclusively at statutory legislation, it will not approach institutional solutions decentred from the state, such as the negotiation of voluntary measures, for example, the European Code of Conduct (See <xref ref-type="bibr" rid="B15">DURACH, Flavia; BÂRGĂOANU, Alina; NASTASIU, Cătălina, Tackling Disinformation: EU Regulation of the Digital Space, <italic>Romanian Journal of European Affairs</italic>, v. 20, n. 1, 2020</xref>) and the Australian Code of Practice on Disinformation and Misinformation (The Code was elaborated by digital platform providers represented by the Digital Industry Group Inc. (DIGI) upon a recommendation of the Australian Media and Communications Authority (ACMA). The Code is available at: <ext-link ext-link-type="uri" xlink:href="https://digi.org.au/disinformation-code/">https://digi.org.au/disinformation-code/</ext-link>. Accessed on: 10 jun. 2021. For more on the Australian framework, see CARSON, Andrea; FALLON, Liam, <italic>Fighting Fake News: A study of online misinformation regulation in the Asia Pacific.</italic>, [s.l.]: La Trobe University, 2021).</p>
            </fn>
            <fn fn-type="other" id="fn42">
                <label>42</label>
                <p><xref ref-type="bibr" rid="B54">WISEMAN, Jamie, Rush to pass ‘fake news’ laws during Covid-19 intensifying global media freedom challenges, <italic>International Press Institute</italic>, 2020. According to Irene Khan, at least “17 states adopted legislation to address pandemic-related problematic disinformation”. KHAN, <italic>Disinformation and freedom of opinion and expression.</italic>, p. 11</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn43">
                <label>43</label>
                <p><xref ref-type="bibr" rid="B51">VALENTE, Regulando desinformação e fake news</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn44">
                <label>44</label>
                <p>The repositories that were initially consulted are the Poynter Institute’s “A guide to anti-misinformation actions around the world”. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.poynter.org/ifcn/anti-misinformation-actions/">https://www.poynter.org/ifcn/anti-misinformation-actions/</ext-link>. Accessed on: 10 jan. 2020); the Law Library of Congress Reports “Initiatives to Counter Fake News in Selected Countries”. Available at: <ext-link ext-link-type="uri" xlink:href="https://digitalcommons.unl.edu/scholcom/179/">https://digitalcommons.unl.edu/scholcom/179/</ext-link>. Accessed on: 06 jun. 2021; and “Government Responses to Disinformation on Social Media Platforms”. Available at: <ext-link ext-link-type="uri" xlink:href="https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&amp;context=scholcom">https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&amp;context=scholcom</ext-link>. Accessed on: 06 jun. 2021); and <xref ref-type="bibr" rid="B08">CARSON; FALLON, <italic>Fighting Fake News: A study of online misinformation regulation in the Asia Pacific.</italic> A Study of Online Misinformation Regulation in the Asia Pacific, La Trobe University</xref>. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf">https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf</ext-link>. Accessed on: 06 jun. 2021). Besides these repositories, further academic literature and media reports were used to provide insights and context for regulatory experiences. All these other sources are cited throughout the paper.</p>
            </fn>
            <fn fn-type="other" id="fn45">
                <label>45</label>
                <p>This is similar to the explanation used by Lyria Bennet-Moses in <xref ref-type="bibr" rid="B37">MOSES, Lyria Bennett, How to Think about Law, Regulation and Technology: Problems with ‘Technology’ as a Regulatory Target, <italic>Law, Innovation and Technology</italic>, v. 5, n. 1, p. 1-20, 2013</xref>. The literature accounts for different ways of referencing regulatory targets, which can also be understood as the “the individual or organization to which a regulatory instrument applies” (<xref ref-type="bibr" rid="B09">COGLIANESE, Cary, Engaging Business in the Regulation of Nanotechnology, <italic>in</italic>: BOSSO, Christopher J. (Org.), <italic>Governing uncertainty: environmental regulation in the age of nanotechnology</italic>, Washington, DC: RFF Press, 2010</xref>.)</p>
            </fn>
            <fn fn-type="other" id="fn46">
                <label>46</label>
                <p>Law 13.834/2019, art. 2o. Available at: <ext-link ext-link-type="uri" xlink:href="http://www.planalto.gov.br/ccivil_03/_ato2019-2022/2019/lei/L13834.htm">http://www.planalto.gov.br/ccivil_03/_ato2019-2022/2019/lei/L13834.htm</ext-link>. Accessed on: 30 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn47">
                <label>47</label>
                <p>Proclamation 1185/2020, as per the translation available at: <ext-link ext-link-type="uri" xlink:href="https://chilot.me/2020/04/05/proclamation-no-1185-2020-hate-speech-and-disinformation-prevention-and-suppression/">https://chilot.me/2020/04/05/proclamation-no-1185-2020-hate-speech-and-disinformation-prevention-and-suppression/</ext-link>. Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn48">
                <label>48</label>
                <p><xref ref-type="bibr" rid="B45">SCHULDT, Lasse, The rebirth of Malaysia’s fake news law – and what the NetzDG has to do with it, <italic>Verfassungsblog.</italic></xref> Available at: <ext-link ext-link-type="uri" xlink:href="https://verfassungsblog.de/malaysia-fake-news/">https://verfassungsblog.de/malaysia-fake-news/</ext-link>. Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn49">
                <label>49</label>
                <p>Cambodian Center for Human Rights, <italic>Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression.</italic> Available at: Cambodia-Centre-for-human-rights.pdf (ohchr.org). Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn50">
                <label>50</label>
                <p>SUGOW, Abdulmalik, MUNGAI, Beatrice, WANYAMA, Jentrix, “The regulation of fake news in Kenya under the coronavirus threat”, available at: <ext-link ext-link-type="uri" xlink:href="https://cipit.strathmore.edu/the-regulation-of-fake-news-in-kenya-under-the-coronavirus-threat/">https://cipit.strathmore.edu/the-regulation-of-fake-news-in-kenya-under-the-coronavirus-threat/</ext-link>. Accessed on: 06 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn51">
                <label>51</label>
                <p>Law Library of Congress Reports, “Initiatives to Counter Fake News in Selected Countries”. Available at: <ext-link ext-link-type="uri" xlink:href="https://digitalcommons.unl.edu/scholcom/179/">https://digitalcommons.unl.edu/scholcom/179/</ext-link>. Accessed on: 06 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn52">
                <label>52</label>
                <p><ext-link ext-link-type="uri" xlink:href="https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&amp;context=scholcom">https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&amp;context=scholcom</ext-link></p>
            </fn>
            <fn fn-type="other" id="fn53">
                <label>53</label>
                <p><xref ref-type="bibr" rid="B46">SCHULZ, Wolfgang, <italic>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms</italic>, [s.l.]: Hoover Institution, Stanford University, 2019, p. 17</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn54">
                <label>54</label>
                <p><xref ref-type="bibr" rid="B41">QI, Aimin; SHAO, Guosong; ZHENG, Wentong, Assessing China’s Cybersecurity Law, <italic>Computer Law &amp; Security Review</italic>, v. 34, n. 6, p. 1342-1354, 2018, p. 12</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn55">
                <label>55</label>
                <p><xref ref-type="bibr" rid="B10">CRAUFURD SMITH, Rachael, Fake news, French Law and democratic legitimacy: lessons for the United Kingdom?, <italic>Journal of Media Law</italic>, v. 11, n. 1, p. 52-81, 2019, p. 52</xref>. As the author highlights, the French Law adopts “a more holistic approach” (p. 58) based on three strands designed to curb foreign state disinformation: to prevent further online transmission of false information prior to elections (i.e., the case of the policy described here); to ensure greater transparency in the operation of online communication platforms; and to stimulate new educational initiatives. Some of these strategies do not encompass content regulation and will be discussed in other sections of this paper.</p>
            </fn>
            <fn fn-type="other" id="fn56">
                <label>56</label>
                <p><italic>Ibid</italic>., p. 60.</p>
            </fn>
            <fn fn-type="other" id="fn57">
                <label>57</label>
                <p>Belarus, <italic>Freedom House</italic>. Available at: <ext-link ext-link-type="uri" xlink:href="https://freedomhouse.org/country/belarus/freedom-net/2021">https://freedomhouse.org/country/belarus/freedom-net/2021</ext-link>. Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn58">
                <label>58</label>
                <p>Belarus, <italic>Freedom House</italic>. Available at: <ext-link ext-link-type="uri" xlink:href="https://freedomhouse.org/country/belarus/freedom-net/2021">https://freedomhouse.org/country/belarus/freedom-net/2021</ext-link>. Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn59">
                <label>59</label>
                <p>Cambodian Center for Human Rights, <italic>Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression</italic>. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf">https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf</ext-link>. Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn60">
                <label>60</label>
                <p>Cambodian Center for Human Rights<italic>, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression</italic>. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf">https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf</ext-link>. Accessed on: 13 sep. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn61">
                <label>61</label>
                <p><xref ref-type="bibr" rid="B29">KHAN, Irene, <italic>A/HRC/47/25 – Disinformation and freedom of opinion and expression</italic>, Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, p. 3</xref>. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.ohchr.org/EN/Issues/FreedomOpinion/Pages/Report-on-disinformation.aspx">https://www.ohchr.org/EN/Issues/FreedomOpinion/Pages/Report-on-disinformation.aspx</ext-link>. Accessed on: 01 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn62">
                <label>62</label>
                <p><xref ref-type="bibr" rid="B26">IGLESIAS KELLER, Clara, Policy by judicialisation: the institutional framework for intermediary liability in Brazil, <italic>International Review of Law, Computers &amp; Technology</italic>, p. 1-19, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn63">
                <label>63</label>
                <p>Law Library of Congress Reports, “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/. Accessed on: 06 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn64">
                <label>64</label>
                <p><xref ref-type="bibr" rid="B46">SCHULZ, Wolfgang, <italic>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms</italic>, [s.l.]: Hoover Institution, Stanford University, 2019, p. 17</xref></p>
            </fn>
            <fn fn-type="other" id="fn65">
                <label>65</label>
                <p><xref ref-type="bibr" rid="B31">MACEDO JUNIOR, Ronaldo Porto, Freedom of Expression: what lessons should we learn from US experience?, <italic>Revista Direito GV</italic>, v. 13, n. 1, p. 274-302, 2017, p. 2</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn66">
                <label>66</label>
                <p><xref ref-type="bibr" rid="B29">KHAN, <italic>Disinformation and freedom of opinion and expression.,</italic> p. 11</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn67">
                <label>67</label>
                <p><xref ref-type="bibr" rid="B36">MENDES, Laura Schertel, <italic>Privacidade, proteção de dados e direito do consumidor: linhas gerais de um novo direito fundamental.,</italic> São Paulo: Saraiva, 2014, p. 47</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn68">
                <label>68</label>
                <p><xref ref-type="bibr" rid="B52">WALKER, Shawn; MERCEA, Dan; BASTOS, Marco, The disinformation landscape and the lockdown of social platforms, <italic>Information, Communication &amp; Society</italic>, v. 22, n. 11, p. 1531-1543, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn69">
                <label>69</label>
                <p><italic>UK Information Commissioner’s Office</italic>, Microtargeting, ICO website. Available at: <ext-link ext-link-type="uri" xlink:href="https://ico.org.uk/your-data-matters/be-data-aware/social-media-privacy-settings/microtargeting/">https://ico.org.uk/your-data-matters/be-data-aware/social-media-privacy-settings/microtargeting/</ext-link>. Accessed on: 10 sep 2021.</p>
            </fn>
            <fn fn-type="other" id="fn70">
                <label>70</label>
                <p><xref ref-type="bibr" rid="B55">ZAROUALI, Brahim <italic>et al</italic>, Using a Personality-Profiling Algorithm to Investigate Political Microtargeting: Assessing the Persuasion Effects of Personality-Tailored Ads on Social Media, <italic>Communication Research</italic>, p. 009365022096196, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn71">
                <label>71</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena, p. 3</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn72">
                <label>72</label>
                <p><xref ref-type="bibr" rid="B38">NENADIĆ, Iva, Unpacking the “European approach” to tackling challenges of disinformation and political manipulation, <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019, p. 6</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn73">
                <label>73</label>
                <p><italic>Ibid</italic>., p. 2.</p>
            </fn>
            <fn fn-type="other" id="fn74">
                <label>74</label>
                <p><xref ref-type="bibr" rid="B07">CADWALLADR, The great British Brexit robbery: how our democracy was hijacked; EVANGELISTA; BRUNO, WhatsApp and political instability in Brazil</xref>; <xref ref-type="bibr" rid="B14">DOBBER, Tom; Ó FATHAIGH, Ronan; ZUIDERVEEN BORGESIUS, Frederik J., The regulation of online political micro-targeting in Europe, <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn75">
                <label>75</label>
                <p>However, guaranteeing the integrity of data-driven elections encompasses concerns that go beyond disinformation. See <xref ref-type="bibr" rid="B04">BENNET, Colin J.; ODURO-MARFO, Smith, <italic>Privacy, Voter Surveillance and Democratic Engagement: Challenges for Data Protection Authorities.,</italic> [s.l.]: University of Victoria, 2019</xref>. <xref ref-type="bibr" rid="B05">BENNETT, Colin J.; LYON, David, Data-driven elections: implications and challenges for democratic societies, <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn76">
                <label>76</label>
                <p><xref ref-type="bibr" rid="B11">CRUZ, Francisco Brito, <italic>Novo jogo, velhas regras: democracia e direito na era da nova propaganda politica e das fake news</italic>, Belo Horizonte, MG: Grupo Editorial Letramento, Casa do Direito, 2020, p. 297</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn77">
                <label>77</label>
                <p><xref ref-type="bibr" rid="B11">CRUZ, <italic>Novo jogo, velhas regras.</italic></xref></p>
            </fn>
            <fn fn-type="other" id="fn78">
                <label>78</label>
                <p><italic>Ibid</italic>., p. 377.</p>
            </fn>
            <fn fn-type="other" id="fn79">
                <label>79</label>
                <p><xref ref-type="bibr" rid="B38">NENADIĆ, Unpacking the “European approach” to tackling challenges of disinformation and political manipulation, p. 13; CRUZ, <italic>Novo jogo, velhas regras,</italic> p. 376-378</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn80">
                <label>80</label>
                <p><xref ref-type="bibr" rid="B11">CRUZ, <italic>Novo jogo, velhas regras,</italic> p. 377</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn81">
                <label>81</label>
                <p><xref ref-type="bibr" rid="B04">BENNET; ODURO-MARFO, <italic>Privacy, Voter Surveillance and Democratic Engagement: Challenges for Data Protection Authorities.,</italic> p. 6</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn82">
                <label>82</label>
                <p>In the context of digital platform regulation, regulating structure can also refer to antitrust legislation (see <italic>Tackling the Information Crisis: A Policy Framework for Media System</italic> Resilience, The Report of the LSE Commission on Truth, Trust and Technology. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.lse.ac.uk/media-and-communications/truth-trust-and-technology-commission">https://www.lse.ac.uk/media-and-communications/truth-trust-and-technology-commission</ext-link>. Accessed on: 10 aug. 2021). Despite this application of the term, this article’s scope does not go so far as to encompass the analysis of antitrust legislation.</p>
            </fn>
            <fn fn-type="other" id="fn83">
                <label>83</label>
                <p><xref ref-type="bibr" rid="B50">TAMBINI, Damian, Rights and Responsibilities of Internet Intermediaries in Europe: The Need for Policy Coordination</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn84">
                <label>84</label>
                <p><xref ref-type="bibr" rid="B32">MANSELL, Robin; STEINMUELLER, W. Edward, <italic>Advanced introduction to platform economics,</italic> Cheltenham, UK; Northampton, MA: Edward Elgar Publishing, 2020, p. 101</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn85">
                <label>85</label>
                <p><xref ref-type="bibr" rid="B46">SCHULZ, WOLFGANG, <italic>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms.</italic></xref></p>
            </fn>
            <fn fn-type="other" id="fn86">
                <label>86</label>
                <p>Brazilian Federal House of Representatives Bill of Law 2.630/2020. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735">https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735</ext-link>. Accessed on: 10 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn87">
                <label>87</label>
                <p><xref ref-type="bibr" rid="B19">FROSIO, Giancarlo, Why keep a dog and bark yourself? From intermediary liability to responsibility, <italic>International Journal of Law and Information Technology</italic>, p. 1-33, 2017</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn88">
                <label>88</label>
                <p><xref ref-type="bibr" rid="B30">MAC SÍTHIGH, Daithí, The road to responsibilities: new attitudes towards Internet intermediaries, <italic>Information &amp; Communications Technology Law</italic>, v. 29, n. 1, p. 1-21, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn89">
                <label>89</label>
                <p>KUCZERAWY, Aleksandra. <italic>General Monitoring Obligations: A New Cornerstone of Internet Regulation in the EU?.</italic> Available at: <ext-link ext-link-type="uri" xlink:href="https://ssrn.com/abstract=3449170">https://ssrn.com/abstract=3449170</ext-link>. Accessed on: 06 mar. 2020.</p>
            </fn>
            <fn fn-type="other" id="fn90">
                <label>90</label>
                <p><xref ref-type="bibr" rid="B20">GASSER, Urs; SCHULZ, Wolfgang, Governance of Online Intermediaries: Observations from a Series of National Case Studies, <italic>SSRN Electronic Journal,</italic> 2015</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn91">
                <label>91</label>
                <p>See, for instance, <xref ref-type="bibr" rid="B50">Tambini, 2017</xref> and <xref ref-type="bibr" rid="B19">Frosio, 2017</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn92">
                <label>92</label>
                <p><xref ref-type="bibr" rid="B10">CRAUFURD SMITH, Fake news, French Law and democratic legitimacy, p. 62</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn93">
                <label>93</label>
                <p><xref ref-type="bibr" rid="B47">SUZOR, Nicolas P. <italic>et al</italic>, What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation, <italic>International Journal of Communication,</italic> v. 13, p. 1526-1543, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn94">
                <label>94</label>
                <p><xref ref-type="bibr" rid="B43">RIEDER, Bernhard; HOFMANN, Jeanette, Towards platform observability, <italic>Internet Policy Review,</italic> v. 9, n. 4, 2020</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn95">
                <label>95</label>
                <p><xref ref-type="bibr" rid="B21">GORWA, Robert; ASH, Timonthy Garton, Democratic Transparency in the Platform Society, <italic>in</italic>: PERSILY, Nathaniel; TUCKER, Joshua A. (Orgs.), <italic>Social media and democracy: the state of the field, prospects for reform,</italic> Cambridge: Cambridge University Press, 2020, p. 287</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn96">
                <label>96</label>
                <p><xref ref-type="bibr" rid="B43">RIEDER; HOFMANN, Towards platform observability</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn97">
                <label>97</label>
                <p>The 2017 <italic>Netzwerkdurchsetzungsgesetz –</italic> NetzDG Act is not a disinformation-targeted law. It requires social media platforms to implement procedures that allow users to report illegal content, notably, the 22 criminal conducts already provided in Germany’s Criminal Code. According to its terms, “‘manifestly unlawful”‘ content needs to be removed within 24 hours of notification (or possibly after seven days or more, with terms to be agreed upon with law enforcement authority). Beside removals, Section 2 requires platforms to periodically publish transparency reports on the number of complaints received and how they were handled by the platform. <xref ref-type="bibr" rid="B24">HELDT, Amélie, Reading between the lines and the numbers: an analysis of the first NetzDG reports, <italic>Internet Policy Review</italic>, v. 8, n. 2, 2019</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn98">
                <label>98</label>
                <p>Brazilian Federal House of Representatives Bill of Law 2.630/2020. Available at: <ext-link ext-link-type="uri" xlink:href="https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735">https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735</ext-link>. Accessed on: 10 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn99">
                <label>99</label>
                <p>European Commission, Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC. Available at: <ext-link ext-link-type="uri" xlink:href="https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-european-parliament-and-council-single-market-digital-services-digital-services">https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-european-parliament-and-council-single-market-digital-services-digital-services</ext-link>. Accessed on: 10 jun. 2021.</p>
            </fn>
            <fn fn-type="other" id="fn100">
                <label>100</label>
                <p><ext-link ext-link-type="uri" xlink:href="https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12826-Transparency-of-political-advertising/public-consultation_en">https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12826-Transparency-of-political-advertising/public-consultation_en</ext-link>.</p>
            </fn>
            <fn fn-type="other" id="fn101">
                <label>101</label>
                <p><xref ref-type="bibr" rid="B10">CRAUFURD SMITH, Fake news, French Law and democratic legitimacy, p. 62</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn102">
                <label>102</label>
                <p><xref ref-type="bibr" rid="B38">NENADIĆ, Unpacking the “European approach” to tackling challenges of disinformation and political manipulation</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn103">
                <label>103</label>
                <p>An example would be the German NetzDG mentioned above.</p>
            </fn>
            <fn fn-type="other" id="fn104">
                <label>104</label>
                <p><xref ref-type="bibr" rid="B46">SCHULZ, WOLFGANG, <italic>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms</italic></xref>.</p>
            </fn>
            <fn fn-type="other" id="fn105">
                <label>105</label>
                <p><xref ref-type="bibr" rid="B49">SYLVAIN, Oliver, Internet governance and democratic legitimacy, <italic>Federal Communications Law Journal</italic>, v. 62, n. 2, p. 205-274, 2010, p. 209</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn106">
                <label>106</label>
                <p><xref ref-type="bibr" rid="B23">HELBERGER, Natali, The Political Power of Platforms: How Current Attempts to Regulate Misinformation Amplify Opinion Power, <italic>Digital Journalism</italic>, v. 8, n. 6, p. 842-854, 2020, p. 846</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn107">
                <label>107</label>
                <p><xref ref-type="bibr" rid="B48">SUZOR, Nicolas; VAN GEELEN, Tess; MYERS WEST, Sarah, Evaluating the legitimacy of platform governance: A review of research and a shared research agenda, <italic>International Communication Gazette</italic>, v. 80, n. 4, p. 385-400, 2018, p. 391-392</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn108">
                <label>108</label>
                <p><xref ref-type="bibr" rid="B25">HOFMANN, Mediated democracy – Linking digital technology to political agency</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn109">
                <label>109</label>
                <p>Natali Helberger has argued that recent attempts to “infuse some public value standards into corporations” formalises “the role of platforms as governors of online speech” and reinforcing their political power. <xref ref-type="bibr" rid="B23">HELBERGER, The Political Power of Platforms, p. 848</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn110">
                <label>110</label>
                <p><xref ref-type="bibr" rid="B27">JUNGHERR; SCHROEDER, Disinformation and the Structural Transformations of the Public Arena</xref>.</p>
            </fn>
            <fn fn-type="other" id="fn111">
                <label>111</label>
                <p><xref ref-type="bibr" rid="B23">HELBERGER, The Political Power of Platforms, p. 847</xref>.</p>
            </fn>
        </fn-group>

        <ref-list>
            <title>BIBLIOGRAPHY</title>
            <ref id="B01">

                <mixed-citation>BALDWIN, Robert; CAVE, Martin; LODGE, Martin. <italic>Understanding regulation: theory, strategy, and practice</italic>. 2nd ed. New York: Oxford University Press, 2012.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>BALDWIN</surname>
                            <given-names>Robert</given-names>
                        </name>
                        <name>
                            <surname>CAVE</surname>
                            <given-names>Martin</given-names>
                        </name>
                        <name>
                            <surname>LODGE</surname>
                            <given-names>Martin</given-names>
                        </name>
                    </person-group>
                    <source>Understanding regulation: theory, strategy, and practice</source>
                    <edition>2nd ed.</edition>
                    <publisher-loc>New York</publisher-loc>
                    <publisher-name>Oxford University Press</publisher-name>
                    <year>2012</year>

                </element-citation>
            </ref>
            <ref id="B02">

                <mixed-citation>BARBERÁ, Pablo. Social Media, Echo Chambers, and Political Polarization. <italic>In</italic>: PERSILY, Nathaniel; TUCKER, Joshua A. (Orgs.). <italic>Social media and democracy: the state of the field, prospects for reform</italic>. Cambridge New York Port Melbourne New Delhi Singapore: Cambridge University Press, 2020, p. 345.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>BARBERÁ</surname>
                            <given-names>Pablo</given-names>
                        </name>
                    </person-group>
                    <chapter-title>Social Media, Echo Chambers, and Political Polarization</chapter-title>
                    <person-group person-group-type="compiler">
                        <name>
                            <surname>PERSILY</surname>
                            <given-names>Nathaniel</given-names>
                        </name>
                        <name>
                            <surname>TUCKER</surname>
                            <given-names>Joshua A.</given-names>
                        </name>
                    </person-group>
                    <source>Social media and democracy: the state of the field, prospects for reform</source>
                    <publisher-loc>Cambridge New York Port Melbourne New Delhi Singapore</publisher-loc>
                    <publisher-name>Cambridge University Press</publisher-name>
                    <year>2020</year>
                    <fpage>345</fpage>
                    <lpage>345</lpage>

                </element-citation>
            </ref>
            <ref id="B03">

                <mixed-citation>BENKLER, Yochai. Cautionary Notes on Disinformation and the Origins of Distrust. <italic>MediaWell, Social Science Research Council</italic>, 2019. Disponível em: https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/. Acesso em: 10 dez. 2019.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>BENKLER</surname>
                            <given-names>Yochai</given-names>
                        </name>
                    </person-group>
                    <comment>Cautionary Notes on Disinformation and the Origins of Distrust</comment>
                    <source>MediaWell, Social Science Research Council</source>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/">https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/</ext-link></comment>
                    <date-in-citation content-type="access-date">10 dez. 2019</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B04">

                <mixed-citation>BENNET, Colin J.; ODURO-MARFO, Smith. <italic>Privacy, Voter Surveillance and Democratic Engagement: Challenges for Data Protection Authorities</italic>. [s.l.]: University of Victoria, 2019.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>BENNET</surname>
                            <given-names>Colin J.</given-names>
                        </name>
                        <name>
                            <surname>ODURO-MARFO</surname>
                            <given-names>Smith</given-names>
                        </name>
                    </person-group>
                    <source>Privacy, Voter Surveillance and Democratic Engagement: Challenges for Data Protection Authorities</source>
                    <publisher-name>University of Victoria</publisher-name>
                    <year>2019</year>

                </element-citation>
            </ref>
            <ref id="B05">

                <mixed-citation>BENNETT, Colin J.; LYON, David. Data-driven elections: implications and challenges for democratic societies. <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019. Disponível em: https://policyreview.info/node/1433. Acesso em: 10 out. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>BENNETT</surname>
                            <given-names>Colin J.</given-names>
                        </name>
                        <name>
                            <surname>LYON</surname>
                            <given-names>David</given-names>
                        </name>
                    </person-group>
                    <article-title>Data-driven elections: implications and challenges for democratic societies</article-title>
                    <source>Internet Policy Review</source>
                    <volume>8</volume>
                    <issue>4</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/node/1433">https://policyreview.info/node/1433</ext-link></comment>
                    <date-in-citation content-type="access-date">10 out. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B06">

                <mixed-citation>BLACK, J. Decentring Regulation: Understanding the Role of Regulation and Self-Regulation in a “Post-Regulatory” World. <italic>Current Legal Problems</italic>, v. 54, n. 1, p. 103-146, 2001.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>BLACK</surname>
                            <given-names>J.</given-names>
                        </name>
                    </person-group>
                    <article-title>Decentring Regulation: Understanding the Role of Regulation and Self-Regulation in a “Post-Regulatory” World</article-title>
                    <source>Current Legal Problems</source>
                    <volume>54</volume>
                    <issue>1</issue>
                    <fpage>103</fpage>
                    <lpage>146</lpage>
                    <year>2001</year>

                </element-citation>
            </ref>
            <ref id="B07">

                <mixed-citation>CADWALLADR, Carole. The great British Brexit robbery: how our democracy was hijacked. <italic>The Guardian</italic>, 2017. Disponível em: https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy. Acesso em: 10 maio 2020.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>CADWALLADR</surname>
                            <given-names>Carole</given-names>
                        </name>
                    </person-group>
                    <comment>The great British Brexit robbery: how our democracy was hijacked</comment>
                    <source>The Guardian</source>
                    <year>2017</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy">https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy</ext-link></comment>
                    <date-in-citation content-type="access-date">10 maio 2020</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B08">

                <mixed-citation>CARSON, Andrea; FALLON, Liam. <italic>Fighting Fake News: A study of online misinformation regulation in the Asia Pacific</italic>. [s.l.]: La Trobe University, 2021. Disponível em: https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf. Acesso em: 10 jun. 221DC.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>CARSON</surname>
                            <given-names>Andrea</given-names>
                        </name>
                        <name>
                            <surname>FALLON</surname>
                            <given-names>Liam</given-names>
                        </name>
                    </person-group>
                    <source>Fighting Fake News: A study of online misinformation regulation in the Asia Pacific</source>
                    <publisher-name>La Trobe University</publisher-name>
                    <year>2021</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf">https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf</ext-link></comment>
                    <date-in-citation content-type="access-date">10 jun. 221DC</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B09">

                <mixed-citation>COGLIANESE, Cary. Engaging Business in the Regulation of Nanotechnology. <italic>In</italic>: BOSSO, Christopher J. (Org.). <italic>Governing uncertainty: environmental regulation in the age of nanotechnology</italic>. Washington, DC: RFF Press, 2010.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>COGLIANESE</surname>
                            <given-names>Cary</given-names>
                        </name>
                    </person-group>
                    <chapter-title>Engaging Business in the Regulation of Nanotechnology</chapter-title>
                    <person-group person-group-type="compiler">
                        <name>
                            <surname>BOSSO</surname>
                            <given-names>Christopher J.</given-names>
                        </name>
                    </person-group>
                    <source>Governing uncertainty: environmental regulation in the age of nanotechnology</source>
                    <publisher-loc>Washington, DC</publisher-loc>
                    <publisher-name>RFF Press</publisher-name>
                    <year>2010</year>

                </element-citation>
            </ref>
            <ref id="B10">

                <mixed-citation>CRAUFURD SMITH, Rachael. Fake news, French Law and democratic legitimacy: lessons for the United Kingdom? <italic>Journal of Media Law</italic>, v. 11, n. 1, p. 52-81, 2019.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>CRAUFURD SMITH</surname>
                            <given-names>Rachael</given-names>
                        </name>
                    </person-group>
                    <article-title>Fake news, French Law and democratic legitimacy: lessons for the United Kingdom?</article-title>
                    <source>Journal of Media Law</source>
                    <volume>11</volume>
                    <issue>1</issue>
                    <fpage>52</fpage>
                    <lpage>81</lpage>
                    <year>2019</year>

                </element-citation>
            </ref>
            <ref id="B11">

                <mixed-citation>CRUZ, Francisco Brito. <italic>Novo jogo, velhas regras: democracia e direito na era da nova propaganda politica e das fake news</italic>. Belo Horizonte, MG: Grupo Editorial Letramento, Casa do Direito, 2020.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>CRUZ</surname>
                            <given-names>Francisco Brito</given-names>
                        </name>
                    </person-group>
                    <source>Novo jogo, velhas regras: democracia e direito na era da nova propaganda politica e das fake news</source>
                    <publisher-loc>Belo Horizonte, MG</publisher-loc>
                    <publisher-name>Grupo Editorial Letramento, Casa do Direito</publisher-name>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B12">

                <mixed-citation>DAN, Viorela; PARIS, Britt; DONOVAN, Joan; <italic>et al</italic>. Visual Mis- and Disinformation, Social Media, and Democracy. <italic>Journalism &amp; Mass Communication Quarterly</italic>, v. 98, n. 3, p. 641-664, 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>DAN</surname>
                            <given-names>Viorela</given-names>
                        </name>
                        <name>
                            <surname>PARIS</surname>
                            <given-names>Britt</given-names>
                        </name>
                        <name>
                            <surname>DONOVAN</surname>
                            <given-names>Joan</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <article-title>Visual Mis- and Disinformation, Social Media, and Democracy</article-title>
                    <source>Journalism &amp; Mass Communication Quarterly</source>
                    <volume>98</volume>
                    <issue>3</issue>
                    <fpage>641</fpage>
                    <lpage>664</lpage>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B13">

                <mixed-citation>DAS, Anupam; SCHROEDER, Ralph. Online disinformation in the run-up to the Indian 2019 election. <italic>Information, Communication &amp; Society</italic>, p. 1-17, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>DAS</surname>
                            <given-names>Anupam</given-names>
                        </name>
                        <name>
                            <surname>SCHROEDER</surname>
                            <given-names>Ralph</given-names>
                        </name>
                    </person-group>
                    <article-title>Online disinformation in the run-up to the Indian 2019 election</article-title>
                    <source>Information, Communication &amp; Society</source>
                    <fpage>1</fpage>
                    <lpage>17</lpage>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B14">

                <mixed-citation>DOBBER, Tom; Ó FATHAIGH, Ronan; ZUIDERVEEN BORGESIUS, Frederik J. The regulation of online political micro-targeting in Europe. <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019. Disponível em: https://policyreview.info/node/1440. Acesso em: 16 set. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>DOBBER</surname>
                            <given-names>Tom</given-names>
                        </name>
                        <name>
                            <surname>Ó FATHAIGH</surname>
                            <given-names>Ronan</given-names>
                        </name>
                        <name>
                            <surname>ZUIDERVEEN BORGESIUS</surname>
                            <given-names>Frederik J.</given-names>
                        </name>
                    </person-group>
                    <article-title>The regulation of online political micro-targeting in Europe</article-title>
                    <source>Internet Policy Review</source>
                    <volume>8</volume>
                    <issue>4</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/node/1440">https://policyreview.info/node/1440</ext-link></comment>
                    <date-in-citation content-type="access-date">16 set. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B15">

                <mixed-citation>DURACH, Flavia; BÂRGĂOANU, Alina; NASTASIU, Cătălina. Tackling Disinformation: EU Regulation of the Digital Space. <italic>Romanian Journal of European Affairs</italic>, v. 20, n. 1, 2020. Disponível em: http://rjea.ier.gov.ro/wp-content/uploads/2020/05/RJEA_vol.-20_no.1_June-2020_Full-issue.pdf#page=6.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>DURACH</surname>
                            <given-names>Flavia</given-names>
                        </name>
                        <name>
                            <surname>BÂRGĂOANU</surname>
                            <given-names>Alina</given-names>
                        </name>
                        <name>
                            <surname>NASTASIU</surname>
                            <given-names>Cătălina</given-names>
                        </name>
                    </person-group>
                    <article-title>Tackling Disinformation: EU Regulation of the Digital Space</article-title>
                    <source>Romanian Journal of European Affairs</source>
                    <volume>20</volume>
                    <issue>1</issue>
                    <year>2020</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="http://rjea.ier.gov.ro/wp-content/uploads/2020/05/RJEA_vol.-20_no.1_June-2020_Full-issue.pdf#page=6">http://rjea.ier.gov.ro/wp-content/uploads/2020/05/RJEA_vol.-20_no.1_June-2020_Full-issue.pdf#page=6</ext-link></comment>

                </element-citation>
            </ref>
            <ref id="B16">

                <mixed-citation>EGELHOFER, Jana Laura; LECHELER, Sophie. Fake news as a two-dimensional phenomenon: a framework and research agenda. <italic>Annals of the International Communication Association</italic>, v. 43, n. 2, p. 97-116, 2019.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>EGELHOFER</surname>
                            <given-names>Jana Laura</given-names>
                        </name>
                        <name>
                            <surname>LECHELER</surname>
                            <given-names>Sophie</given-names>
                        </name>
                    </person-group>
                    <article-title>Fake news as a two-dimensional phenomenon: a framework and research agenda</article-title>
                    <source>Annals of the International Communication Association</source>
                    <volume>43</volume>
                    <issue>2</issue>
                    <fpage>97</fpage>
                    <lpage>116</lpage>
                    <year>2019</year>

                </element-citation>
            </ref>
            <ref id="B17">

                <mixed-citation>EVANGELISTA, Rafael; BRUNO, Fernanda. WhatsApp and political instability in Brazil: targeted messages and political radicalisation. <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019. Disponível em: https://policyreview.info/node/1434. Acesso em: 27 jun. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>EVANGELISTA</surname>
                            <given-names>Rafael</given-names>
                        </name>
                        <name>
                            <surname>BRUNO</surname>
                            <given-names>Fernanda</given-names>
                        </name>
                    </person-group>
                    <article-title>WhatsApp and political instability in Brazil: targeted messages and political radicalisation</article-title>
                    <source>Internet Policy Review</source>
                    <volume>8</volume>
                    <issue>4</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/node/1434">https://policyreview.info/node/1434</ext-link></comment>
                    <date-in-citation content-type="access-date">27 jun. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B18">

                <mixed-citation>FARIS, Robert M.; ROBERTS, Hal; ETLING, Bruce; <italic>et al</italic>. <italic>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</italic>. Cambridge, U.S.: Berkman Klein Center for Internet &amp; Society at Harvard University, 2017. Disponível em: http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>FARIS</surname>
                            <given-names>Robert M.</given-names>
                        </name>
                        <name>
                            <surname>ROBERTS</surname>
                            <given-names>Hal</given-names>
                        </name>
                        <name>
                            <surname>ETLING</surname>
                            <given-names>Bruce</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <source>Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election</source>
                    <publisher-loc>Cambridge, U.S.</publisher-loc>
                    <publisher-name>Berkman Klein Center for Internet &amp; Society at Harvard University</publisher-name>
                    <year>2017</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251">http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251</ext-link></comment>

                </element-citation>
            </ref>
            <ref id="B19">

                <mixed-citation>FROSIO, Giancarlo. Why keep a dog and bark yourself? From intermediary liability to responsibilit. <italic>International Journal of Law and Information Technology</italic>, p. 1-33, 2017.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>FROSIO</surname>
                            <given-names>Giancarlo</given-names>
                        </name>
                    </person-group>
                    <article-title>Why keep a dog and bark yourself? From intermediary liability to responsibilit</article-title>
                    <source>International Journal of Law and Information Technology</source>
                    <fpage>1</fpage>
                    <lpage>33</lpage>
                    <year>2017</year>

                </element-citation>
            </ref>
            <ref id="B20">

                <mixed-citation>GASSER, Urs; SCHULZ, Wolfgang. Governance of Online Intermediaries: Observations from a Series of National Case Studies. <italic>SSRN Electronic Journal</italic>, 2015. Disponível em: http://www.ssrn.com/abstract=2566364. Acesso em: 6 out. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>GASSER</surname>
                            <given-names>Urs</given-names>
                        </name>
                        <name>
                            <surname>SCHULZ</surname>
                            <given-names>Wolfgang</given-names>
                        </name>
                    </person-group>
                    <article-title>Governance of Online Intermediaries: Observations from a Series of National Case Studies</article-title>
                    <source>SSRN Electronic Journal</source>
                    <year>2015</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="http://www.ssrn.com/abstract=2566364">http://www.ssrn.com/abstract=2566364</ext-link></comment>
                    <date-in-citation content-type="access-date">6 out. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B21">

                <mixed-citation>GORWA, Robert; ASH, Timonthy Garton. Democratic Transparency in the Platform Society. <italic>In</italic>: PERSILY, Nathaniel; TUCKER, Joshua A. (Orgs.). <italic>Social media and democracy: the state of the field, prospects for reform</italic>. Cambridge New York Port Melbourne New Delhi Singapore: Cambridge University Press, 2020.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>GORWA</surname>
                            <given-names>Robert</given-names>
                        </name>
                        <name>
                            <surname>ASH</surname>
                            <given-names>Timonthy Garton</given-names>
                        </name>
                    </person-group>
                    <chapter-title>Democratic Transparency in the Platform Society</chapter-title>
                    <person-group person-group-type="compiler">
                        <name>
                            <surname>PERSILY</surname>
                            <given-names>Nathaniel</given-names>
                        </name>
                        <name>
                            <surname>TUCKER</surname>
                            <given-names>Joshua A.</given-names>
                        </name>
                    </person-group>
                    <source>Social media and democracy: the state of the field, prospects for reform</source>
                    <publisher-loc>Cambridge New York Port Melbourne New Delhi Singapore</publisher-loc>
                    <publisher-name>Cambridge University Press</publisher-name>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B22">

                <mixed-citation>GUESS, Andrew M.; LYONS, Benjamin A. Misinformation, Disinformation and Online Propaganda. <italic>In</italic>: PERSILY, Nathaniel; TUCKER, Joshua A. (Eds.). <italic>Social Media and Democracy. The State of the Field, Prospects for Reform</italic>. [s.l.]: Cambridge University Press, 2020.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>GUESS</surname>
                            <given-names>Andrew M.</given-names>
                        </name>
                        <name>
                            <surname>LYONS</surname>
                            <given-names>Benjamin A.</given-names>
                        </name>
                    </person-group>
                    <chapter-title>Misinformation, Disinformation and Online Propaganda</chapter-title>
                    <person-group person-group-type="editor">
                        <name>
                            <surname>PERSILY</surname>
                            <given-names>Nathaniel</given-names>
                        </name>
                        <name>
                            <surname>TUCKER</surname>
                            <given-names>Joshua A.</given-names>
                        </name>
                    </person-group>
                    <source>Social Media and Democracy. The State of the Field, Prospects for Reform</source>
                    <publisher-name>Cambridge University Press</publisher-name>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B23">

                <mixed-citation>HELBERGER, Natali. The Political Power of Platforms: How Current Attempts to Regulate Misinformation Amplify Opinion Power. <italic>Digital Journalism</italic>, v. 8, n. 6, p. 842-854, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>HELBERGER</surname>
                            <given-names>Natali</given-names>
                        </name>
                    </person-group>
                    <article-title>The Political Power of Platforms: How Current Attempts to Regulate Misinformation Amplify Opinion Power</article-title>
                    <source>Digital Journalism</source>
                    <volume>8</volume>
                    <issue>6</issue>
                    <fpage>842</fpage>
                    <lpage>854</lpage>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B24">

                <mixed-citation>HELDT, Amélie. Reading between the lines and the numbers: an analysis of the first NetzDG reports. <italic>Internet Policy Review</italic>, v. 8, n. 2, 2019. Disponível em: https://policyreview.info/node/1398. Acesso em: 10 out. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>HELDT</surname>
                            <given-names>Amélie</given-names>
                        </name>
                    </person-group>
                    <article-title>Reading between the lines and the numbers: an analysis of the first NetzDG reports</article-title>
                    <source>Internet Policy Review</source>
                    <volume>8</volume>
                    <issue>2</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/node/1398">https://policyreview.info/node/1398</ext-link></comment>
                    <date-in-citation content-type="access-date">10 out. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B25">

                <mixed-citation>HOFMANN, Jeanette. Mediated democracy – Linking digital technology to political agency. <italic>Internet Policy Review</italic>, v. 8, n. 2, 2019. Disponível em: https://policyreview.info/node/1416. Acesso em: 23 ago. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>HOFMANN</surname>
                            <given-names>Jeanette</given-names>
                        </name>
                    </person-group>
                    <article-title>Mediated democracy – Linking digital technology to political agency</article-title>
                    <source>Internet Policy Review</source>
                    <volume>8</volume>
                    <issue>2</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/node/1416">https://policyreview.info/node/1416</ext-link></comment>
                    <date-in-citation content-type="access-date">23 ago. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B26">

                <mixed-citation>IGLESIAS KELLER, Clara. Policy by judicialisation: the institutional framework for intermediary liability in Brazil. <italic>International Review of Law, Computers &amp; Technology</italic>, p. 1-19, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>IGLESIAS KELLER</surname>
                            <given-names>Clara</given-names>
                        </name>
                    </person-group>
                    <article-title>Policy by judicialisation: the institutional framework for intermediary liability in Brazil</article-title>
                    <source>International Review of Law, Computers &amp; Technology</source>
                    <fpage>1</fpage>
                    <lpage>19</lpage>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B27">

                <mixed-citation>JUNGHERR, Andreas; SCHROEDER, Ralph. Disinformation and the Structural Transformations of the Public Arena: Addressing the Actual Challenges to Democracy. <italic>Social Media + Society</italic>, v. 7, n. 1, p. 205630512198892, 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>JUNGHERR</surname>
                            <given-names>Andreas</given-names>
                        </name>
                        <name>
                            <surname>SCHROEDER</surname>
                            <given-names>Ralph</given-names>
                        </name>
                    </person-group>
                    <article-title>Disinformation and the Structural Transformations of the Public Arena: Addressing the Actual Challenges to Democracy</article-title>
                    <source>Social Media + Society</source>
                    <volume>7</volume>
                    <issue>1</issue>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B28">

                <mixed-citation>KARPF, David. On Digital Disinformation and Democratric Myths. <italic>MediaWell, Social Science Research Council</italic>, 2019. Disponível em: https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/. Acesso em: 10 jan. 2020.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>KARPF</surname>
                            <given-names>David</given-names>
                        </name>
                    </person-group>
                    <comment>On Digital Disinformation and Democratric Myths</comment>
                    <source>MediaWell, Social Science Research Council</source>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/">https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/</ext-link></comment>
                    <date-in-citation content-type="access-date">10 jan. 2020</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B29">

                <mixed-citation>KHAN, Irene. <italic>Disinformation and freedom of opinion and expression</italic>. [s.l.]: United Nations, General Assembly, 2021.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>KHAN</surname>
                            <given-names>Irene</given-names>
                        </name>
                    </person-group>
                    <source>Disinformation and freedom of opinion and expression</source>
                    <publisher-name>United Nations, General Assembly</publisher-name>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B30">

                <mixed-citation>MAC SÍTHIGH, Daithí. The road to responsibilities: new attitudes towards Internet intermediaries. <italic>Information &amp; Communications Technology Law</italic>, v. 29, n. 1, p. 1-21, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MAC SÍTHIGH</surname>
                            <given-names>Daithí</given-names>
                        </name>
                    </person-group>
                    <article-title>The road to responsibilities: new attitudes towards Internet intermediaries</article-title>
                    <source>Information &amp; Communications Technology Law</source>
                    <volume>29</volume>
                    <issue>1</issue>
                    <fpage>1</fpage>
                    <lpage>21</lpage>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B31">

                <mixed-citation>MACEDO JUNIOR, Ronaldo Porto. Freedom of Expression: what lessons should we learn from US experience? <italic>Revista Direito GV</italic>, v. 13, n. 1, p. 274-302, 2017.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MACEDO</surname>
                            <given-names>Ronaldo Porto</given-names>
                            <suffix>JUNIOR</suffix>
                        </name>
                    </person-group>
                    <article-title>Freedom of Expression: what lessons should we learn from US experience?</article-title>
                    <source>Revista Direito GV</source>
                    <volume>13</volume>
                    <issue>1</issue>
                    <fpage>274</fpage>
                    <lpage>302</lpage>
                    <year>2017</year>

                </element-citation>
            </ref>
            <ref id="B32">

                <mixed-citation>MANSELL, Robin; STEINMUELLER, W. Edward. <italic>Advanced introduction to platform economics</italic>. Cheltenham, UK; Northampton, MA: Edward Elgar Publishing, 2020. (Elgar advanced introductions).</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MANSELL</surname>
                            <given-names>Robin</given-names>
                        </name>
                        <name>
                            <surname>STEINMUELLER</surname>
                            <given-names>W. Edward</given-names>
                        </name>
                    </person-group>
                    <source>Advanced introduction to platform economics</source>
                    <publisher-loc>Cheltenham, UK</publisher-loc>
                    <publisher-loc>Northampton, MA</publisher-loc>
                    <publisher-name>Edward Elgar Publishing</publisher-name>
                    <year>2020</year>
                    <comment>Elgar advanced introductions</comment>

                </element-citation>
            </ref>
            <ref id="B33">

                <mixed-citation>MARSDEN, Chris; MEYER, Trisha; BROWN, Ian. Platform values and democratic elections: How can the law regulate digital disinformation? <italic>Computer Law &amp; Security Review</italic>, v. 36, p. 105373, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MARSDEN</surname>
                            <given-names>Chris</given-names>
                        </name>
                        <name>
                            <surname>MEYER</surname>
                            <given-names>Trisha</given-names>
                        </name>
                        <name>
                            <surname>BROWN</surname>
                            <given-names>Ian</given-names>
                        </name>
                    </person-group>
                    <article-title>Platform values and democratic elections: How can the law regulate digital disinformation?</article-title>
                    <source>Computer Law &amp; Security Review</source>
                    <volume>36</volume>
                    <fpage>105373</fpage>
                    <lpage>105373</lpage>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B34">

                <mixed-citation>MARWICK, Alice; KUO, Rachel; CAMERON, Shanice Jones; <italic>et al</italic>. <italic>Critical Disinformation Studies – A Syllabus</italic>. [s.l.]: Center for Information, Technology and Public Life – University of Noth Carolina at Chapel Hill, 2021. Disponível em: https://citap.unc.edu/research/critical-disinfo/. Acesso em: 5 ago. 2021.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MARWICK</surname>
                            <given-names>Alice</given-names>
                        </name>
                        <name>
                            <surname>KUO</surname>
                            <given-names>Rachel</given-names>
                        </name>
                        <name>
                            <surname>CAMERON</surname>
                            <given-names>Shanice Jones</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <source>Critical Disinformation Studies – A Syllabus</source>
                    <publisher-name>Center for Information, Technology and Public Life – University of Noth Carolina at Chapel Hill</publisher-name>
                    <year>2021</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://citap.unc.edu/research/critical-disinfo/">https://citap.unc.edu/research/critical-disinfo/</ext-link></comment>
                    <date-in-citation content-type="access-date">5 ago. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B35">

                <mixed-citation>MARWICK, Alice; LEWIS, Rebecca. <italic>Media manipulation and disinformation online</italic>. [s.l.]: Data &amp; Society Research Institute, 2020. Disponível em: https://datasociety.net/wp-content/uploads/2017/05/DataAndSociety_MediaManipulationAndDisinformationOnline-1.pdf. Acesso em: 10 nov. 2020.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MARWICK</surname>
                            <given-names>Alice</given-names>
                        </name>
                        <name>
                            <surname>LEWIS</surname>
                            <given-names>Rebecca</given-names>
                        </name>
                    </person-group>
                    <source>Media manipulation and disinformation online</source>
                    <publisher-name>Data &amp; Society Research Institute</publisher-name>
                    <year>2020</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://datasociety.net/wp-content/uploads/2017/05/DataAndSociety_MediaManipulationAndDisinformationOnline-1.pdf">https://datasociety.net/wp-content/uploads/2017/05/DataAndSociety_MediaManipulationAndDisinformationOnline-1.pdf</ext-link></comment>
                    <date-in-citation content-type="access-date">10 nov. 2020</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B36">

                <mixed-citation>MENDES, Laura Schertel. <italic>Privacidade, proteção de dados e direito do consumidor: linhas gerais de um novo direito fundamental</italic>. São Paulo: Saraiva, 2014.</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MENDES</surname>
                            <given-names>Laura Schertel</given-names>
                        </name>
                    </person-group>
                    <source>Privacidade, proteção de dados e direito do consumidor: linhas gerais de um novo direito fundamental</source>
                    <publisher-loc>São Paulo</publisher-loc>
                    <publisher-name>Saraiva</publisher-name>
                    <year>2014</year>

                </element-citation>
            </ref>
            <ref id="B37">

                <mixed-citation>MOSES, Lyria Bennett. How to Think about Law, Regulation and Technology: Problems with ‘Technology’ as a Regulatory Target. <italic>Law, Innovation and Technology</italic>, v. 5, n. 1, p. 1-20, 2013.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>MOSES</surname>
                            <given-names>Lyria Bennett</given-names>
                        </name>
                    </person-group>
                    <article-title>How to Think about Law, Regulation and Technology: Problems with ‘Technology’ as a Regulatory Target</article-title>
                    <source>Law, Innovation and Technology</source>
                    <volume>5</volume>
                    <issue>1</issue>
                    <fpage>1</fpage>
                    <lpage>20</lpage>
                    <year>2013</year>

                </element-citation>
            </ref>
            <ref id="B38">

                <mixed-citation>NENADIĆ, Iva. Unpacking the “European approach” to tackling challenges of disinformation and political manipulation. <italic>Internet Policy Review</italic>, v. 8, n. 4, 2019. Disponível em: https://policyreview.info/node/1436. Acesso em: 27 jun. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>NENADIĆ</surname>
                            <given-names>Iva</given-names>
                        </name>
                    </person-group>
                    <article-title>Unpacking the “European approach” to tackling challenges of disinformation and political manipulation</article-title>
                    <source>Internet Policy Review</source>
                    <volume>8</volume>
                    <issue>4</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/node/1436">https://policyreview.info/node/1436</ext-link></comment>
                    <date-in-citation content-type="access-date">27 jun. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B39">

                <mixed-citation>NEO, Ric. The International Discourses and Governance of Fake News. <italic>Global Policy</italic>, v. 12, n. 2, p. 214-228, 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>NEO</surname>
                            <given-names>Ric</given-names>
                        </name>
                    </person-group>
                    <article-title>The International Discourses and Governance of Fake News</article-title>
                    <source>Global Policy</source>
                    <volume>12</volume>
                    <issue>2</issue>
                    <fpage>214</fpage>
                    <lpage>228</lpage>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B40">

                <mixed-citation>OGNYANOVA, Katherine; LAZER, David; ROBERTSON, Ronald E.; <italic>et al</italic>. Misinformation in action: Fake news exposureis linked to lower trust in media, higher trust in government when your side is in power. <italic>Harvard Kennedy School Misinformation Review</italic>, 2020. Disponível em: https://misinforeview.hks.harvard.edu/?p=1689. Acesso em: 30 ago. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>OGNYANOVA</surname>
                            <given-names>Katherine</given-names>
                        </name>
                        <name>
                            <surname>LAZER</surname>
                            <given-names>David</given-names>
                        </name>
                        <name>
                            <surname>ROBERTSON</surname>
                            <given-names>Ronald E.</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <article-title>Misinformation in action: Fake news exposureis linked to lower trust in media, higher trust in government when your side is in power</article-title>
                    <source>Harvard Kennedy School Misinformation Review</source>
                    <year>2020</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://misinforeview.hks.harvard.edu/?p=1689">https://misinforeview.hks.harvard.edu/?p=1689</ext-link></comment>
                    <date-in-citation content-type="access-date">30 ago. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B41">

                <mixed-citation>QI, Aimin; SHAO, Guosong; ZHENG, Wentong. Assessing China’s Cybersecurity Law. <italic>Computer Law &amp; Security Review</italic>, v. 34, n. 6, p. 1342-1354, 2018.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>QI</surname>
                            <given-names>Aimin</given-names>
                        </name>
                        <name>
                            <surname>SHAO</surname>
                            <given-names>Guosong</given-names>
                        </name>
                        <name>
                            <surname>ZHENG</surname>
                            <given-names>Wentong</given-names>
                        </name>
                    </person-group>
                    <article-title>Assessing China’s Cybersecurity Law</article-title>
                    <source>Computer Law &amp; Security Review</source>
                    <volume>34</volume>
                    <issue>6</issue>
                    <fpage>1342</fpage>
                    <lpage>1354</lpage>
                    <year>2018</year>

                </element-citation>
            </ref>
            <ref id="B42">

                <mixed-citation>RAUCHFLEISCH, Adrian; KAISER, Jonas. The False positive problem of automatic bot detection in social science research. <italic>PLOS ONE</italic>, v. 15, n. 10, p. e0241045, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>RAUCHFLEISCH</surname>
                            <given-names>Adrian</given-names>
                        </name>
                        <name>
                            <surname>KAISER</surname>
                            <given-names>Jonas</given-names>
                        </name>
                    </person-group>
                    <article-title>The False positive problem of automatic bot detection in social science research</article-title>
                    <source>PLOS ONE</source>
                    <volume>15</volume>
                    <issue>10</issue>
                    <elocation-id>e0241045</elocation-id>
                    <year>2020</year>

                </element-citation>
            </ref>
            <ref id="B43">

                <mixed-citation>RIEDER, Bernhard; HOFMANN, Jeanette. Towards platform observability. <italic>Internet Policy Review</italic>, v. 9, n. 4, 2020. Disponível em: https://policyreview.info/articles/analysis/towards-platform-observability. Acesso em: 6 out. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>RIEDER</surname>
                            <given-names>Bernhard</given-names>
                        </name>
                        <name>
                            <surname>HOFMANN</surname>
                            <given-names>Jeanette</given-names>
                        </name>
                    </person-group>
                    <article-title>Towards platform observability</article-title>
                    <source>Internet Policy Review</source>
                    <volume>9</volume>
                    <issue>4</issue>
                    <year>2020</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://policyreview.info/articles/analysis/towards-platform-observability">https://policyreview.info/articles/analysis/towards-platform-observability</ext-link></comment>
                    <date-in-citation content-type="access-date">6 out. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B44">

                <mixed-citation>ROSSINI, Patrícia; STROMER-GALLEY, Jennifer; BAPTISTA, Erica Anita; <italic>et al</italic>. Dysfunctional information sharing on WhatsApp and Facebook: The role of political talk, cross-cutting exposure and social corrections. <italic>New Media &amp; Society</italic>, v. 23, n. 8, p. 2430-2451, 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>ROSSINI</surname>
                            <given-names>Patrícia</given-names>
                        </name>
                        <name>
                            <surname>STROMER-GALLEY</surname>
                            <given-names>Jennifer</given-names>
                        </name>
                        <name>
                            <surname>BAPTISTA</surname>
                            <given-names>Erica Anita</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <article-title>Dysfunctional information sharing on WhatsApp and Facebook: The role of political talk, cross-cutting exposure and social corrections</article-title>
                    <source>New Media &amp; Society</source>
                    <volume>23</volume>
                    <issue>8</issue>
                    <fpage>2430</fpage>
                    <lpage>2451</lpage>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B45">

                <mixed-citation>SCHULDT, Lasse. The rebirth of Malaysia’s fake news law – and what the NetzDG has to do with it. Disponível em: https://verfassungsblog.de/malaysia-fake-news/. Acesso em: 13 set. 2021.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>SCHULDT</surname>
                            <given-names>Lasse</given-names>
                        </name>
                    </person-group>
                    <source>The rebirth of Malaysia’s fake news law – and what the NetzDG has to do with it</source>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://verfassungsblog.de/malaysia-fake-news/">https://verfassungsblog.de/malaysia-fake-news/</ext-link></comment>
                    <date-in-citation content-type="access-date">13 set. 2021</date-in-citation>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B46">

                <mixed-citation>SCHULZ, Wolfgang. <italic>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms</italic>. [s.l.]: Hoover Institution, Stanford University, 2019. (Aegis Series). Disponível em: https://www.hoover.org/sites/default/files/research/docs/schulz_webreadypdf.pdf. Acesso em: 12 dez. 2019.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>SCHULZ</surname>
                            <given-names>Wolfgang</given-names>
                        </name>
                    </person-group>
                    <source>Roles and Responsibilities of Information Intermediaries: Fighting Misinformation as a Test Case for Human-Rights Respecting Governance of Social Media Platforms</source>
                    <publisher-name>Hoover Institution, Stanford University</publisher-name>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://www.hoover.org/sites/default/files/research/docs/schulz_webreadypdf.pdf">https://www.hoover.org/sites/default/files/research/docs/schulz_webreadypdf.pdf</ext-link></comment>
                    <date-in-citation content-type="access-date">12 dez. 2019</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B47">

                <mixed-citation>SUZOR, Nicolas P.; WEST, Sarah Meyers; QUODLING, Andrew; <italic>et al</italic>. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. <italic>International Journal of Communication</italic>, v. 13, p. 1526-1543, 2019.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>SUZOR</surname>
                            <given-names>Nicolas P.</given-names>
                        </name>
                        <name>
                            <surname>WEST</surname>
                            <given-names>Sarah Meyers</given-names>
                        </name>
                        <name>
                            <surname>QUODLING</surname>
                            <given-names>Andrew</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <article-title>What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation</article-title>
                    <source>International Journal of Communication</source>
                    <volume>13</volume>
                    <fpage>1526</fpage>
                    <lpage>1543</lpage>
                    <year>2019</year>

                </element-citation>
            </ref>
            <ref id="B48">

                <mixed-citation>SUZOR, Nicolas; VAN GEELEN, Tess; MYERS WEST, Sarah. Evaluating the legitimacy of platform governance: A review of research and a shared research agenda. <italic>International Communication Gazette</italic>, v. 80, n. 4, p. 385-400, 2018.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>SUZOR</surname>
                            <given-names>Nicolas</given-names>
                        </name>
                        <name>
                            <surname>VAN GEELEN</surname>
                            <given-names>Tess</given-names>
                        </name>
                        <name>
                            <surname>MYERS WEST</surname>
                            <given-names>Sarah</given-names>
                        </name>
                    </person-group>
                    <article-title>Evaluating the legitimacy of platform governance: A review of research and a shared research agenda</article-title>
                    <source>International Communication Gazette</source>
                    <volume>80</volume>
                    <issue>4</issue>
                    <fpage>385</fpage>
                    <lpage>400</lpage>
                    <year>2018</year>

                </element-citation>
            </ref>
            <ref id="B49">

                <mixed-citation>SYLVAIN, Oliver. Internet governance and democratic legitimacy. <italic>Federal Communications Law Journal</italic>, v. 62, n. 2, p. 205-274, 2010.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>SYLVAIN</surname>
                            <given-names>Oliver</given-names>
                        </name>
                    </person-group>
                    <article-title>Internet governance and democratic legitimacy</article-title>
                    <source>Federal Communications Law Journal</source>
                    <volume>62</volume>
                    <issue>2</issue>
                    <fpage>205</fpage>
                    <lpage>274</lpage>
                    <year>2010</year>

                </element-citation>
            </ref>
            <ref id="B50">

                <mixed-citation>TAMBINI, Damian. Rights and Responsibilities of Internet Intermediaries in Europe: The Need for Policy Coordination. Disponível em: https://www.cigionline.org/articles/rights-and-responsibilities-internet-intermediaries-europe-need-policy-coordination. Acesso em: 14 abr. 2021.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>TAMBINI</surname>
                            <given-names>Damian</given-names>
                        </name>
                    </person-group>
                    <source>Rights and Responsibilities of Internet Intermediaries in Europe: The Need for Policy Coordination</source>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://www.cigionline.org/articles/rights-and-responsibilities-internet-intermediaries-europe-need-policy-coordination">https://www.cigionline.org/articles/rights-and-responsibilities-internet-intermediaries-europe-need-policy-coordination</ext-link></comment>
                    <date-in-citation content-type="access-date">14 abr. 2021</date-in-citation>
                    <year>2021</year>

                </element-citation>
            </ref>
            <ref id="B51">

                <mixed-citation>VALENTE, Jonas C. L. Regulando desinformação e fake news: um panorama internacional das respostas ao problema. <italic>Comunicação pública</italic>, v.14 n. 27, 2019. Disponível em: http://journals.openedition.org/cp/5262. Acesso em: 27 jun. 2021.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>VALENTE</surname>
                            <given-names>Jonas C. L.</given-names>
                        </name>
                    </person-group>
                    <article-title>Regulando desinformação e fake news: um panorama internacional das respostas ao problema</article-title>
                    <source>Comunicação pública</source>
                    <volume>14</volume>
                    <issue>27</issue>
                    <year>2019</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="http://journals.openedition.org/cp/5262">http://journals.openedition.org/cp/5262</ext-link></comment>
                    <date-in-citation content-type="access-date">27 jun. 2021</date-in-citation>

                </element-citation>
            </ref>
            <ref id="B52">

                <mixed-citation>WALKER, Shawn; MERCEA, Dan; BASTOS, Marco. The disinformation landscape and the lockdown of social platforms. <italic>Information, Communication &amp; Society</italic>, v. 22, n. 11, p. 1531-1543, 2019.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>WALKER</surname>
                            <given-names>Shawn</given-names>
                        </name>
                        <name>
                            <surname>MERCEA</surname>
                            <given-names>Dan</given-names>
                        </name>
                        <name>
                            <surname>BASTOS</surname>
                            <given-names>Marco</given-names>
                        </name>
                    </person-group>
                    <article-title>The disinformation landscape and the lockdown of social platforms</article-title>
                    <source>Information, Communication &amp; Society</source>
                    <volume>22</volume>
                    <issue>11</issue>
                    <fpage>1531</fpage>
                    <lpage>1543</lpage>
                    <year>2019</year>

                </element-citation>
            </ref>
            <ref id="B53">

                <mixed-citation>WARDLE, Claire; DERAKHSHAN, Hossein. <italic>Information Disorder: Toward an interdisciplinary framework for research and policy making</italic>. [s.l.]: Council of Europe, 2017. (Council of Europe Report).</mixed-citation>

                <element-citation publication-type="book">
                    <person-group person-group-type="author">
                        <name>
                            <surname>WARDLE</surname>
                            <given-names>Claire</given-names>
                        </name>
                        <name>
                            <surname>DERAKHSHAN</surname>
                            <given-names>Hossein</given-names>
                        </name>
                    </person-group>
                    <source>Information Disorder: Toward an interdisciplinary framework for research and policy making</source>
                    <publisher-name>Council of Europe</publisher-name>
                    <year>2017</year>
                    <comment>Council of Europe Report</comment>

                </element-citation>
            </ref>
            <ref id="B54">

                <mixed-citation>WISEMAN, Jamie. Rush to pass ‘fake news’ laws during Covid-19 intensifying global media freedom challenges. <italic>International Press Institute</italic>, 2020. Disponível em: https://ipi.media/rush-to-pass-fake-news-laws-during-covid-19-intensifying-global-media-freedom-challenges/.</mixed-citation>

                <element-citation publication-type="webpage">
                    <person-group person-group-type="author">
                        <name>
                            <surname>WISEMAN</surname>
                            <given-names>Jamie</given-names>
                        </name>
                    </person-group>
                    <comment>Rush to pass ‘fake news’ laws during Covid-19 intensifying global media freedom challenges</comment>
                    <source>International Press Institute</source>
                    <year>2020</year>
                    <comment>Disponível em: <ext-link ext-link-type="uri" xlink:href="https://ipi.media/rush-to-pass-fake-news-laws-during-covid-19-intensifying-global-media-freedom-challenges/">https://ipi.media/rush-to-pass-fake-news-laws-during-covid-19-intensifying-global-media-freedom-challenges/</ext-link></comment>

                </element-citation>
            </ref>
            <ref id="B55">

                <mixed-citation>ZAROUALI, Brahim; DOBBER, Tom; DE PAUW, Guy; <italic>et al</italic>. Using a Personality-Profiling Algorithm to Investigate Political Microtargeting: Assessing the Persuasion Effects of Personality-Tailored Ads on Social Media. <italic>Communication Research</italic>, p. 009365022096196, 2020.</mixed-citation>

                <element-citation publication-type="journal">
                    <person-group person-group-type="author">
                        <name>
                            <surname>ZAROUALI</surname>
                            <given-names>Brahim</given-names>
                        </name>
                        <name>
                            <surname>DOBBER</surname>
                            <given-names>Tom</given-names>
                        </name>
                        <name>
                            <surname>DE PAUW</surname>
                            <given-names>Guy</given-names>
                        </name>
                        <etal/>
                    </person-group>
                    <article-title>Using a Personality-Profiling Algorithm to Investigate Political Microtargeting: Assessing the Persuasion Effects of Personality-Tailored Ads on Social Media</article-title>
                    <source>Communication Research</source>
                    <year>2020</year>

                </element-citation>
            </ref>

        </ref-list>
    </back>
</article>
