Facebook Whistleblower Testifies Before Congress

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
okay good morning we're gonna get started here uh in order to provide our technical and digital staff with notice of the hearing start i'm going to count down from five uh before calling the hearing to order five four three two one uh the committee will now come to order today the subcommittee on communications and technology is holding a hearing entitled hold big tech accountable targeted reforms detects legal immunity due to the covet 19 public health emergency members can participate in today's hearing either in person or remotely via online video conferencing members who are not vaccinated and participating in person must wear a mask and be socially distanced such members may remove their mask when they are under recognition and speaking from a microphone staff and press who are not vaccinated and present in the committee room must wear a mask at all times and be socially distant for members participating remotely your microphones will be set on mute for the purpose of eliminating inadvertent background noise members participating remotely will need to unmute your microphone each time you wish to speak please note that once you unmute your microphone anything that is said in webex will be heard over the loudspeakers in the committee room and subject to be heard by livestream and c-span since members are participating from different locations to days at today's hearing all recognition of members such as for questions will be in the order of subcommittee seniority documents for the record can be sent to joe orlando at the email address we provided to staff all documents will be entered into the record at the conclusion of the hearing we're now going to have opening statements uh the chair now recognizes himself for five minutes for an opening statement in august 2015 wesley greer a young man who had been recovering from addiction went to a website seeking to purchase heroin this website's algorithm took users information to steer them to groups and individuals who had similar interests in wesley's case the website connected him to a drug dealer this dealer had been subject to multiple investigations by law enforcement due to his actions on this particular website after the website's algorithm steered wesley to this drug dealer's postings the two got into direct contact and wesley bought what he thought was heroin but in fact was a lethal dose of fentanyl wesley was found dead on august 19th in 2016 another young man matthew herrick ended an abusive relationship he soon realized that his ex had created a fake profile of him on a dating app this app's geotargeting function and algorithm allowed other users to connect with this fake profile throughout this app through this app matthew's ex sent men to matthew's home and work with the expectation that they would be fulfilling his rape fantasy these traumatizing encounters matthew was followed home and into stairwells where he worked and accosted after a shift shook matthew both emotionally and professionally matthew repeatedly asked the app to remove the fake profile the app however did nothing wesley's family and matthew share something in common they were denied the basic opportunity to determine if these websites shared any legal blame along with the users who posted the content the question of whether the platform should be held liable the companies that developed the algorithms gathered the data and profited off the users was precluded by section 230. they might not have won but they never even had a chance to get their case tried these are just two instances of section 230 locking the courthouse doors to people with real world injuries caused by online actions since i have chaired this subcommittee we have held multiple hearings on this issue we've heard from ceos of the largest tech platforms we've heard from small platforms we've heard from experts and we've heard from those most affected by these behaviors and these oversight activities didn't start with me though republicans have been investigating this issue as well they have a number of discussion drafts and bills they have introduced many of those ideas are worth exploring the concept of not providing immunity for platforms algorithms for example are in both the justice against malicious algorithms act that i've introduced and and mrs mcnorris rogers discussion draft there's a bipartisan desire to reform the court's interpretation of section 230 and the american public wants to see us get things done i urge all my colleagues republican and democratic to bring their ideas forward now and let's work together on bipartisan legislation because we can't continue to wait the largest tech companies would like nothing more than for congress to fight amongst itself while nothing happens and they welcome those complaining about process claiming that congress doesn't understand or saying that this would break the internet because these platforms don't want to be held accountable the users suffering harm deserve better from us and we will act but for the pandemic we would have some of these victims with us in the room today and while they cannot be here in person the family of wesley is watching today matthew herrick is watching today and the advocates for children and marginalized groups and victims rights are watching today to start today we will hear from experts about the harms we are seeing online and our second expert panel will focus on proposals to reform section 230 and in a little over a week chairwoman schakowsky will continue this series in her subcommittee reviewing legislation that can bring additional transparency and accountability for the problems we consider today i want to thank all our panelists for joining us and i look forward to their testimony and with that i yield the remainder of my time to congresswoman eshoo thank you mr chairman for yielding to me uh by way of background i was on the conference committee for the 1996 uh telecom act and i continue to strongly believe in section 230's core benefit which is to protect user speech but algorithms select what content will appear personalized for each user the platform is then more than just a conduit transferring one user's speech to others platforms should not be immune from courts examining if algorithmic amplification causes harms and that's the core idea of the two bills i've co-led so thank you mr chairman for convening this highly important hearing uh and i yield back general lady yields back the chair now recognizes my good friend mr latta the ranking member for the subcommittee on communications and technology for five minutes for his opening statement well thank you my uh good friend and chairman i greatly appreciate it i also want to thank our witness panel for being here today to discuss the potential legislative reforms of section 230 of the communications decency act republicans on the energy and commerce committee are leading on ways to hold big tech com companies accountable for the harms caused by their platforms in january january we announced our big tech accountability platform which began our efforts to take a comprehensive look at ways to reform section 230. and proud republicans have been focused on and remain focused on reconsidering the extent to which big tech deserves to retain their significant liability protections every step of the way we have encouraged our democratic democratic colleagues to join us in the quest to hold big tech accountable while evaluating how we can reform section 230. we saw input from the public on their concerns with big tech from stakeholders on ways to stop censorship while protecting small businesses and innovation and for members of congress on proposals that they have supported hearing from the public stakeholders and members of congress informed the discussion drafts that every republican on this committee released in july our discussion drafts range from amending section 230 to holding big tech accountable for taking down constitutionally protected speech and facilitating illegal drug sales to increasing transparency requirements on how social media companies moderate content section 230 became law in 1996 in response to several court cases most notably stratton oakmont versus prodigy services to allow online platforms to moderate unlawful or indecent content without fear of liability it has two main components a provision that exempts platforms for being liable for content that is posted on their site by a third-party user and a second provision that exempts platforms from being liable for content that they remove in good faith the internet has grown substantially since 1996 and it is clear big tech has abused this power granted to them enter the event panelist numero password followed by pound they censor conservative voices and use algorithms to suppress content that does not fit their narrative they hide research that shows the negative impact their platforms have on the mental health of our children they allow the sale of illegal drugs on their platforms including fentanyl which we all know is killing americans every day while these actions are happening on big tech platforms users have no recourse when conservatives are silenced the appeals process if it exists can be difficult to navigate big tech hides behind section 230 to avoid liability for real world harms their platforms are causing including harms to our children section 230 is supposed to protect platforms for removing content in good faith but says nothing about their liability for when they are acting as bad stewards of their platforms to address this issue i've authored a carve-out section 230 protection for platforms that proposedly promote solicit or purposely excuse me purposely promote solicit or facilitate material by another information content provider if the platform knew or had reason to know that the kind of intent would violate criminal federal law when big tech acts as bad stewards on their platforms or as bad samaritans they should no longer be entitled to protections under section 230. we will also discuss legislation notice on today's hearing which i am concerned could lead to unintended consequences like curtailing free speech and innovation section 230 reform must be taken seriously and any legislative proposal that eventually gets enacted must be thoroughly vetted we're at pivotal time for free speech in america as our generation's turn to uphold the rights of on which our country was founded i look forward to hearing feedback from the witnesses on the proposes in front of us today and before i yield back mr chairman i'd ask the unanimous consent that dr burgess who's not a member of the subcommittee but a distinguished member of the full committee be able to wave on to the committee yeah put out objection thank you very much and with that mr chairman i yield back the balance of my time gentlemen yields back chair now recognizes mr pallone for five minutes for his opening statement thank you chairman doyle today's hearing is the first of two in which this committee will discuss legislative reforms to hold social media companies accountable and we have two panels today the first will focus on the insidious problems from which some social media platforms online are profiting and the second we'll consider how reforms to section 230 of the communications decency act and play a part in addressing those problems and then next week in a consumer protection and commerce subcommittee hearing we'll discuss how consumer protection-focused proposals can increase these companies accountability to the public now these two legislative hearings come after years of repeated bipartisan calls for online platforms to change their ways since 2018 we've held six hearings examining tech platforms accountability and our members have sent countless letters the most prominent online platforms have repeatedly feigned ignorance before this committee but our suspicions unfortunately have been repeatedly confirmed the latest coming from former facebook employee francis haugen we learned how the platforms downplayed research that teen girls were especially vulnerable and suffering online we've learned how executives knew their algorithms amplify harmful and divisive content and rejected proposals to fix the issue we've seen a pattern of platforms highlighting covet 19 in misinformation conspiracy theories and divisiveness we've learned that during a civil rights audit one platform failed to disclose that its algorithms disproportionately harm minority groups for years now these platforms have acted above the law and outside the reach of regulators and the public and its time and it's a time for change in my opinion the legal protections provided by section 230 of the communications decency act have played a role in that lack of accountability by stopping victims from having their cases heard in one recently filed suit a video chatting platform that is commonly used to engage in online sex between users paired a young girl with a middle-aged man he convinced her to send nude photos and videos of herself including by blackmailing her this man forced her to engage in sexual performances for himself and his friends and even to recruit others and based on court precedent section 230 may very well threaten justice for this young girl and i hope it does not because the platform was responsible for pairing the young girl with the middle-aged man now judges and a whole host of diverse interests including many of our witnesses have suggested that courts may have interpreted section 230 more broadly than congress intended and averaged reform to be clear section 230 is critically important to promoting a vibrant and free internet but i agree with those who suggest the courts have allowed it to stray too far judge katzman the late chief judge of the second circuit brought some clarity to this issue in his dissent in force versus facebook he stated that section 230 does not and should not bar relief when a plaintiff brings a claim that is based not on the content of the information shown but rather on the connections that platforms algorithms make between individuals of course that was not the court's ruling in that case and a challenge for us is to clarify the statute if the courts don't while ensuring that we balance the statutes good against the pain it inflicts so today we'll consider four proposals that would amend or clarify section 230 to protect users while promoting open and free online dialogue these bills do not impose liability on the platforms and do not directly restrict the content that platforms make available they simply limit the section 230 protections in certain circumstances including when platforms use algorithms to amplify certain content and these targeted proposals for reform are intended to balance the benefits of vibrant free expression online while ensuring that platforms cannot hide behind section 230 when their business practices meaningfully contribute to real harm now i have to say i'm disappointed that my republican colleagues chose not to introduce the discussion drafts they released in july so that they could be included in today's hearing in order to actually pass legislation that will begin to hold these platforms accountable we must work together and i urge my colleagues not to close the door on bipartisanship for an issue that is so critical because after all i believe there is more than unites us than divides us on clarifying section 230. for example ranking member rogers discussion draft includes a provision similar to my justice against malicious algorithms act and that her proposal would clarify that section 230 immunity does not apply to algorithmic recommendations while the proposals aren't identical this is a place for us to start what i hope could be bipartisan work i just want to say one more thing mr chairman you know the real problem i see is that big tech's primary focus is to make money and i know we have a market economy and that's always a company's primary purpose but they give the impression to the public that they care about content values and have a social purpose that somehow they care about consumers of the first amendment and they have and that you know they have some value to the consumer or to the public and i hope that continues to be true but if it is then they should be held accountable to achieve these goals you can't go out and say i'm not primarily focused on making money i want to help people but then not be accountable for these bad actions so i just wanted to mention that thank you mr chairman gentleman yields back here now recognizes mrs rogers for five minutes for her opening statement thank you mr chairman good morning big tech companies have not been good stewards of their platforms i've been pretty clear with all the the ceos big tech has broken my trust big tech has failed to uphold the fundamental american principle free speech and expression big tech platforms like twitter and facebook used to provide a promising platform for free speech and robust debates but they no longer operate as public squares they do not promote the battle of ideas they actively work against it they shut down free speech and censor any viewpoint that does not fit their liberal ideology and big tech has exploited and harmed our children in our march hearing with the ceos i asked the big tech companies why they deserve liability protections congress provided for them more than 20 years ago unfortunately their behavior has not improved and we only have more examples of them being poor stewards of their platforms big tech has abused its power by defining what is true what we should believe what we should think and controlling what we need it's wrong destroying free speech is what happens in authoritarian countries behind the great chinese firewall here in america we believe in the we believe in dialogue we believe in the battle of ideas we defend the battle of ideas and we used to fight to protect our fundamental principles rather than censor and silence speech the answer should be more speech that's the american way big tech should not be the arbiters of truth not for me my community our children or any american today we should be focused on solutions that hold big tech accountable for how they sense censor allow and promote illegal content and knowingly endanger our children it's wrong for anyone to use this opportunity to push for more censorship more power and more control over what they determine americans should say post think and do which is why i'm deeply troubled by the path before us it's calling for more censorship one of the bills before us today the justice against malicious algorithms act is a thinly veiled attempt to pressure companies to censor more speech the proposal will put companies on the hook for any content an algorithm amplifies or recommends that contributes to quote severe emotional injury of any person how does the bill define severe emotional injury it doesn't clearly companies will have to decide between leaving up content that may offend someone or fight it in court or censor content that reaches a user which do you think that they will choose and there's no doubt who they will silence content that does not line up with their liberal ideology while the section 230 bill before us today pushes for more censorship we believe republicans are fighting for free speech in january we rolled out our big tech accountability platform that made clear we will protect free speech and robust debates on big tech platforms and we've been working hard since then today we will discuss a number of proposals that reform section 230. my proposal which i'm leading along with my good friend congressman jim jordan narrowly amends section 230 to protect free speech small businesses and startups will not be impacted by our bill we remove the largest big tech companies from existing 230 protections and put them under their own set of rules under this proposal big tech will be held accountable for censoring constitutionally protected speech big tech will no longer be able to exploit the ambiguity and discretion we see in the current law big tech will be more responsible for content that they choose to amplify promote or suggest big tech will be forced to be transparent about their content decisions and conservatives will be empowered to challenge big tech censorship decisions amending 230 alone is not enough which is why we are taking an all-of-the-above approach which includes increasing transparency and also holding big tech accountable for how they intentionally manipulate and harm children for their own bottom line while there's agreement on the need to hold big tech accountable with this section 230 reforms it's clear there are drastically different approaches and solutions i look forward to hearing from the witnesses today and i yield back general lady yields back chair would like to remind members that pursuant to committee rules all members written opening statements shall be made part of the record so now i'd like to introduce our witnesses for today's first panel miss francis haugen former facebook employee mr james steyer founder and ceo of common sense media ms cara frederick research fellow in technology policy heritage foundation and mr rashad robinson president of the color of change we want to thank our witnesses for joining us today we look forward to your testimony i do understand that we will lose mr steyer for about 10 minutes at 11 30 so i would encourage members to be conscious of that i understand he will be back at 11 40 and of course members may always submit questions for the record at this time the chair will recognize each witness for five minutes to provide their opening statement before we begin i would like to explain the lighting system in front of our witnesses as a series of lights the light will turn initially be green it will turn yellow when you have a minute remaining please begin to wrap up your testimony at that point the light will turn red when your time expires so let's get started miss hoggin you are now recognized for five minutes subcommittee chairman doyle ranking member lada members of the committee thank you for the opportunity to appear before you today my name is francis haugen i used to work at facebook i joined the company because i believe facebook has the potential to bring out the best in us but i'm here today because i believe that facebook's products harm children stoke division in our communities threaten our democracy weaken our national security and much more facebook is a company that is paid for its immense profits with our safety and security i am honored to be here today to share what i know and i am grateful for the level of scrutiny these issues are getting i hope we can stay focused on the real harms to real people rather than talk in abstractions this is about the teenagers whose mental health is undermined by instagram and is about their parents and teachers who are struggling to deal with the consequences of that harm it is about the doctors and nurses who have to cope with conspiracies about covid19 and vaccines is about people who have suffered harassment online it is about families at home and around the world who live in places where hate fear and conflict have been ratcheted up to a fever pitch amongst as a result of online radicalization facebook may not be the cause of all these problems but the company has unquestionably made them worse facebook knows what is happening on the platform and they have systematically under-invested in fighting these those harms they know they do far too little about it in fact they have incentives for it to be this way and that is what has to change facebook will not change until the incentives change the company's leadership knows how to make facebook and instagram safer but they repeatedly chose to ignore the these options and continue to put their profits before people they can change the name of the company but unless they change the products they will continue to damage the health and safety of our communities and threaten the entire integrity of our democracies there have been many others sounding the same alarm this committee has heard from many experts in recent years they have done the painstaking work of documenting these harms and have been repeatedly gas-lit by facebook about what they found my disclosures back up their findings we have long known that facebook's business model is problematic now we have the evidence to prove it the documents i have shared with congress speak for themselves what i have to say about these documents is grounded in far more than my experience at facebook i have worked as a product manager at large tech companies since 2006 including google pinterest yelp and facebook my job has largely focused on algorithmic products like google plus search and recommendation systems like the one that powers facebook news feed i know my way around these products and i've watched them evolve over the many years working at four major tech companies that operate different types of social networks has given me the perspective to compare and contrast how each company approaches and deals with different challenges the choices being made by facebook's leadership are a huge problem for our children for our communities and for our democracy that's why i came forward and let's be clear it doesn't have to be this way they can make different choices we are here today because of deliberate choices facebook has made during my time at the company first working as the lead product manager for civic misinformation and later on counter espionage i saw that facebook repeatedly encountered conflicts between its own profits and our safety management consistently resolved those conflicts in favor of its own profits i want to be extremely clear this is not about good ideas or bad ideas or good people and bad people facebook has hidden from you the countless ways to make the platform itself safer so you don't that and that don't require anyone to pick and choose what ideas are good but facebook hid these options from you because the status quo made them more money we're having a conflict over over or things that we could solve in other ways that don't compromise speech facebook wants you to have analysis paralysis to get stuck in false choices and not act here facebook does not have safety by design and it chooses everyday to run the system hot because it maximizes their profit the result is a system that amplifies division extremism and polarization facebook is running the show whether we know it or not facebook's choices have led to disastrous sense in too many cases facebook's amplification promotes violence that harms and even kills people in other cases facebook's profit optimizing machine is generating self-harm and self-hate especially for vulnerable groups like teenage girls the socially isolated and the recently widowed no one is held accountable these problems have been confirmed repeatedly by facebook's own internal research secrets that do not see the light of day this is not simply a matter of some social media users being angry or unstable facebook has made a one trillion dollar company by paying for its profits with our safety including the safety of our children and that is unacceptable this committee's attention this congress's action are critical the public deserves further investigation and action to protect customers on several fronts uh first given that platforms like facebook have become the new cyber security attack surface on the united states our national security demands uh demands more oversight second we should be concerned about how facebook's products are used to influence vulnerable populations third we must correct the broken incentive system that perpetuates consistent misalignment between facebook's decisions ms hogan you need to wrap up your statement okay um okay i'll skip forward um as you consider uh reforms to section 230 i i encourage you to to move forward with your eyes open to the consequences of reform congress has instituted carve-outs of section 230 in recent years i encourage you to talk to human rights advocates who can help provide context on how the last reform of 230 had dramatic impacts on the safety of some some of the most vulnerable people in our society but has been rarely used for its original purpose last last thing they should consult with international human right the international human rights community who have seen firsthand how authoritarian authoritarian governments around the world can weaponize reductions in inter and silence descent there is a lot at stake here you have a once in a generation opportunity to create new rules for our online world i came forward at great personal risk because i believe we still have time to act but we must act now thank you thank you um we we well we're going to try to adhere to the five-minute roll this is a very important topic and and so i wanted to give the speakers some leeway and we'll have time to ask questions but uh thank you very much mr steyer you are recognized for five minutes do we have mr steyer remotely please thank you very much hey there you go thank you very much chairman palone chairman doyle ranking member rogers and ranking member lada um and all the distinguished sub committee members this is really a privilege and an honor to testify in front of you today um i am james p steyer i'm the founder and ceo of common sense media the nation's leading children's media and a non-partisan advocacy organization i as many of you know we have well over 100 million unique users over 110 000 member schools definitely in all of your districts and we are a non-partisan powerful voice for kids and families here in this country and the fact that you're having this hearing is actually remarkable and important the other thing i would say is i'm the father of four kids so i've lived through over the past 20 years the evolution of this extraordinary tech society that we've all lived through and over the last nearly two years the pandemic where my kids have been going to school online and distance learning so as a parent i see these issues and i'd also mention because i know the first amendment will come up that i've been a professor at stanford for over 30 years teaching first amendment law so i'd be happy to speak to some of those issues as well as they intersect with some of the 230 issues 10 years ago i wrote a book called talking back to facebook uh the heads of the company at that point that ms hogan just spoke about literally uh threatened to block the publication of the book part of the reason was there was a chapter in there about girls and boys body image and the impact of social media platforms on body image and obviously 10 years ago the heads of that company who i have met with repeatedly knew that there were issues and so when francis halligan came forward uh recently to talk about additional research that they know it merely just shows you that not just facebook but all of the major tech companies are aware of the impact of their platforms on our society the key is we are now at a watershed moment and you have mentioned this in your opening statements but it is true we have literally been over a decade without major reforms for these companies and we've assumed that in some cases they would self-police or self-regulate well that's not true and the record is clear so the bipartisan leadership of this committee could not be more important and could not come at a more important time and i would argue that in the next three to six months the most important legislation including some of the legislation that this subcommittee is considering today will move forward and will finally put the guardrails on that america's children and families deserved um we all know that kids and teens are uniquely vulnerable online because their brains are still developing they are prone to over sharing uh they're not equipped to think through all the consequences of what they do and they're spending more time online than ever before so even though kids get a tremendous amount of benefits from the internet and from social media platforms it's absolutely clear that we have to regulate them thoughtfully and carefully and the moment is now and congress has a responsibility to kids and families in this country to act my written testimony will give you more examples but just a handful of details that i think we should all remember when we think about the impact of social media platforms on kids and families and therefore the relevance of section 230 and other laws first platforms drag kids down rabbit holes um they have led to issues like eating disorders body dysmorphia suicide ideation and more we could tell you stories uh as some of our opening statements said some of you've done european statements of of individual kids who've committed suicide or gone through extraordinary challenges as a result of these platforms and their harmful content they literally feed off kids teens that die and teens desire to be accepted through their likes and their follows and they enable sometimes harmful comment virally so the bottom line is you have this bipartisan consensus with well over 100 million members common sense is out there in the field every day talking to families this is not a republican issue this is not a democratic issue this is an american family issue and you have the opportunity to do something very very important now and this is the time to act look miss haugen talked about the ways in which facebook has acted with impunity for decades um reforming section 230 is clearly one big piece of the puzzle but i would add that there must be a more comprehensive approach you cannot just deal with section 230 we also have to deal with uh privacy issues and other related issues they're all one big comprehensive package so the hearing next week will also be critically important and passing revised copper and the kids acting out the things will matter the bottom line is our kids and our families well-being is at stake you have the power to improve that and change that the moment is here bless you for taking this on and let's move forward together on a bipartisan basis thank you very much thank you mr star uh chair now recognizes miss frederick for five minutes chairs doyle and pallone ranking members lotta and mcmorris rogers distinguished members thank you for the opportunity to testify today i too used to work at facebook i joined the company after three tours in afghanistan helping special operations forces target al qaeda because i believed in facebook's mission as well the democratization of information but i was wrong it's 2021 and the verdict is in big tech is an enemy of the people it is time all independently minded citizens recognize this so what makes this moment different traditional gatekeepers of information corporate media the academy various organs of the culture are captured by the left as the past year has borne out big tech companies like google facebook twitter and amazon are not afraid to exercise their power in the service of this ideology big tech companies they tell us not to believe our lying eyes that viewpoint censorship is all in our heads tell that to the gold star mom who criticized biden's afghanistan withdrawal and was deleted by facebook after the death of her son a u.s marine tell that to ali best stuckey who had the temerity to say that biological men should not compete in women's sports before being suspended by twitter tell that to clarence thomas whose documentary on amazon was deleted without explanation beyond these examples which are legion the confluence of evidence is irrefutable twitter and facebook censor republican members of congress at a rate of 53 to 1 compared to democrats twitter suspends conservatives 21 times more often than liberals facebook created two internal tools in the aftermath of trump's 2016 victory that suppressed right-wing content media traffic and reach on its platform google stifled conservative leaning outlets like the daily caller breitbart and the federalist during the 2020 election season with breitbart search visibility shrinking by 99 percent compared to the 2016 election cycle apple dumped the conservative-friendly parlor app as its sata topics app store google and amazon web services did so as well and these practices have distinct political effects the media research center found in 2020 that one in six biden voters claimed they would have modified their vote had they been aware of information that was actively suppressed by tech companies 52 percent of americans believe social media suppression of the hunter biden laptop story constituted election interference these practices erode our culture of free speech chill open discourse and engender self-censorship all while the taliban the chinese communist party and iranian officials spew their bile and genocidal rhetoric on american-owned platforms big tech is also working hand in glove with the government to do its bidding jen saki admitted from the white house podium that the government is communicating with facebook to single out accounts and posts for censorship and that's just what she admitted out loud the outlook is grim a lack of accountability and the sweeping immunity conferred on big tech by broad interpretations of section 230 has emboldened these companies to abuse their concentrations of power constrict the digital lives of those who express specific political views and sharpen digital surveillance on ordinary americans just look at apple's now paused plans to scan the content directly on your personal device starting with iphotos put simply big tech companies are not afraid of the american people and they're not afraid of meaningful checks on their abuse of power and it shows yet we should be wary of calls to further suppress content based on politically expedient definitions of misinformation clearly this definition is in the eye of the beholder the wuhan lablek theory comes to mind so let the whistleblower docs speak for themselves holding big tech accountable should result in less censorship not more in fact the first amendment should be the standard from which all section 230 reforms flow despite what the new twitter ceo might think american lawmakers have a duty to protect and defend the rights given to us by god and enshrined in our constitution by the founders rights that specific tech companies in conjunction with the government are actively and deliberately eroding the argument that private companies do not bear free speech responsibilities ignores overt collaboration between the government and big tech companies working together to stifle free expression most importantly section 230 reform is not a silver bullet we have to look outside of dc for answers states civil societies and tech founders all have a role to play here we cannot let tech totalitarians shape a digital world where one set of thinkers are second-class citizens the window of opportunity to do something is closing thank you miss frederick chair now recognizes mr robinson for five minutes chair pallone chair doyle ranking member mcmorris rogers ranking member lata thank you for having me here today i am rashad robinson president of color of change the the nation's largest online racial justice organization i also co-chaired the aspen institute's commission on information disorder which just released our comprehensive set of recommendations for effectively tackling misinformation and disinformation i want to thank this committee and its leaders for your work introducing the justice against malicious algorithm act the safe tech act the civil rights modernization act and the protecting americans from dangerous algorithms act each one is essential for reducing the tech industry's harmful effects on our lives congress is rightly called to major action when an industry's business model is at odds with the public interest when it generates its greatest profits only by causing the greatest harms big tech corporations like facebook amazon and google maintain near total control over all three areas of online life online commerce online content and online social connection to keep control they lie about the effects of their products just like big tobacco lies about the deaths their products cause they lie to the public they lie to regulators and they lie to you mark zuckerberg lied to me personally more than once it's time to make the truth louder than their lies but skip the part where we wait 40 years to do it the most important first step is something we have more control over than we think and that is drawing a bright clear line between fake solutions and real solutions big tech would love for congress to pass laws that mimic their own corporate policies fake solutions that are ineffective designed to protect nothing more than their profits and their power and we can't let that happen we know it's a fake solution if we're letting them blame the victims by shifting the burden of solving these problems to consumers because consumer literacy or use of technology is not the problem the problem is corporations design of technology and that is what we need to regulate if we're pretending that colorblind policies will solve problems that have everything to do with race because algorithms advertisers moderators and bad advertisers are targeting black people and we don't get closer to the solution by backing away from that problem if we're putting trust in anything big tech corporations say because it's a lie that self-regulation is anything other than complete non-regulation and it's a lie that this is about free speech when the real issue is regulating deceptive and manipulative content consumer exploitation calls to violence and discriminatory products section 230 is not here to nullify 60 years of civil rights and consumer safety law no matter what any billionaire from silicon valley comes here to tell you there are three ways to know we are heading towards real solutions laws and regulations must be crystal clear big tech corporations are responsible and liable for the damages and violations of people's rights and they not that they not only enable but outright encourage that requires well-vetted and targeted amendments to section 230. you're responsible for what you sell big tech corporations sell content that is their main product congress must allow judges juries regulators and government enforcers to do their jobs to determine what's hurting people and stop it and hold the responsible parties liable responsibility without accountability isn't responsibility at all congress must enable proper enforcement i want to applaud this committee for ensuring that the bill back better legislation includes funding for the ftc the next step is making sure the ftc hires staff with true civil rights expertise laws and regulations must be crystal clear big tech products must be subject to regulatory scrutiny and approval before they release onto the public and hurt people just like a drug formula should be approved by the fda tech products need to pass inspection an independent auditing process that exposes what they would like to hide but regulators can't fall for shifting the burden and blame to consumers the lie that we simply need to put more control on the hands of users is like stacking our supermarket shells with poison and expiring food and then saying we are simply giving consumers more choice finally congress must take anti-trust action seriously with big tech ending their massive concentration of power is necessary condition to ending the major damage they cause the right approach is not complicated if we make the internet safe for those who are being hurt the most it automatically makes the system safe for everyone and that that is why i'm here because big tech puts black people and people of color in danger more than anyone else passing and enforcing laws that guarantee freedom and safety for black people in online commerce content and social connection will create the safest internet for the largest number of people you can make technology the vehicle for progress that it should be and no longer the threat to freedom fairness and safety it has become do not allow the technology that is supposed to take us into the future drag us into the past thank you thank you mr robinson we've concluded our openings uh we now move to member questions each member will have five minutes to ask questions of our witnesses i will start by recognizing myself for five minutes miss hoggin last week the washington post reported that facebook knew the structure of its algorithms was allowing hateful content targeting predominantly black muslim lgbtq and jewish communities facebook knew it could take steps with its algorithm to lessen the reach of such harmful content while still leaving the content up on their website but they declined to do so this appears to be a clear case where facebook knew its own actions would cause hateful harmful content to spread and took those actions anyway i would also note that when mr zuckerberg testified before us earlier this year he bragged about the steps his company took to reduce the spread of hateful content shamefully he left this known information out of his testimony miss hoggins setting law aside do you think facebook has a moral duty to reduce this type of content on its platform and do you believe they've lived up to that moral duty i believe i believe facebook has a moral duty to be transparent about the operation of its algorithms and the performance of those systems currently they operate in the dark because they know that it with no with no transparency there is no accountability i also believe that once someone knows a harm exists and they know that they are causing that harm they do have a duty to address it facebook has known since uh 2018 the changes they made to their algorithm in order to get people to produce more content uh i.e the change from goaling on time spent to meaningful social interactions increase the amount of extreme and polarizing content on the platform i can't speak to that specific example because i don't know the exact circumstances of it but facebook knew that they were giving the most reach the most uh offensive content and i want to give you a very specific example on those let's imagine you encountered a piece of content that was actively um defaming a group that you belong to it could be christians it could be muslims it could be anyone if that post causes controversy in the comments it'll get blasted out to those people's friends even if they didn't follow that group and so the most offensive content the most extreme content gets the most distribution yup turning to instagram which is owned by facebook can you tell the committee in plain words how teen girls are being harmed by the content they see on that platform and how decisions of instagram led to this arm facebook's internal research states that not only is instagram dangerous for teenagers it's actually substantially more dangerous than other social media platforms because uh ticktalk is about performance and doing things with your friends snapchat is largely about augmented reality and faces but instagram is about bodies and social comparison teenagers are very vulnerable to social comparison they're going through a phase of their lives where there's a lot of things changing and what what facebook's own research says is that when kids fall down these rabbit holes when the algorithm finds like you start from something like healthy eating and it pushes you towards anorexia content you have the perfect storm where kids are put in vulnerable environments and then given the most extreme content yeah mr robinson it's disappointing if not surprising to hear the lack of action on the part of facebook after your negotiations with mr zuckerberg and i share your concern what you discussed in your testimony that highlights how not just the advertisers but the platforms themselves can perpetuate discrimination can you discuss how you think targeted amendments to section 230 can address some of the actions of the big platforms um well right now we are all in this situation where we have to go to facebook and ask for their benevolence in dealing with the harms on their platforms going to billionaires where every single day their incentive structure is growth and profit over safety integrity and security and so we've done this before with other industries um congress has done this before in this country with other industries um where we create rules that actually hold them accountable and right now whether it is their product design on what they recommend and what they lead you to or it is in the paid advertisement and content facebook is completely um not accountable and the other thing that i think is incredibly important is that they believe that they do not have to adhere to civil rights law they've said that before congress they've said that to us and the idea that we are going to allow silicon valley companies and their lawyers to come here and say that there are some laws that they are accountable and some laws they are not is that is outrageous and i think that those targeted amendments to section 230 both allow for free speech to exist which as any civil rights leader in this country will tell you that we value and believe in free speech while also having accountability for things that are absolutely not about free speech thank you i see my time has expired i will now yield to mr latta the ranking member for five minutes well thank you mr chairman and uh miss miss hogg and um i can start my question with you the the documents you brought forward from your time at facebook show that facebook has intentionally misled the public about the research they have conducted about the impacts of their platforms including the mental health of children we've heard from the big tech companies including facebook talk to us about how amending sex section 230 will cause them to leave content up or take content down depending on who they are speaking to you've spoken your testimony about how facebook puts its profits over people if that is the case how do you think facebook would adapt to section 230 reform where they would be held liable for certain content on its platform there uh facebook has tried to reduce uh this discussion to the idea of are we taking down enough content are we leaving up too much content that kind of thing when reality they have lots and lots of ways to make the platform safer product choices designed in the algorithm where it's not about picking good or bad ideas it's about making sure that the most extreme polarizing ideas don't get the most reach i don't know exactly how facebook would adapt to 230 reform but i believe that in a world where making a series of intentional choices to prioritize growth and running the system hot over uh having safer options i would hope that pattern of behavior would be held accountable well thank you miss frederick you are a former facebook employee and have uh done significant research on how these platforms censor content including political speech which they disagree the platforms claim they do not censor based on political viewpoint what's your response to that my response is believe your lying eyes tech companies they're not neutral gatekeepers of information you can see the sourcing in my testimony of the litany of examples and new research that i went over in my opening testimony testifies to exactly what they're doing and how skewed it is against viewpoint uh big tech companies are against viewpoints uh talk to senator rand paul talked to reverend call truman talk to governor ron desantis talk to stephen crowder talk to dr scott atlas talk to the gold star mom talk to jim banks talk to jenna ellis talk to ali best stuckey talk to mike gonzalez talk to ryan t anderson all of these american citizens have been victimized by these tech companies and by viewpoint censorship so when tech companies say look away this is not actually happening i say believe you're lying eyes thank you let me continue uh miss frederick as part of the big tech academy accountability platform i've offered draft legislation with amends section 230 to narrow liability protection for platforms that promote or facilitate content that the platform knew or had reasonably violated federal criminal law in short if a platform is acting as a bad samaritan they would not receive section 230 liability protection in those instances uh how do you think uh or what what do you think about the impacts that this legislation would have if it would be enacted in the law my thoughts are that you strip immunity when it's being abused so if the abuses of this immunity continue then you get rid of it you get rid of the freedom from civil liabilities when it is being abused by these tech companies it's as simple as that you know let me go back to your testimony because you know when you were talking i believe it was 52 to 53 to 1 uh when it was conservatives to maybe liberal viewpoints how you know if this is presented to the big tech companies out there what's the response that you hear from them on that so i think people uh try to cover their rear ends in a lot of ways um but i think americans are waking up can i ask you uh real quick is how do they cover themselves i'm sorry are they covering themselves by by saying that we don't do this by employing an army of lobbyists in dc that say we don't do this that it's all in your head by denying reality and what people who use these platforms actually see happening for the suppression of political viewpoints there's a high level of tech company apologists who come into these doors sit at these dioceses and say this is not happening don't believe it but we have the concrete information to say that yes this is actually happening you have the media research center which is acting as a lion in this regard to actually get the numbers and make sure that these viewpoint censorship instances are quantified um a lot of people especially independent research organizations partisan research organizations don't want to see that actually happen and that information get out there so they they smear the source but now i think the there's stuff leaking through the cracks and this is going to eventually get bigger and bigger and become a more prodigious movement and we need to ensure and support the sources that actually do that well thank you very much mr chairman my time has expired and i yield back gentlemen yields back chair now recognizes mr pallone the full committee chairman for five minutes to ask questions thank you chairman doyle in our march hearing i heard or i asked mark zuckerberg about whether he was aware of the company's internal research showing that his company's algorithms were recommending that his users join fringe extremist groups in europe and here in large numbers and reporting from the wall street journal indicated that mr zuckerberg failed to fully implement corrective measures his employees push for internally because it could have undermined advertising revenue back to profit again so miss uh how hogan this seems like a pattern of behavior so in your view what are the most compelling examples of the company ignoring threats to users in the name of profits facebook has known since 2018 that there are that the choices that they made around the design of the newsfeed algorithm uh were while increasing the amount of content consumed increasing the length of sessions that it was uh providing hyper amplification for the worst ideas i'll give an example groups like most people think facebook is about your family and friends facebook has pushed people more and more aggressively towards large groups uh because it lengthens your session right if we had a facebook that was like what we had in 20 2008 you know it's about your family and friends for free you would get less hate speech less nudity less violence but facebook would make less money because your family and friends don't produce enough content for you to look at 2 000 2000 pieces of content a day facebook has implemented policies like if you are invited to a group even if you don't accept it you will begin to receive content from that group for 30 days and if you engage with any of it it will be considered a follow in a world where the algorithms pick the most extreme content from these mega groups and distribute it that kind of behavior directly is facebook promoting their profits over our safety light went back on the internet and social media platforms have made it easier for civil rights groups and so and racial justice groups like color of change to organize around vitally important issues however you firmly demonstrated in your testimony how the current practices of these platforms have harmed black and marginalized communities so my question is as we work to refine the proposals before us can you describe how my bill the justice against malicious algorithms act will help protect black and marginalized voices online great well first of all you have a bill so thank you because i think that that has been incredibly important is moving towards action but your bill removes liability for content information provided through personalized algorithms or algorithms that are specifically tailored to specific individuals and that essentially has been sort of one of the problem it's doing something that you know we can't wait we've seen facebook allow advertisers to exclude black people from housing exclude women from jobs creating these sort of personalized algorithms that give people experiences that actually take us outside of hard-won and hard-fought victories we've had around laws dragging us from the 21st century back to the 1950 and and your bill as well as other pieces of legislation that are before this committee um hold these institutions accountable to not be immune to a whole set of laws and standards that every single other business in this country has to adhere to thank you let me ask you another question some defenders of section 230 say that changes to the law will result in a deluge of frivolous lawsuits against platforms big and small so i wanted to ask you would reforming section 230 in your opinion even if that results in increased lawsuits um hurt or harm marginalized communities and smaller or non-profit websites that do good work giving giving everyday people access and opportunity to hold big institutions accountable is part of this country's fabric being able to give people the opportunity to raise their voices and push back and right now what we have is big companies huge multinational companies facebook has nearly three billion users that is more followers than christianity and for us to say that we shouldn't be able to hold them accountable that we shouldn't be able to push back against them is an outrageous statement and so yes there will be more um lawsuits there will be more accountability but that means that there will hopefully be changes to the structures and the way that they do business just like um the toys you will be giving to the children in your family this holiday season um have to be accountable before they get to the shelves because of lawsuits because of accountability we need these companies to be accountable and so there will be a trade-off but as someone who's gone back and forth in the room with mark zuckerberg with jack with sheryl sandberg and all these people have tried for years to get them to actually not only um move new policies to be more accountable but then to actually implement them and enforce them um we cannot allow them to continue to self-regulate themselves thank you thank you mr chairman gentlemen yields back chair now recognizes mrs rogers full committee ranking member for five minutes to ask questions thank you mr chairman ms haugen i wanted to start with a yes or no question do you support big tech's censorship of constitutionally protected speech on their platforms do i support um what do you define as censorship censorship them controlling what is constitutionally protective speech under the first amendment i am a strong proponent of re-architecting these systems so that they are more focused on our family and friends because so this is not about good ideas or bad ideas it is about uh making the system safer question is yes or no do you support them censoring constitutionally protected speech under the first amendment uh i believe that we should have things like fact checks included along with content i think the current system i guess i take it as a no uh and i think there are better solutions than censorship that we should be using okay miss frederick um obviously many americans have lost trust with big tech and it's because they are arbitrarily censoring speech that they don't agree with and it seems like the censorship is in one direction it's against the conservative content so as we think about solutions as to how we're going to hold big tech accountable we absolutely have to be thoughtful about being bringing transparency and accountability i wanted to ask you to talk about the difference between misinformation and disinformation are we talking about these differences in a sane world because in a sane world disinformation would be the intentional propagation of misleading or false information and misinformation would just be false information that sort of spreads on these platforms but now we know that both of these terms are being conflated into a catch-all for information that the left doesn't like so a perfect example of this is the wuhan institute of virology when in the early days of the pandemic tom cotton floated this theory and people thought he's a deranged conspiracy theorist we have to suppress this information big tech actively suppressed mentions of the wuhan lab leak theory now it's part of acceptable discourse the new yorker gets to talk about it the wall street journal talks about it okay we can talk about it again when tom cotton was very much on to something in the beginning and then you look at the same thing the hunter biden laptop story this is from the new york post incriminating hunter biden and his relationship with ukraine et cetera et cetera and joe biden as well and the new york post uh excuse me and facebook and twitter we have proof of this actively suppressed links to that information they didn't allow people to actually click on the story so you have high level intelligence community officials i'm talking the highest level of the us intelligence community saying that the hunter biden laptop story bore all the hallmarks of russian disinformation and tech companies were completely in tandem with those decisions now hunter biden goes on tv doesn't deny that the laptop is his politico even confirmed the story yeah this information would you speak to concerns around the government regulating misinformation this is huge and in july jens hockey and the surgeon general they got up on the podium they spoke from the white house with the imprimatur of the state and they said we are directly communicating with facebook and we have pointed out specific post specific accounts that we want them to take off the platform within a month all of those accounts and those users those posts 12 of them in fact were gone cnn gloated about it later so when the government works with these big tech companies to stifle speech you have a problem and you have a first amendment problem in that regard the difference between tech companies and the government policing speech is allighted when that happens so i've been working on some legislation with jim jordan and what it proposes is that it would remove those section 230 protections for big tech when they are taking down the constitutionally protected speech it also sunsets the new provisions in five years the goal here is for them to have to earn the liability protections so do you believe that this would be an effective way to to hold them accountable and prevent the censorship i think tech always outpaces attempts to govern it the sunset clause is a great idea we advocated for it at the heritage foundation so definitely a good idea allow time for us to redress some of the imbalance between these big tech companies and the users the american people by letting us legislate on it and we shouldn't be afraid to legislate thank you thank you um i'm quickly running out of time miss how can i i do um i have significant concerns about the impact on on our youth on the on the young generation on children and just would you speak um briefly about facebook and and their internal models uh impact on mental health of children and how it alters their business model automatically yeah facebook knows the future of growth on the platform as children that's why they're pushing into things like instagram kids even though they know that the rates of problematic use are highest in their youngest users it's because the younger you are the less your brain is is formed facebook also knows that kids are suffering alone right now because their parents didn't live through this experience of addictive software when they were used and kids end up getting advice like why don't you just not use it not understanding how addictive these platforms are i think the fact that facebook knows that the kids are suffering alone and that their product products are actively contributing to this is a problem and the fact that they lied to congress repeatedly about these harms is unacceptable so i hope that you guys act because our children deserve something better thank you thank you mr chairman general lady's time has expired that cheer now recognizes mr mcnerney for five minutes well thank the chair i thank the witnesses uh for this testimony uh miss hagan i had to leave a company for bad policies and it was painful so i appreciate what you've gone through um you've discussed a 2018 change and this has been discussed already in this committee uh the company made to its algorithms to favor meaningful social interactions also known as msis this change was made to increase engagement based on my understanding that it continues to favor content that is more likely to be shared by others the problem is that facebook research found that msi rewarded provocative and negative content of low quality and promoted spread of divisive content facebook executives rejected changes suggested by employees that would have countered this so how difficult is it for facebook uh to change its algorithms uh to uh to lessen the impact of that facebook facebook knows that individual factors within meaningful social interactions and i want to be clear hate speech and bullying is considered meaningful as a social interaction in most languages in the world and facebook knows that there are individual terms within that that algorithm that if you remove them you instantly get substantially less misinformation substantially less nudity and facebook has intentionally chosen not to remove those factors because it would decrease their profits so yes they could do they could do a change tomorrow that would give us 25 less misinformation so that was gonna be my next question is why wouldn't they do that but it's obviously because they want to actually i want a slight tweak they claim they did it because they wanted people to engage more that they wanted to be more meaningful but when they checked six months later people said their home feeds were or their newsfeeds were less meaningful and i wanted on the record they didn't do this because they wanted us to engage they did it because it made us produce more content that the only thing they found that could get us to produce more things was giving us more little hits of dopamine in the form of likes comments and reshares so are there other problematic design choices the company is making today that would increase profits and increase proliferation of harmful content facebook knows that people who are suffering from extreme loneliness isolation are often the ones that form um very intense habits involving usage we're talking about thousands of pieces of content per day you could imagine simple things that said are you going down a rabbit hole are you spending 10 hours a day on the system often when people get depressed or experience other things they self-soothe like we see this with children all the time facebook could acknowledge this pattern and put a little bit of friction in to decrease these kinds of things and it would help the most vulnerable users on the platform but it would decrease their profits thank you mr robinson your stark testimony details the harm the lack of tech platform accountability has had on marginalized communities such as some of my communities and your testimony you state that facebook is not just a tool of discrimination by businesses but but facebook's own algorithms are drivers of this discrimination can you talk more about how the algorithms created and implemented by the platform including facebook leads to discrimination um absolutely well you know facebook's um facebook's algorithms um especially the personalized algorithms allow for um a whole set of ways that people are excluded from opportunities or over included in opportunities and over included and over-recommended into sharing leading people down sort of deep rabbit holes or cutting people off from housing opportunities job opportunities and everything else and they've said you know i remember a conversation where we were um trying to deal with housing and job employment discrimination on their platform there was a lawsuit against facebook that they eventually settled but never took all the way to the courts because they essentially want to be able to keep 230 protections in place and the back and forth with um shirelle sandberg and mark zuckerberg about both of those about both of those cases they they said to us deeply we care about civil rights you know we care about these issues it pains us deeply that our platform is causing um these harms and and we're going to work to fix it and so they settled the case and then research comes out just a couple months later that the same thing is continuing to happen after they've told us and they've told you that it's no longer happening on their platform i sat across from mark zuckerberg um and specifically talked to him about voter suppression um on their platform only to work with them to get policies in place then to watch them while they then don't enforce those policies we've sat in the room with them on multiple occasions and i have to just say time and time again that there is no other place that these changes are going to happen if it does not happen here thank you very good answers and i yield back gentlemen yields back uh the chair now recognizes mr guthrie for five minutes thank you mr chair i appreciate it very much and we're all concerned about misinformation uh and don't want misinformation spread on the internet the question is how do you define this information who gets to define it and miss frederick i'll ask the question of you but first i want to set up the question you kind of set it up earlier with the wuhan lab i'm the ranking member of the healthcare subcommittee and we've been really looking into the luham wuhan lab and i just want to set up this isn't a hypothetical snare this is a real scenario of information getting blocked by from facebook and it goes uh to some of the comments and i've got documentation here if you go back to the april 17th white house press briefing somebody asked the president would you and i want to ask dr fauci could you address the suggestions or concerns that the virus is somehow manmade possibly came out of laboratory in china the president says want to go and so uh dr fauci said there was a study recently that we can make available to you where a group of highly qualified evolutionary biologists looked at the sequences they're there and the sequences and bats as they evolve and the mutations that it took to get to the point where it is now totally consistent with the jump of a species from an animal to a human so disregarding the lab sir peter dazek sent an email the next day to dr fauci as the pi of the r01 grant publicly targeted by fox news now this grant was where echo health systems was being paid by taxpayer dollars to go to caves in china and harvest viruses from bats bats that may never see a human being and then taken into a city of 11 million people wuhan taking 11 million people but he said to dr fauci as the pi of the r-1 grant publicly targeted by fox reporters at the presidential press briefing last night i just wanted to say a personal thank you on behalf of our staff and collaborators this is from public information you could foia i want to our staff for publicly standing up and stating that the scientific evidence supports a natural origin for cova 19 from a bat to human spillover not a lab release from luwahan institute of virology from my perspective your comments are brave and coming from your trusted voice you helped dispel the myths being spun around the virus origins and the return email from dr peter man many thanks for your kind note best regards tony so i say that because we had if who's going to determine what's misinformation or not here's uh the national institutes of health that we have funded tremendously over the last few years we all had a lot of faith and trust in dismissing that it came from the wuhan lab when there was no evidence to dismiss it absolutely the evidence doesn't exist today it didn't exist at the time to say that it couldn't have come from the lab i had a conversation last spring with dr collins and brought this up and was really concerned about a lot of faith in what these guys did and matter of fact i quoted dr faucio on this when people say this came from muhammad all we have virologists saying that that it didn't because we've had these before our committee and had no reason to not uh believe what they said and when i talk with dr collins and if somebody wants to ask him to see if this is an accurate description of the a phone call i'll certainly welcome somebody to do that but essentially i said i'm disappointed and where it's coming because i've looked at a lot of evidence and it really appears this could have come possible very more likely than not through the lab and he goes well it did originate in nature so it's not man-made originated in nature now if we went to a bat to a human from bat to a bat to a mammal to a human or back to the lab or to the human because it got leaked through the lab we don't we can't rule that out so we're talking about the people we call in myths talking conspiracies whatever and the whole time they never could rule it out the reason it's relevant to this hearing is because facebook took down and i got to hear any comments that it came from the wuhan lab man made in luan lab and on may 26th i need to look at the dates i talked to dr collins it's not pretty close in light of the ongoing facebook po posted in the light of the ongoing investigation to the origin of kova 19 and in consultation with public health experts we will no longer remove the claim that cova 19 is man-made or manufactured so my point is who gets we we've got the top scientists at nih people that a lot of us had faith in and quoted and now that regret that i quoted them to to constituents who brought these things to my attention and now we know that what they were saying if you look at sparse the words they might be saying the truth but it wasn't accurate in terms of could it have somehow the wuhan lab was involved and moving forward now the preponderance evidence that it is so ms frederick i just the question i have is uh how did the social media platforms fail in this and then who do we look to for expertise if we're going to try to well i've used all my time so you won't be able to answer but how are we going to uh how are we going to define what misinformation is and who gets to define that those are the questions we're going to have to address as we move forward and i yield back thank you was there a question there uh gentleman yields back cheer now recognizes miss clark for five minutes good morning and let me start by thanking chairman dole and uh chairman pallone for calling this very important hearing i'd also like to thank all of our witnesses for joining us today to discuss accountability in tech examining the harm done by the current governing rules governing the internet and exploring targeted reforms to ensure that the rules and regulations which initially created the conditions necessary for internet use to flourish and to grow are not outdated in the face of the technological advances made this century under the leadership of chairman pallone and doyle this committee has worked for years to better understand and limit the spread of harmful content on social media platforms and now is the time for action as many social media platforms have moved away from chronological ranking to a more targeted user experience the use of algorithmic amplification has become increasingly widespread while remaining opaque to users and policy makers alike this use of algorithmic amplification has far too often resulted in discriminatory outcomes and the promotion of harmful content the lack of transparency into how algorithms are used coupled with big tech's increasing dominance in the world of online advertising and commerce have seemingly incentivized business mode models that rely on discriminatory practices and the promotion of harmful content my first question is from mr robinson you touched on this a bit in your written testimony but could you expound on how this combination of a lack of transparency in an industry dominated by a few major players has been detrimental to communities of color and how my legislation the civil rights modernization act would help absolutely um well your piece of legislation congresswoman takes away liability shield claims um um when it comes to targeted advertising and that's incredibly important because as i've already stated what we end up having is these um companies creating all sorts of loopholes and back doors um to get around civil rights law um and in in essence creating an incredibly hostile environment when it comes to the amplification of hate you know big tech is profiting off of yelling fire in a crowded theater and so i understand that we have these conversations about the first amendment but there are limitations to what you can and cannot say and right now the incentive structures in the business models of big tech the recommendations what they amplify and what they choose to amplify in a conversation with mark zuckerberg about um dealing with the the the deep impact of census disinformation on his platform we were trying to have a conversation about a set of policies they could have put in place to deal with sensitive disinformation mark decided to bring up a young woman a young dreamer that he mentors in east palo alto and told the story of her being a daca recipient and he was afraid that if he limited in some way senses disinformation that it would limit her from being able to express concern given the challenges that happened around daca my response was well what other decisions does this young woman get to make at facebook and is she putting millions of dollars behind her post because if she's not putting millions of dollars behind her post then in fact maybe her friends won't even see it on the platform but this is essentially what we're dealing with and this is why we're before congress because at the end of the day self-regulated companies are unregulated companies and facebook and their billionaires will continue to put their hands on the scale of injustice as long as it makes them more money and only congress can stop them from doing it and congress has done this before in the past when it comes to other companies which have harmed and hurt us and that's why we're here thank you mr steiner your testimony focused on many of the negative impacts of algorithmic amplification and prolonged screen time on children and young people this is something i'm very concerned about particularly because we cannot yet fully understand the long-term impact as the children of today grow into leaders of tomorrow can you please explain for the committee how companies use section 230 protections to continue these dangerous practices sure and i think that uh miss haugen has actually also referenced that uh congresswoman clark and the truth is this because of the fact and and we're using facebook an example but don't forget there are other social media platforms that act similarly because they focus completely on engagement and attention this is really an arms race for attention what happens is kids are urge basically become addicted to the screen because of the tech design techniques actually uh congresswoman schakowsky can have a hearing next week about it but the bottom line is they are trying to the business model encourages engagement and a constant attention and that is very damaging to children because it means they spend more and more time in front of a screen and that is not a healthy thing so that is the fundamental business model that leads to the focus on attention and engagement that is damaging to children thank you very much for the question thank you mr chairman i yield back general lady yields back the chair now recognizes mr kenzinger for five minutes well thank you mr chairman and thank you all for being here uh this hearing's important and timely i find the underlying subject is growing tiring at the same time we ask social media companies nicely to change their operations for the public good we hold hearings we warn of major legislative and regulatory changes and nothing gives they nibble around the edges from time to time usually when major news stories break but things continue to get worse over time and not better which is why i've been working for years now to find reasonable and equitable policy solutions in recent years my approach has been to avoid amending section 230 because i felt that we should be considering other options first so i introduced two bills social media accountability and account verification act and the social media fraud mitigation act both narrow in scope and donovan 230 it would have had the ftc undertaken narrow rule making uh to require more action from social media companies to investigate complaints about deceptive accounts and fraudulent activity on their platform and i believe they strike a good balance they would have a positive impact on consumer protection without making drastic policy changes but today given the current state of affairs and the clear dangers social media is posing to society i'm more open to the men's section 230 camp than i used to be and just to drive home my initial point about how tiresome this has become the blame can't be placed solely on social media companies despite my lengthy engagement with my colleagues before introducing my bills and even after making changes to the bills based on their feedback it could not find a partner on the other side of the aisle to lock arms with me take a stand and put something bipartisan out there to at least get the conversation going honestly there are ideas coming from both sides the diocese that are worthy of debating but the devil's always in the details but if we're not even trying to engage in a bipartisan process we're never going to get a strong or lasting set of policy solutions i'm disappointed it's taken my colleagues nearly a year to engage uh with me on this issue but i hope this hearing is a first step of many steps that seemingly we've already had uh to join together and hold big tech accountable miss uh halogen i want to thank you directly for your recent efforts to bring about a broader conversation about the harms of social media as you may recall in the spring of 2018 mark zuckerberg testified before us during the course of that hearing he stated that facebook has a responsibility to protect its users do you agree that facebook and other social media companies have a responsibility to protect their users and if you do do you believe that they are fulfilling that responsibility i do believe they have a duty to protect their users i want to remind everyone in this room that the majority of languages in the world facebook is the internet you know 80 to 90 of all the content in that language will be on facebook in a world where facebook holds that much power they have an extra high duty to protect i do not believe they are up filling that duty today and because of a variety of organizational um incentives that are misaligned and congress must act in order to realign those incentives i agree with you mr frederick let me ask you you're also a former facebook employee as part of their global security counterterrorism analysis program so i'm going to ask you the same question do facebook and other social media companies have a responsibility to protect their users and are they fulfilling that responsibility they do have a responsibility to protect their users the days of only blaming the addict and letting the dealer get off scot-free are over i think everybody recognizes this at that point and i want to say in october of 2019 mark zuckerberg stood on the stage at georgetown university and said facebook was going to be the platform that stands up for freedom of expression he has rankly abrogated those values and the difference between what these companies say and what they actually do is now a yawning chasm it's been reported and discussed today that algorithms employed by some of the biggest companies tend to lead users to content which reinforces their existing belief or worst which causes anxiety fear and anger all of which have been shown to lead to decreased to increase engagement from the user regardless of their damaging effects so ms frederick given your background can you describe the national security concerns with the ways in which social media companies design and employ those and if we have time this halogen too yeah i went to work for facebook because i believed in the danger of foreign islamic terrorism um i went to make sure that the platform was hostile to those bad actors um illegal actors and i think when uh i think we can imbue technology with our values this is the whole concept behind privacy by design and you need to let those programmers who actually code these algorithms they need to be transparent about what they're doing and how they operate and how they impact users as well i am extremely concerned about facebook's role in things like counterterrorism or other counter or counter-state actors that are weaponizing the platform facebook is chronically under-invested in those capacities and if you knew the size of the counter-terrorism team for the threat investigators you'd be shocked like i'm pretty sure it's under like 10 people this should be something that is publicly listed because they need to be funding hundreds of people not 10 people thank you mr chairman thank you to the witnesses and thank you i yield back gentlemen yields back chair now recognizes mr mckeachen for five minutes thank you mr chairman and i want to say to my uh colleague and friend kingston sir forgive me for which one your name uh that i appreciate your comments about the partisanship i share them and with the notion of if we're going to have something that's going to last uh congress from congress changes and parties and whatnot we're going to need to have some of this bipartisan and i invite you to take a look at the safe act which i think takes a unique and different approach to section 230 liability uh that being said colleagues i would ask all of you all to rethink or really understand the message we're sending when we talk about immunity because when we say immunity what we're really saying is that we don't trust juries think about that we don't trust a jury properly instructed to get it right that's the only reason you'd have immunity because you're afraid the jury's going to get it wrong remember my colleagues the juries are composed of the same people who say this to congress they're the people who trust us to make trillion-dollar judgments to decide war and peace to decide any number of things if they're wise enough to do that why are we so arrogant to believe that they're not wise enough when properly impaled properly instructed by a jury that they can't get it right and so i want you to start thinking about immunity in that context as we go forward if you would do would be so kind of do so um i'd like to direct my uh first question mr chairman uh to uh the color of change uh miss miss robinson and i note that you say that there are three ways that we know we're headed towards a real solution and the first one jumps out at me and say the laws and regulations must be crystal clear now when i came to congress i was just a small town lawyer trying to make good and i don't know what an algorithm is from a rhythm blues section quite frankly but i do know how to say that immunity is not uh available if you violate civil rights immunity is not available for any number of legal actions that's the approach to safe tech act takes do you see that as meeting your first criteria can you comment on how and if you believe the safe act will um ultimately help with the uh big tech's uh big text abuses absolutely i do believe that it it meets that mark i believe that it meets the mark because um the fact of the matter is is that facebook um twitter google amazon they've all come before you and have explained that they are not um subject to civil rights law and that they can get around the laws on the books and create all sorts of harms through the choices that they're making and right now we need a set of laws that make it so that they are not immune and your point around juries i mean i would add regulators i would add sort of the infrastructure that we have in this country to hold institutions accountable um gets thrown out the window when we're dealing with tech companies out in silicon valley because somehow they exist on a completely different plane and are allowed to have a completely different set of rules than everyone else and the the fact of the matter is is freedom of speech is not freedom from the consequences of speech this is not about throwing someone in jail this is about ensuring that if you say things that are deeply liable if you incite violence if you incite hate that you are you can be held accountable for that and that if you are recommending folks to that and you are moving people through paid advertisement and your business model is amplifying that that there is accountability baked into it and i don't understand why um we continue to let these platforms make billions of dollars off of violating things that we have worked hard in this country to move forward on and that i think is why really understanding the difference between solutions that are real and solutions that are fake and i think the safe act gets us there as one of the pieces of of a piece of legislation that we need to consider mr robertson i would ask you um do you agree with me that just because uh the immunity is removed doesn't mean the plaintiffs are all of a sudden going to win every lawsuit no of course not and and i don't think anyone here believes that but what it does mean is that we end up actually being in a place where there can be some level of accountability and we and you don't end up having a situation where mark zuckerberg or sheryl sandberg or jack or anyone else can come here and sit before you and lie about what their platforms are doing decide when they're going to be transparent or not and walk and walk away and and feel like no absolutely no accountability and this is not just impacting black communities this is impacting evangelical communities this is impacting lgbtq communities this is impacting women it's impacting people at the intersection of so many different experiences in life that we have allowed these companies to operate in ways that is completely outside of what our rules should look like thank you mr chairman um i thank you for your patience and allow me to trespass on your time and i yield back gentlemen yields back uh chair now recognizes mr belarakis for five minutes thank you mr chairman i appreciate it i want to focus my questions on the section 230 how it interacts with child exploitation online in 2019 research from the national center for missing and exploited children reported that child pornography has grown to nearly 1 million detected events per month exceeding the capabilities of law enforcement that number increased to over 21 million in 2020 and is on track to grow again this year unfortunately mrs frederick if tech company knows about a particular instance of child pornography on its platform but decides to ignore it and permit its distribution with section 230 prevent the victim from suing the tech company i would say that given section 230's broad interpretations by the courts companies have historically avoided liability for hosting similar content okay as a follow-up if a brick and mortar store knowingly distributes child pornography can the victims sue that particular business in your opinion yes obviously thank you i don't see any reason why we should be giving special immunities mr chairman to online platforms that don't exist for other businesses when it comes to a business knowing exploiting or knowingly exploiting our children and facilitating child pornography it would be a discredit to us all to allow this to continue which is why i have a public bill a draft uh i think you can see that seeks to end this despicable protection so i i request bipartisan support in this matter i think we have agreement uh and and in general i believe we have agreement and this is a very very informative hearing mr chairman and thank you for calling it i yield back the balance of my time i know you'll like that and by the way the the pirates do have a bright future thank you i can only hope you're right about that gus uh let's see chair recognizes mr vc for five minutes uh mr chairman uh thank you very much for holding this very important hearing on 230 um it's really timely and critical that we start talking about how we can move the needle on this issue and hold big tech accountable we know that recent reports demonstrate concerning trends that i should put every member of this committee uh and certainly every member of congress on alert about the shortcomings of big tech uh uh and their repeated promise to self-regulate i'm optimistic that we can get something done uh because i really do think that social media platforms are uh no question a a major source of uh a and a dominant source of news now in our lives uh whether it's entertainment uh personal connections uh news even local news advertising uh all of that happens in the uh social media world uh but we also continue to see social media platforms uh acting in a problematic way uh and some instances even endangering the lives of very young kids and today is no different social media platforms continue to behave without an honest solution-oriented approach to stop the spirit of misinformation to manipulate public opinion and we know rampant disinformation about things like uh voter fraud uh is still uh present uh in our communities and as this new variant of covet 19 uh looks around the corner uh congress really needs to act now so we can stop the missed spread of misinformation around covet 19 and this new variant that's about to come through it's it's very uh disconcerting to think about some of the things that we're about to hear about this new variant that's just a bunch of bs uh and while social media platforms continue to flourish uh and the number of users they are able to keep on their platform uh the number of reported harms associated with social media is just one of many consequences we are seeing uh as a result of big tech business practices uh for instance the uh the anti-defamation league says about 41 percent of americans experience some form of online harassment uh next year so doing nothing is not an answer we have to do something uh and i think that we can do that and i wanted to ask uh the panel a question uh again there's no doubt uh that big tech tech companies have just been flat-footed when it comes to getting ahead of removing harmful content and disinformation on the most popular social media prep platforms uh miss uh uh uh miss francis uh hagan you mentioned numerous times during your interview 60 minutes that you wanted to show that facebook cares about profits more than public safety uh in november of this year uh facebook which has now rebranded itself as meta now said uh it is working with civil with the civil rights communities uh privacy experts and others to create a race data measurement given your experience and background in the field can you talk about how facebook incorporates such recommendations into these types of measuring uh tools and is there cry is there a criteria or a set of guidelines that facebook is considering when shaping the product while while i was there i was not aware of any actions um around analyzing whether or not there was a racial bias in things like ranking one of the things that could be disclosed by facebook but does not is the uh concentration of harm on the platform so for every single integrity harm type every safety harm type a small fraction of the users are hyper exposed to that harm it could be misinformation it could be hate speech and facebook has ways to report that data in a privacy conscious conscious way today that would allow you to know whether or not harms across the platform were equally born but they don't do it so this new tool that they're talking about uh creating do you see a potential drawbacks uh to raid in such measurement which is supposedly one to increase fairness when it comes to race in the u.s on this platform is there anything that we should be on the lookout for um while when i was working on narrow cast misinformation we developed the system for segmenting the us population in a privacy conscious way we looked at the groups and pages that people interacted with and then clustered them in a non-labeled way so we're not assigning race to anyone we're not assigning any other characteristics but we're looking at when we look at consistent populations do they experience harms in an unequal way i don't believe there'd be any harms for facebook reporting this data and i believe it is a responsibility of the company to disclose the unequal treatment on the platform because it's the only way if they're not held accountable there's not transparency they will not improve there's no business incentive for them to get this more equitable if it comes at a loss in profits thank you very much mr chairman i yield back gentlemen yields back chair now recognizes mr johnson for five minutes thank you mr chairman you know this topic we're discussing today is uh certainly not a new one this committee has told big tech that they cannot claim to be simply platforms for third party information distribution while simultaneously acting as content providers and removing lawful content based on political or ideological preferences in other words big tech cannot be both tech platform and content provider while still receiving special protections under section 230. free speech involves not only being able to say what you believe but also protecting free speech for those with whom you strongly disagree that's fundamental in america and big tech should not be granted the right to choose when this right to free speech is allowed or when they should refer or prefer to hide edit or censor lawful speech on their platforms they are not the arbiters of the freedoms constitutionally provided to the american people this committee has brought in big tech ceos numerous times now so far they have chosen to arrogantly deflect our questions and ignore the issues we presented to them it took a whistleblower whom we're fortunate to have with us today to expose the harm that facebook and other social media platforms are causing especially to children and teens at an impressionable age that harm concerns me i have a discussion draft that would require companies to disclose the mental health impact their products and services have on children perhaps such requirements would prevent the need for a whistleblower to expose highly concerning revelations including that executives knew the content of their social media platforms or toxic for teenage girls and perhaps it would incentivize these exact these executives to come back to us with solutions that enable a safer online experience for its users rather than attempting to debunk the evidence of their toxicity however we also must be careful when considering reforms to section 230 as over-regulation could actually lead to additional suppression of free speech it is our intent to protect consumers while simultaneously enabling american innovation to grow and thrive without burdensome government regulation password don't know who that was that wasn't me mr chairman that's not my that's not my accent as you can tell miss frederick um the chinese communist party has multiple agencies dedicated to propaganda from the ministry for information industry which regulates anyone providing information to the public via the internet to the central propaganda department which exercises censorship powers through licensing of publishers how do big tech's actions compare to those of the ccp the chinese when it comes to censoring content i'd say that a healthy republic depends on the genuine interrogation of ideas and having said that i am very troubled by what i see as an increasing symbiosis between the government and big tech companies i talked about saki's press conference but what i didn't say is in that july press conference again from the white house podium she said if one user is banned from one private company they should be banned from all private companies platforms that to me is harrowing what company is going to want to start up if 50 of their user base is automatically gone because the government says so so in in my mind that increasing symbiosis between the government and tech companies is very reminiscent of what the ccp does and okay ms haugen in your testimony to the uk parliament you recommend that a federal regulator should have access to platforms internal processes and the ability to regulate their process for removing content just yesterday one of my democrat colleagues agreed with your recommendation for more government intervention i'm seriously troubled by my colleagues thinking that government involvement in private business operations to regulate content is even an option to put on the table bigger government means less innovation less production and less progress not to mention the very serious first amendment first amendment implications this is unamerican ms frederick quickly can you talk about the negative impacts this approach would cause it's authoritarianism okay mr chairman i yield back i was mischaracterized um the only thing i've advocated for is transparency and the government mandating that facebook must articulate what it's doing to solve problems because today they lie to us they give us false data when they rarely give any data and they always just say we're working on it they never actually give progress so i just want to clarify my opinions gentlemen yield back i yield back mr chairman chair recognizes mr soto for five minutes thank you mr chair lies about the vaccines lies about the 2020 election lies about the january 6 insurrection all proliferate on social media to this day it seems like as we're working on key reforms like protecting civil rights accountability for social media companies protecting our kids the main opposition by republicans today the talking point of the day is they want a license to lie the right to lie without consequence even though deliberate lies are not free speech under new york times v sullivan according to our supreme court what was scenario number one tom cotton senator cotton referring to his woohound lab theory he literally said quote we don't have evidence that this disease originated there to the new york times yet radical right-wing media then goes on to say the virus is part of china's bio-warfare program that's a terrible example to use and then after president trump was impeached for collusion with ukraine you want to talk about a hunter biden laptop really i'm deeply concerned about how these things are already spreading in spanish language media as well i got to speak to mr zuckerberg in march about that and he said there's too much misinformation across all these media he mentioned deterministic products like whatsapp and they also said quote there were certainly some of this content on facebook and it's our responsibility to make sure that they're that we're building effective systems that can reduce the spread of that i think a lot of those systems performed well during this election cycle but it is but it's an iterative process and there are always going to be new things that we'll need to do to keep up with the different threats we face then i asked him to commit to boosting spanish language moderators and systems on facebook especially during election season to prevent this from happening again miss haugen you left facebook about two months after that hearing in may and uh has there been significant updates since that hearing on protecting spanish minister information in short as mark zuckerberg kept his word i do not know the progress of the company in since i left i do know before i left there was a significant asymmetry in the investment in safety systems we live in a very linguistically diverse country and yet facebook has overwhelmingly 87 percent of its budget for misinformation is spent exclusively on english all the rest of the world falls into that remaining 13 uh when we live in a linguistically diverse society where there aren't safety systems for non-english speakers we open up the doors to uh dividing our country and being pulled apart because the most extreme content is getting the most distribution for those populations and you mentioned specifically in other hearings about ethiopia and the concern with there can you go into that a little more we are seeing a trend in many countries around the world where uh parties are arising based on uh implying that certain populations within their societies are are subhuman right one of the warning signs for ethnic violence is when leaders begin to refor refer to a minority as things like insects or or or rodents right dehumanizing them at the same time because facebook's algorithms give the most reach the most extreme content uh facebook ends up fanning the flames of this extremism around the world and in the case of ethiopian myanmar that's resulted in people dying thank you miss huggin mr robinson uh we also see a lot of lies and misfortune related to the vaccines how has misinformation impacted communities of color taking the coven 19 vaccine well because of the ways in which facebook is not transparent about their algorithms transparent about how ads can be deployed we actually don't have a full understanding of of what we're dealing with because you know we are dealing with you know deep levels of deceptive and manipulative content um sort of um content that gets to travel and travels can travel far within sub-sects of communities but without any clarity of what's happening until it's sometimes far too late until you can't actually deal with it the ways in which um money can be put behind those for paid advertisement to sell people um things that are not um approved that haven't been tested um in in opening statements um we heard um about um drugs um being sold online and being marketed through algorithms and that's exactly what we're seeing when it comes to black communities mr mr robinson sorry my time's limited would you say misinformation reduces vaccination rates among covet among communities of color with the it can both reduce vaccination rates and increase people going down rabbit holes of using all sorts of untested um drugs thank you my time's expired gentleman yields back chair now recognizes mr long for five minutes thank you mr chairman and thank you all for being here today uh miss halgen the third pillar of the big tech accountability platform is addressing big tech's relationship with china much of the information you brought forward discusses the challenges associated with facebook's business model and how it chooses content that users see on their platform which leads to many of the harms that the platform causes today we're looking at this issue all across big tech not just on facebook one of the platforms we're paying close attention to as you know is tick-tock reports suggest that tick tock's parent company bite dance coordinates with the chinese communist party to facilitate abuses against uyghur muslims and pressure united states-based employees to censor videos that the chinese communist party finds culturally problematic or critical of the chinese communist party how does tick-tock's platform business model make it ripe for being censored by china that is a wonderful question so i often get asked questions about the difference between personal social media which is what facebook is you know you're connecting with your family and friends and what i call broadcast social media where people create in order to get reach tick tock is specifically designed with no contract between the viewer and the content they receive you know you you get shown things you don't know exactly why you got shown them and the way facebook or tick tock works is they push people towards a very limited number of pieces of content you can probably run 50 of everything that's viewed every day with a few thousand pieces of content per day that system was designed that way so that you could censor it like when it was in china they're intentionally set up so the humans can look at that high distribution content and choose what goes forward or not tick tock is designed to be censored okay thank you and uh question for you mr ms frederick your testimony makes clear that holding big tech accountable means increasing transparency into their practices you also have a background in national security we don't know how tick-tock monitors its platforms or censors its content they could easily be doing this be doing the bidding of the chinese communist party and we wouldn't know anything about it how do you uh or do you excuse me do you have recommendations on how we can increase big text transparency for example how do we know if the content viewed by americans on tick tock isn't spreading communist propaganda i would say incentivize transparency certain companies of their own volition right now they give quarterly reports on how they interact with law enforcement you know how they employ their their community standards but other tech companies are not doing this so there has to be some teeth when you incentivize that transparency among tech companies and when it comes to tick tock in particular as you said parent company headquartered in beijing they you have to assume that they are beholden to the ccp in this instance the governance atmosphere of china the 2017 cyber security law the national security laws basically say that whatever private companies do whatever data they ingest however they interact all of it is subject to the ccp when it comes knocking so in my estimation i don't believe any american right now should be on tick tock and there are social contagion elements there as well the algorithm their secret sauce uh it is crazily um wants user engagement and nine to eleven year olds uh parents were surveyed and of nine to eleven-year-old americans these parents said 30 of them are on tick tock this is more than instagram this is more than facebook this is more than snap so when we think about the formation of our young minds in this country we have to understand that bite dance that a beijing-based company has their hooks and our children and we need to act accordingly so you say that the parents are saying that 30 percent of their children are on tick tock 30 of parents say that their children are on tick tock uh nine to 11 year olds in particular so i'd say that i'd say that there's probably another 30 percent that don't know what their kids are looking at i mean i think it's a lot higher number than 30 in my opinion uh tick tock hyper amplification algorithms also make it even more addictive than tick tock because they can choose the absolute uh purest addictive content and spread it to the most audience so i agree with her this is a very dangerous thing and it's affecting very young children okay mr chairman thank you for holding this hearing today and thank you all for your participation here today i really really appreciate it it's a very very serious subject as we all all know and mr chairman i yield back gentleman yields back chair recognizes mr o'halloran for five minutes there we go thank you chairman doyle uh i uh you know there's no doubt uh that the about the positive and negative impacts of the technology platforms have in our society today we've been talking about it all day long as a father grandfather and like many americans i was outraged and am outraged to read about the inner workings of facebook brought to light by ms haugen facebook is recklessness disregard for the well-being of children and teenagers especially given their internal research this is completely unacceptable instead of using facebook and instagram to create positive and social experience for minors facebook is exploiting our children and grandchildren for clicks and ad revenue this is a particular problem for teenage girls facebook's own internal research found that using instagram made teenage girls feel worse about themselves leading to depression eating disorders and thoughts of suicide and yes even death i i don't know how they can come up with these decisions that they've come up with i'm a former homicide investigator as well as father and grandfather i've seen a lot of suicide i've i've witnessed a lot of death in our country and i don't know how somebody can make these decisions knowing the information they knew and the impact it was going to have on the families and children within our society today facebook thinks it's okay i think it this is again an outrage this is clear evidence that something needs to change we need transparency for companies like facebook we need to know what they are showing our children and why and to identify what how these algorithms come together and what impact they will have on the rest of our society we can't have facebook and their algorithms taking advantage of our children our children and families are more than just cor out there for corporate greed tech corporations also have a moral responsibility to children and families in our country in general and the rest of the world miss hogan halogen uh can you tell us more about how instagram uses demographics and a user's search history to serve up content and ads even if the content and ads are harmful to the user facebook systems are designed for scale one of the things that has been seen over and over again in my senate hearing they showed explicit examples of this and facebook does vet ads before they're distributed but they do it very casually they don't do it rigorously enough and as a result uh this in the senate hearing they demonstrated that you can send ads for drug paraphernalia to to children to 13 year olds if you want to there's a lack of accountability when it comes to ads and a lack of detail the second question is around things like search like how do those interests uh then percolate into spirals like down rabbit holes when you engage with any content on instagram it facebook learns little bits of data about you they learn what kinds of content you might like and then they try to show you more but they don't show you random content they show you the content most likely to provoke a reaction for you and facebook has demonstrated that in the case of things like teenagers you can go from a search query like healthy eating to anorexia content within within within less than two weeks just by engaging with the content that you're given by facebook thank you mr halgen facebook had this data that showed how harmful instagram is to teenage teenage users did facebook executives really ignore these findings and make no meaningful changes did they really decide that their profits were more important than the well-being of our kids and i'm trying to understand that who works at facebook that makes these type of decisions and why they make them for when they know that they're going to impact have a negative impact especially on our children in this society i think there's two core problems that lead to the situation the first is that facebook has uh an unflagging faith in the idea that creating connections is more valuable than anything else uh bosworth who's the i believe now the cto of facebook that's an excuse that's a faith of greed that's not a faith of moral responsibility i i don't attribute intentions because they believe that connection is so magical that it's more valuable than say a kid's killing themselves but he's quoted there was a there was a piece that was leaked a couple of weeks ago where in it he says it doesn't matter if people die we are going to advance human connection the second question is how can these decisions be made over and over again facebook has uh has a diffuse responsibility like when antonia davis was appeared before the senate she couldn't name who is responsible for launching instagram kids or who would make that decision because facebook's organizational structure has no one who is responsible for anything they always say this committee made the decision we need to require them to put names on decisions because then someone would take a pause and say do we do i really want my name on this thing that might hurt someone thank you very much and i yield gentlemen yields back chair now recognizes mr wahlberg for five minutes thank you chairman doyle thanks thanks for having this hearing and to our panel thank you for being here uh miss haugen you state in your testimony that on a quote facebook became a one trillion dollar company by paying for its profits with our safety including the safety of our children and it is unacceptable i i agree wholeheartedly and would go even further to say that it's not only unacceptable it's morally and ethically wrong in the march hearing and others we heard big tech companies constantly lie to us and say that they are enhancing safety protections when in reality what they're doing is increasing censorship for more monetary gains and it's a tragedy that half of this country including many friends and family of mine who feel that they need to use these platforms and they're amazing i have a love-hate relationship for the platforms i love them and i hate them it's amazing what they can do but when a family member of mine has to stay off of content areas because of potential of not being able to use facebook for his business that's concerning it also state very clearly that while i love everybody on this committee and in congress good friends and colleagues i don't want any of you censoring me i don't trust you to do it the only one i trust the censor is me doing the censoring and you shouldn't trust that so the issue here is not so much with adults i don't want to be treated as a child i want to be accountable for what i believe what i read and what i accept and so that's an adult issue but kids it's different story as a parent i've now grown three kids my wife and i did creative things to try to keep them from using the tv when we were gone but now with my grandkids six of them young children it seems nearly impossible to keep kids away from harmful digital content and that's where i have my major concerns facebook knows that his platforms cause negative mental health impacts on young users and yet they continue to exploit children for profit while selling parents ability goods and selling us a bill of goods they refuse to abandon instagram for kids saying they believe building the app is the right thing to do they've said that in front of us but it's not just facebook google tic tac and snapchat have built empires and collecting and selling our children's data and have become havens for predators seeking to exploit and lure vulnerable populations as the lead a sponsor of the only bipartisan bill in the house to update the children's online privacy act i'm very worried about the harm tick-tock poses to our kids and the national security threat that its chinese communist party-backed mothership bite dance poses to our democracy recently before the senate commerce committee a tick-tock executive was unable to distinguish what american data may fall in the hands of mainland china miss haugen i understand you have a background in both of these realms can you please give us a sense of the threat this entity poses to our society and to our children there have been past scandals in even in the last year or two regarding tick tock where tick tock banned all content from disabled users and from homosexual users to protect them from bullying when you have a product that can be so um thoroughly controlled we we must accept that on if if bite dance wants to control what ideas are shown on the platform the product is designed so that they can can control those ideas they can block what they want to block there is no there's nowhere near enough transparency in how tick-tock operates and i worry that is substantially more addictive than even instagram because of its its hyper hyper-amplification focus thank you uh uh miss frederick it's become abundantly clear that big tech will not enact real changes unless congress forces them to i have great concerns about that um but it's clear they they can lie to us they'll keep doing that they have no intentions so ms frederick i've led a discussion draft that would carve out section 230 liability protections for reasonable foreseeable cyber billing uh bullying of kids under 18 meaning there would be need to be an established pattern of harmful behavior for this to apply do you think this approach will actually force big tech platforms to change their behaviors why or why not so i think there are a couple benefits and challenges to something like this the benefit is that it would address genuine problems on the platform but you run into some issues when it comes to the definition so you want to make the definition linked to a standard and as tight as possible because we see what definition inflation looks like i was in a room in orlando florida talking to a bunch of grandmothers nobody under probably 60 and i asked them given facebook's rollout of a pilot program on extremism and creating that friction between extremist and potential extremist content almost every single one of them raised their hands because they got that extremism warning that they potentially engage with extremism or know an extremist that definition inflation is a critical problem so i think if you tighten up that definition make it as tight as possible i think it will go far in addressing some of these problems that exist on the platform that are actually genuine thank you and i yield back gentlemen yields back chair recognizes um miss rice for five minutes thank you so much mr chairman i really want to thank you um for having this hearing you know um i i can't believe that we are i'm very happy that we're here discussing potential future legislation but i believe that level of congressional and action when it comes to anything social media company related is astounding and it is our greatest national moral failure i am not a parent um but i'm one of 10 kids and i can tell you my mother had to police 10 children using tick-tock and instagram and facebook i mean i don't i i don't know what she would have done so and i'm loathe to um i don't mean to be critical of anyone's parenting but one thing that we should listen to when it comes to all of these social media poncho bigwigs none of them let their own children use any of these platforms none of them so why do we i really hope and i'm grateful for all the witnesses here today i really hope that we can come up with legislation that will once and for all send a message to these social media platforms that have taken over every aspect of our life not just here in america but across the planet um finally to put some bite in the law i spent nine years as the elected d.a in my home county before i came to congress and i was in a unique position to understand where the law failed to address um anti-social behavior we know now right and then i would go to albany and say okay we need to get this lot to protect this to do that and i understand that it takes a while to do that kind of thing to come to a consensus but i i'm hearing overwhelmingly from my colleagues on both sides of the aisle today that we all understand the urgency to do something in this instance so mr steyer i would like to ask to start with you it was the wall street journal that published an article and i think mr chairman you might have made reference to this article that was published in september that was titled facebook knows instagram is toxic for teen girls company documents show we've talked a lot about this um today um but it was really disturbing to learn that and to read internal communications from facebook employees and managers that showed that they were fully aware of how their algorithms harm young users but that they continue to curate content in that manner anyway um mr steyer can you please um maybe explain a little deeper why teen girls are particularly vulnerable to this cycle of emotional and or psychological manipulation online i don't think it's i think it's i don't think it's fair to talk about um you know teenage girls in in kind of isolation when there are so many different groups who are impacted negatively by these social media companies and their algorithms but it's important that we help educate young girls to tell them the truth about what information is is beating them in the face and affecting their lives um their very lives so if you could just mr expound a little bit more on that because it's important that children get you the actual users who are the victims here and they are victims understand you're absolutely right congresswoman and you're absolutely right too first of all the reason it's instagram is such a powerful platform is it's comparative so kids and teens constantly compare themselves to each other it's the essence of their self-esteem it's the essence of how they grow up it's we all understand this because we all were kids and teens at one point so it's it's why the platform is so powerful i think the second point is you're absolutely right that this has to be a bipartisan issue and quite frankly i would like to say to this committee you've talked about this for years but you haven't done anything right here show me a piece of legislation that you passed we had to pass the privacy law in california because congress could not on a bipartisan basis come together and pass a privacy law for the country and i would urge you to think about that as you and put aside some of the partisan rhetoric that occasionally seeped in today and focus on the fact that all of us care about children and teens and that there are major reforms to 230 that would change that remember freedom of speech is not freedom of reach and so it's the amplification and the algorithms that are critical so transparency as a number of you have mentioned on both sides is critical and the other thing i want to say to your very good question congresswoman rice is that 230 reform is going to be very important for protecting kids and teens on platforms like instagram and holding them accountable and liable but you also as a committee have to do privacy anti-trust and design reform so in a comprehensive way this committee in a bipartisan fashion could fundamentally change the reality for kids in society and i really hope you will do that because there's been a lot of talk but until there is legislation the companies that we're referring to are going to sit there and do exactly what they're still doing so thank you very much for the question and thank you all for a bipartisan approach to this issue very good voice mr schneier thank you so much and thank you to all the witnesses and also thank you to mr chair i yield back general lady yields back uh mr duncan welcome you are recognized for five minutes thank you mr chairman and um you know we've had the opportunity to discuss these issues with the heads of facebook twitter and google in the past and i've asked those ceos in this hearing room do you believe you are the arbiters of absolute truth i was sitting here listening to the hearing and thinking about the hearing even before i came in today and i kept coming back to this 1984. the words in this book that george orwell wrote ring so true when we talk about where we are today with big tech and all the things that have been discussed here um not just 230 protections some quotes from that book we know that no one ever sees his power with the intention of relinquishing it who controls the past controls the future who controls the present controls the past do you see that the whole aim of newspeak is to narrow the range of thought narrow the range of thought in the end we shall make thought crime literally impossible because there will be no words in which to express it we've seen these arbiters of truth at least in their minds with big tech actually scrub words that can be used on their platforms i still think that the question before us is social media platforms need to check themselves and understand they're not god's little g they are not arbiters of truth for the past two years we've seen an unprecedented onslaught from big brother tech on conservative thought it's interesting mr chairman we don't see liberal thoughts suppressed by big tech platforms because big brother tech believes they should be the arbiters of truth and they hate conservatives so they silence us president donald j trump using the at donald trump at donald j trump handle was the single most effective and most successful social media user and influencer ever twitter didn't like his politics so much so they de-platformed him you know i think about this book thing called a memory hall it's a small shoot leading to a large incinerator anything that needed to be wiped from the public record was sent in the memory hall donald trump's twitter handle was sent in the memory hall tried to be wiped they wanted to make him an unperson someone whose existence had been excised from the public in private memory legislation democrats are bringing forward is in that same spirit we know what's best for you and if you disagree then shut up you would allow yourselves to define harm and conservative thought thought is harmful to the nanny state you would allow yourselves to define hurtful and conservative thought is famously hurtful to the nanny state as our friend ben shapiro said facts don't care about your feelings you would allow yourselves to define extremism and then label anyone who opposes you as extremist that's double think double think in 1984. act of simultaneously accepting two mutual contradictory beliefs is correct these are the tactics of the old soviet union the communists there all descent must be silenced and i think we've seen big tech try to silence those they didn't agree with because they they blame it on an algorithm or whatever but truth be known has been exposed that these efforts were consent consciously put forward it wasn't just some algorithm running by ai you're holding this hearing in that spirit today the same soviet spirit and build back better that burdens taxpayers trillions of dollars new debt weakens our currency with inflation harms our people same spirit you come today with the left-wing government's alliance and left-wing big tech the silence conservatives like you silenced donald trump think paul from 1984. think paula it's a newspeak word to describe the secret police of oceana who are responsible for the detection prosecution and elimination of unspoken beliefs and doubts that contradict the party i would say contradict the liberal thought in this arena i want to ask one question real quick because i think you all get the the gist of what i'm trying to say mr miss frederick in your testimony talk about the pitfall of having congress try to define harm one of the proposals we are considering today remove section 230 protection from companies that use algorithms to promote harm what are some of the consequences of taking that approach so i think it's absolutely worth noting that when trump was banned from 17 different platforms in two weeks the aclu spoke out against the ban no friends of conservatives right um angela merkel spoke where are they today though russian navalny spoke out against the ban uh russian dissident alexander navalny um and lopez obrador as well so everybody recognizes the threat of censorship it's not just republicans it's not just conservatives it's independently minded people who think that our health depends on the genuine again interrogation of ideas tech companies are not allowing that happen we need to strip them from immunity when they censor based on political viewpoints my time's up mr chairman censorship is bad we need to keep hollering that from the rooftop i yield back gentleman's time has expired chair now recognizes mrs issue for five minutes thank you mr chairman uh let me start by uh thanking miss hogan for your courage in uh coming forward uh what you've done is a great act of i believe in public service uh the document you disclosed uh relates to carol's journey uh it's a project that facebook researchers um set up to observe the platform's recommendation uh endings now i only have five minutes so maybe you can do this in a minute and a quarter or whatever but can you briefly tell us what the research found as it relates to facebook's algorithms leading users down rabbit holes of extremism facebook is found over and over again on the right on the left with children that you can take a blank account so there's no friends no interests and you can follow centrist interests you can follow donald trump and melania you can follow fox news or you can follow you know hillary and msnbc and just by clicking on the content facebook suggests to you facebook will get more and more and more extreme so on the left you go to within three weeks to let's kill republicans it's it's crazy on the right within a couple of days you get to q anon within a couple weeks you get to white genocide there isn't too tangling as facebook claims there's only facebook system amplifying and amplifying amplifying and this happens because that content is the content you are most likely to engage with even though when after your survey you say you don't like it thank you very much uh to uh mr sire uh jim it's uh wonderful to see you again thank you for testifying today uh your testimony mentions uh how addictive uh design features are like uh snap streaks uh how harmful that is um can you tell us more about how these addictive designs uh prey on children uh and teens in particular absolutely good to see you congresswoman escher the it's very clear that platforms like facebook and instagram but if you just mentioned snapchat youtube and others have literally design features like the auto replay that we're all familiar with three two one you watch the next episode that are designed to keep you there as i mentioned in my opening remarks this is an arms race for attention attention engagement as ms however has made clear to the public is the basis of the business model for a number of the social media platforms and what that does with younger minds particularly children and genes less developed minds is constantly get you to come back because you are being urged in very very creative and strategic way by very sophisticated engineers to stay on that platform because they make more money so it's the core of the business model that's at stake here and the point that has been made repeatedly and it should be part of the legislation that this committee addresses is you have to have transparency about the design and the algorithms those are separate issues i believe this committee is going to have a separate hearing next week that's going to go to design issues congresswoman x2 they're very important because at the end of the day if we are able to transparently see how facebook builds its platform and nudges you particularly you and all of us but mostly kids and teens to stay on their platform that will change everything it will also change everything if their liability for that behavior is removed and they're held accountable so you were onto something big it's why i said earlier this committee should take reforms of 230 very seriously and move forward on the legislation but also look at this in a comprehensive way and include privacy by design and include all and the design issues you mentioned that is what will protect our kids going forward and we'll make this the first bipartisan effort in congress in years to protect children thank you very very much jim uh to mr robinson i think we're uh all moved by the examples of uh civil rights harms that you cited in your in your written testimony uh can you elaborate on how these are fundamentally uh issues of product design and business model and not issues of uh user generated content well there's all sorts of things that are connected to user generated content but the fact of the matter is this is about what gets amplified what gets moved on the content another thing that i do think is important is that companies that don't hire black people can't be trusted to create policies to protect black communities and these companies have time and time again have made a choice not to hire black people and so the russians knew more about black people during the 2016 election than the people at facebook and so this the the disinformation that was allowed to travel on their platform was a direct result of choices that they've made um and so the only other thing i'd like to add is that your bill in terms of this comprehensive sort of set of things that we need your bill the online privacy act which creates a data privacy a data protection agency is also incredibly important because we need infrastructure and our government to actually be able to meet these 21st century needs thank you very much and yield back mr chairman general lady yields back cheer now recognizes mr curtis for five minutes thank you mr chair and mr rank cu member and our witnesses it's very interesting to be with you today i think we understand censorship it's an easy concept the suppression of speech public communication or other information when we think of censorship we generally refer to limiting of objectionable harmful obscene or dangerous information censorship can be dangerous because its intent is to control thought i'm not sure we understand the other side of this as well now we've talked about it and i'm not going to give you any new ideas but i'm going to talk about it in a slightly different way today and that is the attempt to control thought by presenting or feeding objectionable harmful obscene or dangerous information as i was preparing for this i could not think of a word of the opposite of censorship right that's what i was trying to come up with and it dawned on me there are words brainwashing propaganda we have done this in war we've dropped pamphlets across enemy lines to influence people's thoughts and behaviors we've had radio stations infiltrating behind enemy lines that's what this is isn't it i'd like to look at this through a slightly different lens which is the algorithm transparency that we've talked about and customers having the ability to tailor their social media experiences based on what they want and not what the social media giant wants in my problem it's the content presented to people without their consent with an accompanied agenda that's the biggest problem miss frederick can you define in simple terms that everybody can understand back home what an algorithm is and how social media uses it so algorithms are codes built by programmers designed by programmers that basically take information so the input whatever however these are designed uh whatever you know data is labeled etc and produce an output that has an effect so input to output algorithm building built by people i think that is a critical element is they are built by people companies can't hide behind the algorithms they're not just ottomans that go forward they're built by people i paid for a substantial amount of a phd of my son who's a data scientist he he works has worked for grocery store chain in predicting a rush on milk and eggs and things like that is it possible when we talk about transparency do we really have the ability to to give the average lay person a view into these algorithms in a way that they really can understand what they are i think to some degree and the previous witness was just talking about privacy by design privacy preserving technology so there are ways to actually design these programs to design these machines that are imbued again with values like privacy so there is a way to manipulate them and people should know if they are being manipulated i think what you're saying if i understand it right you tell me if i'm wrong is that these algorithms could be created to manipulate in a harmful way the way people think it's happened before can you are there any examples that come to mind just quickly that you could share that we would all understand i think we're all familiar with the facebook files and uh the documents that have been released that dict that talk about the design of these algorithms and how they were manipulated starting in 2018 to increase engagement could an algorithm be created to influence the way a person votes it could contribute to their cognitive processes and the decisions that they eventually make in the voting booth should somebody have protection from the law who creates an algorithm to determine how somebody votes i'm not sure i know the answer to that myself i think that's why we're here today right like but that's what's happening isn't it i think in questions that run up against very serious debates i think individual liberty and individual freedom in general should always be paramount if there's a bad actor not the companies themselves whose intent is to influence how somebody votes let's hypothetically say russia and a company facilitates their intent and their agenda should they be protected from the law i think you run into a problem of attribution here when the strategic intent of these nation-states blend with patriotic netizens when they blend with hacktivists when they blend with people who just want to be chaos agents in general seven seconds let me try to make a point we have a town square right people can post things in this town square everybody understands that i was a mayor i could put right we could have that it's very complicated if i as the mayor decide i'm going to take some things down and i'm going to take some things and duplicate them and put them back up and it's that simple right that i think what we're talking about is where are the boundaries in this and how do we find the boundaries i chairman to him i briefly comment something uh yes very briefly um in 2018 when facebook made that change political parties across europe from a variety of different political indications said we were forced to change our positions to more extreme things on the left on the right because that's what now got distributed we saw a difference in what we could run the idea that our political parties now have the positions they can take influenced by facebook's algorithms and the changes that influences the elections because it controls what we get to even vote on in the ballot box this gentleman's time has expired chairman thank you i yield back uh chair now recognizes ms matsui for five minutes thank you very much mr chairman and first of all i want to thank you for holding this legislative hearing today this is really not the first time that our subcommittee has met to consider needed updates to section 230 and it certainly won't be our last well our discussion today can and should be measured and fact-based we cannot lose sight of what brings us here today a crisis exacting an immense human toll and undermining our shared democratic values the magnitude of this crisis will necessitate a comprehensive approach that has implications for privacy on a trust and of course section 230 reform i introduced the algorithmic justice and online platform transparency act with senator markey to bring needed transparency to the algorithms employed by online platforms and establish clear prohibitions on the most discriminatory algorithms in use today the bill has been endorsed by 14 of the most important public interest groups like the anti-defamation league consumer reports and two organizations that are testifying here today with free press action and color of change i'm hopeful my bill will be included on the agenda for consumer protection subcommittee hearing on the 9th you know like many parts of this country i represent a region that is suffering from an acute shortage of affordable housing that's why it's so alarming for me to see case after case of discrimination and housing opportunities online recently the department of housing and urban development took action against facebook over concerns that his targeted advertising platform violates the fair housing act by encouraging enabling and causing unlawful discrimination by restricting who can view housing ads mr robinson as a simple yes or no to set the stage in your experience are the big tech algorithms and practices disproportionately impacting people of color yes thank you i think it's important to reiterate that to frame our discussion my algorithmic justice and online platform transparency establishes an interagency task force composed of a broad group of agencies including the federal trade commission and housing and urban development to investigate the discriminatory algorithmic processes online mr robinson when it comes to enforcement do you believe including sector specific expertise like hud or housing is important to effectively document police instances of discrimination within specific industries yes uh mr robertson are you aware of instances in which facebook or other platforms have designed their products any manner that allow advertisers or sellers to discriminate in ways that are inconsistent with the country's anti-discrimination laws yes and i've spoken to them about it directly okay fine thank you um i'm very concerned about youth mental health i have grandchildren teenagers and i'm i really see the fact that um they are so connected to their social media through their devices now recent revelations from witnesses here today miss holgam confirmed that many of us ordering you to be true the social media is harming the mental health of america's youth especially pernicious for teen girls and that facebook is aware of the problem the results of these internal documents speak for themselves teens blame instagram for increases in anxiety and depression instagram made body image issues worse for one in three teen girls miss hogan clearly there is a potent mix of psychology and engineering at play here can you describe or can you tell me about the backgrounds of the employees that these companies hire to help them exploit youth psychology with targeted algorithms facebook employs researchers who have phds who may or may not have expertise specifically in child psychology and there are um there are specifically advocates who work with um external partners to develop things like the interventions on self-harm the question though is how much does that actually reach people uh when i was there there was a dashboard for the self-harm dashboard which facebook loves to promote and it's only been shown hundreds of times per day there's a question of what scale of intervention should facebook be doing and i don't believe facebook is acting strongly enough to protect our children okay well thank you very much and i thank you very much for what you've done too so i yield back general lady's time expires she yields back okay uh chair's gonna recognize mr welsh for five minutes uh thank you very much i really want to thank all three of you and mr steyer for your testimony the clarity with which you presented the dynamic that now exist is overwhelming and i think shared on both sides of the aisle we've got a business model where amplifying conflict amplifies profit and the two casualties of that business model our our democracy and our children and i want i'm going to lay out my thoughts and i want your response to this but in a democracy it depends ultimately on trust and norms and the algorithms that are promoting engagement are about conflict versus cooperation they're about blame versus acceptance and i see what is happening as a profoundly threatening development for the capacity of us as citizens to engage with one another and sort out the conflicts that are legitimate disputes among us and secondly the horrendous use of a business model that attacks the self-esteem of our kids and i don't care whether those kids come from a family that supported donald trump or a family that voted joe supported joe biden we all love our kids and they all have the same challenges when they're trying to find their identity and they have a business model that essentially erodes those prospects and those efforts is one business model that we have to challenge my view on this is i've listened to you and also heard the proposals that i'm supportive of from my colleagues is that we need more than one-off legislation to address what is a constantly evolving situation and in the past our government in order to protect the public interest in the common good has created agencies like the inter the the interstate commerce commission like the securities and exchange commission an agency that is funded that is staffed with experts that has capacity for rulemaking and can engage in the investigation just as an example of algorithms so my view is that congress needs to establish a commission which i'm calling the digital markets act it would set up an independent commission with five commissioners it would have civil penalty authority it would hire technology experts to oversee technology companies it would test algorithms and other technology to ensure that any technology is free from bias and would not amplify potentially harmful con conflict con content the commission will be authorized to engage in additional research and an ongoing basis that's needed for us to have oversight of the industry so this is the approach i think congress needs to take it's not about free speech by the way because it's not about good ideas or bad ideas you make a good point miss frederick and it's not about good people versus bad people it's like recognizing that no mr zuckerberg you're not in charge of the community a forum that we have a democracy that we have to defend we have children that we want to protect so i'm just going to go i'll ask you miss hoggin what's your view about that as an approach to address many of the problems that you've brought to our attention i think one of the core dynamics that has brought us to the place that we're at today is that facebook knows that no one can hold no one can see what they're doing they can claim they're doing whatever they're doing and they've actively gasled investigators researchers academics for years when they identified real problems we need somebody it can be a commission it could be a regulator but need someone who has the authority to demand real data from facebook someone who has investigatory responsibility mr robinson thank you we demanded for years facebook conducted civil rights audit they eventually committed to in front of the united states senate they went about doing it and now we have found out all the places in which they lied and held back and what's your view about my suggestion of an independent commission color change supports the creation of a data protection agency and believes that it needs to have civil rights expertise involved and mr steyer if you're still there yes i'm here uh congressman welch i think that idea deserves very serious consideration as i've said and i think other witnesses and and and congress people have said this deserves a comprehensive approach so i think your proposal is a very serious consideration because we have to hold the tech companies accountable period full stop okay thank you very much my time is up and i'm sorry i didn't get to you ms frederick appreciate it i would have disagreed anyway sir pardon me i would have disagreed anyway sir okay duly noted okay let's see uh chair's going to recognize mr schroeder for five minutes thank you mr chairman and appreciate everyone being here at this hearing uh we definitely have to figure out what to do and you all give us a lot of food for thought uh miss hoggin want to give you a lot of credit for stepping up it's very very very difficult to do occasionally some of us do that here in this body and i share your pain frankly and having to do that uh deeply disturbed uh by that facebook has breaches duty jack responsibly uh uh when it potentially has stood to benefit uh from the misery and suffering of a number of its users it's totally inappropriate it appears that facebook knew that his products were causing harm at the american people particularly mental health of young people as we've heard here today and facebook has not responded as i've listened to the testimony from you all this should raise concerns for every member of our committee it appears to be that way and indeed each and every american democracy public safety the health of our families and our children in particular coming at the cost of profit for these companies power to connect all people around the world could be great you know but you know it needs to be checked by democratic norms human rights and the rule of law our part is getting to the solution at the end of the day you know how do we avoid censorship to mrs frederick's point i think and at the same time uh allow people to communicate in uh you know in an honest and open way that does not advantage one side or the other so i just hit a couple points up facebook and companies like it uh you know promise uh to police themselves you guys have talked about that ms hoggin in your opinion uh and firsthand experience is it uh particularly naive of us uh or even negligent of us to expect facebook and other entities to self-police themselves for our benefit i believe there are two at least two criteria for self-governance the first is that facebook must tell the truth which they have not demonstrated they have not earned our trust that they would actually surface to us dangerous when they encounter them the second thing they've and they've actively actively denied it when they've been asked about specific allegations the second criteria is when they encounter conflicts of interest between the public good and their own interests do they resolve them in a way that it would be aligned with the common good and they don't and so in a world where they actively lie to us and they resolve conflicts on the side of their own profits and not the common good we have to have some mechanism and that might be a commission or a regulator but someone has to be able to get truth out of these com out of these companies because they're currently lying to us thank you ms frederick you've hit the nail on the head when it comes to viewpoint censorship i mean uh the eye of that's in the eye of the beholder to a large degree uh so how do we deal with that based on your experience and your extensive research and uh first-hand history what what is a way to get at avoiding viewpoint censorship but again getting the clarity that you all have spoken to on the uh on the panel here i think put simply you anchor any sort of legislative reforms to the standard of the first amendment so what the constitution says again these rights are given to us by god they were just enshrined put on paper for americans in the constitution uh so you make sure that any sort of reforms flow from that anchored standard to the first amendment okay that sounds like it's easier said than done though i'll be honest with you uh i miss hogging again you talked about uh i think one of your testimony uh that they know how to make it safer so how should they make it safer what in your opinion are some of the reforms that you would suggest uh you've alluded to some already we've spent a lot of time today talking about censorship one of the things that we forget is that when we focus on content on the language that doesn't translate right it require that you have to do the solutions place by place by place language by language which doesn't protect the most vulnerable people in the world that's places like what's happening in ethiopia right now which has 95 different dialects in their country what we need to do is make the platform safer through product choices that's things like imagine alice writes something aubrey shares it carol re-shares it so that's friends of friends let's imagine when it got to that point when it got beyond friends of friends you had to copy and paste it to share it have you been censored i don't think so it doesn't involve content but that action alone reduces misinformation the same amount as the third party fact-checking program we need solutions like that friction that make the platform safe for everyone even if you don't speak english um and and but facebook doesn't do it because it costs them little slivers of profit every time they do it sounds like a complicated long it's going to be complicated to do this because you're dealing with how to affect the algorithms in a more positive way without bias if ostensibly and i guess i'm out of time and i go back gentlemen yields back uh chair recognizes mr cardinis for five minutes thank you mr uh chairman and also ranking member lada for having this very important hearing and again this is not the first time that we're discussing this issue on behalf of the american people who elected us to do our job which is to make sure that they do continue with their freedoms yet at the same time the harm that can be prevented does not come to them i'd like to first start off by submitting for the record a letter by the national hispanic media coalitions in support of h.r 5596 justice against malicious algorithms act of 2021. and also um i'd like to say to ms haugen the information you provided to the public about facebook's internal deliberations and how the company has dealt with or chosen not to deal with some of the more pressing issues it faces has been illuminating to all of us so thank you so much for your your brave uh willingness to to come out and and speak the truth one issue of critical importance to me is the spanish language disinformation that has flooded social media platforms including facebook but also other social media sites as well one example is the level of the company's resources dedicated to spanish language misinformation in may facebook executives told congress that we and i quote them we conduct spanish language content review 24 hours per day at multiple global sites spanish is one of the most common languages used on our platforms and is also one of the highest resource languages when it comes to the content review end quote yet in february uh of 2020 the product risk assessment indicated that quote we're not good at detecting misinfo in spanish or lots of other media types end quote and another internal report warned that facebook had quote no policies to protect against targeted suppression end quote miss haugen in your testimony you know that we should be concerned about how facebook's products are used to influence vulnerable populations is it your belief that facebook has blatantly lied to congress and the american people facebook is very good at giving you data that just sidesteps the question you asked so it's probably true that spanish is one of the most resource languages at facebook but when overwhelmingly the misinformation budget so 87 goes to english it doesn't matter if spanish is one of your top funded languages beyond that if you're giving it just like tiny slivers of resources i live in a place that is predominantly spanish-speaking and this is a very personal issue for me facebook has never been transparent with any government around the world on how many third-party fact checkers speak each language how many third party fact checks are written in each language or locality and as a result things like spanish misinformation are are nowhere near as safe as it is for english okay so basically facebook uh do they have the resources to do a better job of making sure that they police that and actually help reduce the amount of disinformation and harm that comes to the people in this country facebook is on track to make 45 billion dollars of profit in the coming year of course they have resources to solve these problems more effectively than they do today facebook loves to come back and say we spent five billion dollars last year or we're going to on safety the question is not how much they currently spend but whether or not they spend an adequate amount currently they are not keeping spanish speakers safe at the level they do for english speakers and that's unacceptable and and and so they do have the resources to do to do better or to do more and they have the knowledge and the ability and capability to do so they choose not to yes they have they have the financial resources they have the technology they have chosen not to invest in spanish they've chosen not to allocate the moderators or pay for the journalists they are not treating spanish speakers equally as they do english speakers okay thank you mr stark given the repeated evidence that facebook is unable to moderate content with algorithmic and human reviewers adequately can section 230 reform change the approach that facebook and other tech platforms take into moderation con moderating content yes absolutely and in fact a couple of the bills that have been referenced here congressman cardinis uh would actually make major progress on that in addition uh as i said earlier the privacy by design uh issues that will be in next week's hearing will cover um and other measures uh related to uh raining in the facebook's uh algo transparency of algorithms will all work to fundamentally change what's currently going on and just to echo what haugen said they have the resources they have the knowledge but unless congress holds them accountable on a bipartisan basis it will not happen so the ball is really in your court on a bipartisan basis gentlemen's time is thank you very much my time has expired i yield back thank you chair recognizes mr carter for five minutes thank you mr chairman and thank all of you for being here we appreciate this is extremely important as you can well imagine i want to start um with you miss frederick and and it's kind of just a general question i think you all realize that we want to keep free speech i don't with democrat republican i don't think there's any difference if you're an american that's one of the greatest freedoms that we have and we we value that freedom we all want to take we all want to keep that and and it is important but i want to ask you miss frederick we also want to ensure that one's free speech is is not subject to any kind of political bias particularly if it's supposedly fact checkers who are have a bias against conservative thought or any thought whether it be conservative or liberal we just don't want that bias and it's such a you know this is this is not easy what we're trying to do here it's not it's tough and it's we want free speech but holy cow something's got someone's got to give here i just want to ask you um why why do you think it's necessary for us to to um reform section 230 and and to pass laws to to keep big tech accountable rather than just rely on tech companies to self-regulate i'll be quite honest with you this is my seventh year here my fourth term and i i i've had the opportunity twice to have um the ceo of facebook of of twitter and of um google before us in a panel and and i've tried to make it as clear to them as i can i don't want to do this you don't want me to do this so please clean it up yourself so i don't have to do this because you don't want me to do this but it seems to go in one ear and out the other so so tell me why do you why do you think this is necessary i think you're correct i think thus far every tactic that has been tried it's not working and the proof is in the pudding and we see what this self-regulation tactic has wrought toxic practices that inordinately harm young american women you look at when it comes to tick-tock the one thing that uh instances of people coming into hospitals developing actual tics have in common according to reporting from the wall street journal is that they all follow influencers on tick tock who have a some sort of tourette's tick so of those toxic practices those behaviors those social issues that they're exacerbating plus rampant censorship that you talked about right now as it stands it's a veritable race to the bottom and so again i'm going to ask you the same question give you the opportunity to respond as well yes why why do you think it's necessary do you think it's necessary to reform section 230 and and you know because tonight they're not responding i've tried it i've done it twice i've had them before the the ceo's before me twice and it's just it it ain't working we got to do something we've talked about over and over again today about the nature of censorship the thing that i think uh we need to figure out something to change the incentives is facebook knows lots and lots of solutions that aren't about picking good or bad ideas that's what we've been arguing a lot about today is them picking out ideas they have ways of changing the product to make it safer but they have no incentive right now to make those trade-offs you know this thing like i talked about the reshares that takes a little sliver of profit away from them and they keep not doing these things or not telling us about them because their only incentive is profit we need to do something to change the incentives that these companies is that truly their only incentive profit um i think they do face you know liability right like sharehold they have a fiduciary duty to their shareholders a lot of the things we're talking about here are trade-offs between long-term harms and short-term profits right i think generally good reform of facebook will make it more profitable 10 years from now because fewer people will quit but when you look on a term by short-term basis they're unwilling to trade off these slivers of profit for a safer product okay i'm going to get to this one final question i'm running out of time here um by profession i'm a pharmacist and i've dealt with i've dealt with with drug addiction and and with prescription drug addiction and all of you know that in 2020 drug overdose deaths increased increased by 55 percent and and you know how accessible these drugs are over the internet and that is that is disturbing and you know that many of them are laced with fentanyl and you're familiar with this but my question and i'll direct it to you miss frederick um yes or no with this proposal theoretically they the the proposal by the big tech platforms that they the on the sale of illegal drugs on their platforms and one of the proposals that republicans have put forward is to carve out section 230 liability protection for illegal drug trafficking on a platform yes or no do you think that that would work theoretically it should and it points to a broader problem on the platforms drug cartels advertisements for coyotes trafficking people across the border illegally foreign islamic terrorism yes i think theoretically it should help okay thank you and i'll you're back as you've heard the task before us is very difficult as we try to pursue legislation fixes to section 230. as chair of the tech accountability caucus i believe that amending of section 230 must be done carefully to ensure we are limiting the unintended consequences and driving the changes we really hope to achieve the harms that were mentioned in the testimony today and the misinformation and disinformation on many platforms cannot persist if we are to continue having a healthy democracy promoting disordered eating body dysmorphia and self-harm is sending kids and teens already struggling with their mental health down a dark path that has been shown to worsen their mental health mr steyer why are parents not able to hold these platforms accountable for pushing this type of content and how can the reform of section 230 impact platforms amplification of harmful content thank you very much for the question connor um i would just tell you first of all because right parents aren't able to hold the platforms accountable because there's no law in place that permits that which is why reforms of section 230 will go a long way to doing that you have to remove some of the immunity for example the issues that ms halkin has talked about in terms of the body image hiding the body image research by instagram uh scientists we i sat with the heads of facebook a decade ago and told them the very same messages about facebook instagram wasn't as popular then they know this but there is no mech they have a legal immunity so unless the that immunity is is removed for harmful behavior like in in the case of body image issues that we've discussed they will they will walk with they will act with impunity as all of the witnesses have said so i think it's extremely important that we this body acts our prior uh congress person mentioned the idea because these guys the answer is clearly no parents across the country we have well over 100 million on common sense media do not believe that that they have the power to do that you do as congress so a please reform section 230 along the lines of some of the proposals that you put forward have and second look at a broader comprehensive approach that includes privacy by design and some of the an anti-trust and other important ways that will put power in the hands of parents where it belongs thank you very much for the question thank you and mr robinson first and foremost thank you thank you thank you for your leadership in your testimony talk about the particular challenges communities of color face online with regards to content moderation how do we ensure that civil rights are not circumvented online and that platforms are not facilitating discrimination through moderation thank you for that question congresswoman we remove immunity the fact of the matter is is that the technology of the future is dragging us into the past because platforms have been given this idea and have been given these laws to believe that they are immune for a whole set of laws that people died for in this country to put on the books and now we are sort of re-arguing and re-engaging around whether or not it is okay to discriminate against people in housing employment and data these are things that should have already been settled but now the technology of the future is dragging us into the past and the other thing that i think about and listening to you um as chair of the tech accountability task force we've called them out on their lack of diversity in their boardrooms you know c suites whatever and and i can't help but think that's part of the issue too um um um so the so they've they've left huge swaths of of communities out is is deeply is deeply troubling these are choices these platforms have made year over year we end up i'm getting these all sorts of commitments from diversity and inclusion officers at these companies saying they're going to do better we have asked them to disaggregate their data sometimes they'll say oh we're at two percent or three percent black and then we ask them to disaggregate and then we'll find out that the numbers that they'll be including bus drivers and cafeteria workers who are fighting for a living wage inside of those numbers and so the fact of the matter is is these companies are making choices every single day and they're giving lip service to diversity lip service to inclusion and then creating content creating um all sorts of technologies that harm us all thank you so very much my time is up can i get a tiny sliver um it's even worse when we talk about international representation from the global south facebook has built the internet for the global south for majority of languages in the world facebook is the internet 80 to 90 of that content in their language and they have almost no representation from the global south thank you thanks rob who's next go i'd like to recognize mr mullen now thank you madam chair um i'm gonna try to be pretty quick here we've been talking about section 230 and the protection that um you know the these companies seem to seem to hide behind and some of the abuse and and i'm just i'm not trying to play politics here i'm just bringing up you know what has happened just in the last week and underneath section 230 you know it's supposed to be the town square where you can post anything and no one's held responsible for it and and in those parameters you know obviously you can't make a direct threat or a death threat at somebody or or and and these platforms that took a little bit further to show extremist views um but they're they're becoming they're becoming political platforms and we know this and so this is kind of what i wanted to bring about miss miss frederick as you know google recently prohibited abortion pill reversal advertisement that was supported by pro-life organizations then they turned around and allowed advertisement to continue for medication assisted abortion pills to support pro-life or pro-abortion groups uh when we started talking about section 230 was section 230 was this what it was designed for uh to to limit someone's ability to to voice their opinion uh and allow somebody to to say that it is or isn't i mean we have uh i mean this is what this is what this country does we have opposite views and we have opposite views we air them out we talk about it but completely eliminating one person's view and and just putting stuff that you agree with that doesn't fall within section 230 does it miss cedric not at all and this is why the fcc chairman brendan carr says that section 230 right now amounts to a regulatory legal advantage for one set of political actors and we see the disparity between what big tech companies censor coming from the right and then what uh they censor that maybe uh cleaves to a leftist narrative that they approve of if you look at the hypocrisy it's rampant as we talked about just in the national security space iranian officials north korean officials ccp spokespeople the taliban all of these people are free to say what they want on these tech companies usually it's a vociferous or even an obstreperous right who says what is going on here this hypocrisy it can't stand and then they maybe think about it they maybe say this is human error and uh redress those issues but but that doesn't happen often and it doesn't happen unless we talk very seriously or at least flag these issues so this is not what section 230 was created for we need to realign it with congress's original intent but it's it's being abused right now i i couldn't agree with more i agree with you more on that so with that i yield back thank you okay who's next uh chair recognizes miss craig for five minutes thank you so much mr chair uh both you and to ranking member ladder for holding this really really important hearing thank you so much for the witness testimony uh we've been talking about section 230 reform in in various formats i know for for many years and some of the folks who've been here for more than a decade have brought that up today and i'm i'm glad we're finally diving into some specific pieces of legislation uh whether they're perfect or not as mr steyer noted children and young adults are living much of their lives online in a world that is created by the various tech platforms that world is increasingly controlled by algorithms over which young people and their parents have absolutely no control this lack of control has a real world impact on people and families in our communities one example is the role that as you've talked about today these platforms play in the sale of illegal drugs to young members in our communities you've talked about it a lot today but i just want to describe what i experienced a month ago back in october i joined community members in a small mississippi river town called hastings in my congressional district and we gathered to talk about the opioid and fentanyl crisis because we've had too many young people who we've lost in that community during that event i listened to the story of a woman a mother who's now become an advocate by the name of bridget noring she lost her son devin in a tragic and accidental overdose after he bought a pill through a snapchat interaction devon thought the pill was a common painkiller that would help him with his debilitating migraines instead it was laced with fentanyl the questions that bridget has they really get right to the point for all of you how can we trust platforms to ensure the best outcomes for our society when too many young people like devon have been lost because of those algorithms that don't account for human safety and well-being how do we make smart long-lasting and constructive changes to these laws to ensure that online environments or a place where young people can learn and build communities safely not be pushed toward destructive or harmful content simply because it's the thing that is most likely to get the most clicks i believe that the answers lie somewhere in some of the bills that are before us today and i guess i just start with miss haugen for my first question can you help us understand how facebook and the other tech companies you've worked for factor the impact on children and young adults into their decision making and does that real world impact and potential cause them does it shape their algorithm development at all at this point and mr robinson mentioned the lack of diversity at these tech companies one of the groups that is never represented amongst tech company employees is children and it's important for us to also acknowledge that many of the people who found startups or who populate even large companies are people who are very young you know they're under the age of 30 and they almost always don't have children i think the role of children and acknowledging them as people and as people who have different needs is not present enough at tech companies and that means that often just as diversity is usually not designed in from the start acknowledgement of the needs of children is also usually not designed and from the start and as a result it doesn't get as much support as it needs thank you for that um a follow-up maybe for mr steyer in your work at common sense you identified specific solutions to address the sale of illegal drugs on tech platforms how do you see the that issue addressed in any of these bills or these bills or not addressed in the bills are there gaps that you think we also need to put more thought into very good question congresswoman craig so first of all i think most of the bills will remove liability for harmful behavior and that clearly would fall under that category so i think that several of the bills in front of you and a couple of the ones that have been mentioned by other members will actually address that i think your point is extremely well taken congruent because the really by part of the thing that will move this forward and that will i believe get this committee to act in a way that will have an extraordinarily important impact for kids and families across the country no matter what politics they have is the focus on children and you have the power to do that and if we reform section 230 and remove the liability protections around harmful behaviors like the drug sales you're talking about that will be an extraordinarily important move forward so i really urge you all to do this on a bipartisan date now the parents of america are counting on you thank you so much for that answer and you know i have four boys it's too late they range in age from 18 to 24 to be able to impact their lives but i have an eight-week-old grandson and it sure is damn better not take us another decade to figure this out thank you mr chair and i heard back general lady yields back chair now recognizes uh miss fletcher thank you chairman doyle um thanks to you and ranking member latta for organizing and holding this hearing today and thank you to all of the witnesses who are here today your testimony has been very useful for all of us and listening to you and my colleagues today as we've addressed these issues over time as so this isn't our first hearing um but it is clear that the legislation we're talking about today the things that we're taking to address 230 is important um and as a broader approach that mr welch discussed and and as mr schroeder and kelly both said this is not at all easy to do because we're talking about how we balance a lot of interest here a lot of challenges we want to protect our children we want to protect the free exchange of ideas in our marketplace of ideas and that's really the foundation of a democratic society an exchange of ideas and debate and ultimately hopefully some consensus um but what we have learned and are continuing to learn is that some of these addictive design features have not only the potential to sow division and extremism but the actual effect of doing so um and as as we've heard today uh what we saw in some of the wall street journal reporting and the facebook files that facebook made some changes to that algorithm that were meant to encourage people to interact more with friends and family uh through meaningful social interaction but they actually did something very different um and i i know that um i'd like to direct my questions to ms haugen a little bit we know we've read and we've heard uh from your testimony that researchers within the company um as well as online publishers who used facebook to drive traffic to their websites warned the company that divisive toxic and inflammatory content was being rewarded by the algorithm and pushed into more and more users news feeds um so ms howie can you talk a little bit about how and why the algorithm had such a different and devastating result than was intended uh can you talk a little bit about that and then i have a follow-up after that if we have time mark zuckerberg said in 2018 that engagement-based ranking by prioritizing content based on its ability to elicit a reaction from you was dangerous because people were drawn to engage with extreme content even when they asked them afterwards did you like that they said no and he said but don't worry ai will save us ignoring the fact that the ais that they built were insufficient what happens is there's two sides to the problem one is that publishers see that if they make content that has more negative comments the more negative the comments on your content the more likely you get a click back to your site the more likely a publisher makes money off of that interaction so there's an incentive for publishers to make more and more divisive and polarizing content the second side is that the algorithm gives more reach and distribution to people if it's more likely to elicit a reaction and so any threat that causes controversy versus one that brings reconciliation will get more distribution in the system this has been known in psychology for years that it's easier to elicit anger from someone than compassion and it's known inside the company but they don't change it because the way the system is built today causes you to produce the most content because when it elicits that reaction from you a comment like a reshare it encourages the other person to keep making content so this is not here for us to have more meaningful interactions so that we can be a tool for more content to be product produced okay thank you so following up on that um and i think you you addressed it a little bit already um in your response to mr carter and some of the discussions that we've had already today but can you talk a little bit about i mean that's one thing to have the stated goal um to do this is it possible for the platforms to change their algorithms or their other practices some of which you talked about earlier um to promote healthy user engagement and reduce some of these negative outcomes and and coupled with that can you just talk about the ways that you think congress can help make that happen facebook has lots of solutions that lead to less misinformation less polarization more divisiveness that don't require us picking and choosing which ideas are good i'll give you an example they have a picker that allows you to re-share not to one group but to many groups simultaneously they don't have to have that feature they have it because it makes the platform grow faster but that feature causes more misinformation and they know that because a small number of people are hyper spreaders when we add friction to the system when we make people make intentional choices to spread information it happens to be we get less violence we get less hate speech for free we don't have to pick and choose the individual things the question is how do we incentivize facebook to make these decisions because in order to make them they have to sacrifice little slivers of growth and the reality is we have to create incentives that counterweigh these profit motives if we want facebook to act on the common good okay well thank you very much for that uh testimony and i am out of time so mr chairman i yield back general lady yields back uh i think that is all the members of the subcommittee so now we're going to those members who have waived on and we will start with dr burgess i think thank the chair for the recognition thank you all for your testimony and your ability to survive during a very lengthy congressional hearing and i appreciate your your input and your attendance today ms frederick if i could just ask you on the on the issue of the fact that platform and other platforms do use algorithms to filter content and to help identify posts or information that might violate their content moderation policies but the sheer volume of of that content that they that they have to evaluate can you give us some guidance as how congress might incentivize fair and accurate enforcement of content moderation policies by the tech companies that have the section 230 liability so i think there are a couple ways to do that as i said before use that first amendment as a standard to reform section 230 and then i think that companies should implement a user-friendly appeals process to provide that prompt and meaningful recourse for users who think that they've been wrongfully targeted for their speech so basically give power back to the hands of the people and not the platforms itself let them actually use the judicial system to address those issues and we i really think we should examine discrepancies in between what these companies say they do what they say they stand for these are us-based companies their terms of service their policies and those implementation if there is a discrepancy why not bring them up on breach of contract why not uh examine them as possible cases of fraud so you have to give the people some efficacy against these platforms because frankly they're not afraid they're not afraid of congress they're not afraid of you especially on the right side of the aisle they do not fear the use or the incentivization of any of these mechanisms to cause them to fix what they've been doing wrong i have an impression that you're correct they don't they don't fear on this side of the of the diocese so and kind of what you're talking about there is a way to increase the transparency of the algorithms that use on that that are used on those platforms so is there a way to get to the transparency without jeopardizing the proprietary business nature of the information i think there's a difference between proprietary designs of algorithms and then reporting and details on how these algorithms affect users and impact users on the platform so that distinction should be made and when we're incentivizing algorithmic transparency i do think there has to be a publicly a public availability component and there has to be some sort of of teeth again we have institutions that exist for a reason the ftc exists for a reason there are enforcement mechanisms that already exist we don't have to expand government power we don't have to weaponize it but we do need to give this some teeth well let me just ask you ms algon do you think that transparency that insight exists within the company say a company like facebook are they aware that this occurs um that there's not a recourse for for um for for over enforcement or what's right so the algorithms that are developed for content moderation are they aware of the effect that that has on the on the end user they're very aware that people have a very strong emotional response when their content is moderated and they're very aware that um the system the amount of content that has to be moderated is so high that they make they take i i don't want to describe them as shortcuts but they make many um optimizations that lead potentially to inaccurate enforcement sure it gets back to the sheer volume uh argument exactly well let me ask you something because when your uh testimony before the senate came out the wall street journal did their uh series of articles on on facebook and i heard an interview with dr sanjay gupton cnn talking about teen suicide um interesting comments that he had and then he went further and said it's it's far in excess in in teenage girls in adolescent girls and apparently if you look at the studies that that is the case and some of it does seem to be related to screen time and and usage is this something that is known internally within the company facebook has done um proactive investigations they have things called proactive incident responses so these are things where they hear a rumor and they go check for it they know that you can follow very neutral interests like healthy eating and just by clicking on the content provided be led to anorexia content like that's that's what the algorithms do they lead to amplification they know that children sometimes self-soothe that as they get more depressed as they get more anxious they consume more and more and more content and when the content itself is the driving factor on the problem that leads to tragedy and facebook is aware of all those things you know one of the things that strikes me and i'm a physician in my former life to be able to have that information available to caregivers so that they're aware of the clues or cues that should be sought um you know we're all trained to to ask about whether someone is is depressed whether someone is worried about hurting themselves or someone else but here it seems so specific and it seems like the information that the company could make available to to to doctors nurses caregivers in general it seems like that should be something that is just done but i get the impression that it's not yeah i've been told by government official governmental officials in other countries that they have asked facebook things like how many children are overexposed to self-harm content and facebook says we don't track what content is self-harm content so we don't know i think facebook facebook has some willful ignorance with regard to the harms against children where they have intentionally not investigated or invested resources in understanding these problems because they are afraid that they would have to do something if they if they could concretely know what was going on yeah gentleman's time has expired let's chairman if i make it seems like we have an obligation to inform the provider community that this is important and this is something that should be actively sought in taking a history with a patient thank you i'll yield back the chair recognizes ms schakowsky for five minutes thank you mr chairman and thank you for allowing to wave on to this really extraordinary hearing and i want to thank all of the panelists this has been so important um and i would uh especially like to thank um that the testimony of miss haugen and thank you for your um courage and your strength in testifying today and and really clearing clarifying for the committee and for the the public the um incredible harms that can be created on online so yesterday i i introduced a bill called the ftc whistleblower act with my colleague representative treyhan and this legislation would protect whistleblowers who um that that provide information to the federal trade commission's uh federal trade commission um from retaliation um from their brave um and courageous activities by uh disclosing the kinds of things that we think need to be disclosed so here's my my question for for you why is it and can you explain um to us why you brought your evidence to the securities and exchange commission how that decision got made my lawyers have advised me that by disclosing to the sec i would receive federal whistleblower protections i think it's extremely important for us to expand those protections both to private companies because if i had been at tick tock i would not have been eligible for those protections so um in your in your view are um whistleblowers who want to expose wrongdoing or um help uh to defend consumers by reporting to the federal trade commission um protected under that law was the question are are they protected under ftc or that they should be well i mean no my my question is under what you did would not protect them oh yeah if they went to the uh if they were reviewing revealing something from the federal trade to the federal trade commission i think this is an issue that both the right and the left can get behind like when the right worries about over enforcement that's a thing that we should be able to know about if it's happening inside of companies or on the left if we want to have democrat control of these institutions no one but the employees at these platforms knows what's going on except the employees so we need to have whistleblower protections in more parts of the government and i strongly encourage having protections for former employees also because um that clarity in the law is vitally important so then you do believe that establishing some sort of legal protection against retaliation and you know what whistleblower protections at the federal trade commission would be important but i hear you also saying that the fact that it is like agency by agency that we don't have any kind of umbrella protection for consumer whistleblowers is a is a problem as you see it it's a huge huge problem we are living in a time when technology is accelerating technology is always governance has always lagged behind technology and as technology gets faster and faster and more opaque it becomes more and more important for us to have systemic protections for whistleblowers if we want to remain with the government in control of these things technology needs to live in democracy's house thank you really that was the only question that i had i just wanted to raise the issue that you needed to go there on the advice of your attorneys because that was a place that you would have protection but the fact that ordinary people who have legitimate claims that know things that need to be shared do not have that protection right now i actually didn't know that it also did not apply to ex-federal employees and i think that they should be covered as well the import i want to really emphasize again that concept of private versus public employees so if i had worked at a private company like tick tock i would not have received protections from the sec and i want a thing that is not necessarily obvious to people is that companies are going public later and later and later they're huge companies by the time they go public and so we need to have laws that protect across the federal government whistleblowers and they need to be at private and public companies so you're saying that only those corporations that have gone public um right now would would be included um if i my understanding i'm not a lawyer my understanding is if i had been at a private company i would not have gotten sec protections because the whistleblower protect program at the sec only covers public employees general ladies time has expired thank you i yield there uh chair now recognizes mr pence for five minutes thank you chairman doyle and ranking member lotta for allowing me to join today and thank the witnesses for their testimony in answering the questions while i'm encouraged that this hearing represents a positive step towards reforming section 230 i hope we can create bipartisan bills for consideration republicans on this committee have put forth thoughtful reforms to section 230 that would greatly rein in big tech's unchecked authority to silence hard-working hoosiers and all americans i encourage my colleagues in the majority uh to continue to include proposals from this side of the aisle on issues affecting all of our constituents as i stated during our hearing earlier this year with big tech ceos these platforms have become reminiscent of all-encompassing monopolies whether it was standard oil or mobile most of the country had no choice but to rely on their services likewise social media platforms connect every aspect of our lives from family photos to political opinions even representatives in congress are all but required to have a facebook and twitter account to reach our constituents which is very bothersome to a 65 year old congressman big tech claims to understand the gravity of their influence but their actions say otherwise twitter allows the supreme leader of iran to have a megaphone to proclaim derogatory statements against jewish culture and endorse violence against the u.s and western world which i called out in earlier committee hearing they continue to allow the chinese communist party to peddle propaganda here at home google allegedly tried to use their own advertising monopoly to financially harm a conservative news outlet the federalist as one of the witnesses today talked about and other companies as well when jack dorsey announced his departure from twitter on monday he ended his message wishing they would be the most transparent company in the world i hope this commitment reverberates across the entire industry hoosiers and all americans should know exactly how these companies are profiting off the personal infra information of its users how i p has been stolen by adversarial countries like china and how social media platforms give megaphones to dictators and terrorists while manipulating addictive qualities of posts likes and comments to hook our children into their service we should have a better understanding behind big tech's decision to moderate content under their section 230 shield miss halogen i'm hoping you can comment on a suggested reform to section 230 that i don't necessarily agree with or disagree with i just want to get your thoughts on this it's been suggested that a revised version of section 230 for the treatment of a publisher or speaker would read and i quote no provider or user of an interactive computer service shall be treated as a publisher or speaker of any speech protected by the first amendment wholly provided by another information content provider unless such provider or user intentionally encourages solicits or generates revenue from that speech if this language was signed into law how would this affect social media platforms ability to monetize higher engagement from harmful rhetoric i'm not a lawyer so i don't understand the the new necessarily the nuances is the difference between the current version and that version that uh if you profit from the content then you're liable like i'm not sure what the current wording of the law is if you're promoting uh you would no longer have protection if you were monetizing it if you're a monetary if you're promoting yes um monetizing it correct i uh i do not support removing 230 protections from individual pieces of content because it is it is basically it's functionally impossible to do and have products like we have today um if we called out the idea that you would have to that if it was monet uh in a place like facebook it's actually quite hard to say which piece of content led to monetization right so if you look at a feed of 30 posts but but if they're shooting it out all over the place because it's uh because it's negativity or anger or hatred and let me ask in a timer rating miss frederick uh could you answer that real quick i'm also not a lawyer um but i do like money um so that gives me a little bit of pause when we think about people's ability to monetize their livings on these platforms because part of the problem we know is that normal people who just want to have a business and maybe have some skepticism about what public health officials say when they question that dogma or that orthodoxy or that leftist narrative they're suspended or banned from the platform so i want to protect the individual and the individual rights more than anything gentlemen's time has expired thank you fair now recognizes miss caster for five minutes well thank you chairman doyle for uh calling this very important hearing and thank you to our witnesses and to miss hagen uh you're courageous and i think we all owe you a debt of gratitude for blowing the whistle on facebook's harmful uh corporate uh operation the the harmful the design of their platform they they know the damage they're causing and yet they look the other way and fatten their wallets at the same time and mr steyer thank you for your years of commitment to keeping our children safe online thank you for your advice as we drafted the kids privacy act the update to uh coppa hopefully we'll get to privacy as we move design reform and section 230 reform along as well and mr robinson thank you let's get into section 230 a little bit you say we should not nullify consumer safety or civil rights laws we shouldn't encourage illegal harmful behavior i mean we don't allow this to happen in the real world we shouldn't allow it to happen in the online world uh section 230 courts have interpreted this section to provide and remember this was adopted in 1996 a world away from where we are now online but the courts have interpreted section 230 as a almost a complete uh immunity from liability for what happens on their platform no matter how illegal uh harmful it's so flagrantly bad that judges now are asking the congress to please weigh in and reform section 230. so that's why i filed the safe tech act with congressman mckeachen who was on earlier the safe tech act would remove section 230 liability the liability shield for violations of civil rights laws anti-trust laws stalking harassment intimidation laws international human rights laws and wrongful death actions some of the bills the other bills on the agenda today focus on the algorithm algorithmic amplification or targeting uh that leads to certain harms do we need to blend these approaches or do you would you highlight one uh over the other i'll start with you mr robinson i think we need multiple approaches and i think um we need to start by um really by removing all the immunity that these companies have when it comes to violating existing law both in terms of um amplification and in terms of sort of what they allow on their platform um the fact of the matter is is that this has to go hand in hand though with transparency because what we end up with is these companies determining when they let us know or when we get to know we just got a whole new set of documents um through the washington post that let us know that they had done all sorts of internal research to actually show and i know there's been a lot of conversation here today about this idea of conservative bias but in fact black people were sort of much more likely to have their content pulled down um than white people on the platform for similar levels of violations time and time again this was facebook's own internal research they got the research then they squashed that research so we end up with these conversations about this idea of conservative bias when their own research tells them something different and then they refuse to do anything about it so you because they haven't blunt that blended approach linda miss haugen what's your view i agree that we need uh multiple approaches just for moving immunity will not be sufficient we need to have ways of being able to get information out of these companies because one of the things that is lacking for facebook that is not lacking for any similarly powerful industry is because they've hid the data they've hid the knowledge you can't get a master's degree and the things that drive facebook right now or any of the other social media companies you have to learn inside the company is that we lack public muscle to approach these problems to develop our own solutions and until we have something more systematic we will not be able to hold these companies accountable mr steyer mr star had to leave early i'm sorry i should have made that announcement we thank him for being on the panel but he's not with us anymore ms frederick do you want to weigh in on the uh the design and the algorithm rhythmic amplification in section 230 reform what's your view so section 230 reform generally i think again it starts with that first amendment standard and then you allow people to have recourse in courts and then you let companies uh where you make sure that companies report their content moderation methodology their practices to some sort of mechanism like the ftc with that public availability component and then you add algorithmic transparency into that as well so it's the the public availability component that i think helps give people uh power back when it comes to them standing up against these companies and their concentrations of power thank you very much inspired uh let's see mr crenshaw you're recognized for five minutes thank you mr chairman thank you everyone for for being here uh miss haugen i'd like to start with you please uh you're a lead product manager at the at the civic misinformation department at facebook or civic integrity as it's sometimes called i want you to help us understand what standards are used to decide what is misinformation and what isn't and i know that could be an hour long answer if you could do a short one um so just for for clarification um people have sometimes said that my team took down i think the hunter biden story there are two teams at facebook or more than two teams that deal with misinformation so the main misinformation team which was under community integrity uses third party fact trackers which are independent journalists who identify they may allow to make any choice they want to within the queue of stories and then they write their own journalism and that is what how things are decided to be true or false my team worked on you guys see any problem with outsourcing the fact checking to people who really don't check facts but instead check opinions i mean i'm a victim of that many times by these so-called journalists who are so-called fact checkers is there any concern about that at facebook um i it is a very complicated and nuanced issue um i did not work on the third party fact-checking program though so i am not aware of it but that's one standard so any other principles that we might point to that are that would that would lead us to understanding what the standard is and what is misinformation facebook's policy is very clear they are not the arbiters of truth um so i think uh there is an open opportunity for public discussion on how third-party fact checks should be conducted um but that is outside the scope of the things that i worked on okay um mr robinson in your testimony say that we must take racism head on finally eliminate the racially ignorant exploitative and harmful components of big tech um and you we would do so by supporting legislation that removes liability um if they do not remove content that causes irreparable harm now in principle i already have objections to that just because it's too vague but that's not actually what i want you to address i want you to address of whether it would be really applied neutrally across the board that that general principle of of that irreparable harm um well i don't know if we can absolutely get to neutrality but we don't get to consequences when companies have blanket immunity and right now these companies have blanket immunity and so as a result we don't allow um regulators enforcers judges and juries to be able to i'm more asking about the intent of your proposals as opposed to my intent the intent of our proposals is to stop allowing silicon valley companies to skirt civil rights and to stop allowing them to be able to decide when in when and where civil rights are enforced right i mean on the one hand i am sympathetic to it because i hate racism and we recently had six people die in wisconsin possibly because of racism because of posts that were on facebook 2015 racist posts violent posts 2020 again and now six people are dead um would would you would these proposals address that as well the proposals would would remove the profit and growth incentive over safety integrity and security and so it it places a set of consequences on these platforms and then gets us to a place where there's actually consequences right now there are not consequences they can come here and lie to you about transparency they can come here and lie to you what they're doing to keep these companies safe and they have and you all have no risk recourse and this has been happening for you i understand thank you mr robinson i appreciate your answers and i just want to say a few things um one of the concerns we have is that it seems the advocates of censorship or content management or whatever we want to call it they tend to want to censor in only one direction they don't want to be neutral in their application of community standards second brings to light this fundamental question whose fault is it that human beings are horrible to one another whose fault is it that a bad person spreads lies or hate is it the medium of communication or is it the person spreading it this is a very fundamental question because free speech is very messy our founders knew that when they wrote the first amendment it can result in all sorts of chaos and pain and hurt feelings because the human race is indeed what it is but let's be clear that's a heck of a lot better than the alternative this independent oversight committee being discussed with an elite unaccountable few regulating what we see and what we don't i don't want us to go down that path and i want to be clear about something else republicans and democrats do not agree on this issue i've observed a clever strategy by the media and some of my colleagues implying that we all agree that we're all moving in the right direction towards the same thing we're all mad at big tech this is not really true we have very different views of the problem and as the ranking member pointed out one of the bills being considered today puts companies on the hook for any content that causes severe emotional injury which remains undefined and open to interpretation it's fundamentally un-american that your hurt feeling should dictate my free speech and i think the democrat party wants to censorship wants to censor based on vague interpretations of harmful speech and misinformation which invariably means things they just disagree with they can't legally infringe on the first amendment so bully big tech into doing it for you gentlemen go down this path thank you i yield back chair now recognizes ms trahan for five minutes uh thank you mr chairman i thank you to all our witnesses uh miss haugen just let me echo what all my colleagues have said uh thanks for your bravery um bringing to light so so many important issues i worked in tech and i can't imagine that this has been easy for you the papers you provided have shown that when executives at facebook and companies like like it make decisions about content moderation processes and algorithmic design that that the harms caused to users are real uh in many cases devastating it's especially true for our young users already on services like instagram uh and it's true for young girls like my seven and eleven-year-old daughters who facebook's internal plans identified as the company's next growth frontier the fact that these companies view our children as expendable in their pursuit of profitability shows just how flawed the status quo is yet while these company-run ads pleading for updated internet regulations everyone on this panel is aware that the goal of their multi-million dollar lobbying efforts is the exact opposite i recognize that bipartisanship can seem to be in short supply these days like my colleague mr crenshaw pointed out but if protecting our children cannot garner the support of republicans and democrats alike i truly fear for our future there are a number of pieces of legislation either introduced already or currently in the works that all of us should be able to get behind especially when it comes to requiring transparency to that end i am the author of the social media data act which would direct the ftc to issue guidance on how internal research much like the research published in the facebook papers along with a range of other internal company data can be shared with academics in a way that protects privacy that way we can be informed by independent analysis of the full extent of harm that users like our children face when they open an app like instagram so in your experience miss haugen uh what types of internal studies are already regularly performed do platforms mostly perform surveys and interviews like we saw in the facebook papers or do they also do they employ other forms of study as well i want to encourage you when you talk about having data to encourage that in cases of aggregate data so it's not individually identifiable data that can be made public because for other companies like twitter they have a firehose that's one-tenth of all the tweets and there's probably ten thousand researchers in the world that hold twitter accountable so if you just send it to academics you won't reach independent consultants like myself and you'll miss out on a huge opportunity the second thing is what kinds of resources exist internally you have presentations you have you have large quantitative studies so these might be based on user data or they might be literally surveys sent out to 50 000 people and they do do small group studies as well terrific i appreciate that uh and so many of your comments have actually made some of our existing bills already stronger uh you know similarly i'm working on legislation right now that would create a new bureau at the ftc focused on platform oversight and include an office of independent research facilitation you know researchers have several methods for proving causation but the quote-unquote gold standard is randomly controlled trials which is well understood for product safety across multiple industries um at facebook were you aware of whether internal resource researchers were doing randomly controlled trials and if so when in the product life cycle was that most likely to happen randomized trials happen all the time they're usually called ap trials for example in the case of removing likes off of instagram they ran a real a b trial where they randomly chose a number of users and removed the likes to see if it and then surveyed them afterwards and said you know did this decrease social comparison or did this decrease a variety of mental health harms so they have the infrastructure to run those trials um they just haven't maybe ran them on as many things as the public we need would want to know so what do you think is the likelihood in the future of platforms regularly collaborating with independent researchers you know using institutional review boards and ethical best practices to design and run controlled trials unless you legally mandate it you will not get those you just won't get them like researchers have begged and begged and begged for very basic data and for example a couple months ago after begging for years for a very small amount of data on the most popular links on facebook researchers accidentally caught that facebook had misc had pulled different data and given it to them which invalidated the phds of probably countless students so we need legally mandated ways to get data out of these companies which becomes very important when these companies talk about creation of things like instagram for kids so i i appreciate that i don't know how much time i'm going to get to this next line of questioning if i run out i will submit my questions for the record because i'm so interested in your responses but mr robinson you were one of the leaders of the aspen institute's commission on information disorder which recently issued a report that included suggestions for policy makers one suggestion was that congress require that platforms provide high reach content disclosures or lists of popular content and my office is currently working on text to do just that and we would love to connect with you but for now can you just explain why this type of disclosure is important how it complements several of the proposals we're discussing today which aim to limit section 230 immunity when recommendation algorithms are involved the aspen commission institutes um there the proposal should be taken together because we can't actually get to policy recommendations or new policies if we don't have more transparency and this actually gets to transparency around how these algorithms are functioning how they are sort of moving content and getting much more clear about all those things and so that is one of the pieces in transparency that i think is really clear and essential to getting towards the next steps general ladies time has expired thank you sir okay last but certainly not least start gentleman from pennsylvania mr joyce you have five minutes thank you chairman doyle and ranking member lada for holding this important hearing on holding big tech accountable in light of what's happened over the past year it's abundantly clear that this body needs to act on reforming section 230 and reigning in big tech recent reports have shown how far social media companies will go in order to maximize profit at the expense of consumers well-being it is disturbing to see this callous and harmful behavior from some of our largest companies and personally it worries me that it took a whistleblower coming forward for us to learn about these harmful effects that these products potentially and do often have to take on the unchecked power of big tech in silicon valley my colleagues and i have proposed a comprehensive package that will hold big tech accountable and work to protect consumers and actually most importantly our children i implore the majority to take up these crucial pieces of legislation and to do it now miss frederick conservatives especially in my district feel as though their voices are being silenced by content regulators in silicon valley how can we broadly ensure that this doesn't happen so what really hasn't been talked about much here is the fact that it's not even just about individual users or individual accounts or individual pieces of content we are talking about market dominance that translates to americans ability to access information you look at something like amazon web services which you know google apple they took down parlor okay whatever you can get it on their desktop people weren't extremely fussed about that but then within 24 hours when amazon web services at the cloud hosting infrastructure level pulled the plug on parlor entirely a whole slew litany of conservative users were silent lights out at the snap of a of a finger insane so in my mind we absolutely need to use that first amendment standard so things can happen to the content moderation issue to we need to make sure we increase transparency like we talked about it let's have some legislative teeth here let's incentivize those quarterly or even biannual reports when these companies report on what they're actually doing their content moderation decisions and the inconsistent not even-handed application of them um and then just frankly remove liability protections when these companies censor based on political views again strip that immunity when it's abused uh and then finally i think there are reforms that exist outside of section 230 civil society grassroots we need to get invigorated about this let's use that anti-critical race theory model to gin up the the population when these abuses harm our children which they are which has been proven so that civil society is huge and states states can wield power here as well and i think a lot of good ideas have been put forward in those labs of democracy and we should amplify those ideas and promote them as conservatives as well and i agree with you that first amendment rights must be amplified and must be maintained additionally we see harmful impact as social media is having on children and you recognize this as a significant concern of mine and my colleagues the potential lasting psychological impacts that come with endless contests and are readily accessible to so many users miss frederick can you talk about the information that you exposed and how you feel we as members of congress must be able to further utilize that so i wasn't the one who exposed any of this information i just read it in the paper like most people however what you do learn what i learned from working at this company was they are concerned about growth at all costs which translates to bottom line at all costs which translates to uh pr problems and brand and reputation concerns so they should focus on the brand and reputation concern and recognize that these children when they have these devices in their hands they do not yet have fully formed consciousness consciences to deal with the effects that that device is admitting admitting so i think that people need to rethink the way that these devices impact children we need to rethink whether or not children can even have these devices as was mentioned earlier famously tech oligarchs they don't give their kids these devices there's a reason for that and that should be all you need to know miss haugen can you as the individual who did this can you comment on how this effects will move forward in being able to protect our children uh which affects the the things that she's described yes exactly what was just described by miss frederick um we have huge opportunities to protect our children in more effective ways we need more transparency on children who are exposed to these harms we need to know what facebook is actually doing to protect kids they've been using the efforts that they've done so far like things like the um they've a help center that comes up occasionally they've promoted that as if it's a huge intervention but only hundreds of kids see it per day so we need transparency we need like a parent board that can weigh in on these decisions and we need to have independent academic researchers have enough access that that we can know what the effects are on the kid our kids until we have those things we're not going to be able to protect children adequately gentlemen's time is my term has expired thank you mr chair so this uh concludes the witness testimony and questions for our first panel i want to thank all of our witnesses miss hoggin when congress finally acts i won't say if congress finally acts i'll say when you will be chiefly responsible for whatever happens here uh through the brave step that you took to come forward and and open up the door and shine a light on on what was really happening here so i thank you for being here mr robinson uh miss frederick mr steyer uh all of you thank you so much uh your your testimony and your answering of our questions have been very helpful we are committed to working in a bipartisan fashion to get some legislation done so with that i will dismiss you with our thanks and gratitude and uh we're gonna bring the second panel in thank you thank you for having us [Music] see you soon
Info
Channel: Forbes Breaking News
Views: 1,237,538
Rating: undefined out of 5
Keywords: Frances Haugen, Facebook, House
Id: xN3MvmUhSIk
Channel Id: undefined
Length: 238min 51sec (14331 seconds)
Published: Sun Dec 05 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.