GSA launches public campaign to battle bots, fake comments from online rulemaking forums

The General Services Administration kicked off a broad public effort this week aimed at modernizing the federal e-rulemaking process, as bots, fraudulent submis...

The General Services Administration kicked off a public effort this week to modernize the federal e-rulemaking process, with a special emphasis on how agencies should respond to an ever-growing number of fake and mass comment campaigns.

But based on the initial possibilities, agencies, research institutions and other interested stakeholders described at the first of two public meetings on the e-rulemaking process, the debate that lies ahead is multi-faceted and complex.

Take the public comments the Federal Communications Commission received on net neutrality regulations in 2017. FCC received nearly 22 million comments, and just more than half of them were identified as being from stolen identities, according to a Pew Research Center study.

Just 6% of the 22 million comments were unique, while the remaining 94% of comments were submitted multiple times. Experts say a combination of mass comment campaigns and bots likely contributed to the vast majority of these submissions on net neutrality, which agencies and other stakeholders say is troubling.

Under the Administrative Procedure Act, agencies are supposed to go through a public notice and comment period to solicit feedback and relevant information and data to inform their policy and rulemaking. The E-Government Act of 2002 established an e-rulemaking program, which opened up online channels for public comment on Regulations.gov and other sites.

But — as the FCC example shows — these sites are often vulnerable to the kinds of phishing emails, social media scams and toxic comment boards that can plague the rest of the internet.

“The perception problem is huge,” Michael Fitzpatrick, head of global regulatory affairs for Google, said Thursday at GSA’s first public meeting on the topic. “We live in a time where there is decreasing trust in all of our institutions, not the least of which are government institutions. The last thing we need is a common view that essentially the entire rulemaking process is being gamed by a variety of machines and shadowy players. In some instances, that’s true at a substantial scale.”

Fitzpatrick previously served as the Office of Information and Regulatory Affairs associate administrator during the Obama administration.

At this point, it’s unclear exactly how many comments on federal regulations are fake, said Tobias Schroeder, director of GSA’s eRulemaking Program.

Determining that proportion will be part of GSA’s “rigorous” and “phased” process to learn more about agencies’ challenges with the existing e-rulemaking procedures, he added.

Agencies currently can choose whether they accept comments anonymously on Regulations.gov or whether they require the public to enter a name in a designated field. But agencies don’t take the extra step to verify a person is who he said he is.

“You could say that you’re Mickey Mouse and not who you actually say that you are,” Schroeder said.

This scenario was a reality for the FCC when it sorted through the submissions on net neutrality regulations. Commenters identified themselves as high-profile politicians and even Elvis Presley, said Sanith Wijesinghe, an information systems engineer with MITRE.

The problem, the experts said, will only get worse.

“Is it a real good use of taxpayer money to process comments that probably originate in foreign countries? Why are we spending this time accommodating that?” he said. “Given that we now have the Evidence-Based Policy Act, the overhead associated with tracking down all of the assertions made in these comments is a non-trivial effort, and we really need to make sure our policies are truly evidence-based and not fake-evidence-based.”

Agencies shouldn’t completely discount bot-generated comments or fraudulent submissions just because their fake, Reeve Bull, research director for the Administrative Conference of the United States, said.

“If the comment, even if it is a fraudulent comment, has factual information in it that’s relevant and that information is verifiable, then in theory the agency should take it into account, even if the person filing the comment is not the same person who signs his name on the bottom of it,” he said.

Still, Bull acknowledged the risks associated with evaluating and accepting fraudulent comments, especially if they come from an impostor pretending to be a real person or advocacy organization that could, in theory, reasonably submit a comment.

He urged agencies to consider a combination of other methods, such an early public comment period, surveys and listening sessions to gather relevant feedback on a potential policy.

In addition, emerging technology could also help GSA and other agencies detect and prevent the spread of bot-generated or fake comments, Fitzpatrick said.

Tools like reCAPTCHA, which prompts a would-be commenter to identify and click on a named item within a series of images, may help agencies weed out the bots from the real human commenters — at least for now.

Other tools could track click speeds and patterns to help agencies determine whether the submission came from a human being or a bot.

“You could also segregate any red-flagged submissions and put them in a separate category to be dealt with later,” Fitzpatrick said. “That’s a decision that could be made by the agency. The machine learning tool will continue to learn over time and perfect its precision so that what you end up doing at the end of the day is reducing false positives.”

Still, Fitzpatrick warned of the risks associated with deploying too many tools that make it too difficult for the public to share relevant information.

“This is the great balance for agencies and for the rulemaking process,” Fitzpatrick said. “They want to protect and we want to protect against bad actors, but we don’t want to add a level of friction that deters democratic participation in the process, particularly participation by people with fewer resources and less sophistication.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories