Special Reports

How AI is poised to impact government – and vice versa

The future is now – and the public and private sector are both working together and at cross-purposes to meet the moment

Government officials are scrambling to draft legislation and regulations to rein in the risks, balanced against AI’s transformative potential to make work more efficient and improve our way of life.

Government officials are scrambling to draft legislation and regulations to rein in the risks, balanced against AI’s transformative potential to make work more efficient and improve our way of life. kasezo/Getty Images

Since OpenAI launched its chatbot in November 2022, artificial intelligence has seemed to shift from a far-fetched sci-fi premise – think HAL 9000 in “2001: A Space Odyssey” or the childlike android in “A.I. Artificial Intelligence” – to an inescapable, everyday reality. ChatGPT and other AI-powered tools have sprouted up overnight, spreading to search engines, social media platforms, virtual assistants and beyond.

The advent of the AI age holds plenty of promise, even as it threatens to disrupt entire professions, undermine and exploit the creative work of artists and writers, and unleash a flood of deceptive audio, videos and other imagery. Government officials are scrambling to draft legislation and regulations to rein in the risks, balanced against AI’s transformative potential to make work more efficient and improve our way of life.

Of course, some AI tools have been in use for years – and not just by tech startups, but just as often in the public sector. State and local governments have already been employing AI to complete a range of tasks more efficiently and effectively: sifting through vast amounts of data, automating bureaucratic processes, and identifying and responding more quickly to public safety threats.

Now, as the underlying technology continues to improve and its applications multiply, the question isn’t just how exactly governments will regulate artificial intelligence – it’s also how exactly governments will deploy it.


Governmental bodies use AI to help prevent cyberattacks and detect public health threats. AI is deployed by law enforcement to identify potential offenders and by local and state agencies as a screening and predictive tool – although there’s sharp debate about whether such efforts do more harm than good. And even as they work out their kinks, chatbots are being used to serve constituents, companies and other stakeholders that interact directly with government offices.

The domain where government use of artificial intelligence has perhaps generated the most alarm is in law enforcement, amid fears of ever-expanding surveillance capabilities. The maker of the New York City Police Department’s “Digidog” robots touts its AI capabilities, and the Metropolitan Transportation Authority recently began using AI-powered surveillance to track fare evasion in the city’s subways. Florida recently contracted with a California tech company to transcribe inmates’ phone conversations in the state’s prison system.

New York City Police Department’s “Digidog”
New York City Police Department’s “Digidog” will assist the NYPD Technical Assistance and Response Unit. / Photo credit: Lev Radin/Pacific Press/LightRocket via Getty Images

Police use of facial recognition has also prompted an outcry in many jurisdictions. In Pittsburgh, police used facial recognition technology during the 2020 Black Lives Matter protests without the knowledge or permission of police command staff, PublicSource reported in 2021. The episode was controversial for a variety of reasons. In May 2020, the Pittsburgh City Council voted to regulate the use of facial recognition and predictive policing technologies by city entities by mandating that council members sign off on any use of facial recognition technologies. The software was criticized on the grounds that it violated people’s privacy because it builds its database of images by procuring pictures collected without the subject’s consent. Finally, the software can locate photos of people they haven’t posted online themselves.

Today, AI recognition technology remains restricted in Pittsburgh. A draft proposal on Carnegie Mellon University’s campus in July 2022 considered a video surveillance policy allowing the university to use facial recognition technology during investigations, but it spurred a backlash from students and alumni alike on the basis that this would be an invasion of students’ privacy and normalize surveillance. Ultimately, the university nixed the policy in response to community concerns and claimed that the Carnegie Mellon University Police Department had never used facial recognition technology in the past.

“Based on feedback received from the community to the first draft of a video security policy that would have allowed for the potential use of facial recognition tools in criminal investigations, we have decided not to move forward with further consideration of this policy document,” the school said in a statement at the time.

One application that remains accessible to law enforcement throughout Pennsylvania is JNET, a database with facial recognition capabilities. In Allegheny County, the Office of the District Attorney announced in August 2022 that it would use NICE Justice, an AI-powered tool that facilitates the management, sharing and use of digital evidence, with photos, videos, PDFs and documents uploaded to a single portal.

According to the district attorney’s office, its 35,000 annual cases is beyond the capacity of the 127 attorneys it has on staff, and the software can improve efficiency and reduce delays. “We have a sizable number of cases that were postponed due to delays in processing digital evidence,” Rebecca D. Spangler, first assistant district attorney for the Allegheny County District Attorney’s Office, said in a statement. “With NICE Justice, we’ll be able to streamline the entire process of managing digital evidence, from intake to discovery. When we’re able to eliminate postponements by making the system more efficient as a whole, that’s good for everyone.”

NICE General Manager for Public Safety John Rennie said one consequence of the coronavirus pandemic was heightened demand for expediency. “COVID helped a lot of people realize that doing things slowly in an office was also not always the best way of doing things,” Rennie said. “The more technology-oriented DAs and the more progressive DAs in particular very much recognize that their staff needs these tools to be able to self-serve instead of having to send off to other people or do in different programs.”


Meanwhile, elected officials have been advancing measures to regulate artificial intelligence. New York City’s AI bias law, which requires employers using AI tools for hiring to perform annual audits of its technology, went into effect this summer, and the Adams administration’s new AI action plan envisions the creation of a “framework for AI governance” and the formation of an advisory group.

In Pennsylvania, Gov. Josh Shapiro issued an executive order in September to establish standards and a governance framework for generative artificial intelligence use by state agencies and employees. This included establishing an AI governing board along with establishing core values for generative AI use, among them accuracy, employee empowerment and equity.

“We can’t ignore new technology – we have to educate ourselves and be proactive to minimize the risks and maximize the benefits of innovation and that’s the approach my administration is taking here in Pennsylvania,” Shapiro said.

The Adams administration’s new AI action plan envisions the creation of a “framework for AI governance” and the formation of an advisory group.
The Adams administration’s new AI action plan envisions the creation of a “framework for AI governance” and the formation of an advisory group. / Photo credit: Office of the Mayor

In July, U.S. Sen. Bob Casey of Pennsylvania introduced a bill that would restrict employers’ use of AI in hiring. Pennsylvania state Reps. Robert Merski and Chris Pielli have introduced legislation to make it a misdemeanor to use AI to impersonate a loved one without consent. In May, U.S. Rep. Joe Morelle of New York introduced legislation banning the nonconsensual sharing of intimate deepfake images, something domestic violence advocates have voiced concerns about.

State Sen. Jay Costa of Pennsylvania, who has been a vocal advocate of AI and the technology’s integration into society, has sponsored legislation to spur more study of the technology. He has teamed up with state Sen. Jimmy Dillon and other lawmakers on legislation establishing an advisory committee to conduct a study on AI, including how to distinguish between AI- and human-generated content.

“AI is not just about the technology; it’s going to impact aspects of our daily lives that we can’t even imagine right now,” Dillon said. “We need to really concentrate on and address the ethics and the regulations and the workforce concerns and take a comprehensive approach toward it.”

Yet advocates for responsible AI use said transparency is often easier said than done. In the Pittsburgh area, the Allegheny County Department of Human Services has used the Allegheny Family Screening Tool since 2016 to help social workers predict when children may face harm, ideally as a way to check individual biases of staffers. Yet the practice has drawn scrutiny from the U.S. Department of Justice following complaints that the technology could result in unfairly targeting parents with disabilities or mental health disorders. The Associated Press reported that the tool draws on Supplemental Security Income data as well as records of diagnoses such as schizophrenia or mood disorders. Disability rights advocates argued that this unfairly targets parents with disabilities and mental illness and punishes them for accessing country resources.

Children work on a classroom exercise using Khanmigo, an AI-powered guide developed by Khan Academy, during a math and sciences class at Khan Lab School.
Children work on a classroom exercise using Khanmigo, an AI-powered guide developed by Khan Academy, during a math and sciences class at Khan Lab School. / Photo credit: Constanza Hevia H. for The Washington Post via Getty Images

The parents at the center of a case that resulted in a Justice Department investigation weren’t allowed to see their risk score after their daughter was removed from their care. An Allegheny County spokesperson said via email that the county doesn’t consider the tool to be AI – but not everyone sees it that way.

Experts said being able to access information about what kind of data goes into training an AI model is crucial. But Julia Stoyanovich, director of the Center for Responsible AI, said regulating AI can be tricky because entities – including local and state governments – can be cagey about disclosing when they’re using it.

“We do need to think about how we can create an environment where people can have a positive conversation where it’s not all just sticks, but also carrots,” Stoyanovich said, where “government representatives are able to come to the table and actually use the expertise of external folks to try and help them do things better rather than the worried about negative publicity.”

U.S. Senate Majority Leader Chuck Schumer has made headlines for exploring increased oversight of AI and met recently with tech leaders to discuss how to regulate AI. However, Stoyanovich lamented that the first meetings were with industry leaders, even though input was taken from a wide range of constituents and stakeholder groups.

“Essentially, we’re going to be relying on industry to say that they’re not going to be evil,” Stoyanovich said. “I think that’s really a mistake … because industry incentives do not align very often with societal incentives. And we do absolutely need legal and regulatory instruments here that are going to protect citizens in terms of both, what data about them is being used, how it’s collected, and also the decisions that were subjected to.”

Back to Special Report: AI in Government

Atiya Irvin-Mitchell is the lead reporter in Pittsburgh for Technical.ly, a news organization for technologists and entrepreneurs, by way of Report for America.

NEXT STORY: The battle to deep-six the deepfake threat to elections levels up

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.