By AI Trends Staff
The VA is planning to expand its use of AI; the IRS is moving to employ more AI for help with tax compliance. As the federal government moves toward wider adoption AI, we highlight selected implementations.
How best to apply curated data sets to newfound use cases, and navigating through the proper consents for use of proprietary medical data, is among the top challenges of implementing AI in the Veterans Administration, according to a recent account in governmentCIO.
“Data is collected for maybe one reason, and it may be used for analyzing and finding results for that one particular reason. But there may be other uses for that data as well. So when you get to secondary uses you have to examine a number of challenges,” stated Gil Alterovitz, Director of AI for the VA, at the recent AFCEA’s Automation Transformation conference, held in January 2020.
Alterovitz has proposed releasing broader ecosystems of data sets that can be chosen and applied depending on the demands of specific AI projects.
“Rather than release one data set, consider releasing an ecosystem of data sets that are related,” he stated. “Imagine, for example, someone is searching for a trial you have information about. Consider the patient looking for the trial, the physician, the demographics, pieces of information about the trial itself, where it’s located. Having all that put together makes for an efficient use case and allows us to better work together.”
On the issue of consent, the VA is exploring use of synthetic datasets that mimic the statistical parameter and overall metrics of a dataset, while obscuring personally-identifying information.
“How do you make data available considering privacy and other concerns?” Alterovitz said. “One area is synthetic data, essentially looking at the statistics of the underlying data and creating a new data set that has the same statistics, but can’t be identified because it generates at the individual level a completely different data set that has similar statistics.”
The revised dataset may only have to be 20% different to protect identities, he suggested.
IRS Using AI To Monitor Tax Compliance
Meanwhile, the IRS is increasingly relying on machine learning and data analytics to study a rich supply of data in efforts to detect tax evasion, respond to taxpayer questions and in general become more efficient, according to a recent account in the Wall Street Journal.
The IRS criminal investigations unit is using systems from Palantir Technologies, offering data mining services, to identify potential fraud cases for further inquiry. “You’d be shocked [at] how much information Palantir can provide,” stated IRS Commissioner Charles Rettig at a conference on AI and taxes held in late February 2020 at the University of California, Irvine law school.
The information can include first name and cell phone number. And this data does not come from people who filed tax returns. “These are non-filers. There is a heat map that says where there are concentrations of these people. We have sufficient data,” Rettig stated.
Palantir has drawn attention from the community concerned with ethical use of AI. The company was founded in 2004 by Peter Thiel and others from PayPal. Initial work was for the Pentagon and the CIA in Afghanistan and Iraq, with some success helping soldiers avoid roadside bombs and track insurgents, according to a 2018 account in Bloomberg.
The Department of Health and Human Services has used Palantir to detect Medicare frauds. The FBI uses it for criminal investigations. The Department of Homeland Security deployed it to screen travelers and keep tabs on immigrants. Police and sheriff’s departments in New York, New Orleans, Chicago and Los Angeles have also used it, such as for digital dragnets of people. More famously, Cambridge Analytica, the political consulting firm that worked for Donald Trump’s 2016 presidential campaign, testified to the British Parliament that a Palantir employee had helped in use of the personal data of up to 87 million Facebook users. Cambridge Analytica used the data to develop psychographic profiles of individual voters. Palantir said the employee was working with Cambridge Analytica on his own time; the company has a policy against working on political campaigns.
Today the company is winning deals connected to fighting the coronavirus. Palantir has sold its technology to the Centers for Disease Control and Prevention (CDC), to help monitor what masks ventilators and staff hospitals require to fight the coronavirus pandemic, according to a recent account in Forbes.
The US Coast Guard recently acquired a Palantir system under a procurement for a “Readiness System in response to the COVID-19 pandemic.” It was not made clear what the mission was or how Palantir would be employed.
IRS efforts are still in an early stage, as the agency struggles with its legacy technology.
Federal Government AI Community of Practice Growing
The commitment to AI by the entire federal government was given a boost just over a year ago with the signing of an Executive Order by President Donald Trump calling for the US to maintain its leadership in AI. This would be accomplished in part with a government-wide strategy to collaborate and engage with the private sector, academic, the public and international partners.
This has led to the Office and Management and Budget and the General Services Administration to create an AI community of practice, to help agencies take advantage of current and foreseen AI and machine learning technologies. The community of practice, which started in November, is now 400 members strong from 26 agencies, according to a recent account in Federal News Network.
Among the goals of the AI community of practice is to create a searchable “use case repository” that would provide agencies examples of where agencies have successfully deployed AI for customer experience, human resources, advanced cybersecurity and business processes.
Creating a roadmap that focuses on AI projects that combine the highest value, good timing and available necessary data, suggested Eric Forseter, the general manager for the public sector at DataRobot, which offers an automated machine learning system. “The data is never going to be perfect,” he stated. When you use software that can automate machine learning, you can enable people who are your mission critical staff but not necessarily data scientists.”
Read the source articles in governmentCIO, Wall Street Journal, Bloomberg, Forbes and the Federal News Network.