When the Biden Administration launched an AI task force earlier this month to create a path to “democratize access to research tools to promote AI,” the goal of access was paramount.
“The task force consists of some of the top experts in academia and industry,” said Dinesh Manocha, a professor of computer science and electrical and computer engineering at the University of Maryland, on Federal Monthly Insights – Repurposing Manpower through Automation. “They recognize the importance and they’re pushing for more development in the field by making good data available. So data is a very key component of AI and machine learning-based methods.
Manocha said AI is as old as the field, pointing to “Founding Father” Alan Turing, whom he said laid the foundations in the 1950s.
“Machine learning is one sub-area in the broader field of AI,” said Manocha on Federal Drive with Tom Temin. “All the recent developments in AI, all the penetration in the real world, has primarily been driven by the excitement in last five to 10 years from machine learning.”
The breadth of AI and machine learning is quite evident by simply looking at classes offered at the University of Maryland and what students want to study.
“A lot of computer science majors want to take AI,” Manocha said. “Machine learning, by itself, has become such an important sub topic that we even offer multiple classes in it at the undergraduate and graduate levels.”
Focusing on data and algorithms, AI and machine learning imitate the way humans learn. “So you know one of the grand challenges in AI is how can we emulate human-like intelligence, which is still a big open problem,” Manocha said. “There have been a lot of approaches pursued and proposed by a wonderful researchers the last 50 to 60 years.”
Manocha pointed to great advances in AI and machine learning by talking about the sub-branch of “deep learning,” which imitates human knowledge and thinking. But there is still a ways to go.
“If you have some data, you can easily get 50%-to-70% accuracy,” Manocha said. “To go from 50-to-70, to 90%, you get 100x more data. The complexity seems to grow exponentially… So for every 10% relative improvement, we need 100x more data, and that part of getting 100x more data, with all possible situations, is very difficult.”
Credit: Google News