Andrew Ng proposes bringing AI from the top 1% to the masses • The Register

0

In 2015, modern AI pioneer Andrew Ng’s recipe for success was to make it big with neural networks, data and monolithic systems. Now that recipe has created a problem: the technology is dominated only by a few wealthy companies that have the money and the number of employees to build such immense systems.

However, according to Ng, the Baidu and Google Brain alumnus (and current CEO of software maker Landing.AI), the landscape doesn’t have to depend on such mainstream accessibility. Instead, he proposes an approach to make machine learning inclusive and open during a meeting at Nvidia’s GPU Technology Conference last week.

Ng proposed developing better analytical AI tools and domain knowledge, with the goal of essentially being able to do more with less. The key to AI accessibility is being able to understand patterns and trends from smaller data sets.

“We know that in consumer internet companies, you might have a billion users and a massive dataset. But when you go into other industries, the sizes are often much smaller,” Ng said.

Ng was referring to building AI systems in locations like hospitals, schools or factories that lack the resources and data sets to develop and train AI models.

“AI should change all industries. We’re not seeing this at the pace we’d like yet, and we need data-centric AI tools and principles to make AI useful for everyone…not just big consumer internet companies,” Ng said.

For example, he cites thousands of $1-5 million projects in places like hospitals, which are typically tight on budgets, and could migrate to smaller custom AI systems to improve analytics.

Ng said he saw some environments on the shop floor that only had 50 images on which to build a computer vision-based inspection system to root out defective parts.

“The only way for the AI ​​community to build these essentially very large numbers of systems is to build vertical platforms that aggregate all of these use cases. That enables the end customer to build the custom AI system,” said Ng.

One such step is better “data preparation” – as opposed to data cleaning – to iteratively improve the machine learning system. The idea is not to improve all of the data in a large dataset, but to implement an error analysis technique that helps identify a subset or portion of the data that can then be improved.

“Rather than trying to improve all the data, which is just too much, you might know you want to improve that part of the data, but let’s leave the others. They can be much more targeted,” Ng said.

For example, if part of an image is found to be defective, error analysis can collect more targeted and specific data to better train systems. This small-data approach is more efficient than the larger approach of collecting larger data, which can be expensive and resource-intensive.

“This allows you to go to a much more targeted data collection process where you’re like, ‘Hey, let’s go to the manufacturing facility and take a lot more pictures,'” Ng said, adding that consistent and efficient labeling is a big deal part of the process.

Ng gave a concrete example of error analysis in speech recognition systems to filter out car noise from human speech in a soundbite. Engineers have been tempted to build a system to detect the car noise, filter out the car noise, and then eliminate the car noise.

A more efficient approach is to generate more human speech and background noise data in the car, then use error analysis to slice out the problematic car noise data, and then use a targeted data enrichment and generation approach to improve the performance of that problematic segment of data.

Traditional big data approaches to AI are still good, but error analysis systems are better for limited data sets. “You decide what you want to improve and combine the costs of more data collection – [whether] it is reasonable in relation to the potential benefit,” said Ng.

It could take a decade and thousands of research papers to craft a consistent data-centric model for deep learning, he said.

“We’re still in the early stages of figuring out the principles and tools for entering the data in a systematic way,” Ng said, adding, “I’m excited to see a lot of people doing this and celebrating their work too.” ®

Share.

Comments are closed.