New York -- Two of the opening keynotes at the Artificial Intelligence Conference, hosted by O'Reilly Media and Intel AI, highlighted the corporate efforts by a couple of well-established technology companies to use artificial intelligence for social good.
The SAS Institute's Mary Beth Ainsworth talked about how the analytics company is partnering with a nonprofit organization to study cheetah populations in southwest Africa by using computer vision. And Microsoft Corp.'s Jennifer Marsman talked about the company's efforts to use machine learning, the internet of things and new networking ideas to reduce world hunger.
They are part of a growing collection of tech companies investing in projects that use artificial intelligence for social good. The list includes IBM and its IBM Watson AI XPrize, as well as Amazon and its "artificial intelligence for good manager." The latter partners with organizations attempting to tackle global challenges. Whether using artificial intelligence for social good becomes a movement with legs remains to be seen. But at the O'Reilly event, Ainsworth and Marsman outlined their companies' respective efforts to a packed audience of technology experts.
Automating footprint analysis
Traditional methods for monitoring wildlife can be invasive, affect animal behavior and even be detrimental to the health of the species. Chemicals used to sedate animals to affix a radio tracker have been linked to infertility in females.
SAS partnered with Sky Alibhai and Zoe Jewell, co-founders of the nonprofit organization WildTrack, who sought to find noninvasive ways to monitor a species' movements.
Taking a cue from the indigenous trackers who can identify the species, sex and even the individual from footprints alone, Alibhai and Jewell developed the Footprint Identification Technique, or FIT. The idea is to digitally capture images of footprints, upload them to a laptop and begin documenting telling features, such as the distance between two toes, for identification and analysis.
This data is critical to mapping the density and distribution of a species, which Ainsworth described as the "cornerstone of a conservation plan."
But FIT is a laborious set of tasks made up of multiple steps and multiple programs. Today, computer vision and deep learning are automating the process, accomplishing in seconds tasks that took researchers 20 to 30 minutes to complete, according to Ainsworth.
The process of automation included training a convolution neural network on images in the database to identify species. Data also had to be labeled using a "golden data set" provided by researchers as the standard to build a database of other pertinent identification information. Soon, researchers are hoping the tables will turn: Rather than teaching the technology, they're hoping the technology will teach them.
"Given more data, we can start to look at what data points are being surfaced by computer vision that can identify an individual that maybe aren't obvious to the human eye," Ainsworth said during a session she gave later that day.
AI in agriculture
Microsoft Research also has staked a claim in the artificial intelligence for social good movement. It is investigating ways to enable data-driven farming techniques such as "precision agriculture," where crops are watered or fertilized with exactly the amount they need and no more.
The mission is to use artificial intelligence to deal with the looming global food crisis. Farmers are going to have to double food production to sustain the projected nine billion people who will inhabit the planet by 2050.
But building the AI infrastructure to do this isn't cheap. Data-driven farming requires sensors, which can cost as much as $1,000 apiece. The big expense is neither the sensors themselves nor is it the energy needed to power internet of things devices. Instead, "it is the connectivity -- it's getting data from the actual devices into the cloud where we can do something useful with it," said Marsman, who is part of Microsoft's AI for Earth Group.
Cellular and Wi-Fi connectivity aren't robust enough in rural areas to get the job done, so Microsoft is experimenting with "television white spaces." Because television stations broadcast at a lower frequency than Wi-Fi, coverage of an area can be more complete.
"Now, of course, you can't just start sending data packets over unused television channels," Marsman said. "You have to work with the local governments and do it properly. But this is something that we've pioneered and it actually works quite well." And the data generated can be used to feed machine learning models to predict the best time to plant crops or when to fertilize.
Microsoft researchers are also attempting to reduce the number of sensors needed by deploying drones or helium balloons outfitted with cameras to capture aerial photos of the crops. The images are combined with machine learning models to build maps that can predict sensor readings in areas that don't have any.