Plant science students at the University of Nebraska-Lincoln are still taught to phenotype by hand, wading into muddy fields to record the differences in physical characteristics between varying corn hybrids with a small set of tools and a pen and notebook.
But like the rest of 21st century life, technology is on track to render humans obsolete.
A team of UNL plant scientists and biological systems engineers have built an automated system capable of detecting an individual corn leaf and grasping it with robotic precision to screen its temperature, chlorophyll and water content in less than a minute.
At a time when driverless combines can harvest around the clock, drones can nimbly identify problem spots in a field, and cattle herds are fitted with wearable devices to monitor their individual health, the Plant Phenotyping Robot System marks another leap forward for precision agriculture.
James Schnable, an associate professor of plant science who specializes in computational biology, said before hybrid corn “really took off” after World War II, there was little need to measure how a specific breed of corn physically manifested itself in the field.
“When we started doing hybrid breeding, there became this mechanism where there was a lot of investment in breeding and suddenly you have this whole discipline of how you collect data to make breeding decisions,” he said.
[Watch: Plant Phenotyping Robot System]
UNL has trained aspiring plant scientists in labs and research fields to phenotype various corn hybrids in the search for breeds with higher heat and drought tolerance that produce better yields.
Phenotyping took a leap with the opening of the Greenhouse Innovation Center at Nebraska Innovation Campus, where an automated conveyor belt and camera system can record miniscule changes to individual plants in a controlled environment.
That technology has been taken out of the greenhouse and into the field at various places in the state, including the Eastern Nebraska Research and Extension Center near Mead, where a Spidercam outfitted with special cameras, like those found at colossal stadiums, records the physical characteristics of plants in the field.
“The challenge is we need to score bigger and bigger populations as we do more and more complicated breeding tasks,” Schnable said. “And the total population of students interested in spending their summers in cornfields as steam is coming out of the mud is not getting any bigger.
“That’s why we need more complicated technology to look at much bigger experiments across more environment,” he added.
Schnable, who partnered with a trio of biological systems engineers at UNL to develop the robot, said the goal is to verify the data collected from the expensive camera systems at ground level, and once perfected, do it faster and with more certainty than an army of human scientists can.
Building a robot to automate those processes took years, as the team had to build a system that could identify the leaf from a corn or soybean plant, and then direct a robotic arm where to reach and how to grab it for a battery of measurements, said Abbas Atefi, a Ph.D. candidate in biological systems engineering.
“For our task, we needed to find the 3D coordinates of each grasping point on the leaf,” Atefi said. Using a “time-of-flight” camera, rather than a standard color camera, the robot could be given hundreds of options for points to grab on an X, Y and Z axis.
Atefi spent two years writing an algorithm that narrowed the hundreds of candidate sites into a single representative point before converting the coordinates into directions for the robotic arm to bend and twist toward a leaf, finally grabbing it without damaging it.
Once that was accomplished, the team worked on refining the fiber-optic cable and temperature sensor installed on the gripper, which measure the biochemical and physiological processes of the plant through indicators such as chlorophyll and water content.
After hundreds of tweaks to the algorithm, the team ran the robot through a series of tests in the Greenhouse Innovation Center, comparing the results to a series of benchmarks taken by both human researchers and the powerful imaging system already in operation.
The results, which will be published in the academic journal Computers and Electronics in Agriculture next month, show that the robot is able to determine the temperature of a leaf with comparable accuracy to a human, while tests measuring chlorophyll, water content, and nitrogen still need more fine-tuning.
But early results feed into the optimism the team has about the robot’s future, both in the greenhouse and in the field.
“Integration is something we think about a lot,” said Yufeng Ge, an associate professor of biological systems engineering at UNL.
The robot needs to be able to operate in a wide array of environments — wheeled around on a moveable platform in more conventional greenhouses, or woven seamlessly into the existing conveyor belt system at Innovation Campus — while providing a known degree of accuracy in each, Ge said.
“If you have a graduate student, they are human, they will make errors,” he added. “But we can have a system where hundreds of robots are scattered throughout the greenhouse, each carrying a specific sensor to look at one aspect of the plants with great accuracy.”
Schnable said automating certain phenotyping processes also frees graduate students to crunch the massive volumes of data looking for patterns that emerge in various hybrids.
“(Graduate students) may not want to stay in grad school if the first two years they are just walking around taking measurements,” he said. “Longer-term, there is a vision that this would be able to go out into the field, where I still do have grad students running around.”
Santosh Pitla, an associate professor of biological systems engineering, said the robot will soon be tested on a robotic platform — essentially a small, driverless tractor with a 5-foot clearance and a toolbox full of scientific instruments that can navigate a cornfield on its own — that would allow for automated data collection in hundreds of field acres.
If drone imaging or the on-board computers in a combine can sense higher plant temperatures in wider parts of a field, Pitla said robotics is the next leap in precision agriculture, allowing farmers to pinpoint the water, fertilizer or pesticide inputs down to the individual plant.
“Right now, we still talk about things in pounds per acre or gallons per acre,” he said. “I think we could get to the point where we’re talking about per plant, and every plant has a name and each robot can go and tend to each plant’s needs. That’s the end goal, treating each plant separately.”