The sound of plants dancing
Virginia Tech researchers use machine learning to identify plant health based on the sound of their movement.
August 31, 2020
Close your eyes. Listen to what’s around you. What do you hear?
Sure, you hear the sound of wind chimes or leaves fluttering in the wind, but can you hear the sounds of plants moving as they grow?
Now imagine putting on a pair of headphones and being able to determine the health of plants across the globe based purely on what you hear. You’d be able to make remote adjustments to soil, water, or fertilizer based on what you heard, ensuring healthy plants.
That’s exactly what Bingyu Zhao, associate professor in the College of Agriculture and Life Sciences’ School of Plant and Environmental Sciences is researching.
Zhao’s team is studying the microscopic movements and sounds plants grown in a hydroponic environment make based on the nutrients they have — or lack — from water. Their work could impact the global food source by increasing the sheer number of food-producing farms in previously challenging growing locations, such as urban environments. The research will increase the health of the plants, enabling these plants to produce more or higher-quality food.
“Plants that are in a healthy condition have a specific movement pattern,” Zhao said. “When a plant is stressed, such as lacking nutrients, the pH of the water dramatically changes, or lacking light, the movement pattern changes. We want to understand the plant movement pattern or use the plant movement pattern to reflect plant health.”
Plants move during the day and as they grow, creating their own movement patterns. That dance creates a sound that changes based on a variety of environmental factors, including sun, soil quality, and nutrients.
Indoor, urban agriculture, gaining popularity because of a changing climate, allows farmers to grow plants indoors in a highly scalable environment. Hydroponic agriculture is a popular form of indoor growing and is how plants are grown in water and primarily without soil.
Above: A time-lapse video shows microscopic plant movements that happen during the day. Image credit: Daniel Pillis
These systems range from simple deep-water systems to complex ebb and flow systems, each containing nutrient-rich water that plants’ roots absorb.
These are smartfarms of the future: highly integrated with technology, sensors, and machine learning. The university is embarking on the SmartFarm Innovation Network, a decentralized network of interconnected centers across the commonwealth where Virginia Tech’s interdisciplinary researchers and Virginia Cooperative Extension specialists can partner with industries to develop and deploy innovative technologies to increase overall efficiency, resilience, and sustainability of food, agricultural, and natural resources production systems.
Hydroponic agriculture has a lot of factors that come into play – heating, lighting, pH of water, and nutrients contained within the hydroponic water solution, just to name a few. Sensors can capture each one of these factors — and more.
Carefully walking the tightrope of these factors is critical to growing healthy, nutritious plants.
Plants are always moving. Obviously, they can’t walk, but plants do shake.
In a collaboration with Ivica Ico Bukvic, a professor in the School of Performing Arts and the director of the Creativity + Innovation transdisciplinary community, this movement is captured using high-resolution cameras, and a time-lapse is constructed to observe these movements with the human eye. These cameras capture the micro-movements of the plants. Their movement data is then sonified, or converted into sound, so that it can be studied by leveraging unique affordances of the human hearing. The resulting findings will be used to drive the choice of stimuli, including sounds that can be fed back to plants to promote particular behavior.
“This project immediately attracted my attention because its transdisciplinary potential resonates both with my administrative and research interests,” Bukvic said. “Sonifying data, such as plant movement, allows us to uncover new patterns and empower humans to perceive the imperceivable. This project completes the sonification loop by also offering a way to feed the sound back to the plants with the goal of improving the plant wellbeing.
“It also connects across disciplines in the most unexpected ways and it is at these newfound and largely underexplored intersections that we find a lot of new and potentially transformative ideas,” Bukvic continued. “We at Virginia Tech are uniquely positioned to do this given our comprehensive scholarly depth.”
Above: Microscopic plant movements are captured using high-resolution cameras so the human eye can see how they react to inputs, such as water and fertilizer. Image credit: Daniel Pillis
Using peppers grown in this environment as the test subjects, Zhao and his team are using the high-resolution cameras to capture movement and convert it to sound, so that it can be studied, interpreted, and used to provide targeted stimuli back to the plants.
Machine learning can analyze these sounds and the raw video footage to correlate health with sound. To accomplish this, Zhao is leveraging the expertise with machine learning and artificial intelligence of Jia-Bin Huang, an assistant professor of electrical engineering at Virginia Tech, and Daniel Pillis, a Princeton University researcher and artist.
Listening to the plant movement pattern can reveal whether the plants are in a healthy condition or in a stressed condition.
“In the future, you could catalogue different patterns and understand if a specific sound means the plants need more nitrogen, more phosphate, or more light,” Zhao said. “When people are working in indoor agriculture, they are gaining another tool to monitor plant health. They don’t even need to be in the building to understand the plant growth conditions need to be adjusted.”
That’s the smartfarm of the future.
Written by Max Esterhuizen