This is a demonstration showing how to achieve the seamless workflow of in situ root phenotyping with the help of automated minirhizotron (MR) system and analysis model (Figure 1).
Figure 1. The workflow of automated minirhizotron image acquisition and analysis.
Step 1: Automated Image Acquisition
Images used in this study were collected at a greenhouse located at the Arava Research and Development Center in Israel. The soil texture is loamy sand. Sweet bell pepper (Capsicum annuum L.) plants, Canon 7158 (Zeraim Gedera, Syngenta), were measured from August 2021 to May 2022. The automated MR system, RootCam© (Crystal Vision, Samar, Israel), was applied to take images on a daily basis. RootCam includes the camera itself and software designed to move the camera along a rail to acquire images every 19 mm along the MR observation tube. Lighting was supplied by LED strips. The images were saved to a “Raspberry Pi” device, which was accessible via a network cable and allowed remote control. Each MR observation tube was 60 cm long and installed 10 cm from the plant (Figure 2).
Figure 2. RootCam installed in the field before painting observation tubes and transplanting plants.
Compared with images taken by the most commonly-used MR system (Bartz Technology Co., Carpentaria, California, USA), those taken by RootCam were of larger size and higher resolution which allow us to see more details of plant roots such as root hair and mycorrhizal fungi (Figure 3a-b).
Figure 3. Comparison of root images taken by Bartz manual MR system (a) and by RootCam (b). Images in the second row are enlarged versions of the marked parts in the original root images in the first row.
Step 2: Automated Image Transfer and Analysis
Previously, users either download images remotely through a remote desktop application such as Anydesk (Figuer 4a) or copy images in the field with a hard drive (Figure 4b). These data transfer approaches are either time-consuming since high-resolution images are of large size or hinder the up-to-date data acquisition since researchers do not visit field frequently if the field is far away.
Figure 4. Traditional approaches to transfer data either by downloading data through Anydesk remotely (a) or by copying data in the field with a hard disk (b).
To oversome these limitations, Dropbox is applied to achieve real-time data transfer. Raspberry Pi embedded in RootCam send each day’s root images to Dropbox. Then users could access these images through synchronized Dropbox on the office computer (Figure 5).
Figure 5. Transfer data from the field to the office through Dropbox.
A Convolutional Neural Networks based model is developed to estimate total root length (TRL) from MR images direcly without segmentation. As can be seen in Figure 6, The TRL values were split into K ranges, and the model was trained to classify an image to the correct TRL range and to also regress the TRL value per class. The model consists of a Feature Extraction Module (FEM) that extracts image features which are fed to a Classification Module (CM) that outputs the class probabilities and to an ensemble of Regression Modules (RM). The model outputs the expectation value of the TRL estimates of each class.
Figure 6. The architecture of the root length estimation model based on Convolutional Neural Networks.
Task schedulers were set on the office computer to automatically conduct analysis with the model inside the local Dropbox.
Step 3: Up-to-date Root Growth Report
Task schedulers were set on the office computer to automatically generate up-to-date root growth report with results from Step 2 via pre-written scripts. Scripts are shared on Github (https://github.com/kainingzhou/autoworkflow). Following is example of the report:
1. Root length density estimated by model
Following is the graph of root length density (RLD) in different soil depth changing with time (Figure 7).
Figure 7. Root length density distribution with soil depth at different dates.
2. Root distribution along the soil profile
Stitched root images showed root distribution along the soil profile. Figure 8 displayed the soil profile on January 6, 2022