**Acknowledgements**

classification, these features are specific to certain datasets. If you use the same features to analyze different data sets, the results may be very different, which is a

ence [40], which contains 10 diseases of 3 crops with a maximum accuracy of 97.3%. Therefore, the performance of LeafNet is slightly lower than Reference [40], which used currently popular transfer learning algorithm. The main advantages of this algorithm are as follows: the network can converge quickly when the data set is small; easy to implement; and shorter training time. Therefore, in the future we will continue to research on and apply transfer learning algorithms to identify more

CNNs have developed into mature techniques that have been increasingly applied in image recognition. The computational complexity needed for neural network analyses is considerably reduced compared to other algorithms, and it also significantly improves computing precision. Concomitantly, the high fault tolerance of CNNs allows the use of incomplete or fuzzy background images, thereby effec-

Feature extraction is an important step in image classification and directly affects classification accuracies. Thus, two feature extraction methods and three classifiers were compared in their abilities to identify seven tea leaf diseases in the present manuscript. These analyses revealed that LeafNet yielded the highest accuracies among SVM and MLP classification algorithms. CNNs thus have obvious advantages for identifying tea leaf diseases. Importantly, the results from the present study highlight the feasibility of applying CNNs in the identification of tea leaf diseases, which would significantly improve disease recognition for tea plant agriculture. Although the disease classification accuracy of the LeafNet was not 100%, improvements upon the present method can be implemented in future studies to improve the method and provide more efficient and accurate guidance for the

In this manuscript, the expansion process of sample data is a time-consuming process, but with the continuous growth of network information resources, the

tively enhancing the precision of image recognition.

LeafNet has the best recognition effect on the bird's-eye spot, which may be due to the obvious plant pathological symptoms and the strong recognition ability of the LeafNet algorithm. The white spot disease was the second, while the other diseases range from 84 to 93%. Because of the similar pathological characteristics of the gray blight, red leaf spot, and brown blight, the classification accuracy of the three diseases is lowest. The symptoms of gray blight and brown blight diseases are too similar, which both exhibit annulations in their late stage and cannot be distinguished. In addition, the symptoms of white spot and bird's-eye spot diseases both include reddish brown spots at early stages. In addition, both anthracnose and brown blight diseases are typified by waterlogged leaves during early disease stages, while symptoms are different in the later stages. Some diseases can occur in tea plants throughout the year, although some diseases occur at distinct times. Consequently, diseases diagnosed at different times may affect the accuracy of disease identification. Another factor that affects the accuracy may be that the tea leaf can be infected with two or more diseases at the same time. This is because when the tea leaf is infected by one pathogen, the leaves are suffering from physiological weakness, and the second pathogen can easily infect. Therefore, the above factors explain the main reasons for the low accuracy of the test model in some diseases. In addition, the performance of LeafNet is compared with the method of Refer-

problem inherent in these technologies.

*Advances in Forest Management under Global Change*

plant diseases.

**7. Conclusion**

control of tea leaf diseases.

**152**

We acknowledge the funding support by key R&D projects of Ningxia Hui Autonomous Region (2017BY080) and the National Natural Science Foundation of China (M1942001) and Natural Science Foundation of Inner Mongolia Autonomous Region (2019MS08168).

### **Conflict of interest**

The authors declare no conflict of interest.
