Document Type : Research Paper

Authors

1 Faculty Member, Department of Agricultural Engineering, Faculty of Shahriar, Technical and Vocational University

2 Specialist, Department of Agricultural Engineering, Faculty of Shahriar, Technical and Vocational University

10.22059/jap.2022.342545.623690

Abstract

This study was performed to intelligent and rapid assessment of the status of colonies in terms of honey production efficiency during foraging period, and presenting a method based on machine vision system. Using deep learning method, at first the comb frame and then the geometric, textural and color pattern of honey were identified. After that, the percentage of honey area was calculated. To do this, the imaging test of bee colonies using digital camera was designed and performed in such a way that different states of cells were present on the combs. In image analysis stage, the convolutional neural network with YOLOv5 algorithm and semantic segmentation method were used. The results showed that the present intelligent system has the ability to detect the comb frame from the surrounding environment of the image with an accuracy of more than 88%. Also, honey-related areas in each comb were identified with almost 83% accuracy and about 240 times quicker that of an expert beekeeper. These results were simultaneously confirmed with manual counting by a skilled beekeeper. Due to increase in the estimation speed, reduction of human error and consequently reduction of disruption time in colony activity, the proposed method can be a proper alternative to the traditional method of using framing technique for regular visits and evaluation of honey production efficiency.

Keywords

1. Aljazaeri M, Bazi Y, AlMubarak H and Alajlan N (2020) Faster R-CNN and DenseNet regression for glaucoma detection in retinal fundus images. In: 2020 2nd International Conference on Computer and Information Sciences (ICCIS), IEEE. 1-4.
2. Alves TS, Pinto MA, Ventura P, Neves CJ, Biron DG, Junior AC, de Paula Filho PL and Rodrigues PJ (2020) Automatic detection and classification of honey bee comb cells using deep learning. Computers and Electronics in Agriculture, 170: 105244.
3. Colin T, Bruce J, Meikle WG and Barron AB (2018) The development of honey bee colonies assessed using a new semi automated brood counting method: CombCount. PLoSOne, 13: e0205816.
4. Cornelissen B, Schmid S, Henning J and Der JV (2009) Estimating colony size using digital photography. In: of 41st International Apicultural Congress. 48.
5. Girshick R, Donahue J, Darrell T and Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 580-587.
6. Girshick R (2015) Fast r-cnn. In: Proceedings of the IEEE International Conference on Computer Vision. 1440-1448.
7. Hajizadeh R, Iraqi M, Safamehr A and Asadi A (2013) Effect of pollen substitutes on honey production performance in bee colonies. National Conference on Livestock and Poultry Nutrition, Islamic Azad University, Maragheh Branch, Iran.
8. He K, Gkioxari G, Dollár P and Girshick R (2017) Mask r-cnn. Proceedings of the IEEE International Conference on Computer Vision. Honolulu. HI, USA.
9. Höferlin B, Höferlin M, Kleinhenz M and Bargen H (2013) Automatic analysis of apis mellifera comb photos and brood development. In: Association of Institutes for Bee Research Report of the 60th Seminar in Würzburg Apidologie. 19.
10. Jocher G, Stoken A and Borovec J (2020) NanoCode012, ChristopherSTAN, L. Changyu, Laughing, tkianai, A Hogan, lorenzomammana, yxNONG, AlexWang1900, L Diaconu, Marc, wanghaoyang0106, ml5ah, Doug, F Ingham, Frederik, Guilhen, Hatovix, J Poznanski, J Fang, L Yu, changyu98, M Wang, N Gupta, O Akhtar, PetrDvoracek, and P Rai, “ultralytics/yolov5: v3 1.
11. Kim JA, Sung JY and Park SH (2020) Comparison of Faster-RCNN, YOLO, and SSD for real-time vehicle type recognition. In: 2020 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia). IEEE. 1-4.
12. LeCun Y, Bengio Y and Hinton G (2015) Deep learning. Nature, 521: 436-444.
13. Lin TY, Maire M, Belongie S, Hays J, Perona P, Ramanan D, Dollár P and Zitnick CL (2014) Microsoft coco: Common objects in context. In: European Conference on Computer Vision, Springer. 740-755.
14. Nuzzi C, Pasinetti S, Lancini M, Docchio F and Sansoni G (2018) Deep learning based machine vision: first steps towards a hand gesture recognition set up for collaborative robots. In: 2018 Workshop on Metrology for Industry 40 and IoT, IEEE. 28-33.
15. Redmon J, Divvala S, Girshick R and Farhadi A (2016) You only look once: Unified, realtime object detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 779-788.
16. Ren S, He K, Girshick R and Sun J (2015) Faster r-cnn: towards real-time object detection with region proposal networks. Advances in Neural Information Processing Systems: 28.
17. Rodrigues PJ, Neves C and Pinto MA (2016) Geometric contrast feature for automatic visual counting of honey bee brood capped cells. In: EURBEE 2016: 7th European Conference of Apidology.
18. Yoshiyama M, Kimura K, Saitoh K and Iwata H (2011) Measuring colony development in honey bees by simple digital image analysis. Journal of Apicultural Research, 50: 170-172.