World Scientific News
EISSN 2392-2192
  • Login
  • Home
  • About
    • About Us
    • Editorial Board
    • Guide for Authors
    • Abstracting & Indexing
    • Instruction for Authors
    • Submit your Article
  • View Articles
    • 2026
    • 2025
    • 2024
    • 2023
    • 2022
    • 2021
    • 2020
    • 2019
    • 2018
    • 2017
    • 2016
    • 2015
    • 2014
    • 2013
  • Careers
  • News
No Result
View All Result
SUBMIT ARTICLE
Register
  • Home
  • About
    • About Us
    • Editorial Board
    • Guide for Authors
    • Abstracting & Indexing
    • Instruction for Authors
    • Submit your Article
  • View Articles
    • 2026
    • 2025
    • 2024
    • 2023
    • 2022
    • 2021
    • 2020
    • 2019
    • 2018
    • 2017
    • 2016
    • 2015
    • 2014
    • 2013
  • Careers
  • News
No Result
View All Result
World Scientific News
No Result
View All Result
Home 2023

A comprehensive review of smart animal husbandry: Its data, applications, techniques, challenges and opportunities

Authors: Rotimi-Williams Bello, Ahmad Sufril Azlan Mohamed, Abdullah Zawawi Talib, 181 (2023) 68-98

2024-01-02
Reading Time: 11 mins read
0

ABSTRACT

Smart systems have momentously changed the traditional methods of animal husbandry as a practice of raising livestock in the domain of agriculture. Productive and competitive animal husbandry are made possible with the use of modern technologies like machine learning models. The modern technologies enable the collection of extensive amount of smart animal husbandry data which can be employed for day-by-day animal measures such as morphological measures, physiological measures, phenological measures and other related measures. This paper dwells on three most important aspects of modern-day smart animal husbandry. First, the paper emphasizes animal measures as big data. Second, it presents all-inclusive practical applications of animal measures in smart animal husbandry. Third, it discusses mainstream machine learning techniques that are employed in smart animal husbandry analysis. By so doing, some of the prevailing challenges and prospective opportunities are identified. To the best of our knowledge, there is no existing paper that has reviewed smart animal husbandry as reviewed in this paper considering the applications and techniques that are involved in it in addition to the prevailing challenges and prospective opportunities that are comprehensively identified. The varieties of animal considered in this survey are cattle, goats and pigs.

 

References

  • Ahmed, S., Gaber, T., Tharwat, A., Hassanien, A.E.,Snáel, V., 2015. Muzzle-based cattle identification using speed up robust feature approach. In: IEEE International Conference on Intelligent Networking and Collaborative Systems, pp. 99-104.
  • Alameer, A., Kyriazakis, I., Bacardit, J., 2020. Automated recognition of postures and drinking behaviour for the detection of compromised health in pigs. Scientific Reports 10, 1-15.
  • Allen, A., Golden, B., Taylor, M., Patterson, D., Henriksen, D., Skuce, R., 2008. Evaluation of retinal imaging technology for the biometric identification of bovine animals in Northern Ireland. Livestock Science 116, 42-52.
  • Alvarez, J.R., Arroqui, M., Mangudo, P., Toloza, J., Jatip, D., Rodríguez, J.M., …, Mateos, C., 2018. Body condition estimation on cows from depth images using convolutional neural networks. Computers and Electronics in Agriculture 155, 12-22.
  • Andrew, W., Greatwood, C., Burghardt, T., 2017. Visual localisation and identification of Holstein Friesian cows via deep learning. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, pp. 2850-2859.
  • Andrew, W., Hannuna, S., Campbell, N., Burghardt, T., 2016. Automatic individual Holstein Friesian cows identification via selective local coat pattern matching in RGB-D imagery. In: IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, pp. 484-488.
  • Arslan, A.C., Akar, M., Alagöz, F., 2014. 3D cow identification in cattle farms. In: IEEE 22nd Signal Processing and Communications Applications Conference (SIU), 1347-1350.
  • Bell, M.J., Maak, M., Sorley, M., Proud, R., 2018. Comparison of methods for monitoring the body condition of dairy cows. Frontiers in Sustainable Food Systems 80, 1-7.
  • Bello, R.W., Abubakar, S., 2019. Development of a software package for cattle identification in Nigeria. Journal of Applied Sciences and Environmental Management 23, 1825-1828.
  • Bello, R.W., Abubakar, S., 2020. Framework for modeling cattle behavior through grazing patterns. Asian J. Mathemat. Sci. 4, 75-79.
  • Bello, R.W., Moradeyo, O.M., 2019. Monitoring cattle grazing behavior and intrusion using global positioning system and virtual fencing. Asian Journal of Mathematical Sciences 3, 4-14.
  • Bello, R.W., Talib, A.Z., Mohamed, A.S.A., Olubummo, D.A., Otobo, F.N., 2020a. Image-based individual cow recognition using body patterns. J. Adv. Comp. Sci. Appl. 11, 92-98.
  • Bello, R.W., Talib, A.Z.H., Mohamed, A.S.A.B., 2020b. Deep learning-based architectures for recognition of cow using cow nose image pattern. Gazi University Journal of Science 33, 831-844
  • Bello, R.W., Talib, A.Z.H., Mohamed, A.S.A.B., 2021a. Deep belief network approach for recognition of cow using cow nose image pattern. Walailak Journal of Science and Technology 18, 1-14.
  • Bello, R.W., Talib, A.Z.H., Mohamed, A.S.A.B., 2020d. A framework for real-time cattle monitoring using multimedia networks. J. Recent Technol. Engin. 8, 974-979.
  • Bello, R.W., Mohamed, A.S.A., Talib, A.Z., 2021b. Contour extraction of individual cattle from an image using enhanced mask R-CNN instance segmentation method. IEEE Access 9, 56984-57000.
  • Bello, R.W., Mohamed, A.S.A., Talib, A.Z., 2021c. Enhanced mask R-CNN for herd segmentation. International Journal of Agricultural and Biological Engineering 14, 238-244.
  • Bello, R.W., Mohamed, A.S.A., Talib, A.Z., Olubummo, D.A., Enuma, O.C., 2021d. Enhanced deep learning framework for cow image segmentation. IAENG International Journal of Computer Science 48, 1182-1191.
  • Bello, R.W., Mohamed, A.S.A., Talib, A.Z., Sani, S. & Wahab, M.N.A., 2022a. Behavior recognition of group-ranched cattle from video sequences using deep learning. Indian Journal of Animal Research, volume 56 issue: 505-512. doi: 10.18805/IJAR.B-1369
  • Bello, RW., Mohamed, A.S.A., Talib, A.Z., 2022b. Cow image segmentation using mask R-CNN integrated with grabcut. In: Proceedings of International Conference on Emerging Technologies and Intelligent Systems (ICETIS) 2021, Springer, Cham, Lecture Notes in Networks and Systems 322, 23-32.
  • Bello, R.W., Olubummo, D.A., Seiyaboh, Z., Enuma, O.C., Talib, A.Z., Mohamed, A.S.A., 2020c. Cattle identification: the history of nose prints approach in brief. In: IOP Conference Series: Earth and Environmental Science, 594, 1-9.
  • Bercovich, A., Edan, Y., Alchanatis, V., Moallem, U., Parmet, Y., Honig, H., …, Halachmi, I., 2013. Development of an automatic cow body condition scoring using body shape signature and Fourier descriptors. Journal of Dairy Science 96, 8047-8059.
  • Blancou, J., 2001. A history of the traceability of animals and animal products. Revue Scientifique et Technique (International Office of Epizootics) 20, 413-425.
  • Cai, C., Li, J., 2013. Cattle face recognition using local binary pattern descriptor. In: IEEE Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1-4.
  • Caria, M., Schudrowitz, J., Jukan, A., Kemper, N., 2017. Smart farm computing systems for animal welfare monitoring. In: IEEE 40th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), pp. 152-157.
  • Cheema, G.S., Anand, S., 2017. Automatic detection and recognition of individuals in patterned species. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Springer, Cham, pp. 27-38.
  • Chen, G., Shen, S., Wen, L., Luo, S., Bo, L., 2020. Efficient pig counting in crowds with keypoints tracking and spatial-aware temporal response filtering. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Paris, France, pp. 1-7.
  • Cowton, J., Kyriazakis, I., Bacardit, J., 2019. Automated individual pig localisation, tracking and behaviour metric extraction using deep learning. IEEE Access 7, 108049-108060.
  • De Alwis, S., Hou, Z., Zhang, Y., Na, M.H., Ofoghi, B., Sajjanhar, A., 2022. A survey on smart farming data, applications and techniques. Computers in Industry 138, 1-14.
  • Dickinson, R.A., Morton, J.M., Beggs, D.S., Anderson, G.A., Pyman, M.F., Mansell, P.D., Blackwood, C.B., 2013. An automated walk-over weighing system as a tool for measuring liveweight change in lactating dairy cows. Journal of Dairy Science 96, 4477-4486.
  • Donahue, J., Anne Hendricks, L., Guadarrama, S., Rohrbach, M., Venugopalan, S., Saenko, K., Darrell, T., 2015. Long-term recurrent convolutional networks for visual recognition and description. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, pp. 2625-2634.
  • Evans, J., Van Eenennaam, A., 2005. Livestock identification. An introduction to electronic animal identifications systems and comparison of technologies. Emerging management systems in animal identification 1-12 [Disponible en Línea: http://animalscience. ucdavis.edu/animalID/]
  • , 2009. How to feed the world in 2050. In: Food and Agriculture Organization of the United Nations, Rome.
  • Feichtenhofer, C., Fan, H., Malik, J., He, K., 2019. Slowfast networks for video recognition. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Korea (South), pp. 6202-6211.
  • Fordyce, G., Anderson, A., McCosker, K., Williams, P.J., Holroyd, R.G., Corbet, N.J., Sullivan, M.S., 2013. Liveweight prediction from hip height, condition score, fetal age and breed in tropical female cattle. Animal Production Science 53, 275-282.
  • Gebbers, R., Adamchuk, V.I., 2010. Precision agriculture and food security. Science 327, 828–831.
  • Ghirardi, J.J., Caja, G., Garín, D., Casellas, J., Hernández-Jover, M., 2006. Evaluation of the retention of electronic identification boluses in the forestomachs of cattle. Journal of Animal Science 84, 2260-2268.
  • Ghosh, P., Mustafi, S., Mukherjee, K., Dan, S., Roy, K., Mandal, S.N., Banik, S., 2021. Image-based identification of animal breeds sing deep learning. In: Deep Learning for Unmanned Systems, Springer, Cham, pp. 415-445.
  • Hansen, M.F., Smith, M.L., Smith, L.N., Jabbar, K.A., Forbes, D., 2018. Automated monitoring of dairy cow body condition, mobility and weight using a single 3D video capture device. Computers in Industry 98, 14-22.
  • He, K., Gkioxari, G., Dollár, P., Girshick, R., 2017. Mask R-CNN. In: Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, pp. 2961-2969.
  • He, K., Zhang, X., Ren, S., Sun, J., 2016. Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, pp. 770-778.
  • Held, D., Thrun, S., Savarese, S., 2016. Learning to track at 100 fps with deep regression networks. In: European Conference on Computer Vision, Springer, Cham, pp. 749-765.
  • Hu, H., Dai, B., Shen, W., Wei, X., Sun, J., Li, R., Zhang, Y., 2020. Cow identification based on fusion of deep parts features. Biosystems Engineering 192, 245-256.
  • Jaurena, G., Moorby, J.M., Fisher, W.J., Cantet, R., 2005. Association of body weight, loin longissimus dorsi and backfat with body condition score in dry and lactating Holstein dairy cows. Animal Science 80, 219-223.
  • Jiang, M., Rao, Y., Zhang, J., Shen, Y., 2020. Automatic behavior recognition of group-housed goats using deep learning. Computers and Electronics in Agriculture 177, 1-13.
  • JiHye, O., Noh, D.H., Sohn, Y.H., 2017. Empirical test of Wi-Fi environment stability for smart farm platform. In: 4th International Conference on Computer Applications and Information Processing Technology (CAIPT), 1-5.
  • Jingqiu, G., Zhihai, W., Ronghua, G., Huarui, W., 2017. Cow behaviour recognition based on image analysis and activities. International Journal of Agricultural and Biological Engineering 10, 165-174.
  • Kamilaris, A., Gao, F., Prenafeta-Boldú, F.X., Ali, M.I., 2016. Agri-IoT: A semantic framework for Internet of Things-enabled smart farming applications. In: IEEE 3rd World Forum on Internet of Things (WF-IoT), Reston, VA, USA, pp. 442–447.
  • Kamilaris, A., Kartakoullis, A., Prenafeta-Boldú, F.X., 2017. A review on the practice of big data analysis in agriculture. Computers and Electronics in Agriculture 143, 23-37.
  • Kamilaris, A., Prenafeta-Boldú, F.X., 2018. Deep learning in agriculture: A survey. Computers and Electronics in Agriculture 147, 70-90.
  • Krizhevsky, A., Sutskever, I., Hinton, G.E., 2012. Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems 25, 1097-1105.
  • Kumar, S., Pandey, A., Satwik, K.S.R., Kumar, S., Singh, S.K., Singh, A.K., Mohan, A., 2018. Deep learning framework for recognition of cattle using muzzle point image pattern. Measurement 116, 1-17.
  • Kumar, S., Singh, S.K., 2017. Automatic identification of cattle using muzzle point pattern: a hybrid feature extraction and classification paradigm. Multimedia Tools and Applications 76, 26551-26580.
  • Kumar, S., Singh, S.K., 2020. Cattle recognition: A new frontier in visual animal biometrics research. In: Proceedings of the national academy of sciences, India Section A: Physical Sciences, 90, 689-708.
  • Kumar, S., Singh, S.K., Singh, A.K., 2017. Muzzle point pattern based techniques for individual cattle identification. IET Image Processing 11, 805-814.
  • Kusakunniran, W., Wiratsudakul, A., Chuachan, U., Kanchanapreechakorn, S., Imaromkul, T., 2018. Automatic cattle identification based on fusion of texture features extracted from muzzle images. In: IEEE International Conference on Industrial Technology (ICIT), pp. 1484-1489.
  • Li, G., Huang, Y., Chen, Z., Chesser, G.D., Purswell, J.L., Linhoss, J., Zhao, Y., 2021. Practices and applications of convolutional neural network-based computer vision systems in animal farming: a review. Sensors 21, 1-44.
  • Liaghat, S., Balasundram, S.K., A review: The role of remote sensing in precision agriculture. Am. J. Agric. Biol. Sci. 5, 50–55.
  • Liu, H., Reibman, A.R., Boerman, J.P., 2020. A cow structural model for video analytics of cow health. arXiv preprint 1-13. arXiv:2003.05903
  • Lu, Y., He, X., Wen, Y., Wang, P.S., 2014. A new cow identification system based on iris analysis and recognition. International Journal of Biometrics 6, 18-32.
  • Lynn, N.C., Kyu, Z.M., Zin, T.T., Kobayashi, I., 2017. Estimating body condition score of cows from images with the newly developed approach. In: 18th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), pp. 91-94.
  • Mahmoud, H.A., Hadad, H.M.R.E., 2015. Automatic cattle muzzle print classification system using multiclass support vector machine. International Journal of Image Mining 1, 126-140.
  • Marchant, J., 2002. Secure animal identification and source verification. JM Communications, UK, pp. 1-28.
  • Miao, Z., Gaynor, K.M., Wang, J., Liu, Z., Muellerklein, O., Norouzzadeh, M.S., …, Getz, W.M., 2019. Insights and approaches using deep learning to classify wildlife. Scientific Reports 9, 1-9.
  • Moon, A., Kim, J., Zhang, J., Liu, H., Son, S.W., 2017. Understanding the impact of lossy compressions on IoT smart farm analytics. In: IEEE International Conference on Big Data (Big Data), pp. 4602-4611.
  • Neary, M., Yager, A., 2002. Methods of livestock identification. Purdue University Department of Animal Sciences, AS-556-W, pp 1-9.
  • Neethirajan, S., 2020. The role of sensors, big data and machine learning in modern animal farming. Sensing and Bio-Sensing Research 29, 1-8.
  • Norouzzadeh, M.S., Nguyen, A., Kosmala, M., Swanson, A., Palmer, M.S., Packer, C., Clune, J., 2018. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. In: Proceedings of the National Academy of Sciences, 115, E5716-E5725.
  • Petersen, W.E., 1922. The identification of the bovine by means of nose-prints. Dairy Sci. 5, 249-58.
  • Pinheiro, P.O., Collobert, R., Dollár, P., 2015. Learning to segment object candidates. arXiv preprint 1-10. arXiv:1506.06204v2
  • Pinheiro, P.O., Lin, T.Y., Collobert, R., Dollár, P., 2016. Learning to refine object segments. In European Conference on Computer Vision, Springer, Cham, pp. 75-91.
  • Qiao, Y., Kong, H., Clark, C., Lomax, S., Su, D., Eiffert, S., Sukkarieh, S., 2021a. Intelligent perception for cows monitoring: A review for cows identification, body condition score evaluation, and weight estimation. Computers and Electronics in Agriculture 185, 1-11.
  • Qiao, Y., Kong, H., Clark, C., Lomax, S., Su, D., Eiffert, S., Sukkarieh, S., 2021b. Intelligent perception-based cattle lameness detection and behaviour recognition: A review. Animals 11, 1-20.
  • Qiao, Y., Su, D., Kong, H., Sukkarieh, S., Lomax, S., Clark, C., 2020. BiLSTM-based individual cattle
  • identification for automated precision livestock farming. In: IEEE 16th International Conference on Automation Science and Engineering (CASE), 967-972.
  • Qiao, Y., Truman, M., Sukkarieh, S., 2019. Cows segmentation and contour extraction based on Mask R-CNN for precision livestock farming. Computers and Electronics in Agriculture 165, 1-9.
  • Regan, Á., 2019. ‘Smart farming’ in Ireland: A risk perception study with key governance actors. NJAS-Wageningen Journal of Life Sciences 90, 1-10.
  • Ren, K., Bernes, G., Hetta, M., Karlsson, J., 2021. Tracking and analysing social interactions in dairy cattle with real-time locating system and machine learning. Journal of Systems Architecture 116, 1-7.
  • Rivas, A., Chamoso, P., González-Briones, A., Corchado, J.M., 2018. Detection of cows using drones and convolutional neural networks. Sensors 18, 1-15.
  • Rojo-Gimeno, C., van der Voort, M., Niemi, J.K., Lauwers, L., Kristensen, A.R., Wauters, E., 2019. Assessment of the value of information of precision livestock farming: A conceptual framework. NJAS-Wageningen Journal of Life Sciences 90, 1-9.
  • Salau, J., Haas, J.H., Junge, W., Bauer, U., Harms, J., Bieletzki, S., 2014. Feasibility of automated body trait determination using the SR4K time-of-flight camera in cow barns. SpringerPlus 3, 1-16.
  • Salau, J., Lamp, O., Krieter, J., 2019. Dairy cows’ contact networks derived from videos of eight cameras. Biosystems Engineering 188, 106-113.
  • Saxena, L., Armstrong, L., 2014. A survey of image processing techniques for agriculture. In: Proceedings of Asian Federation for Information Technology in Agriculture, Australian Society of Information and Communication Technologies in Agriculture, Perth, Australia, pp. 405-418.
  • Shao, W., Kawakami, R., Yoshihashi, R., You, S., Kawase, H., Naemura, T., 2020. Cattle detection and counting in UAV images based on convolutional neural networks. International Journal of Remote Sensing 41, 31-52.
  • Simonyan, K., Zisserman, A., 2014. Two-stream convolutional networks for action recognition in videos. arXiv preprint, 1-11. arXiv:1406.2199
  • Simonyan, K., Zisserman, A., 2015. Very deep convolutional networks for large-scale image recognition. arXiv preprint 1-14. arXiv:1409.1556
  • Su, Q., Tang, J., Zhai, M., He, D., 2022. An intelligent method for dairy goat tracking based on Siamese network. Computers and Electronics in Agriculture 193, 1-13.
  • Swanson, A., Kosmala, M., Lintott, C., Simpson, R., Smith, A., Packer, C., 2015. Snapshot Serengeti, high-frequency annotated camera trap images of 40 mammalian species in an African savanna. Scientific Data 2, 1-14.
  • Swinnen, K.R., Reijniers, J., Breno, M., Leirs, H., 2014. A novel method to reduce time investment when processing videos from camera trap studies. PLoS One 9, 1-7.
  • Wu, D., Yin, X., Jiang, B., Jiang, M., Li, Z., Song, H., 2020. Detection of the respiratory rate of standing cows by combining the Deeplab V3+ semantic segmentation model with the phase-based video magnification algorithm. Biosystems Engineering 192, 72-89.
  • Xu, B., Wang, W., Falzon, G., Kwan, P., Guo, L., Chen, G., …, Schneider, D., 2020. Automated cattle counting using Mask R-CNN in quadcopter vision system. Computers and Electronics in Agriculture 171, 1-12.
  • Xudong, Z., Xi, K., Ningning, F., Gang, L., 2020. Automatic recognition of dairy cow mastitis from thermal images by a deep learning detector. Computers and Electronics in Agriculture 178, 1-11.
  • Zhang, L., Gray, H., Ye, X., Collins, L., Allinson, N., 2019. Automatic individual pig detection and tracking in pig farms. Sensors 19, 1-20.
  • Zhao, K., Jin, X., Ji, J., Wang, J., Ma, H., Zhu, X., 2019. Individual identification of Holstein dairy cows based on detecting and matching feature points in body images. Biosystems Engineering 181, 128-139.
  • Zin, T.T., Phyo, C.N., Tin, P., Hama, H., Kobayashi, I., 2018. Image technology based cow identification system using deep learning. In: Proceedings of the International MultiConference of Engineers and Computer Scientists, pp. 236-247.
  • Zin, T.T., Seint, P.T., Tin, P., Horii, Y., Kobayashi, I., 2020. Body condition score estimation based on regression analysis sing a 3D camera. Sensors 20, 1-9

Download all article in PDF

WSN 181 (2023) 68-98


 

ADVERTISEMENT
Tags: agricultureAnimal measuresDeep learningSensorSmart animal husbandry
ShareTweetPin
Next Post

Aqueous phase adsorption-desorption studies of malachite green on metal-organic framework synthesized with zinc and benzene-1,4-dicarboxylic acid

Water and H2O Difference in Terms of International Relationship

View free articles

  • Open access

View Articles

  • 2013 (5)
    • Volume 1 (2013), pp. 1-14 (2)
    • Volume 2 (2013), pp. 1-29 (3)
  • 2014 (13)
    • Volume 3 (2014), pp. 1-21 (3)
    • Volume 4 (2014), pp. 1-16 (2)
    • Volume 5 (2014), pp. 1-36 (4)
    • Volume 6 (2014), pp. 1-23 (3)
  • 2015 (109)
    • Volume 10 (2015), pp. 1-100 (5)
    • Volume 11 (2015), pp. 1-96 (6)
    • Volume 12 (2015), pp. 1-76 (6)
    • Volume 13 (2015), pp. 1-130 (7)
    • Volume 14 (2015), pp. 1-55 (1)
    • Volume 15 (2015), pp. 1-25 (2)
    • Volume 16 (2015), pp. 1-158 (9)
    • Volume 17 (2015), pp. 1-63 (1)
    • Volume 18 (2015), pp. 1-127 (8)
    • Volume 19 (2015), pp. 1-111 (7)
    • Volume 20 (2015), pp. 1-336 (1)
    • Volume 21 (2015), pp. 1-89 (7)
    • Volume 22 (2015), pp. 1-119 (8)
    • Volume 23 (2015), pp. 1-127 (10)
    • Volume 24 (2015), pp. 1-87 (6)
    • Volume 7 (2015), pp. 1-237 (9)
    • Volume 8 (2015), pp. 1-203 (7)
    • Volume 9 (2015), pp. 1-160 (9)
  • 2016 (517)
    • Volume 25 (2016), pp. 1-16 (2)
    • Volume 26 (2016), pp. 1-19 (2)
    • Volume 27 (2016), pp. 1-16 (2)
    • Volume 28 (2016), pp. 1-100 (7)
    • Volume 29 (2016), pp. 1-95 (6)
    • Volume 30 (2016), pp. 1-142 (10)
    • Volume 31 (2016), pp. 1-124 (8)
    • Volume 32 (2016), pp. 1-81 (9)
    • Volume 33 (2016), pp. 1-121 (8)
    • Volume 34 (2016), pp. 1-145 (10)
    • Volume 35 (2016), pp. 1-133 (10)
    • Volume 36 (2016), pp. 1-152 (10)
    • Volume 37 (2016), pp. 1-303 (18)
    • Volume 38 (2016), pp. 1-59 (1)
    • Volume 39 (2016), pp. 1-30 (2)
    • Volume 40 (2016), pp. 1-299 (20)
    • Volume 41 (2016), pp. 1-287 (36)
    • Volume 42 (2016), pp. 1-316 (21)
    • Volume 43(1,2,3) (2016), pp. 1-157 (3)
      • Volume 43, Issue 1 (2016), pp. 1-55 (1)
      • Volume 43, Issue 2 (2016), pp. 56-103 (1)
      • Volume 43, Issue 3 (2016), pp. 104-157 (1)
    • Volume 44 (2016), pp. 1-301 (20)
    • Volume 45(1,2) (2016), pp. 1-383 (21)
      • Volume 45, Issue 1 (2016), pp. 1-62 (1)
      • Volume 45, Issue 2 (2016), pp. 63-383 (20)
    • Volume 46 (2016), pp. 1-286 (20)
    • Volume 47(1,2) (2016), pp. 1-350 (21)
      • Volume 47, Issue 1 (2016), pp. 1-61 (1)
      • Volume 47, Issue 2 (2016), pp. 62-350 (20)
    • Volume 48 (2016), pp. 1-163 (17)
    • Volume 49(1,2) (2016), pp. 1-404 (21)
      • Volume 49, Issue 1 (2016), pp. 1-58 (1)
      • Volume 49, Issue 2 (2016), pp. 59-404 (20)
    • Volume 50 (2016), pp. 1-316 (20)
    • Volume 51 (2016), pp. 1-71 (7)
    • Volume 52 (2016), pp. 1-275 (20)
    • Volume 53(1,2,3) (2016), pp. 1-429 (22)
      • Volume 53, Issue 1 (2016), pp. 1-66 (1)
      • Volume 53, Issue 2 (2016), pp. 67-109 (1)
      • Volume 53, Issue 3 (2016), pp. 110-429 (20)
    • Volume 54 (2016), pp. 1-299 (20)
    • Volume 55 (2016), pp. 1-288 (20)
    • Volume 56 (2015), pp. 1-266 (20)
    • Volume 57 (2016), pp. 1-570 (53)
    • Volume 58 (2016), pp. 1-161 (10)
    • Volume 59 (2016), pp. 1-128 (10)
    • Volume 60 (2016), pp. 1-120 (10)
  • 2017 (481)
    • Volume 61(1,2) (2017), pp. 1-194 (11)
      • Volume 61, Issue 1 (2017), pp. 1-51 (1)
      • Volume 61, Issue 2 (2017), pp. 52-194 (10)
    • Volume 62 (2017), pp. 1-146 (10)
    • Volume 63 (2017), pp. 1-240 (1)
    • Volume 64 (2017), pp. 1-140 (10)
    • Volume 65 (2017), pp. 1-175 (10)
    • Volume 66 (2017), pp. 1-300 (20)
    • Volume 67(1,2,) (2017), pp. 1-389 (21)
      • Volume 67, Issue 1 (2017), pp. 1-67 (1)
      • Volume 67, Issue 2 (2017), pp. 68-389 (20)
    • Volume 68 (2017), pp. 1-141 (1)
    • Volume 69 (2017), pp. 1-253 (20)
    • Volume 70(1,2) (2017), pp. 1-321 (21)
      • Volume 70, Issue 1 (2017), pp. 1-50 (1)
      • Volume 70, Issue 2 (2017), pp. 51-321 (20)
    • Volume 71 (2017), pp. 1-219 (18)
    • Volume 72 (2017), pp. 1-478 (46)
    • Volume 73 (2017), pp. 1-133 (15)
    • Volume 74 (2017), pp. 1-287 (20)
    • Volume 75 (2017), pp. 1-111 (12)
    • Volume 76 (2017), pp. 1-199 (20)
    • Volume 77(1,2) (2017), pp. 1-380 (21)
      • Volume 77, Issue 1 (2017), pp. 1-102 (1)
      • Volume 77, Issue 2 (2017), pp. 103-380 (20)
    • Volume 78 (2017), pp. 1-230 (24)
    • Volume 79 (2017), pp. 1-89 (1)
    • Volume 80 (2017), pp. 1-323 (20)
    • Volume 81(1,2) (2017), pp. 1-312 (21)
      • Volume 81, Issue 1 (2017), pp. 1-47 (1)
      • Volume 81, Issue 2 (2017), pp. 48-312 (20)
    • Volume 82 (2017), pp. 1-90 (1)
    • Volume 83 (2017), pp. 1-239 (20)
    • Volume 84 (2017), pp. 1-92 (1)
    • Volume 85 (2017), pp. 1-73 (10)
    • Volume 86(1,2,3) (2017), pp. 1-370 (22)
      • Volume 86, Issue 1 (2017), pp. 1-58 (1)
      • Volume 86, Issue 2 (2017), pp. 59-122 (1)
      • Volume 86, Issue 3 (2017), pp. 123-370 (20)
    • Volume 87 (2017), pp. 1-255 (20)
    • Volume 88(1,2) (2017), pp. 1-226 (11)
      • Volume 88, Issue 1 (2017), pp. 1-57 (1)
      • Volume 88, Issue 2 (2017), pp. 58-226 (10)
    • Volume 89 (2017), pp. 1-321 (33)
    • Volume 90 (2017), pp. 1-270 (20)
  • 2018 (486)
    • Volume 100 (2018), pp. 1-253 (20)
    • Volume 101 (2018), pp. 1-252 (20)
    • Volume 102 (2018), pp. 1-223 (20)
    • Volume 103 (2018), pp. 1-249 (18)
    • Volume 104 (2018), pp. 1-492 (40)
    • Volume 105 (2018), pp. 1-232 (20)
    • Volume 106 (2018), pp. 1-244 (20)
    • Volume 107 (2018), pp. 1-232 (20)
    • Volume 108 (2018), pp. 1-244 (20)
    • Volume 109 (2018), pp. 1-266 (19)
    • Volume 110 (2018), pp. 1-243 (20)
    • Volume 111 (2018), pp. 1-181 (17)
    • Volume 112 (2018), pp. 1-251 (20)
    • Volume 113 (2018), pp. 1-250 (26)
    • Volume 114 (2018), pp. 1-264 (20)
    • Volume 91 (2018), pp. 1-137 (10)
    • Volume 92(1,2) (2018), pp. 1-399 (21)
      • Volume 92, Issue 1 (2018), pp. 1-138 (1)
      • Volume 92, Issue 2 (2018), pp. 139-399 (20)
    • Volume 93 (2018), pp. 1-141 (15)
    • Volume 94(1,2) (2018), pp. 1-332 (21)
      • Volume 94, Issue 1 (2018), pp. 1-71 (1)
      • Volume 94, Issue 2 (2018), pp. 72-332 (20)
    • Volume 95 (2018), pp. 1-272 (20)
    • Volume 96 (2018), pp. 1-250 (20)
    • Volume 97 (2018), pp. 1-284 (20)
    • Volume 98 (2018), pp. 1-232 (20)
    • Volume 99 (2018), pp. 1-229 (19)
  • 2019 (467)
    • Volume 115 (2019), pp. 1-268 (20)
    • Volume 116 (2019), pp. 1-252 (19)
    • Volume 117 (2019), pp. 1-242 (20)
    • Volume 118 (2019), pp. 1-280 (20)
    • Volume 119 (2019), pp. 1-253 (20)
    • Volume 120(1,2) (2019), pp. 1-295 (21)
      • Volume 120, Issue 1 (2019), pp. 1-59 (1)
      • Volume 120, Issue 2 (2019), pp. 60-295 (20)
    • Volume 121 (2019), pp. 1-100 (13)
    • Volume 122 (2019), pp. 1-262 (20)
    • Volume 123 (2019), pp. 1-273 (20)
    • Volume 124(1,2) (2019), pp. 1-333 (21)
      • Volume 124, Issue 1 (2019), pp. 1-85 (1)
      • Volume 124, Issue 2 (2019), pp. 86-1-333 (20)
    • Volume 125 (2019), pp. 1-259 (20)
    • Volume 126 (2019), pp. 1-298 (20)
    • Volume 127(1,2,3) (2019), pp. 1-376 (22)
      • Volume 127, Issue 1 (2019), pp. 1-55 (1)
      • Volume 127, Issue 2 (2019), pp. 56-105 (1)
      • Volume 127, Issue 3 (2019), pp. 106-376 (20)
    • Volume 128(1,2) (2019), pp. 1-432 (21)
      • Volume 128, Issue 1 (2019), pp. 1-70 (1)
      • Volume 128, Issue 2 (2019), pp. 71-432 (20)
    • Volume 129 (2019), pp. 1-267 (20)
    • Volume 130 (2019), pp. 1-308 (20)
    • Volume 131 (2019), pp. 1-288 (20)
    • Volume 132 (2019), pp. 1-312 (24)
    • Volume 133 (2019), pp. 1-274 (20)
    • Volume 134(1,2) (2020), pp. 1-338 (21)
      • Volume 134, Issue 1 (2019), pp. 1-51 (1)
      • Volume 134, Issue 2 (2019), pp. 52-338 (20)
    • Volume 135 (2019), pp. 1-298 (22)
    • Volume 136 (2019), pp. 1-246 (16)
    • Volume 137 (2019), pp. 1-236 (14)
    • Volume 138(1,2) (2019), pp. 1-294 (13)
      • Volume 138, Issue 1 (2019), pp. 1-64 (1)
      • Volume 138, Issue 2 (2019), pp. 65-294 (12)
  • 2020 (179)
    • Volume 139(1,2) (2020), pp. 1-258 (13)
      • Volume 139, Issue 1 (2020), pp. 1-60 (1)
      • Volume 139, Issue 2 (2020), pp. 61-258 (12)
    • Volume 140 (2020), pp. 1-184 (10)
    • Volume 141 (2020), pp. 1-155 (10)
    • Volume 142 (2020), pp. 1-194 (12)
    • Volume 143 (2020), pp. 1-261 (16)
    • Volume 144 (2020), pp. 1-449 (30)
    • Volume 145 (2020), pp. 1-408 (30)
    • Volume 146 (2020), pp. 1-289 (18)
    • Volume 147 (2020), pp. 1-208 (12)
    • Volume 148 (2020), pp. 1-121 (8)
    • Volume 149 (2020), pp. 1-165 (10)
    • Volume 150 (2020), pp. 1-181 (10)
  • 2021 (143)
    • Volume 151 (2021), pp. 1-122 (8)
    • Volume 152 (2021), pp. 1-125 (8)
    • Volume 153(1,2) (2021), pp. 1-215 (13)
      • Volume 153, Issue 1 (2021), pp. 1-42 (1)
      • Volume 153, Issue 2 (2021), pp. 43-215 (12)
    • Volume 154 (2021), pp. 1-174 (10)
    • Volume 155 (2021), pp. 1-154 (10)
    • Volume 156 (2021), pp. 1-191 (12)
    • Volume 157 (2021), pp. 1-188 (10)
    • Volume 158 (2021), pp. 1-298 (16)
    • Volume 159 (2021), pp. 1-223 (14)
    • Volume 160 (2021), pp. 1-337 (20)
    • Volume 161 (2021), pp. 1-156 (10)
    • Volume 162 (2021), pp. 1-178 (12)
  • 2022 (125)
    • Volume 163 (2022), pp. 1-157 (8)
    • Volume 164 (2022), pp. 1-149 (8)
    • Volume 165 (2022), pp. 1-209 (12)
    • Volume 166 (2022), pp. 1-145 (10)
    • Volume 167 (2022), pp. 1-161 (9)
    • Volume 168 (2022), pp. 1-146 (10)
    • Volume 169 (2022), pp. 1-201 (10)
    • Volume 170 (2022), pp. 1-171 (10)
    • Volume 171 (2022), pp. 1-125 (8)
    • Volume 172 (2022), pp. 1-333 (20)
    • Volume 173 (2022), pp. 1-161 (10)
    • Volume 174 (2022), pp. 1-176 (10)
  • 2023 (132)
    • Volume 175 (2023), pp. 1-108 (8)
    • Volume 176 (2023), pp. 1-174 (10)
    • Volume 177 (2023), pp. 1-136 (8)
    • Volume 178 (2023), pp. 1-165 (10)
    • Volume 179 (2023), pp. 1-164 (10)
    • Volume 180 (2023), pp. 1-162 (12)
    • Volume 181 (2023), pp. 1-215 (12)
    • Volume 182 (2023), pp. 1-265 (18)
    • Volume 183 (2023), pp. 1-226 (14)
    • Volume 184 (2023), pp. 1-154 (10)
    • Volume 185 (2023), pp. 1-191 (10)
    • Volume 186 (2023), pp. 1-160 (10)
  • 2024 (183)
    • Volume 187 (2024), pp. 1-156 (10)
    • Volume 188 (2024), pp. 1-197 (12)
    • Volume 189 (2024), pp. 1-310 (20)
    • Volume 190(1,2) (2024), pp. 1-351 (18)
      • Volume 190, Issue 1 (2024), pp. 1-69 (1)
      • Volume 190, Issue 2 (2024), pp. 70-351 (17)
    • Volume 191 (2024), pp. 1-207 (12)
    • Volume 192 (2024), pp. 1-319 (20)
    • Volume 193(1,2) (2024), pp. 1-252 (13)
      • Volume 193, Issue 1 (2024), pp. 1-45 (1)
      • Volume 193, Issue 2 (2024), pp. 46-252 (12)
    • Volume 194 (2024), pp. 1-213 (13)
    • Volume 195 (2024), pp. 1-235 (13)
    • Volume 196 (2024), pp. 1-221 (14)
    • Volume 197 (2024), pp. 1-231 (15)
    • Volume 198 (2024), pp. 1-402 (23)
  • 2025 (169)
    • Volume 199 (2025), pp. 1-253 (16)
    • Volume 200 (2025), pp. 1-223 (14)
    • Volume 201 (2025), pp. 1-245 (12)
    • Volume 202 (2025), pp. 1-317 (17)
    • Volume 203 (2025), pp. 1-438 (15)
    • Volume 204 (2025), pp. 1-353 (19)
    • Volume 205 (2025), pp. 1-272 (16)
    • Volume 206 (2025), pp. 1-172 (13)
    • Volume 207 (2025), pp. 1-173 (12)
    • Volume 208 (2025), pp. 1-174 (11)
    • Volume 209 (2025), pp. 1-184 (12)
    • Volume 210 (2025), pp. 1-158 (12)
  • 2026 (21)
    • Volume 211 (2026), pp. (21)
  • Info (6)
  • News (3)
  • Open access (460)
  • Premium (38)

Last Articles

  • All
  • Premium
  • Open access

Falsification and falsificationism

2024-01-21

The Effect of Double Dose of Vaccination on COVID-19 Infection in India: A Mathematical Study

2024-07-18

RFID System Adaptivity in Supply Chain Management for Bangladesh

2024-01-31

Popular Articles

  • About Us

    About Us

    0 shares
    Share 0 Tweet 0
  • Submit your Article

    0 shares
    Share 0 Tweet 0
  • Jeevamrut – A Natural Fertilizer

    0 shares
    Share 0 Tweet 0
  • Abstracting & Indexing

    0 shares
    Share 0 Tweet 0
  • Guide for Authors

    0 shares
    Share 0 Tweet 0

Careers

  • All
  • Careers
No Content Available
World Scientific News

World Scientific News (WSN) is an open-access fully peer-reviewed scholarly journal. The monthly – interdisciplinary journal is directed in the first place to scientists who want to publish their findings, insights, observations, conclusions, etc.

READ MORE

Menu

  • Home
  • About Us
  • Editorial Board
  • Guide for Authors
  • Instruction for Authors
  • Abstracting & Indexing
  • Submit your Article
  • Careers
  • News

Other databases

AGRO
CAS
Google Scholar
Google Scholar Metrics
ICZN
ProQuest
Road Directory
ZooBank

EISSN 2392-2192

Login / Register
Privacy Policy
Cookie Policy

made by fixfix

No Result
View All Result
  • Home
  • About
    • About Us
    • Editorial Board
    • Guide for Authors
    • Abstracting & Indexing
    • Instruction for Authors
    • Submit your Article
  • View Articles
    • 2026
    • 2025
    • 2024
    • 2023
    • 2022
    • 2021
    • 2020
    • 2019
    • 2018
    • 2017
    • 2016
    • 2015
    • 2014
    • 2013
  • Careers
  • News

made by fixfix

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.