A Review On Computer Vision Technology F
A Review On Computer Vision Technology F
ABSTRACT The productivity and profitability of poultry farming are crucial to support its affordability
issues in food security. Criteria in productivity measurement, including Feed Conversion Ratio (FCR)
calculation, whereas economic management is essential for profitability. Hence, best management practices
need to be implemented throughout the growth period for optimizing the poultry performance. This review
provides a comprehensive overview of computer vision technology for poultry industry research. This review
relies on the use of several online databases to identify key works in the area of computer vision in a poultry
farm. We recommend our search by focusing on four keywords, ‘computer vision’ and ‘poultry’ or ‘chicken’
or ‘broiler’ that had been published between 2010 and early 2020 with open access provided by University
Teknologi Malaysia only. All the selected papers were manually examined and sorted to determine their
relevance to computer vision in a poultry farm. We focus on the latest developments by focusing on the
hardware and software parts used to analyze the poultry data with some examples of various representative
studies on poultry farming. Notably, hardware parts can be classified into camera types, lighting units and
camera position, whereas software parts can be categorized into data acquisition and analysis software types
as well as data processing and analysis methods that can be implemented into the software types. This paper
concludes by highlighting the future works and key challenges that needed to be addressed to assure the
quality of this technology prior to the successful implementation of the poultry industry.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
VOLUME 9, 2021 12431
N. S. N. Abd Aziz et al.: Review on Computer Vision Technology for Monitoring Poultry Farm—Application, Hardware, and Software
productivity. Meanwhile, FCR is a ratio of feed intake (feed 1) The applications of computer vision, as well as the
usage) into live weight and provides a benchmark of man- hardware and software that have been used in poultry
agement performance (productivity), as well as profitability farm are critically reviewed.
of each given feed cost [7], [8]. Constantly, a productivity 2) The future potential and limitations of the implementa-
improvement will increase profit, through its effect on the tion the computer vision techniques in poultry farm are
way inputs are transformed into outputs and hence, more also briefly discussed.
outputs (revenue) will be produced from the same inputs 3) Comparisons of our review paper with previous reviews
(same costs). are achieved based on hardware and software parts,
FCR is the current benchmark used for poultry productivity applications, challenges and future enhancements.
by means lower FCR indicates improved animal performance 4) The goal of this review is to help readers understand
and welfare, reduced impact on the environment, shows and remind them of the shortcomings in poultry farm-
that broilers may have improved digestion or metabolism or ing in the more advanced development of computer
nutrients and may utilize absorbed nutrients efficiently [9]. vision in recent years, so that they can consider the
Current benchmark for FCR in Malaysia is 1.67, which is a possible applications and trends of using the computer
competitive ratio in poultry industry [3]. The typical modern vision technique in poultry farming.
weight of an animal is about 2.5kg at day 39 with feed
conversion ratio of 1.6 [3]. This could mean 1.6 kg of feed This review paper is organized as follows. Section II pro-
per kilogram of broiler body weight gain. The formula used vides an overview of computer vision technology for poul-
to calculate FCR [10] is shown below: try farm focusing on the representative studies. Section III
FCR = Feed intake (g)/Body weight (g) (1) explains the detail comparison of tools used in computer
vision in poultry farm. Next, Section IV discusses thoroughly
The equation computes the total feed intake of the herd the challenges and future research needs. Finally, Section V
divided by the live weight measured at the broiler house to concludes the paper.
determine on-farm FCR [10]. Any factors that lead to an
over-estimation or artificial increase in feed consumption,
or estimated weight, will result in an unrealistic increase in II. OVERVIEW OF COMPUTER VISION IN POULTRY FARM
FCR. The conversion of feed to live weight is a complex Computer vision has been widely used in various pro-
process and the cause of a low or high FCR is often multi- cesses of different poultry production systems. It includes
factorial because small changes in FCR can have a substantial automation of the house management, behavior, and
impact on overall performance efficiency [11]. The key to welfare [11], [29]–[40], disease detection [28], [41]–[47],
preventing FCR problems is to ensure that good manage- weight measurement [27], [48], [49], slaughtering process
ment practices have been implemented throughout the growth [50], [51], carcass quality [52]–[55], and egg examination
period to optimize broiler performance. [56]–[65]. On the other hand, computer vision also popular
The drive towards reduced FCR motivates farmers to mon- on other livestock monitoring, such as pig [72]–[79], sheep
itor the performance better and understand the development or cattle [80]–[83], and fish [84].
of their animals. Over the past decades, a variety of classifi- Computer vision can be defined as interdisciplinary scien-
cation and detection methods have been developed in poul- tific field that deals with how computational models can be
try farming including acoustic resonance [12]–[16], robotics made to gain high-level understanding from digital images
[17], remote sensing [18], Wireless Sensor Networks (WSNs) or videos to build autonomous system as the human visual
[19]–[26], and computer vision [27]–[65]. It should be noted system can do [87]. By combining machine learning or deep
that this review highlights on the computer vision component learning with computer vision have enabled computers to
in poultry farming. Therefore, researches on other automation better understand what they see and as a result has bolstered
technologies without image sensing or computer vision are developments in computer vision. To simplify the process
not discussed in depth. of detection, pattern recognition and prediction, computer
A review on the application of computer vision in poultry vision has been developed, which can automatically extract
farm has been published [66] and only focused on image anal- complex features that are not designed by human engineers
ysis of the imaging technologies exists. Unlike our review, and end-to-end by learning from multiple training data.
there is no in-depth discussion on the machine learning or Fig. 1 illustrates the computer vision in poultry farming
deep learning techniques, no summary of the hardware and comprehensively. Computer vision mainly composed of two
software used in computer vision systems. Table 1 lists all components including the hardware and software part. The
the comparisons between the previous review papers with our hardware part can be narrowed down into camera and the
review. Based on Table 1, it can be seen that the previous light source (Fig. 1a), meanwhile the software part is fur-
review papers are more focused on the data processing and ther classified into data acquisition and analysis software
analysis methods in poultry farm instead of the hardware (Fig. 1b). Furthermore, data processing and analysis methods
and software used in poultry farm. The contributions of this (Fig. 1c and 1d) are the various algorithms applied in the data
review paper can be listed as follows: analysis software.
Generally, computer vision works in three basic steps: (1) features achieved. Next, data acquisition software is needed
acquiring an image, (2) processing the image, and (3) under- to amass or store the images captured by the camera for
standing the image. In order to obtain such a high-quality the analysis step. The raw images are then processed using
image, hardware plays a crucial part in this step. The choices selective data pre-processing and segmentation algorithms.
of diverse cameras, additional lighting units, high specifi- Finally, data analysis algorithms are adopted to evaluate
cation computer or any mobile station do affect the image the data images according to the study purposes. Both data
processing and analysis methods could be conducted by using lens (50mm/F 1.4) with high frame per second (fps) captured
the data analysis software. The details of each part will be around 250-300fps is needed. This is due to the high speed
discussed further in next session of mandibulation which consists of a cycle of opening and
closing beak during the feed grasping. In contrast, a standard
III. HARDWARE digital camera is needed to detect sick broilers or welfare
In the poultry industry, the rapid development of computer of the whole poultry house. For instance, author in [42]
vision is indistinguishable from the hardware aspect of com- adopted an affordable Logitech webcam camera with 30fps to
puter vision. In general, a camera with a lens, a lighting detect sick broilers. Surprisingly, they still can achieve a high
unit, a motor-driven mobile station, and a computer packed accuracy rate for differentiating sick and healthy broilers.
with image acquisition and analysis software are traditional Therefore, it seems that in order to achieve the best quality
computer vision [53], as shown in Fig. 1. Commonly, camera of images for ease of further analyses, the choices of suitable
and lighting units are operated at early stages of computer camera are needed according to what research purpose is.
vision for image acquisition, while computer or any other In addition, the low contrast between animals and bed as
analysis hardware are used throughout the analysis process well as lack of results under low light intensity conditions
to generate the desired findings. Table 2 summarizes the make this type of camera not suitable in principle for the
hardware used in computer vision for poultry farm. It can target application [36]. Hence, extra lighting units sometimes
be seen that; the uses of each of hardware parts are distinct are needed to increase or maintain the light intensity in a
according to the research purposes. poultry farm.
This section of hardware parts can best be treated under
three headings: (1) various type of cameras and lens, (2) 2) INFRARED (IR) IMAGING
lighting units, and (3) camera position in poultry farm. IR imaging is brand new technology that is growing in popu-
larity. The IR camera uses a technology that gathers and mea-
A. CAMERA AND LENS sure a beam of IR light waves through the radiant heat emitted
The camera is one of the core components of computer from an object and then converts it to an image [87]. There
vision systems. There are various types of camera that have are three regions in the IR electromagnetic spectrum defined
been used in poultry farm regardless of its purposes. The as (1) near-IR (780-2,500 nm), (2) mid-IR (2500-25,000 nm),
particular type of imaging methods massively used in poul- and (3) far-IR (25,000-1,000,000 nm) [88]. Previous studies
try farm, including visible light camera [11], [27]–[29], evaluating IR camera observed on whether on poultry activity
[34]–[39], [41], [44], [51], infrared [34], [36], thermographic monitoring or tracking [34]–[36]. Similarly, IR camera is
[36], depth [41], [48], [59], and hyperspectral camera [61], having shortcoming as visible light digital camera as it still
[62]. While an ordinary visible-light camera captures light needs enough light & contrast to create usable images [36].
across three wavelength bands in the visible spectrum, includ- Therefore, additional IR light is needed to overcome the
ing red, green, and blue (RGB), infrared, thermographic, limitation of contrast issue. Furthermore, it’s been stated by
depth and hyperspectral imaging encompasses a wide variety [36] that IR light sources are non-invasive towards poultry
of spectrums that go beyond RGB. The types of imaging eyes.
methods used in poultry farm can be detailed as follows:
3) THERMOGRAPHIC CAMERA
1) VISIBLE LIGHT DIGITAL CAMERA The thermographic camera usually detects radiation in the
A visible-light digital camera is a standard digital camera long-infrared range of the electromagnetic spectrum, roughly
used for taking photos or videos in visible light [86]. It is 9,000 to 14,000 nanometers [89]. Image produced from that
very cost-effective solutions, particularly because they allow radiation is called thermograms. The amount of radiation
wide areas of the farm to be covered hence, a lower num- emitted by an object increases with temperature, hence, ther-
ber of them would be necessary [36]. These benefits make mography allows one to see variations in temperature. Ther-
most of researchers favored on this type of camera compared mal cameras measure the absolute temperature of the object.
to other types of camera [11], [27], [28], [30], [34]–[39], The advantage of this type of imaging is their ability to
[41], [44], [51]. Several studies have revealed that this type work in complete darkness. Their operation does not depend
of camera is capable of capturing images for various pur- on the presence of light [36]. Hence, this camera provides
poses. For example, it is operated to detect the sick broilers better used of differentiating between the broiler body and the
[28], [41], monitoring broiler behavior and movement [11], background images. When viewed through a thermal imaging
[30], [34]–[39], [41], [44], and predict the live weight of camera, warm objects stand out well against the environment,
broiler [27]. day or night. This camera usually non-invasive, non-contact
However, few major drawbacks of this camera are the technology, and used no harmful radiation [89], [90]. How-
lenses compatibility and resolution. The evidence of this ever, the major drawbacks of thermographic cameras, are they
drawback can be clearly seen in the case of detection of are costly and cover a relatively small area [36]. In general,
biomechanical analysis during feeding [11], [38]. For this the typical poultry farm is around 100m long and 40m wide
type of experiment, the high-speed camera with a special [36] hence, a large number of nodes are needed. It is therefore
likely that each node must be low-cost for the resulting whole less complicated to pre-process the 2D images compared to
system to be affordable. 3D images.
Another problem with this camera is that it is limited to
only indoor applications. [91], [92]. The structured light is
4) DEPTH CAMERA easy to be affected by strong natural light outdoors such as
Depth camera is a camera that employs structured of light sunlight, which results in the projected coded light becoming
or Time of Flight (ToF) techniques by judging the depth and submerged and unusable [91]. Therefore, this type of camera
distance of an object by measuring the round-trip time of light can only be used in closed-poultry house system compared to
to measure distances within a complete scene with a single opened-poultry house system.
shot [91]. It can also effectively count the amount of time it
takes for a reflected beam of light to return to the camera
sensor. In addition, this type of camera uses only a small 5) HYPERSPECTRAL IMAGING
amount of processing power since it requires a direct process Hyperspectral imaging (HSI) is a non-destructive imaging
to extract the distance information out of the output signals technique that combines conventional imaging and spec-
of ToF sensor. Besides, this type of imaging is also ideal for troscopy to obtain both spatial and spectral information about
low light and give a wide field of view [39]. Many researchers an object [62]. HSI has been widely used in the poultry indus-
have utilized depth camera to measure the volume of targeted try, mainly in food safety and quality [68], [92], [93], egg
object. This is exemplified in the work undertaken by [41], inspection [61], [62], microbiological contamination [89],
[48], [59] to measure broiler weight, detect the sick broilers and food fraud [69], [96].
and estimate the egg’s volume. Having a higher level of spectral gives better capability
In addition, the depth images created can be used for fea- to see the unseen. Therefore, this type of imaging is suit-
tures that relate to the three-dimensional (3D) features such able for a very detail and high-throughput online monitor-
as volume, width and height features as used in [48]. It can ing of poultry products. Flakovskaya and Gowen [69] have
provide an extra benefit towards the final weight prediction. provided in-depth analysis of the HSI studies in poultry
Interestingly, this technique obtained high accuracy (92.2%) products. In their review, they identify five major stud-
between the predicted weight and reference weights. In con- ies of HSI have been published, including fecal matter,
trast to Mortensen et al. [48], however, Okinda et al. [41] microbiological contamination, product quality, physical
proved that the volume estimation can be accurate and effi- defects and food fraud and discussed methods of each study
cient just by using two-dimensional (2D) features. It is also thoroughly.
The main limitation of HSI, however, is related to com- wavelength Infrared (LWIR), and Far Infrared (FIR). The
plexity and storage. The hyperspectral data can cause a very majority of cameras used in poultry farm occur in an infrared
large computational load since it has multidimensional and electromagnetic spectrum with different ranges. Depth cam-
high redundancy data [93]. Furthermore, this type of camera era and hyperspectral imaging usually detect radiation in
is usually high in price. between 0.35µm to 1.7µm range [ [96]–[98]. A ther-
Fig. 2 illustrates the various camera types discussed mographic camera detects radiation in Long wavelength
earlier and applications used across the electromagnetic Infrared (LWIR) between 9µm to 14µm [89].
spectrum. The camera types used in poultry farm usually
occurs in visible light and infrared wavelength spec- B. LIGHTING UNITS
trum. Infrared wavelength spectra can be further nar- Another important core of a computer vision system is light-
rowed down into Near-Infrared (NIR), Short wavelength ing unit. After being applied to the object to be detected, the
Infrared (SWIR), Mid-wavelength Infrared (MWIR), Long light produced by the illumination device acts as a carrier
for physical information and is then projected to the array of In addition, ‘cold-white’ light also improved the final weight
regions of the camera by the beam splitting element [50]. of broilers and the yield of muscle breast tenders without neg-
Light is a key aspect in creating a remarkable image. The ative impacts on their measured parameters such as lameness
quality of the image therefore directly affects the efficiency and dermatitis.
and reliability of the lighting unit. It determines not only
the brightness or darkness of the environment, but also the C. CAMERA POSITION
tone, mood and the atmosphere. Hence, in order to obtain the When the specification and relative location of the camera is
best quality of images, it is crucial to control and manipulate modified, it will have different impacts on the data collection
light correctly to get a better texture, vibrancy of colours and of the same sample. Therefore, computer system in poultry
luminosity on the objects. If the light intensity is low, it will farm generally has two acquisition mode, namely (i) top-view
create weak contrast, which the image contained a small and (ii) side-view camera position.
difference in brightness between the objects and background Not all researchers show precautions to the position of the
areas. This phenomenon will lead to noises and make it more camera above the floor. This is important due to the image
difficult for image analysis. dimensions can vary with different position of broilers below
At present, there are two types of lighting units are widely the camera. However, the camera setup can overcome the
used in computer vision in poultry farm such as halogen impact of deviations caused by the different angles between
lamps [70] and Light-Emitting Diode (LED) [61]. Halogen the camera and the chicken [27].
lamps are inflatable incandescent lamps filled with halogen Top-view camera imaging has been known as the least dis-
or halide gas [70]. In the wavelength range of visible light to turbing for animals and it produces the most useful data [35].
infrared, the emitted light is a smooth, continuous continuum, The majority of the researchers [27], [35], [38], [41], [48]
with no sharp peaks [68]. It also has a luminous efficiency that used top-view camera positioning during the detection of
is greater than conventional bulbs. Furthermore, the halogen broilers. The difference between the studies is the distance
frequency ensures constant lighting and a long life, four times of the camera with the targeted broilers. Up to now, far
the life of ordinary light bulbs. too little attention has been paid to this issue and the ideal
Halogen lamps, however, often have major deficiencies, distance needed to achieve the best image quality remains
such as high heat output, changes in temperature that cause unclear.
spectral peak shifts, and vibration sensitivity. Hence, many It has previously been observed that for measuring the
researchers are tending to implement the LED in researches broiler weight, Mollah et al. [27] emphasize that with the
on computer vision in a poultry farm. The advent of LED camera height of 1m above floor, covering an area of 1m2
lighting has brought a new, and way better option, in illumi- the weight of broilers may be estimated to get more accurate
nating house poultry [100]. mean weight and weight distributions. Moreover, the results
LEDs have low energy, low heat output, robust and durable of this study indicate that it achieved high predictive value
energy consumption. According to particular requirements, (R2 = 0.99). Meanwhile, Amrei et al. [49] achieved lower
they may also be composed of various structures, such as the value (R2 = 0.98) with the camera height of 2m above floor.
point source, line source, and source of ring light. The wave- On the other hand, Kashiha et al. [35] and
length range of LEDs is limited, however. Halogen lamps Fernández et al. [31] implemented camera 5m above floor,
with a large wavelength range are still irreplaceable at this which cover an area of 70m2 to monitor the welfare status
stage. of the whole broiler flock related to health and management
When managing the effectiveness of lighting units to obtain problem.
the best image quality, it is also crucial to emphasize the
effects of this lighting unit on the poultry. It has previously IV. SOFTWARE
been observed that blue light has a calming effect on the Software development has also played a key role in computer
poultry, while red reduces cannibalism and feather pecking vision strategies, in addition to hardware. In particular,
[100]. To date, several studies have investigated on the effects software for such applications includes data acquisition soft-
of light on poultry health and welfare [100]–[102]. Since ware and software for data analysis. Data acquisition soft-
LED offer a great benefit towards efficient lighting, however, ware plays an important role to store and select the best
solving this problem is not uniquely and simply. quality images captured by the camera. Whereas, data anal-
To determine the effects of light on behavior, welfare and ysis software is the tool used to conduct image analysis
performance of broiler, Riber [101] compared two different using a selected algorithm. According to Table 3, eYeNamic
types of LEDs with different color temperatures, measured (Fancom BV) is the most used data acquisition software
in Kelvin (K). The 4,100 K light is known as ‘neutral-white’ by researchers meanwhile, MATLAB (Mathworks, Inc.)
while 6,065 K is known as ‘cold-white’. The ‘cold-white’ is the most used data analysis software by researchers,
light contains more wavelengths from the blue part of the as shown in Table 4. The section below describes in details
spectrum than the ‘neutral-white’ light. It has been observed about data acquisition and data analysis software as well
that the broilers spent more time in the ‘cold-white’ light, as data processing and analysis method used in this two
and performed more relaxed behavior in the compartment. software.
1) IMAGE PRE-PROCESSING image. In this methodology, the author in [48] preserves the
Image pre-processing is a process of converting a raw image local minima from the broiler bodies by using morphological
data into a suitable presentation for an application through opening with a circular structuring element. To differentiate
several sequences of operations. The main purpose of this the foreground and the background (e.g., the floor), any seg-
process is to enhance the image quality for the segmentation ment located less than 2 cm from the floor was discarded.
step. Many image pre-processing methods have been used in Mean-shift clustering is traditional density gradient esti-
researches including dilation and erosion [37], Otsu’s method mation algorithm consist of non-parametric feature space.
[37], [41], [44], image thresholding [59], [62], [57], gaussian According to [52], Mean-shift algorithm has given significant
filter [56], [57], [86], and binarization [41], [62]. contribution to ensuring the halal status based on slaughtering
of Islamic way based on utilization of the algorithm by using
U and V features in LUV (L stands for luminance, U and V
2) IMAGE SEGMENTATION represent chromaticity values of the color image) color space.
Image segmentation is the most difficult task mentioned However, this result needs to be verified with the largest
by some of the researchers [42], [47], [48]. It is a process database of poultry images using different parts of the body.
of forming connected objects with relatively homogeneous Ellipse model has been implemented in the poultry study
properties by grouping related pixels together or partitioning related to poultry tracking [32], [34], [39] and early sick warn-
an image into multiple segments with similar attributes [42], ing system [28], [42]. To estimate the target, this approach
[69], [82]. In practical implementations, the separation of uses the current color feature prototype so that it can separate
poultry from a background image is inevitable. Failure in seg- the poultry target into a new frame from the background.
menting the image will decreases the robustness and precision Zhuang et al. [42] explained that the lab (CIE l∗ a ∗ b) space
of the final model. Therefore, to obtain meaningful segments, colors features are clearly visible and clustered compared to
appropriate techniques are required. At present, common HSV (Hue, Saturation, Value) color space. After the color
image segmentation techniques including Watershed algo- characteristics are collected, each pixel for which the Lab
rithm [48], Ellipse model [42], Mean-shift clustering [52], color characteristics are retained in the elliptical column is
and K -Means clustering [42]. evaluated while the other pixels are removed. In order to
Watershed algorithm is widely used in segmenting examine each extracted poultry color feature and the propor-
grayscale images as it is partitioning the image into seg- tions of different color features, a histogram model is then
ments by extracting their contours. As the image captured developed. Finally, using Bhattacharyya distance, the newly
by [48] is the 3D depth image, thus, a height function is segmented outline is compared with the original. If the gap
defined to obtain an artificial depth image with local minima is too high, the outline will be omitted and with this can
at the object of interest. Then, a flooding technique [104] the poultry be more precisely segmented. However, problems
was implemented to increment flood regions surrounding the arise as the segmentation of the edge is not optimal. If the
local minima until the region meets. After the regions was feather colors are complex, the outcome could be worse [27].
met, a watershed line separating these two regions is created. A variation of the Ellipse model with K-Means clustering was
Gaussian kernel was used to prevent over segmentation of the then used by Zhuang et al. [42]. K -Means is an unsupervised
method of clustering which uses K -means to represent data Two different approaches were used for estimating the vol-
distribution. The combination of clustering and ellipse model ume: (1) numerical integration and (2) convex hull.
K -means will compensate for the limitations of the edge seg- Recently, various features added to enhance the accuracy
mentations, making the segmentation effects more precise. of the poultry detection and prediction. Eccentricity [48] is a
deviation of a curve or orbit from the circularity. It is added as
3) FEATURE SELECTION AND EXTRACTION younger broilers tend to have a more elongated shape which
After the images have been pre-processed and segmented, may lead to high eccentricity while the older broilers have a
then the selected features of the images will be extracted rounder shape which may cause low eccentricity. As broilers
out. An early work by [27] takes morphological features grow, it will increase in both length and width. However, the
such as age, area, perimeter, and volume of the poultry. length can be affected by broiler head movement when it
Age has been used as covariate variable in [27] to exam- walks and pecks and width can be affected by broiler flapping
ine the relationship between manual body weight and the its wings, but width is less experienced than the bobbing
number of surface-area pixels in the image. [48] added on of the head. Hence, the width was calculated according to
how the food, water supplies and circadian rhythm of the the minor axis of the segmented broilers as broilers were
broilers were heavily controlled to obtain the optimal growth in elongated shape. The back height was defined as the
pattern and reached target weight at the end of the rearing difference between the average depth value of the contour
period. That is why the age was added as a feature. Features of the segmented broiler and the depth value on top of its
such as area are mostly used to estimate the broiler body back [48].
size. Area (A) was calculated by summing pixels within a Concavity [42] also be added as feature as it can contribute
contour which constitutes a broiler [29]. The perimeter or to differentiate between healthy and sick broiler as well as
edge detection has been used with great success as a weight calculate the broiler weight. The skeleton is massively used
predictor for the broilers. Perimeter (P) was calculated by as feature for human pose estimation. Skeleton also could be
summing pixels that were different from one which constitute used as a feature to distinguish between the healthy broilers
the contour [24]. The volume was used as 3D features in [34]. and sick broilers.
Instead of body segmentation, head segmentation is crucial when considering highly unstable, noisy, incomplete, impre-
for poultry detection. Head segmentation features can be cise, and qualitative natures which coincide to the features
divided into eye location, beak, and pecking judgement. Eye of poultry activities [2]. This technique usually resulting
location is a parameter used by [11] as it contributes to biome- in high accuracies during detection. For example, with the
chanical attributed to the broiler behavior during feeding. combination with Bayesian Network can result in 92.2%
Beak tips detection was done by applying an algorithm that accuracy [48]. On the other hand, with Levenberg-Marquardt
starts a search for the beak tips from the bottom left of the back-propagation type, this algorithm achieved 97.5% accu-
binary image. racy during evaluation of 150 egg samples [62].
CNN is one of the most representative deep learning algo-
4) IMAGE CLASSIFICATION rithms in digital image processing. Author in [28] used CNN
Image classification is an important task in computer vision, with a comparison between Single Shot MultiBox Detec-
as it is used to identify an object that appears in an image. This tor (SSD) and Improved Feature Fusion Single Shot Multi-
task consists of labelling input images with a probability for Box Detector (IFSSD). SSD has shown a good performance
the presence of a particular visual object class. Furthermore, in location detection of broilers and obtain their health simul-
the ultimate aim of computer vision is to construct models taneously, but has weak recognition ability towards small
of machine learning or deep learning that accurately approx- targets and cannot define many distant broilers. Meanwhile
imate or distinguish the sample ’s characteristics. A corre- IFSSD is proved to classify ability of the health status more
lation between the precise measured properties of a sample accurately as it can achieve 99.7% accuracy in detecting sick
and its spectral information was designed to create a machine broilers, compared to SSD that achieved 98.7% accuracy.
learning and deep learning model. Two types are usually
included in the samples used to construct the model: (i) the V. KEY CHALLENGES AND FUTURE RESEARCH NEEDS
calibration or training set, and (ii) the validation or predic- Although there is a considerable number of researches in
tion set. The calibration collection applies primarily to some poultry management, there are still several issues to be
representative specimens and is used to determine the param- addressed. The main issue being focused is regarding produc-
eters of the model. Common machine learning and deep tivity and profitability of a poultry farm. Many researchers
learning modeling methods include Support Vector Machine have improved the method of measuring poultry welfare
(SVM) [39], [41], [61], Artificial Neural Network (ANN) [2], specially related to health, behavior, weight and growth.
[48], [61], [62], [105] and the Convolutional Neural Network
(CNN) [28]. A. KEY CHALLENGES IN POULTRY FARM
SVM is a supervised learning algorithm typically used in The section below describes the issues and challenges in
the study of statistical classification and regression, which poultry management, including the quality of raw data, pre-
simultaneously minimizes the empirical classification error cision of image segmentation and reliability of prediction or
and maximizes the geometric margin. It uses hyper plane to classification.
separate classes in data [61]. However, since linear decision
boundaries are not sufficient for many tasks, SVM often use 1) QUALITY OF RAW DATA
a kernel function that maps the features into a higher dimen- The first problem raised in data acquisition whereby the
sional space in which more complex decision boundaries can quality of the raw data being questioned. Challenges faced
be represented linearly. In [39], the algorithm inputs vectors in ensuring the quality of raw data due to physical action that
into a high-dimensional feature space non-linearly and uses affected the changes on posture, orientation and the diversity
the theory of minimization of construction risk to find the in body dimension measurement especially the frequency of
maximum margin in the high-dimensional feature space so head position shifting. Besides, images could be poor due
that the health status of the broiler chickens can be graded as to dust bathing of hyperactive broilers to stretching out their
99.5% accuracy achieved. In addition, SVM with RBF kernel wings. Image dimension also varies due to various locations
outperformed during health prediction in [40] with 97.5% and of the chickens below the camera, feather level, lighting and
97.8% accuracy respectively. the threshold values of the image as well as the distance
ANN is a supervised network and typically defined by between the position of the camera with the broilers [27],
four parameters: (1) interconnection pattern between differ- [48]. Due to that challenge, many researches tried to exclude
ent layers of neurons, (2) learning process for updating the the head and tail position during the feature extraction phase
weights of the interconnection pattern, (3) activation function [48], [106]. This will lead to the underestimation of broiler
that converts a neuron-weighed input to its output activation body weight and behavior than the actual.
and (4) training strategy and ability of data processing. ANN
function responses are determined by independent processing 2) PRECISION OF IMAGE SEGMENTATION
neuron units connected through a weighted network. ANN The next problem is segmentation process and feature extrac-
basically composed of three neuron layers known as input, tion. This is the most crucial part in image analysis to ensure
hidden and output layers. ANN can be regarded as an alter- the accuracies of the calculation. The differences of the
native modeling approach to traditional statistics, particularly broiler as an object with the background is crucial in image
segmentation. It is more difficult if having multiple broilers applications and patterns of using a poultry farm computer
to be segmented. Many researchers have tried to put a dark vision technique. Thus, the review we present will stimulate
background [22] to minimize the noises during segmentation new lines of inquiry that will contribute in improving the
and some used external lighting units to overcome the con- productivity and profitability of a poultry farm.
trast issues.
ACKNOWLEDGMENT
3) RELIABILITY OF PREDICTION The authors would like to thank Universiti Tekonologi
The third problem is the reliability of the method in provid- Malaysia (UTM) and specifically Advanced Informatics,
ing adequate accuracy on any calculation of a poultry farm. Razak Faculty of Technology and Informatics, for realizing
There are various algorithms with different performance and and supporting this research work.
accuracy of computer vision technology such as combination
of k-means clustering with SVM that shown 99.469% of REFERENCES
accuracy in determining the healthiness of broiler [42], and [1] V. Ravindran, ‘‘Poultry feed availability and nutrition in developing
Watershed algorithm with Bayesian Artificial Neural Net- countries,’’ Poultry Develop. Rev., vol. 1, pp. 60–63, Jan. 2013.
work with Relative Mean Error of 7.8% [34]. Besides, it is [2] R. Ribeiro, M. Teixeira, A. L. Wirth, A. P. Borges, and F. Enembreck,
‘‘A learning model for intelligent agents applied to poultry farming,’’
being proven that Deep learning has shown excellent results in Proc. 17th Int. Conf. Enterprise Inf. Syst., Barcelona, Spain, 2015,
in the segmentation of difficult data by giving high precision pp. 495–503.
on predicting the broiler’s health by using a Convolutional [3] M. N. Syauqi, M. M. A. Zaffrie, and H. I. Hasnul. Broiler Industry
in Malaysia. , Accessed: Jan. 25, 2018. [Online]. Available: http://ap.
Neural Network with 99.7%. Hence, further analyses and fftc.agnet.org/files/ap_policy/532/532_1.pdf
comparisons between the algorithms need to be taken to [4] Food and Agriculture Organization of The United Nations. The State of
ensure it has strong recognition ability towards the overlap- Food and Agriculture 1996. Accessed: Dec. 17, 2017. [Online]. Available:
http://www.fao.org/3/w1358e/w1358e.pdf
ping small targets and obtain the focus simultaneously.
[5] World Food Summit. Rome Declaration on World Food Secu-
rity. Accessed: Feb. 15,2018. [Online]. Available: http://www.fao.org/
B. FUTURE RESEARCH NEEDS 3/w3613e/w3613e00.htm
As the technology continues to expand in the future, in order [6] T. In Indrani Situation of Agriculture in Malaysia—A Cause for
Concern. Accessed: Mar. 23, 2018. [Online]. Available: https://pdfs.
to achieve the versatility and coordination of technology, it is semanticscholar.org/dafd/37381e01a95d3ea43bebdc58a6c598d6f7c3.pdf
necessary to establish a large-scale dataset. On the other hand, [7] Aviagen. (Jul. 2011). Optimizing Broiler Feed Conversion Ratio.
in ensuring the accuracy and robustness in various complex Accessed: May 16, 2018. [Online]. Available: http://ap.aviagen.
com/assets/Uploads/AAServiceBulletinFCRJuly2011.pdf
situations in poultry farming, researchers should improve the [8] D. Skinner-Noble and R. Teeter, ‘‘Components of feed efficiency
accuracy of computer vision techniques in both hardware and in broiler breeding stock: Energetics, performance, carcass composi-
software. The choice of camera type, light source, and posi- tion, metabolism, and body temperature,’’ Poultry Sci., vol. 82, no. 7,
pp. 1080–1090, Jul. 2003.
tion of mounting a camera are equally important to reduce [9] J. Vaughan, M. G. M. P. Salter, M. Grieve, and B. K. Ozanyan, ‘‘Floor
the image distortion while preserving the image dimension sensors of animal weight and gait for precision livestock farming,’’ in
and quality. In addition, maintaining the hardware usage Proc. IEEE Sensors, vol. 16, Nov. 2017, pp. 978–981, doi: 10.1109/
ICSENS.2017.8234202.
during detection in a long time also one of the difficulties [10] M. Chehraghi, A. Zakeri, and M. Taghinejad-Roudbaneh, ‘‘Effects of
that researchers will need to overcome in the future. Next, different feed forms on performance in broiler chickens,’’ Eur. J. Exp.
it is very crucial to study more effective data processing Biol., vol. 3, no. 4, pp. 66–70, 2013.
and analysis methods to reduce the interference of useless [11] S. Abdanan Mehdizadeh, D. P. Neves, M. Tscharke, I. A. Nääs, and
T. M. Banhazi, ‘‘Image analysis method to evaluate beak and head motion
data. Finally, with the rapid development of computer vision of broiler chickens during feeding,’’ Comput. Electron. Agricult., vol. 114,
technology in poultry farming, this field will involve the pp. 88–95, Jun. 2015, doi: 10.1016/j.compag.2015.03.017.
integration of more disciplines, and the requirements for [12] A. Nasirahmadi, J. Gonzalez, B. Sturm, O. Hensel, and U. Knierim,
‘‘Pecking activity detection in group-housed turkeys using acoustic data
professionals in terms of quality and quantity will continue and a deep learning technique,’’ Biosyst. Eng., vol. 194, pp. 40–48,
to increase. Jun. 2020.
[13] A. Aydin, C. Bahr, S. Viazzi, V. Exadaktylos, J. Buyse, and D. Berckmans,
‘‘A novel method to automatically measure the feed intake of broiler
VI. CONCLUSION chickens by sound technology,’’ Comput. Electron. Agricult., vol. 101,
In this review article, a comprehensive review of the appli- pp. 17–23, Feb. 2014.
cation of computer vision in poultry farm has been pro- [14] A. Aydin, C. Bahr, and D. Berckmans, ‘‘A real-time monitoring tool
to automatically measure the feed intakes of multiple broiler chick-
vided. We have presented the latest development of computer ens by sound analysis,’’ Comput. Electron. Agricult., vol. 114, pp. 1–6,
vision techniques using various representative studies with Jun. 2015, doi: 10.1016/j.compag.2015.03.010.
the highlight of hardware and software parts used in the [15] I. Fontana, E. Tullo, A. Butterworth, and M. Guarino, ‘‘An innova-
systems. Various types of hardware and software elements tive approach to predict the growth in intensive poultry farming,’’
Comput. Electron. Agricult., vol. 119, pp. 178–183, Nov. 2015, doi:
have been discussed. We have illustrated all the components 10.1016/j.compag.2015.10.001.
of computer vision in a poultry farm. The goal of this study [16] M. Rizwan, B. T. Carroll, D. V. Anderson, W. Daley, S. Harbert,
is to help readers understand the more advanced development D. F. Britton, and M. W. Jackwood, ‘‘Identifying rale sounds in chickens
using audio signals for early disease detection in poultry,’’ in Proc. IEEE
of computer vision in recent years and to inform them of the Global Conf. Signal Inf. Process., Washington, DC, USA, Dec. 2016,
limitations of a poultry farm in order to recognize the possible pp. 55–59, doi: 10.1109/GlobalSIP.2016.7905802.
[17] G. Ren, T. Lin, Y. Ying, G. Chowdary, and K. C. Ting, ‘‘Agricul- [34] K. Chao, M. S. Kim, and D. E. Chan, ‘‘Control interface and
tural robotics research applicable to poultry production: A review,’’ tracking control system for automated poultry inspection,’’ Comput.
Comput. Electron. Agricult., vol. 169, Feb. 2020, Art. no. 105216, doi: Standards Interface, vol. 36, no. 2, pp. 271–277, Feb. 2014, doi:
10.1016/j.compag.2020.105216. 10.1016/j.csi.2011.03.006.
[18] J. K. Othman, J. R. Mahmood, and G. Y. A. Al- Emarah, ‘‘Design and [35] M. A. C. E. S. S. Kashiha Bahr Vranken Hong and D. Berckmans,
implementation of smart relay based remote monitoring and controlling ‘‘Monitoring system to detect problems in broiler houses based on image
of ammonia in poultry houses,’’ Int. J. Comput. Appl., vol. 103, no. 8, processing,’’ in Proc. Int. Conf. Agricult. Eng., Zurich, Switzerland, 2014,
pp. 13–18, Oct. 2014, doi: 10.5120/18093-9149. pp. 1–7.
[19] G. P. Raghudathesh, D. J. Deepak, G. K. Prasad, A. B. Arun, R. Balekai, [36] C. Gonzalez, R. Pardo, J. Farina, M. D. Valdes, J. J. Rodriguez-Andina,
V. C. Yatnalli, S. H. Lata, and B. S. Kumar, ‘‘Iot based intelligent poultry and M. Portela, ‘‘Real-time monitoring of poultry activity in breeding
management system using linux embedded system,’’ in Proc. Int. Conf. farms,’’ in Proc. 43rd Annu. Conf. IEEE Ind. Electron. Soc., Oct. 2017,
Adv. Comput., Commun. Informat. (ICACCI), Udupi, India, Sep. 2017, pp. 5–10, doi: 10.1109/IECON.2017.8216605.
pp. 449–454, doi: 10.1109/ICACCI.2017.8125881. [37] R. V. Novas and F. L. Usberti, ‘‘Live monitoring in poultry houses:
[20] R. B. Mahale and S. S. Sonavane, ‘‘Smart poultry farm: An A broiler detection approach,’’ in Proc. 30th Conf. Graph. Patterns
integrated solution using WSN and GPRS based network,’’ Images, 2017, pp. 216–222, doi: 10.1109/SIBGRAPI.2017.35.
Int. J. Adv. Res. Comput. Eng. Technol., vol. 5, pp. 1984–1988, [38] D. P. Neves, S. A. Mehdizadeh, M. R. Santana, M. S. Amadori,
Jun. 2016. T. M. Banhazi, and I. de Alencar Nääs, ‘‘Young broiler feeding kinematic
[21] L. S. Handigolkar, M. L. Kavya, and P. D. Veena, ‘‘Iot based smart analysis as a function of the feed type,’’ Animals, vol. 9, no. 12, pp. 1–11,
poultry farming using commodity hardware and software,’’ Bonfring Int. 2019, doi: 10.3390/ani9121149.
J. Softw. Eng. Soft Comput., vol. 6, no. 5, pp. 171–175, Oct. 2016, doi: [39] C. Fang, J. Huang, K. Cuan, X. Zhuang, and T. Zhang, ‘‘Comparative
10.9756/BIJSESC.8269. study on poultry target tracking algorithms based on a deep regres-
[22] C. So-In, S. Poolsanguan, and C. Poonriboon, ‘‘Smart mobile poultry sion network,’’ Biosyst. Eng., vol. 190, pp. 176–183, Feb. 2020, doi:
farming systems in remote sky WSNs,’’ Int. J. Digit. Content Tech- 10.1016/j.biosystemseng.2019.12.002.
nol. Appl., vol. 7, no. 9, pp. 508–518, May 2013, doi: 10.4156/jdcta. [40] M. Ammad-uddin, M. Ayaz, E.-H. Aggoune, and M. Sajjad, ‘‘Wireless
vol7.issue9.61. sensor network: A complete solution for poultry farming,’’ in Proc. IEEE
[23] R. B. Mahale and D. S. S. Sonavane, ‘‘Food and water level control mech- 2nd Int. Symp. Telecommun. Technol. (ISTT), Nov. 2014, pp. 321–325,
anism for smart chicken poultry farming,’’ Int. J. Innov. Res. Comput. doi: 10.1109/ISTT.2014.7238228.
Commun. Eng., vol. 4, no. 8, pp. 15141–15147, 2016, doi: 10.15680/IJIR- [41] C. Okinda, M. Lu, L. Liu, I. Nyalala, C. Muneri, J. Wang, H. Zhang, and
CCE.2016. M. Shen, ‘‘A machine vision system for early detection and prediction
of sick birds: A broiler chicken model,’’ Biosystems Eng., vol. 188,
[24] O. Debauche, S. Mahmoudi, S. A. Mahmoudi, P. Manneback, J. Bindelle,
pp. 229–242, Dec. 2019, doi: 10.1016/j.biosystemseng.2019.09.015.
and F. Lebeau, ‘‘Edge computing and artificial intelligence for real-
[42] X. Zhuang, M. Bi, J. Guo, S. Wu, and T. Zhang, ‘‘Development of an early
time poultry monitoring,’’ Procedia Comput. Sci., vol. 175, pp. 534–541,
warning algorithm to detect sick broilers,’’ Comput. Electron. Agricult.,
Jan. 2020, doi: 10.1016/j.procs.2020.07.076.
vol. 144, pp. 102–113, Jan. 2018, doi: 10.1016/j.compag.2017.11.032.
[25] W. F. Pereira, L. D. S. Fonseca, F. F. Putti, B. C. Góes, and
[43] T. V. Hertem, T. Norton, D. Berckmans, and E. Vranken, ‘‘Predicting
L. D. P. Naves, ‘‘Environmental monitoring in a poultry farm using
broiler gait scores from activity monitoring and flock data,’’ Special
an instrument developed with the Internet of Things concept,’’ Com-
Eng. Adv. Precis. Livestock Farming, vol. 173, pp. 93–102, Sep. 2018,
put. Electron. Agricult., vol. 170, Mar. 2020, Art. no. 105257, doi:
doi: 10.1016/j.biosystemeng.2018.07.002.
10.1016/j.compag.2020.105257.
[44] L. Xiao, K. Ding, Y. Gao, and X. Rao, ‘‘Behavior-induced health
[26] W. Sarachai, P. Ratnapinda, and P. Khumwichai, ‘‘Smart notification
condition monitoring of caged chickens using binocular vision,’’
system for detecting fan failure in evaporative cooling system of a
Comput. Electron. Agricult., vol. 156, pp. 254–262, Jan. 2019, doi:
poultry farm,’’ in Proc. Joint Int. Conf. Digit. Arts, Media Technol.
10.1016/j.compag.2018.11.022.
Northern Sect. Conf. Electr., Electron., Comput. Telecommun. Eng. (ECTI
[45] A. A. G. Raj and J. G. Jayanthi, ‘‘IoT-based real-time poultry monitoring
DAMT-NCON), Jan. 2019, p. 296, doi: 10.1109/ECTI-NCON.2019.
and health status identification,’’ in Proc. 11th Int. Symp. Mechatronics
8692266.
Appl. (ISMA), Mar. 2018, pp. 3–9, doi: 10.1109/ISMA.2018.8330139.
[27] M. B. R. Mollah, M. A. Hasan, M. A. Salam, and M. A. Ali, ‘‘Digital [46] A. Aydin, ‘‘Using 3D vision camera system to automatically assess
image analysis to estimate the live weight of broiler,’’ Comput. Elec- the level of inactivity in broiler chickens,’’ Comput. Electron. Agricult.,
tron. Agricult., vol. 72, no. 1, pp. 48–52, Jun. 2010, doi: 10.1016/j. vol. 135, pp. 4–10, Apr. 2017, doi: 10.1016/j.compag.2017.01.024.
compag.2010.02.002. [47] A. Aydin, ‘‘Development of an early detection system for lameness of
[28] X. Zhuang and T. Zhang, ‘‘Detection of sick broilers by digital image broilers using computer vision,’’ Comput. Electron. Agricult., vol. 136,
processing and deep learning,’’ Biosyst. Eng., vol. 179, pp. 106–116, pp. 140–146, Apr. 2017, doi: 10.1016/j.compag.2017.02.019.
Mar. 2019. [48] A. K. Mortensen, P. Lisouski, and P. Ahrendt, ‘‘Weight prediction of
[29] D. F. Pereira, B. C. B. Miyamoto, G. D. N. Maia, G. Tatiana Sales, broiler chickens using 3D computer vision,’’ Comput. Electron. Agricult.,
M. M. Magalhães, and R. S. Gates, ‘‘Machine vision to identify broiler vol. 123, pp. 319–326, Apr. 2016, doi: 10.1016/j.compag.2016.03.011.
breeder behavior,’’ Comput. Electron. Agricult., vol. 99, pp. 194–199, [49] S. Amraei, S. A. Mehdizadeh, and I. D. A. Nääs, ‘‘Development of a trans-
Nov. 2013, doi: 10.1016/j.compag.2013.09.012. fer function for weight prediction of live broiler chicken using machine
[30] D. P. Neves, S. A. Mehdizadeh, M. Tscharke, I. D. A. Nääs, and vision,’’ Engenharia Agrícola, vol. 38, no. 5, pp. 776–782, Sep. 2018, doi:
T. M. Banhazi, ‘‘Detection of flock movement and behaviour of broiler 10.1590/1809-4430-Eng.Agric.v38n5p776-782/2018.
chickens at different feeders using image analysis,’’ Inf. Process. [50] C.-W. Ye, K. Yousaf, C. Qi, C. Liu, and K. jie Chen, ‘‘Broiler stunned state
Agricult., vol. 2, nos. 3–4, pp. 177–182, Oct. 2015, doi: 10.1016/j. detection based on an improved fast region-based convolutional neural
inpa.2015.08.002. network algorithm,’’ Poultry Sci., vol. 99, no. 1, pp. 637–646, Jan. 2020.
[31] A. Peña Fernández, T. Norton, E. Tullo, T. van Hertem, A. Youssef, [51] C.-W. Ye, Z.-W. Yu, R. Kang, K. Yousaf, C. Qi, K.-J. Chen, and
V. Exadaktylos, E. Vranken, M. Guarino, and D. Berckmans, ‘‘Real-time Y.-P. Huang, ‘‘An experimental study of stunned state detection for broiler
monitoring of broiler flock’s welfare status using camera-based technol- chickens using an improved convolution neural network algorithm,’’
ogy,’’ Biosyst. Eng., vol. 173, pp. 103–114, Sep. 2018, doi: 10.1016/j. Comput. Electron. Agricult., vol. 170, Mar. 2020, Art. no. 105284, doi:
biosystemseng.2018.05.008. 10.1016/j.compag.2020.105284.
[32] T. Fujii, H. Yokoi, T. Tada, K. Suzuki, and K. Tsukamoto, ‘‘Poul- [52] M. Amin Mansor, S. R. M. S. Baki, N. M. Tahir, and R. A. Rahman,
try tracking system with camera using particle filters,’’ in Proc. ‘‘An approach of halal poultry meat comparison based on mean-shift
IEEE Int. Conf. Robot. Biomimetics, Feb. 2009, pp. 1888–1893, doi: segmentation,’’ in Proc. IEEE Conf. Syst., Process Control (ICSPC),
10.1109/robio.2009.4913289. Dec. 2013, pp. 279–282, doi: 10.1109/SPC.2013.6735147.
[33] C. So-In, S. Poolsanguan, and K. Rujirakul, ‘‘A hybrid mobile envi- [53] M. Chmiel, M. Slowinski, and K. Dasiewicz, ‘‘Application of com-
ronmental and population density management system for smart poultry puter vision systems for estimation of fat content in poultry meat,’’
farms,’’ Comput. Electron. Agricult., vol. 109, pp. 287–301, Nov. 2014, Food Control, vol. 22, no. 8, pp. 1424–1427, Aug. 2011, doi: 10.1016/j.
doi: 10.1016/j.compag.2014.10.004. foodcont.2011.03.002.
[54] E. Latifa Noferita Kaswati, A. Harmoko Saputro, and C. Imawan, ‘‘Exam- [73] A. Aydin, O. Cangar, S. E. Ozcan, C. Bahr, and D. Berckmans, ‘‘Appli-
ination system of chicken meat quality based on hyperspectral imag- cation of a fully automatic analysis tool to assess the activity of broiler
ing,’’ J. Phys., Conf. Ser., vol. 1528, Apr. 2020, Art. no. 012045, doi: chickens with different gait scores,’’ Comput. Electron. Agricult., vol. 73,
10.1088/1742-6596/1528/1/012045. no. 2, pp. 194–199, Aug. 2010, doi: 10.1016/j.compag.2010.05.004.
[55] B. C. Geronimo, S. M. Mastelini, R. H. Carvalho, S. B. Júnior, [74] M. Kashiha, C. Bahr, S. Ott, C. P. H. Moons, T. A. Niewold, F. O. Ödberg,
D. F. Barbin, M. Shimokomaki, and E. I. Ida, ‘‘Computer vision system and D. Berckmans, ‘‘Automatic weight estimation of individual pigs
and near-infrared spectroscopy for identification and classification of using image analysis,’’ Comput. Electron. Agricult., vol. 107, pp. 38–44,
chicken with wooden breast, and physicochemical and technological Sep. 2014, doi: 10.1016/j.compag.2014.06.003.
characterization,’’ Infr. Phys. Technol., vol. 96, pp. 303–310, Jan. 2019, [75] A. Nasirahmadi, U. Richter, O. Hensel, S. Edwards, and B. Sturm, ‘‘Using
doi: 10.1016/j.infrared.2018.11.036. machine vision for investigation of changes in pig group lying patterns,’’
[56] A. S. Alon and P. Technological Institute of the Philippines- Manila, ‘‘An Comput. Electron. Agric., vol. 119, pp. 184–190, 2015, doi: 10.1016/j.
image processing approach of multiple Eggs’ quality inspection,’’ Int. compag.2015.10.023.
J. Adv. Trends Comput. Sci. Eng., vol. 8, no. 6, pp. 2794–2799, Dec. 2019. [76] A. Pezzuolo, V. Milani, D. H. Zhu, H. Guo, S. Guercini, and F. Marinello,
[57] B. Narin, S. Buntan, N. Chumuang, and M. Ketcham, ‘‘Crack on eggshell ‘‘On-barn pig weight estimation based on body measurements by
detection system based on image processing technique,’’ in Proc. 18th structure-from-motion (SfM),’’ Sensors, vol. 18, no. 11, pp. 29–36, 2018,
Int. Symp. Commun. Inf. Technol. (ISCIT), Sep. 2018, pp. 226–231, doi: doi: 10.3390/s18113603.
10.1109/ISCIT.2018.8587980. [77] K. Wang, H. Guo, Q. Ma, W. Su, L. Chen, and D. Zhu, ‘‘A portable and
[58] V. G. Narushin, G. Lu, J. Cugley, M. N. Romanov, and D. K. Griffin, automatic xtion-based measurement system for pig body size,’’ Comput.
‘‘A 2-D imaging-assisted geometrical transformation method for non- Electron. Agricult., vol. 148, pp. 291–298, May 2018, doi: 10.1016/j.
destructive evaluation of the volume and surface area of avian eggs,’’ compag.2018.03.018.
Food Control, vol. 112, Jun. 2020, Art. no. 107112, doi: 10.1016/j. [78] K. Jun, S. J. Kim, and H. W. Ji, ‘‘Estimating pig weights from
foodcont.2020.107112. images without constraint on posture and illumination,’’ Comput. Elec-
[59] C. Okinda, Y. Sun, I. Nyalala, T. Korohou, S. Opiyo, J. Wang, and tron. Agricult., vol. 153, pp. 169–176, Oct. 2018, doi: 10.1016/j.
M. Shen, ‘‘Egg volume estimation based on image processing and com- compag.2018.08.006.
puter vision,’’ J. Food Eng., vol. 283, Oct. 2020, Art. no. 110041, doi: [79] X. Sun, J. Young, J.-H. Liu, and D. Newman, ‘‘Prediction of pork
10.1016/j.jfoodeng.2020.110041. loin quality using online computer vision system and artificial intelli-
[60] A. Nasiri, M. Omid, and A. Taheri-Garavand, ‘‘An automatic sorting gence model,’’ Meat Sci., vol. 140, pp. 72–77, Jun. 2018, doi: 10.1016/j.
system for unwashed eggs using deep learning,’’ J. Food Eng., vol. 283, meatsci.2018.03.005.
pp. 1–9, Feb. 2020, doi: 10.1016/j.jfoodeng.2020.110036. [80] A. Nasirahmadi, S. A. Edwards, and B. Sturm, ‘‘Implementation of
[61] M. Soltani and M. Omid, ‘‘Detection of poultry egg freshness by dielectric machine vision for detecting behaviour of cattle and pigs,’’ Livestock Sci.,
spectroscopy and machine learning techniques,’’ LWT-Food Sci. Technol., vol. 202, pp. 25–38, Aug. 2017, doi: 10.1016/j.livsci.2017.05.014.
vol. 62, no. 2, pp. 1034–1042, Jul. 2015, doi: 10.1016/j.lwt.2015.02.019. [81] A. Fuentes, S. Yoon, J. Park, and D. S. Park, ‘‘Deep learning-based
[62] R. Mota-Grajales, J. C. Torres-Peña, J. L. Camas-Anzueto, hierarchical cattle behavior recognition with spatio-temporal informa-
M. Pérez-Patricio, R. Grajales Coutiño, F. R. López-Estrada, tion,’’ Comput. Electron. Agricult., vol. 177, Oct. 2020, Art. no. 105627,
E. N. Escobar-Gómez, and H. Guerra-Crespo, ‘‘Defect detection in doi: 10.1016/j.compag.2020.105627.
eggshell using a vision system to ensure the incubation in poultry [82] M. F. Hansen, M. L. Smith, L. N. Smith, K. Abdul Jabbar, and D. Forbes,
production,’’ Measurement, vol. 135, pp. 39–46, Mar. 2019, doi: ‘‘Automated monitoring of dairy cow body condition, mobility and
10.1016/j.measurement.2018.09.059. weight using a single 3D video capture device,’’ Comput. Ind., vol. 98,
[63] M. Omid, M. Soltani, M. H. Dehrouyeh, S. S. Mohtasebi, and H. Ahmadi, pp. 14–22, Jun. 2018, doi: 10.1016/j.compind.2018.02.011.
‘‘An expert egg grading system based on machine vision and artifi- [83] A. L. Zhang, B. P. Wu, C. T. Wuyun, D. X. Jiang, E. C. Xuan, and
cial intelligence techniques,’’ J. Food Eng., vol. 118, no. 1, pp. 70–77, F. Y. Ma, ‘‘Algorithm of sheep body dimension measurement and its
Sep. 2013, doi: 10.1016/j.jfoodeng.2013.03.019. applications based on image analysis,’’ Comput. Electron. Agricult.,
[64] B. Guanjun, J. Mimi, X. Yi, C. Shibo, and Y. Qinghua, ‘‘Cracked vol. 153, pp. 33–45, Oct. 2018, doi: 10.1016/j.compag.2018.07.033.
egg recognition based on machine vision,’’ Comput. Electron. Agricult., [84] C. Zhou, K. Lin, D. Xu, L. Chen, Q. Guo, C. Sun, and X. Yang, ‘‘Near
vol. 158, pp. 159–166, Mar. 2019, doi: 10.1016/j.compag.2019.01.005. infrared computer vision and neuro-fuzzy model-based feeding decision
[65] K. Sun, L. Ma, L. Pan, and K. Tu, ‘‘Sequenced wave signal extrac- system for fish in aquaculture,’’ Comput. Electron. Agricult., vol. 146,
tion and classification algorithm for duck egg crack on-line detection,’’ pp. 114–124, Mar. 2018, doi: 10.1016/j.compag.2018.02.006.
Comput. Electron. Agricult., vol. 142, pp. 429–439, Nov. 2017, doi: [85] T. S. Huang, ‘‘Computer vision: Evolution and promise,’’ in Proc.
10.1016/j.compag.2017.09.034. 19th CERN School Comput., Egmond aan Zee, The Netherlands, 1996,
[66] J. Astill, R. A. Dara, E. D. G. Fraser, B. Roberts, and S. Sharif, ‘‘Smart pp. 21–25.
poultry management: Smart sensors, big data, and the Internet of Thing,’’ [86] W. Zhao and K. Sakurai, ‘‘Seeing elements by visible-light digital cam-
Comput. Electron. Agricult., vol. 170, Mar. 2020, Art. no. 105291, doi: era,’’ Sci. Rep., vol. 7, no. 1, pp. 1–2, Jun. 2017, doi: 10.1038/srep45472.
10.1016/j.compag.2020.105291. [87] I. Choi, J. Kim, and J. Jang, ‘‘Development of marker-free night-vision
[67] A. Taheri-Garavand, S. Fatahi, M. Omid, and Y. Makino, ‘‘Meat quality displacement sensor system by using image convex hull optimization,’’
evaluation based on computer vision technique: A review,’’ Meat Sci., Sensors, vol. 18, no. 12, p. 4151, Nov. 2018, doi: 10.3390/s18124151.
vol. 156, pp. 183–195, Oct. 2019, doi: 10.1016/j.meatsci.2019.06.002. [88] S. Türker-Kaya and C. W. Huck, ‘‘A review of mid-infrared
[68] J. Qin, K. Chao, M. S. Kim, R. Lu, and T. F. Burks, ‘‘Hyperspectral and and near-infrared imaging: Principles, concepts and applications
multispectral imaging for evaluating food safety and quality,’’ J. Food in plant tissue analysis,’’ Molecules, vol. 22, no. 1, 2017, doi:
Eng., vol. 118, no. 2, pp. 157–171, Sep. 2013, doi: 10.1016/j.jfoodeng. 10.3390/molecules22010168.
2013.04.001. [89] P. Rajmanova, P. Nudzikova, and D. Vala, ‘‘Application and technology of
[69] A. Falkovskaya and A. Gowen, ‘‘Literature review: Spectral imaging thermal imagine camera in medicine,’’ IFAC-Papers Line, vol. 48, no. 4,
applied to poultry products,’’ Poultry Sci., vol. 99, no. 7, pp. 3709–3722, pp. 492–497, 2015.
Jul. 2020, doi: 10.1016/j.psj.2020.04.013. [90] R. Usamentiaga, P. Venegas, J. Guerediaga, L. Vega, J. Molleda, and
[70] X. Fu and J. Chen, ‘‘A review of hyperspectral imaging for chicken F. Bulnes, ‘‘Infrared thermography for temperature measurement and
meat safety and quality evaluation: Application, hardware, and software,’’ non-destructive testing,’’ Sensors, vol. 14, no. 7, pp. 12305–12348,
Comprehensive Rev. Food Sci. Food Saf., vol. 18, no. 2, pp. 535–547, Jul. 2014, doi: 10.3390/s140712305.
Mar. 2019, doi: 10.1111/1541-4337.12428. [91] H. Xu, J. Xu, and W. Xu, ‘‘Survey of 3D modeling using depth cameras,’’
[71] C. Okinda, I. Nyalala, T. Korohou, C. Okinda, J. Wang, T. Achieng, Virtual Reality Intell. Hardw., vol. 1, no. 5, pp. 483–499, Oct. 2019, doi:
P. Wamalwa, T. Mang, and M. Shen, ‘‘A review on computer vision 10.1016/j.vrih.2019.09.003.
systems in monitoring of poultry: A welfare perspective,’’ Artif. Intell. [92] C.-H. Feng, Y. Makino, S. Oshita, and J. F. García Martín, ‘‘Hyperspectral
Agricult., vol. 4, pp. 184–208, 2020, doi: 10.1016/j.aiia.2020.09.002. imaging and multispectral imaging as the novel techniques for detecting
[72] J. Kongsro, ‘‘Estimation of pig weight using a microsoft kinect proto- defects in raw and processed meat products: Current state-of-the-art
type imaging system,’’ Comput. Electron. Agricult., vol. 109, pp. 32–35, research advances,’’ Food Control, vol. 84, pp. 165–176, Feb. 2018, doi:
Nov. 2014. 10.1016/j.foodcont.2017.07.013.
[93] Y. Liu, H. Pu, and D.-W. Sun, ‘‘Hyperspectral imaging technique for SALWANI MOHD DAUD (Member, IEEE)
evaluating food quality and safety during various processes: A review received the B.Eng. (Hons.) degree in electron-
of recent applications,’’ Trends Food Sci. Technol., vol. 69, pp. 25–35, ics engineering from the University of Liverpool,
Nov. 2017, doi: 10.1016/j.tifs.2017.08.013. in 1984, and the M.Eng. and Ph.D. degrees in
[94] R. Vejarano, R. Siche, and W. Tesfaye, ‘‘Evaluation of biological con- electrical engineering from Universiti Teknologi
taminants in foods by hyperspectral imaging: A review,’’ Int. J. Food Malaysia (UTM), in 1989 and 2006, respectively.
Properties, vol. 20, pp. 1264–1297, Dec. 2017, doi: 10.1080/10942912. She has been with UTM for more than 30 years
2017.1338729.
and has vast experience in teaching and research.
[95] C. V. Lauritsen, J. Kjeldgaard, H. Ingmer, M. Bisgaard, and
She is currently a Professor of Advanced Infor-
H. Christensen, ‘‘Microbiota encompassing putative spoilage
bacteria in retail packaged broiler meat and commercial broiler matics with the Razak Faculty of Technology and
abattoir,’’ Int. J. Food Microbiol., vol. 300, pp. 14–21, Jul. 2019, doi: Informatics, UTM. Her research area is focusing on artificial intelligence,
10.1016/j.ijfoodmicro.2019.04.003. blockchain, and IoT. She is currently teaching machine learning and system
[96] R. Grau, A. J. Sánchez, J. Girón, E. Iborra, A. Fuentes, and J. M. Barat, design for security for the postgraduate program. She is also leading few
‘‘Nondestructive assessment of freshness in packaged sliced chicken research grants in the related topics and had secured more than RM2 million
breasts using SW-NIR spectroscopy,’’ Food Res. Int., vol. 44, no. 1, of RandD funds. She has also published more than 100 academic articles
pp. 331–337, Jan. 2011, doi: 10.1016/j.foodres.2010.10.011. in journals, proceedings, and books. She is also heading Cyber-Physical
[97] M. Mahajan and S. M. Kamalapur, ‘‘Spectral imaging,’’ Int. J. Mod. Elec- Systems Research Group. She is a member of the IEEE Computer Society,
tron. Commun. Eng., vol. 7, no. 1, pp. 144–148, 2015, doi: 10.1016/b978- a registered Professional Technologist from the Malaysia Board of Tech-
0-240-80740-9.50070-2. nologists (MBOT), and a registered Graduate Engineer with the Board of
[98] B. Singh, R. Gautam, S. Kumar, B. N. Vinay Kumar, U. Nongthomba, Engineers Malaysia (BEM).
D. Nandi, G. Mukherjee, V. Santosh, S. Kumaravel, and S. Umapathy,
‘‘Application of vibrational microspectroscopy to biology and medicine,’’
Current Sci., vol. 102, no. 2, pp. 232–244, 2012. RUDZIDATUL AKMAM DZIYAUDDIN (Senior
[99] R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, ‘‘An overview Member, IEEE) received the B.Eng. degree in
of depth cameras and range scanners based on time-of-flight technolo- electrical and electronic engineering from Univer-
gies,’’ Mach. Vis. Appl., vol. 27, no. 7, pp. 1005–1020, Oct. 2016, doi: siti Sains Malaysia, and the Ph.D. degree from
10.1007/s00138-016-0784-4. the University of Bristol, U.K. In 2012, she had
[100] P. Mashkov, B. Gyoch, R. Kandilarov, H. Beloev, M. Varbanov, and an internship at Toshiba Research Europe Limited
T. Pencheva, ‘‘LED lamp for poultry housing—Design and thermal (TREL), U.K., for a year, and produced several
management,’’ in Proc. Int. Spring Semin. Electron. Technol., 2015, patents related to intercell interference mitigation
pp. 91–96, doi: 10.1109/ISSE.2015.7247969.
techniques. She is currently a Senior Lecturer with
[101] A. B. Riber, ‘‘Effects of color of light on preferences, performance,
the RAZAK Faculty of Technology and Informat-
and welfare in broilers,’’ Poultry Sci., vol. 94, no. 8, pp. 1767–1775,
Aug. 2015, doi: 10.3382/ps/pev174. ics, Universiti Teknologi Malaysia, Kuala Lumpur. Her research interests
[102] H. A. Olanrewaju, J. L. Purswell, S. D. Collier, and S. L. Branton, ‘‘Effects include radio resource management, energy efficiency, cross-layer design
of color temperatures (Kelvin) of LED bulbs on blood physiological MAC, and smart sensors applications.
variables of broilers grown to heavy weights,’’ Poultry Sci., vol. 94, no. 8,
pp. 1721–1728, Aug. 2015, doi: 10.3382/ps/pev139. MOHAMAD ZULKEFLI ADAM received the
[103] M. Kashiha, A. Pluk, C. Bahr, E. Vranken, and D. Berckmans, ‘‘Devel-
Ph.D. degree from Loughborough University,
opment of an early warning system for a broiler house using image
U.K., in 2012, where the research of biometrics
interpretation,’’ in Proc. 8th Int. Conf. MDA, vol. 6, 2013, pp. 36–44.
[104] L. Chen and N. D. Georganas, ‘‘An efficient and robust algorithm for 3D (pattern analysis of face recognition) has been pur-
mesh segmentation,’’ Multimedia Tools Appl., vol. 29, no. 2, pp. 109–125, sued in the Department of Control and Electrical
Jun. 2006, doi: 10.1007/s11042-006-0002-x. Engineering. He has consistently studied pattern
[105] J. O. Rico-Contreras, A. A. Aguilar-Lasserre, J. M. Méndez-Contreras, analysis and understanding since his industrial
J. J. López-Andrés, and G. Cid-Chama, ‘‘Moisture content prediction practice in traffic control and surveillance system
in poultry litter using artificial intelligence techniques and Monte Carlo in the late 1990s, and continued with the mas-
simulation to determine the economic yield from energy use,’’ J. Environ. ter level of research in stereoscopic-based pattern
Manage., vol. 202, pp. 254–267, Nov. 2017. analysis in vision system at Universiti Teknologi Malaysia, in the early
[106] J. Brünger, I. Traulsen, and R. Koch, ‘‘Model-based detection of pigs 2000s. He has also completed his two-year post-doctoral program cum as a
in images under sub-optimal conditions,’’ Comput. Electron. Agricult., Visiting Research Fellow at King’s College London in 2019, with the project
vol. 152, pp. 59–63, Sep. 2018. of Smart Rural, the pattern analysis in the wider spectrum involving multi-
disciplinary approach with the aim of sustainable socio-economic research
and development. He currently serves the Universiti Teknologi Malaysia
as a Senior Lecturer for lecturing systems engineering, system testing and
evaluation, and focuses research in the area of sustainable technological
k-economic research and development.