Digital farming is the practice of modern technologies such as sensors, robotics, and data analysis for shifting from tedious operations to continuously automated processes. This paper reviews some of the latest achievements in agricultural robotics, specifically those that are used for autonomous weed control, field scouting, and harvesting. Object identification, task planning algorithms, digitalization and optimization of sensors are highlighted as some of the facing challenges in the context of digital farming. The concepts of multi-robots, human-robot collaboration, and environment reconstruction from aerial images and ground-based sensors for the creation of virtual farms were highlighted as some of the gateways of digital farming. It was shown that one of the trends and research focuses in agricultural field robotics is towards building a swarm of small scale robots and drones that collaborate together to optimize farming inputs and reveal denied or concealed information. For the case of robotic harvesting, an autonomous framework with several simple axis manipulators can be faster and more efficient than the currently adapted professional expensive manipulators. While robots are becoming the inseparable parts of the modern farms, our conclusion is that it is not realistic to expect an entirely automated farming system in the future.
Choosing the Best UAV Drones for Precision Agriculture and Smart Farming: Agr...Redmond R. Shamshiri
Best Drones For Agriculture, Exploring agricultural drones, Agricultural Drone Technology, Agricultural Drones for Sale, Choosing the Best UAV Drones for Precision Agriculture and Smart Farming: Agricultural drone buyer’s guide for farmers and agriculture service professionals
Agriculture 4.0- The future of farming technology Dishant James
The World Government Summit recently came out with an agenda to improve agricultural technologies by integrating farming with industry 4.0. The outcome would be a fourth agricultural revolution or Agriculture 4.0
Choosing the Best UAV Drones for Precision Agriculture and Smart Farming: Agr...Redmond R. Shamshiri
Best Drones For Agriculture, Exploring agricultural drones, Agricultural Drone Technology, Agricultural Drones for Sale, Choosing the Best UAV Drones for Precision Agriculture and Smart Farming: Agricultural drone buyer’s guide for farmers and agriculture service professionals
Agriculture 4.0- The future of farming technology Dishant James
The World Government Summit recently came out with an agenda to improve agricultural technologies by integrating farming with industry 4.0. The outcome would be a fourth agricultural revolution or Agriculture 4.0
An agricultural robot is a robot deployed for agricultural purposes. ... Emerging applications of robots or drones in agriculture include weed control, cloud seeding, planting seeds, harvesting, environmental monitoring and soil analysis.
Agricultural robots automate slow, repetitive, and dull tasks for farmers, allowing them to focus more on improving overall production yields. Some of the most common robots in agriculture are used for: Harvesting and picking.
This is the new technology to increase food production mostly horticulture production and also used in Agronomic crop production. This technology can overcome many problems which create problems at farm level as well as storage level.
Agricultural Robots or agribot is a robot deployed for agricultural purposes. The main area of application of robots in agriculture is at the harvesting stage. Fruit picking robots, driverless tractor / sprayer, and sheep shearing robots are designed to replace human labor. In most cases, a lot of factors have to be considered (e.g., the size and color of the fruit to be picked) before the commencement of a task. Robots can be used for other horticultural tasks such as pruning, weeding, spraying and monitoring. Robots can also be used in livestock applications (livestock robotics) such as automatic milking, washing and castrating. Robots like these have many benefits for the agricultural industry, including a higher quality of fresh produce, lower production costs, and a smaller need for manual labor.
Indian agriculture: Mechanization to DigitizationICRISAT
India is characterized by small farm holdings. More than 80% of the land holdings are less than 2 ha (5 acres). About 55% of India’s population is engaged in Agriculture with 40% farm mechanization. Due to non-remunerative nature of farming, more than 50% farmers in India are in debt. This situation has constrained farmers from investing in mechanization and other technologies.
-> ICRISAT Director General Dr David Bergvinson's presentation at the CII Agri business and Mechanization Summit held in New Delhi, India on 01 Sep 2015.
ROLE OF IOT ,SENSORS AND NANOBIOSENSORS IN AGRICULTURE.pptxarchana reddy
Role of Internet of things, sensors, biosensors in agriculture,types of sensors,Artificial intelligence is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence.
Machine learning is a branch of (AI) and computer science which focuses on the use of data and algorithms to imitate the way that humans learn, gradually improving its accuracy.
Deep learning is a subset of ML, which is essentially a neural network with three or more layers, which attempt to simulate the behavior of the human brain—though far from matching its ability—allowing it to “learn” from large amounts of data.
Pradhan Mantri Fasal Bima Yojana (PMFBY)
This is a government-sponsored crop insurance scheme that integrates multiple stakeholders on a single platform. To improve the crop sector, the government will now envisage the use of innovative technologies like AI, remote sensing imageries, and modelling tools to reduce the time lag for settling of claims of the farmers. By analysing the data collected, the scheme aims at increasing the crop insurance penetration in India by increasing farmer awareness and reducing farmer premium rates.
PM-KISAN
By leveraging the benefits of AI, the government of India has rolled out a scheme — PM-KISAN, where every farmer is going to receive Rs. 6000 annually to support their farming abilities. The government is aimed to leverage the huge amount of collected data by several agri-schemes and use the same to better target the farmer who requires the benefit of PM-KISAN.
The data will be used in creating a proper framework for farmers, along with the right policy. It will also help in converging some government projects to achieve the targeted development of farmers and the overall sector.
In a bid to push innovative technologies in agriculture secure, the government of India has also launched another initiative — AGRI-UDAAN – Food & Agribusiness accelerator 2.0 to mentor 40 agricultural startups from cities like Chandigarh, Ahmedabad, Pune, Bengaluru, Kolkata and Hyderabad, and enable them to connect with potential investors. This initiative is a six-month-programme in which shortlisted Agri startups with innovative business models will be mentored and guided to improve their operations, enhance commercialisation, improve product validation and business plan preparation, risk analysis, customer engagement, finance management, and fundraising. These shortlisted startups will also stand a chance of receiving $40,000 as funding assistanceLaunched earlier this year, the project is based out of Maharashtra — seeks to use innovative technologies to address various risks related to cultivation such as poor rains, pest attacks, etc. and to accurately predict crop yielding. The project will also use this data to inform farmers about several policy requirements including pricing, warehousing and crop
A transplanter is an Agriculture machine used for transplanting seedlings to the field. This is very important as it reduces the time taken to transplant seedlings (when compared to manual transplanting), thus allowing more time for harvesting. It also reduces the use of manual energy. Paddy transplanter machines are comes in 2 , 3 , 4 up-to 6 rows for large capacity field .
Drones-as-a-Service for agricultural applications (by Philipp Trénel)TUS Expo
At TUS Nordics 2017, Philipp Trénel gave the presentation ‘Drones-as-a-Service for agricultural applications’ in our Arctic track, on Thursday 12 October 2017.
This is based on a research study on the application of drone technology in India and showcase the benefits of its applicability to the agricultural sector in rendering services which in the past tends to be very tedious in executing.
Today the use of data is having a very revolutionized effect with
cultivatable land in decline demand for food increasing from
developing countries farmers.
Farmers who use data are capable of turning ordinary harvests into
bumper crops and profits behind.This is the precision agriculture hub connecting the world’s biggest agricultural businesses farmers and suppliers using integrated software solutions.
Authors are invited to submit theoretical or empirical papers in all aspects of management, including strategy, human resources, marketing, operations, technology, information systems, finance and accounting, business economics, and public sector management. IJMRR is an international forum for research that advances the theory and practice of management. The journal publishes original works with practical significance and academic value.
An agricultural robot is a robot deployed for agricultural purposes. ... Emerging applications of robots or drones in agriculture include weed control, cloud seeding, planting seeds, harvesting, environmental monitoring and soil analysis.
Agricultural robots automate slow, repetitive, and dull tasks for farmers, allowing them to focus more on improving overall production yields. Some of the most common robots in agriculture are used for: Harvesting and picking.
This is the new technology to increase food production mostly horticulture production and also used in Agronomic crop production. This technology can overcome many problems which create problems at farm level as well as storage level.
Agricultural Robots or agribot is a robot deployed for agricultural purposes. The main area of application of robots in agriculture is at the harvesting stage. Fruit picking robots, driverless tractor / sprayer, and sheep shearing robots are designed to replace human labor. In most cases, a lot of factors have to be considered (e.g., the size and color of the fruit to be picked) before the commencement of a task. Robots can be used for other horticultural tasks such as pruning, weeding, spraying and monitoring. Robots can also be used in livestock applications (livestock robotics) such as automatic milking, washing and castrating. Robots like these have many benefits for the agricultural industry, including a higher quality of fresh produce, lower production costs, and a smaller need for manual labor.
Indian agriculture: Mechanization to DigitizationICRISAT
India is characterized by small farm holdings. More than 80% of the land holdings are less than 2 ha (5 acres). About 55% of India’s population is engaged in Agriculture with 40% farm mechanization. Due to non-remunerative nature of farming, more than 50% farmers in India are in debt. This situation has constrained farmers from investing in mechanization and other technologies.
-> ICRISAT Director General Dr David Bergvinson's presentation at the CII Agri business and Mechanization Summit held in New Delhi, India on 01 Sep 2015.
ROLE OF IOT ,SENSORS AND NANOBIOSENSORS IN AGRICULTURE.pptxarchana reddy
Role of Internet of things, sensors, biosensors in agriculture,types of sensors,Artificial intelligence is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence.
Machine learning is a branch of (AI) and computer science which focuses on the use of data and algorithms to imitate the way that humans learn, gradually improving its accuracy.
Deep learning is a subset of ML, which is essentially a neural network with three or more layers, which attempt to simulate the behavior of the human brain—though far from matching its ability—allowing it to “learn” from large amounts of data.
Pradhan Mantri Fasal Bima Yojana (PMFBY)
This is a government-sponsored crop insurance scheme that integrates multiple stakeholders on a single platform. To improve the crop sector, the government will now envisage the use of innovative technologies like AI, remote sensing imageries, and modelling tools to reduce the time lag for settling of claims of the farmers. By analysing the data collected, the scheme aims at increasing the crop insurance penetration in India by increasing farmer awareness and reducing farmer premium rates.
PM-KISAN
By leveraging the benefits of AI, the government of India has rolled out a scheme — PM-KISAN, where every farmer is going to receive Rs. 6000 annually to support their farming abilities. The government is aimed to leverage the huge amount of collected data by several agri-schemes and use the same to better target the farmer who requires the benefit of PM-KISAN.
The data will be used in creating a proper framework for farmers, along with the right policy. It will also help in converging some government projects to achieve the targeted development of farmers and the overall sector.
In a bid to push innovative technologies in agriculture secure, the government of India has also launched another initiative — AGRI-UDAAN – Food & Agribusiness accelerator 2.0 to mentor 40 agricultural startups from cities like Chandigarh, Ahmedabad, Pune, Bengaluru, Kolkata and Hyderabad, and enable them to connect with potential investors. This initiative is a six-month-programme in which shortlisted Agri startups with innovative business models will be mentored and guided to improve their operations, enhance commercialisation, improve product validation and business plan preparation, risk analysis, customer engagement, finance management, and fundraising. These shortlisted startups will also stand a chance of receiving $40,000 as funding assistanceLaunched earlier this year, the project is based out of Maharashtra — seeks to use innovative technologies to address various risks related to cultivation such as poor rains, pest attacks, etc. and to accurately predict crop yielding. The project will also use this data to inform farmers about several policy requirements including pricing, warehousing and crop
A transplanter is an Agriculture machine used for transplanting seedlings to the field. This is very important as it reduces the time taken to transplant seedlings (when compared to manual transplanting), thus allowing more time for harvesting. It also reduces the use of manual energy. Paddy transplanter machines are comes in 2 , 3 , 4 up-to 6 rows for large capacity field .
Drones-as-a-Service for agricultural applications (by Philipp Trénel)TUS Expo
At TUS Nordics 2017, Philipp Trénel gave the presentation ‘Drones-as-a-Service for agricultural applications’ in our Arctic track, on Thursday 12 October 2017.
This is based on a research study on the application of drone technology in India and showcase the benefits of its applicability to the agricultural sector in rendering services which in the past tends to be very tedious in executing.
Today the use of data is having a very revolutionized effect with
cultivatable land in decline demand for food increasing from
developing countries farmers.
Farmers who use data are capable of turning ordinary harvests into
bumper crops and profits behind.This is the precision agriculture hub connecting the world’s biggest agricultural businesses farmers and suppliers using integrated software solutions.
Authors are invited to submit theoretical or empirical papers in all aspects of management, including strategy, human resources, marketing, operations, technology, information systems, finance and accounting, business economics, and public sector management. IJMRR is an international forum for research that advances the theory and practice of management. The journal publishes original works with practical significance and academic value.
APPLICATION OF INFORMATION AND COMMUNICATION TOOLS (ICTs) IN MODERN AGRICULTURESREENIVASAREDDY KADAPA
ICT can deliver fast, reliable, and accurate information in a user-friendly manner for practical utilization by the end-user. ICT includes any communication device or application encompassing radio, television, cellular phones, computer and network hardware and software, satellite systems, and as well as the various services and applications associated with them, such as videoconferencing and digital learning.
A Proposed Model for Mobile Cloud Computing in Agricultureijsrd.com
This paper presents the recent development and application of mobile phones and cloud computing in agriculture. Basic concepts and technologies associated with mobile phones and cloud computing is highlighted. For better communication, sharing of information and profitability in agriculture, there is need for collaboration of cloud computing and mobile technology. This paper presents a framework in which a farmer can utilize mobile cloud computing on their handsets using various applications, to assist them for relatively better cultivation and marketing. The main application of this proposed framework is focused on to eliminate the problem of data storage, computational processing and sharing of information
Agriculture is essential to the prosperity of agricultural countries like India.
Thus, the suggested strategy is to use automation and internet of thing (IoT)
technology to make agriculture smart. Applications enabled by the IoTs
include irrigation decision assistance, crop growth monitoring and selection,
and more. an Arduino-powered technology that boosts agricultural
productivity. This study's main goal is to find the least quantity of water
necessary to grow crops. Most farmers squander a lot of time on the fields
rather than concentrating on the water that plants have access to at the right
moment. The suggested system determines the required amount of water
based on the data obtained from the sensors. Two sensors provide data on
the soil's temperature, humidity, amount of sunlight each day, and soil
temperature to the base station. The suggested systems must determine the
amount of water required for irrigation based on these criteria. The system's
main benefit is the use of precision agriculture (PA) in conjunction with
cloud computing, which will maximise the use of water fertilisers while
maximising crop yields and also assist in determining field weather
conditions.
Today, majority of the farmers are dependent on agriculture for their survival. But
majority of the agricultural tools and practices are outdated and it yields less crop
products, because everything is depends on environment and Government support. The
world population is becoming more comparatively cultivation land and crop yield. It is
essential for the world to increase the yielding of the crop by adopting information
technology and communication plays a vital role in smart farming. The objective of this
research paper to present tools and best practices for understanding the role of
information and communication technologies in agriculture sector, motivate and make
the illiterate farmers to understand the best insights given by the big data analytics using
machine learning
Assessing the advancement of artificial intelligence and drones’ integration ...IJECEIAES
Integrating artificial intelligence (AI) with drones has emerged as a promising paradigm for advancing agriculture. This bibliometric analysis investigates the current state of research in this transformative domain by comprehensively reviewing 234 pertinent articles from Scopus and Web of Science databases. The problem involves harnessing AI-driven drones' potential to address agricultural challenges effectively. To address this, we conducted a bibliometric review, looking at critical components, such as prominent journals, co-authorship patterns across countries, highly cited articles, and the co-citation network of keywords. Our findings underscore a growing interest in using AI-integrated drones to revolutionize various agricultural practices. Noteworthy applications include crop monitoring, precision agriculture, and environmental sensing, indicative of the field’s transformative capacity. This pioneering bibliometric study presents a comprehensive synthesis of the dynamic research landscape, signifying the first extensive exploration of AI and drones in agriculture. The identified knowledge gaps point to future research opportunities, fostering the adoption and implementation of these technologies for sustainable farming practices and resource optimization. Our analysis provides essential insights for researchers and practitioners, laying the groundwork for steering agricultural advancements toward an enhanced efficiency and innovation era.
APPLICATION OF BIG DATA IN ENHANCING EFFECTIVE DECISION MAKING IN AGRICULTURA...Sjaak Wolfert
The agriculture production system increasingly becomes data-driven and data-enabled based on the cyber-physical management cycle. This paper describes several IoT-applications of the EU-funded IoF2020 project in which data and data-sharing plays a crucial role. It provides an integrative framework aiming at cross-fertilisation, co-creation and co-ownership of results. Technical integration, business support and ecosystem development are key mechanisms to realize this.
Remote Sensing (RS), UAV/drones, and Machine Learning (ML) as powerful techni...nitinrane33
Precision agriculture utilizes modern technology to optimize agricultural practices, resulting in increased productivity while reducing costs and environmental impact. The use of remote sensing (RS), drones or unmanned aerial vehicles (UAVs), and machine learning (ML) has significantly transformed precision agriculture. These advanced technologies provide farmers with accurate, cost-effective, and timely tools to manage crops and resources effectively. This paper evaluates the use of these techniques in precision agriculture, including their benefits, and effective applications. Remote sensing involves using satellites, aircraft, or drones to collect data on crops and the environment, such as soil moisture, temperature, and vegetation indices. With high-resolution images and three-dimensional maps of crops, UAVs enable farmers to identify and address issues like pest infestations or nutrient deficiencies. Machine learning algorithms analyze large amounts of data to predict crop yields, optimize irrigation and fertilization, and identify areas of the field that need attention. Several case studies highlight the effectiveness of these techniques in different agricultural settings. However, the paper also acknowledges the challenges associated with adopting these technologies, such as cost, data management, and regulatory issues. While the initial investment in drones and sensors may be high, the long-term benefits in terms of increased yields, reduced costs, and environmental sustainability are substantial. Farmers need to be trained in the use of these technologies to make informed decisions, and effective data management and analysis are crucial. Additionally, regulatory frameworks are still evolving, and clear guidelines are required for data privacy, safety, and ethical use. Although challenges remain, the benefits of increased productivity, reduced costs, and environmental sustainability make these technologies an attractive investment for farmers worldwide.
IoT Based Intelligent Management System for Agricultural Applicationijtsrd
The growth of technology in any sector is not there in agriculture and this is a problem for India. The government has struggled to do anything for the farming sector which is in an exceptionally deplorable state. The pause in decision making also has led to Indias high rate of unemployment owing to the quality of the economy. The applications in well developed countries involve robotics, aircraft, and artificial intelligence, but they can raise the cost of running and sustain. Currently, operating drones such as these is difficult. In India, only a few farmers can afford to employ such high tech machinery to farm owing to financial constraints. The project is aimed at developing an affordable quad copter for farmers to use on their crops, with the goal of growing their output. We are developing core a framework with support of Raspberry Pi and OpenCV that can help predict crops yield with the help of inputs from numerous different sensor packages. Venkat. P. Patil | Umakant B. Gohatre | Anushka Mhaskar | Akash Jadhav | Prajwal Shetty | Yash Jadhav "IoT Based Intelligent Management System for Agricultural Application" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-2 , February 2021, URL: https://www.ijtsrd.com/papers/ijtsrd38578.pdf Paper Url: https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/38578/iot-based-intelligent-management-system-for-agricultural-application/venkat-p-patil
Development and Field Evaluation of a Multichannel LoRa Sensor for IoT Monito...Redmond R. Shamshiri
Evaluation of long-range wireless transceivers with respect to their power consumption, network connectivity, and coverage under extreme field conditions is necessary prior to their deployment in large scale commercial orchards. This paper reports on the development and field performance of an affordable multi-channel wireless data acquisition for IoT monitoring of environmental variations in berry orchards. A connectivity board was custom-designed based on the powerful dual-core 32-bit microcontroller with WiFi antenna and LoRa modulation at 868MHz. The objective was to verify the possibility of transmitting multiple sensor readings with lower power consumption while increasing the reliability and stability of wireless communication at long distances (over 1.7 km). Collected data from the wireless sensor was compared and found to be consistent with measurements of a data logger installed in the same locations. The presented paper highlights the advantages of LoRa sensors for digital agriculture and the experience in real-time monitoring of environmental parameters in berry orchards.
SunBot: Autonomous Nursing Assistant for Emission-Free Berry Production, Gene...Redmond R. Shamshiri
This paper is dedicated to the general concept and simulation framework used to develop an autonomous electric tractor-mower combination. Thus enabling preliminary studies and experiments to construct a functional model of an autonomous electric tractor that is capable of sensing the environment, navigate in shrubbery orchards, identify and avoid obstacles. For this cause a distributed ROS-based framework has been designed mirroring the modular control architecture of the different ECU’s and the main on-board controller. The applied V-REP, ROS simulation offers standard features such as hardware abstraction, low-level device control, commonly used functionalities, message-passing between processes, and package management. The framework reduces efforts in code and application development that can be shared by all sensors and actuators. The simulation framework was applied to tune algorithms, test and validate different sensing and control strategies.
Smart Management of Oil Palm Plantations with Autonomous UAV Imagery and Robu...Redmond R. Shamshiri
A Breakthrough in Oil Palm Precision Agriculture: Smart Management of Oil Palm Plantations with Autonomous UAV Imagery and Robust Machine Vision
Link to the publication
https://www.researchgate.net/publication/306348123_A_Breakthrough_in_Oil_Palm_Precision_Agriculture_Smart_Management_of_Oil_Palm_Plantations_with_Autonomous_UAV_Imagery_and_Robust_Machine_Vision
Development of a Field Robot Platform for Mechanical Weed Control in Greenhou...Redmond R. Shamshiri
A prototype robot that moves on a monorail along the greenhouse for weed elimination between cucumber plants was designed and developed. The robot benefits from three arrays of ultrasonic sensors for weed detection and a PIC18 F4550-E/P microcontroller board for processing. The feedback from the sensors activates a robotic arm, which moves inside the rows of the cucumber plants for cutting the weeds using rotating blades. Several experiments were carried out inside a greenhouse to find the best combination of arm motor (AM) speed, blade rotation (BR) speed, and blade design. We assigned three BR speeds of 3500, 2500, and 1500 rpm, and two AM speed of 10 and 30 rpm to three blade designs of S-shape, triangular shape, and circular shape. Results indicated that different types of blades, different BR speed, and different AM speed had significant effects (P < 0.05) on the percentage of weeds cut (PWC); however, no significant interaction effects were observed. The comparison between the interaction effect of the factors (three blade designs, three BR speeds, and two AM speeds) showed that maximum mean PWC was equal to 78.2% with standard deviation of 3.9% and was achieved with the S-shape blade when the BR speed was 3500 rpm, and the AM speed was 10 rpm. Using this setting, the maximum PWC that the robot achieved in a random experiment was 95%. The lowest mean PWC was observed with the triangular-shaped blade (mean of 50.39% and SD = 1.86), which resulted from BR speed of 1500 rpm and AM speed of 30 rpm. This study can contribute to the commercialization of a reliable and affordable robot for automated weed control in greenhouse cultivation of cucumber.
Fundamental Research on Unmanned Aerial Vehicles to Support Precision Agricul...Redmond R. Shamshiri
Unmanned aerial vehicles carrying multimodal sensors for precision agriculture (PA) applications face adaptation challenges to satisfy reliability, accuracy, and timeliness. Unlike ground platforms, UAV/drones are subjected to additional considerations such as payload, flight time, stabilization, autonomous missions, and external disturbances. For instance, in oil palm plantations (OPP), accruing high resolution images to generate multidimensional maps necessitates lower altitude mission flights with greater stability. This chapter addresses various UAV-based smart farming and PA solutions for OPP including health assessment and disease detection, pest monitoring, yield estimation, creation of virtual plantations, and dynamic Web-mapping. Stabilization of UAVs was discussed as one of the key factors for acquiring high quality aerial images. For this purpose, a case study was presented on stabilizing a fixed-wing Osprey drone crop surveillance that can be adapted as a remote sensing research platform. The objective was to design three controllers (including PID, LQR with full state feedback, and LQR plus observer) to improve the automatic flight mission. Dynamic equations were decoupled into lateral and longitudinal directions, where the longitudinal dynamics were modeled as a fourth order two-inputs-two-outputs system. State variables were defined as velocity, angle of attack, pitch rate, and pitch angle, all assumed to be available to the controller. A special case was considered in which only velocity and pitch rate were measurable. The control objective was to stabilize the system for a velocity step input of 10m/s. The performance of noise effects, model error, and complementary sensitivity was analyzed.
An Overview of the System of Rice Intensification for Paddy Fields of MalaysiaRedmond R. Shamshiri
Objectives: The objective of this paper was to present a general overview of rice agronomic practices and transplanting operations by considering the interactions of soil, plant, and machine relationship in line with the System of Rice Intensification (SRI) cultivation practice. Methods: Some of the problems challenging Malaysian rice growers, as well as yield increase and total rice production in the last four decades, were first addressed and discussed. The trend in the world rice production between 1961 and 2014 was used to predict the production in 2020 and to show that Southeast Asian countries are expected to increase their production by 27.2%. Findings: A consistently increasing pattern from 3.1 tons/ha during 1981 to 4.1 tons/ha in 2014 was observed in the rice yield of Malaysia due to the advances in technology and improved farming operations coupled with integrated management and control of resources. Various literature were reviewed and their findings of the best transplanting practices were summarized to discuss how SRI contributes to the production of higher rice yield with improved transplanting practices through a more effective root system. Our review shows that wider spacing, availability of solar radiation, medium temperature, soil aeration, and nutrient supply promote shorter Phyllochrons which increase the number of tillers in rice. In this regard, modification and development of a transplanter that complies with SRI specification require determination of optimum transplanting spacing, seed rate, and planting pattern to significantly improve yield. Improvement: It was concluded that for maximum yield, the SRI method in Malaysia should emphasize on the planting of one seedling per hill with space of 0.25 m for optimum water consumption, nutrient and pest management.
Robotic Harvesting of Fruiting Vegetables: A Simulation Approach in V-REP, RO...Redmond R. Shamshiri
In modern agriculture, there is a high demand to move from tedious manual harvesting to a continuously automated operation. This chapter reports on designing a simulation and control platform in V-REP, ROS and MATLAB for experimenting with sensors and manipulators in robotic harvesting of sweet pepper. The objective was to provide a completely simulated environment for improvement of visual servoing task through easy testing and debugging of control algorithms with zero damage risk to the real robot and to the actual equipment. A simulated workspace, including an exact replica of different robot manipulators, sensing mechanisms, and sweet pepper plant, and fruit system was created in V-REP. Image moment method visual servoing with eye-in-hand configuration was implemented in MATLAB, and was tested on four robotic platforms including Fanuc LR Mate 200iD, NOVABOT, multiple linear actuators, and multiple SCARA arms. Data from simulation experiments were used as inputs of the control algorithm in MATLAB, whose output were sent back to the simulated workspace and to the actual robots. ROS was used for exchanging data between the simulated environment and the real workspace via its publish and subscribe architecture. Results provided a framework for experimenting with different sensing and acting scenarios, and verified the performance functionality of the simulator.
An Introduction to Controlled Greenhouse Plant Production Systems For Tropica...Redmond R. Shamshiri
Introduction, Greenhouses in the Netherlands vs. Malaysia, Some Definitions, History, Simple fact yet ignored, Concept of Adaptive Solution applied to Greenhouse, Introduction to Greenhouse Automation and Control System Engineering, Open-field vs. Closed-field, Vegetable production in the highlands and lowlands of Malaysia Malaysian Strategy and Policy on Vegetable Production (2011-2020)
Robotic Harvesting of Fruiting Vegetables, “Acceleration by Simulation”Redmond R. Shamshiri
Robotic Harvesting of Fruiting Vegetables, “Acceleration by Simulation”
Presented at the Acceleration Workshop Robotics & Crop Sensing in Greenhouses, 11-12 September 2017. Delf University of Technology, The Netherlands
Robotic Harvesting with NOVABOT innovative manipulator
https://youtu.be/R38IoVcOVt0
Robotic Harvesting with multiple SCARA manipulators
https://youtu.be/TLLW3S-55ls
Robotic Harvesting with Array of Linear Actuators
https://youtu.be/iFu7FAxLvmg
Robotic Harvesting with fanuc lr mate 200id (Visual Servo Control in V-REP, ROS, MATLAB)
https://youtu.be/BwRBUeB812s
Robotic Harvesting, Simulation of Environment and Fruit/Plant Scan
https://youtu.be/XD3J7b0cDGM
Advanced Visual Servo Control in V-REP for Robotic harvesting of sweet pepper
https://youtu.be/VupoirQOL0Y
Robotic Harvesting of Sweet Pepper, Ubuntu, V-REP, ROS Environment Setup
https://youtu.be/tKagjNQ9FW4
Real-time, robust and rapid red-pepper fruit detection with Matlab
https://youtu.be/rFV6Y5ivLF8
Talk
https://youtu.be/QZawPeg3wEQ
Dynamic Assessment of Air Temperature for Tomato (Lycopersicon Esculentum) Cu...Redmond R. Shamshiri
Net-screen covered greenhouses operating on natural ventilation are used as a sustainable approach for closed-field cultivation of fruits and vegetables and to eliminate insect passage and subsequent production damage. The objective of this work was to develop a real-time assessment framework for evaluating air-temperature inside an insect-proof net-screen greenhouse in tropical lowlands of Malaysia prior to cultivation of tomato. Mathematical description of a growth response model was implemented and used in a computer application. A custom-designed data acquisition system was built for collecting 6 months of air-temperature data, during July to December 2014. For each measured air-temperature (T), an optimality degree, denoted by Opt(T), was calculated with respect to different light conditions (sun, cloud, night) and different growth stages. Interactive three-dimensional plots were generated to demonstrate variations in Opt(T) values due to different hours and days in a growth season. Results showed that, air temperature was never less than 25% optimal for early growth, and 51% for vegetative to mature fruiting growth stages. The average Opt(T) in the entire 6 months was between 65 and 75%. The presented framework allows tomato growers to automatically collect and process raw air temperature data and to simulate growth responses at different growth stages and light conditions. The software database can be used to track and record Opt(T) values from any greenhouses with different structure design, covering materials, cooling system and growing seasons, and to contribute to knowledge-based decision support systems and energy balance models.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...ssuser7dcef0
Power plants release a large amount of water vapor into the
atmosphere through the stack. The flue gas can be a potential
source for obtaining much needed cooling water for a power
plant. If a power plant could recover and reuse a portion of this
moisture, it could reduce its total cooling water intake
requirement. One of the most practical way to recover water
from flue gas is to use a condensing heat exchanger. The power
plant could also recover latent heat due to condensation as well
as sensible heat due to lowering the flue gas exit temperature.
Additionally, harmful acids released from the stack can be
reduced in a condensing heat exchanger by acid condensation. reduced in a condensing heat exchanger by acid condensation.
Condensation of vapors in flue gas is a complicated
phenomenon since heat and mass transfer of water vapor and
various acids simultaneously occur in the presence of noncondensable
gases such as nitrogen and oxygen. Design of a
condenser depends on the knowledge and understanding of the
heat and mass transfer processes. A computer program for
numerical simulations of water (H2O) and sulfuric acid (H2SO4)
condensation in a flue gas condensing heat exchanger was
developed using MATLAB. Governing equations based on
mass and energy balances for the system were derived to
predict variables such as flue gas exit temperature, cooling
water outlet temperature, mole fraction and condensation rates
of water and sulfuric acid vapors. The equations were solved
using an iterative solution technique with calculations of heat
and mass transfer coefficients and physical properties.
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERSveerababupersonal22
It consists of cw radar and fmcw radar ,range measurement,if amplifier and fmcw altimeterThe CW radar operates using continuous wave transmission, while the FMCW radar employs frequency-modulated continuous wave technology. Range measurement is a crucial aspect of radar systems, providing information about the distance to a target. The IF amplifier plays a key role in signal processing, amplifying intermediate frequency signals for further analysis. The FMCW altimeter utilizes frequency-modulated continuous wave technology to accurately measure altitude above a reference point.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Research and development in agricultural robotics: A perspective of digital farming
1. July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4 1
Research and development in agricultural robotics:
A perspective of digital farming
Redmond Ramin Shamshiri1,2,3*
, Cornelia Weltzien2
, Ibrahim A. Hameed3
, Ian J. Yule4
,
Tony E. Grift5
, Siva K. Balasundram1
, Lenka Pitonakova6
, Desa Ahmad7
, Girish Chowdhary5
(1. Department of Agriculture Technology, Faculty of Agriculture, Universiti Putra Malaysia, 43400, Serdang, Selangor, Malaysia;
2. Leibniz Institute for Agricultural Engineering and Bioeconomy, Max-Eyth-Allee 100, 14469 Potsdam-Bornim, Germany; 3. Dept. of ICT
and Natural Sciences, Faculty of Information Technology and Electrical Engineering, Norwegian University of Science and Technology
(NTNU), Larsgårdsveien 2, NO-6009 Ålesund, Norway; 4. New Zealand Centre for Precision Agriculture (NZCPA), School of Agriculture
and Environment, Massey University, Private Bag 11 222, Palmerston North 4442, New Zealand; 5. Department of Agricultural and
Biological Engineering, University of Illinois at Urbana Champaign, 1304 West Pennsylvania Avenue Urbana, IL 61801, USA;
6. Department of Computer Science, University of Bristol, Bristol, United Kingdom;
7. Department of Biological and Agricultural Engineering, Faculty of Engineering, Universiti Putra Malaysia)
Abstract: Digital farming is the practice of modern technologies such as sensors, robotics, and data analysis for shifting from
tedious operations to continuously automated processes. This paper reviews some of the latest achievements in agricultural
robotics, specifically those that are used for autonomous weed control, field scouting, and harvesting. Object identification,
task planning algorithms, digitalization and optimization of sensors are highlighted as some of the facing challenges in the
context of digital farming. The concepts of multi-robots, human-robot collaboration, and environment reconstruction from
aerial images and ground-based sensors for the creation of virtual farms were highlighted as some of the gateways of digital
farming. It was shown that one of the trends and research focuses in agricultural field robotics is towards building a swarm of
small scale robots and drones that collaborate together to optimize farming inputs and reveal denied or concealed information.
For the case of robotic harvesting, an autonomous framework with several simple axis manipulators can be faster and more
efficient than the currently adapted professional expensive manipulators. While robots are becoming the inseparable parts of
the modern farms, our conclusion is that it is not realistic to expect an entirely automated farming system in the future.
Keywords: agricultural robotics, precision agriculture, virtual orchards, digital agriculture, simulation software, multi-robots
DOI: 10.25165/j.ijabe.20181104.4278
Citation: Shamshiri R R, Weltzien C, Hameed I A, Yule I J, Grift T E, Balasundram S K, et al. Research and development in
agricultural robotics: A perspective of digital farming. Int J Agric & Biol Eng, 2018; 11(4): 1–14.
1 Introduction
Modern farms are expected to produce more yields with higher
quality at lower expenses in a sustainable way that is less
Received date: 2018-03-28 Accepted date: 2018-07-05
Biographies: Cornelia Weltzien, PhD, Professor, research interests: mechanical
engineering, control systems and agricultural engineering. Email:
CWeltzien@atb-potsdam.de; Ibrahim A. Hameed, PhD, Associate Professor,
research interests: machine learning, AI, optimization and robotics. Email:
ibib@ntnu.no; Ian J. Yule, PhD, Professor, President of the International
Society of Precision Agriculture, research interests: digital agriculture, system
modeling and optimization, remote sensing. Email: I.J.Yule@massey.ac.nz;
Tony E. Grift, PhD, Associate Professor, research interests: Agricultural
Robotics, Advanced machinery for biosystems applications, Automation and
control. Email: grift@illinois.edu; Siva K. Balasundram, PhD, Associate
Professor, research interests: precision agriculture, information system and
technology. Email: siva@upm.edu.my; Lenka Pitonakova, PhD, research
interests: swarm robotics, simulation of complex systems, neural networks,
unsupervised learning. Email: contact@lenkaspace.net; Desa Ahmad, PhD,
Professor, research interests: soil machine mechanics, agricultural machinery
engineering, agricultural mechanization, controlled environment and smart
farming. Email: desa@upm.edu.my; Girish Chowdhary, PhD, Assistant
Professor, research interests: Intelligent systems, field-robotics, multiple aerial
vehicles. Email: girishc@illinois.edu.
*Corresponding Author: Redmond Ramin Shamshiri, PhD, research interests:
control systems and dynamics, simulation and modeling. Department of
Agriculture Technology, Faculty of Agriculture, Universiti Putra Malaysia,
43400, Serdang, Selangor, Malaysia. Tel: +60-3894-68472, Fax: +60-
386567099, Email: raminshamshiri@upm.edu.my.
dependent on the labor force. Implementation of digital farming
and site-specific precision management are some of the possible
responses to this expectation, which depends not only on the sensor
technology but the continuous collection of field data that is only
feasible through proper utilization of agricultural robots.
Agricultural scientists, farmers, and growers are also facing the
challenge of producing more food from less land in a sustainable
way to meet the demands of the predicted 9.8 billion populations in
2050[1]
. That is equivalent of feeding a newly added city of
200 000 people every day. Integration of digital tools, sensors,
and control technologies has accelerated design and developments
of agricultural robotics, demonstrating significant potentials and
benefits in modern farming. These evolutions range from
digitizing plants and fields by collecting accurate and detailed
temporal and spatial information in a timely manner, to
accomplishing complicated nonlinear control tasks for robot
navigation. Autonomous guided tractors and farm machinery
equipped with local and global sensors for operating in row-crops
and orchards have already become mature. Examples include the
John Deere iTEC Pro (Deere & Company, Moline, Illinois) which
uses Global Navigation Satellite System for steering control, and
the Claas autonomous navigation (Harsewinkel, Ostwestfalen-
Lippe, Germany) which offers Cam Pilot steering and 3D computer
vision in addition to the GPS-based control to follow features on
the ground. Agricultural field robots and manipulators have
become an important part in different aspects of digital farming[2]
2. 2 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
and precision agriculture[3]
. With the advances in controls theory,
applications of these robots in digital farming have shown growing
interest towards automation, changing the traditional field activists
to high-tech industrial tasks that are attracting investors,
professional engineers, and companies. While many are still in
the prototype phase, these robots are now capable of performing
various farming operations, including crop scouting[4]
, pest and
weed control[5]
, harvesting[6-10]
, targeted spraying[11,12]
,
pruning[13,14]
, milking[15,16]
, Phenotyping[17,18]
, and sorting[19]
.
Unlike the industrial case, these applications can be extremely
challenging to be fully automated. An agricultural robot is
subjected to an extremely dynamic environment, and yet expected to
touch, sense, or manipulate the crop and the surroundings in a
precise manner which makes it necessary to have the minimal
amount of impact while increasing efficiency[20]
. Although
industrial robotic platform with precision accuracy and speed are
available, their application in agriculture is limited due to what we
refer to as unstructured environments and uncertain tasks which
impose great challenges. For example, the demand for off-season
cultivation of fruits and vegetables require different aspects of
automation and robotics in closed-field plant production
environments like greenhouses[21]
. A field robot with spraying,
de-leafing, and harvesting manipulator and end-effector for such
tasks in a dynamic, complex, and uncertain environment should
take into account the different arrangements of plant sizes and
shapes, stems, branches, leaves, fruit color, texture, obstacles, and
weather influences in order to operate efficiently in the real world
condition. In the case of harvesting for example, the sensing
mechanism has to identify the ripeness of fruits in the presence of
various disturbances in an unpredicted heterogeneous environment,
while the actuation mechanism should perform motion and path
planning to navigate inside the plant system or tree canopy with
minimum collisions for grasping and removing the soft fruit
delicately. This is by far more challenging compared to an
industrial robot in charge of picking and placing a solid bolt in an
assembly line.
The organization of this paper is as follow: In Section 2 we
have provided a literature review on the research and development
in agricultural robotics followed by separated discussions focused
on weed control, field scouting, and harvesting robots. Section 3
highlights the perspective of agricultural robotics and the
opportunities for digital farming and virtual orchards. Section 4
extends our discussion on the challenges of digitalization,
automation, and optimization of robotics for precision agriculture.
A summary of findings and conclusions are presented in section 5.
2 Research and development in agricultural robotics
Research works on agricultural robotics cover a wide range of
applications, from automated harvesting using professional
manipulators that are integrated with custom designed mobile
platforms and innovative grippers such as the one shown in Figure
1, or autonomous targeted spraying for pest control in commercial
greenhouses[22]
, to optimum manipulator design for autonomous
de-leafing process of cucumber plants[23]
, and simultaneous
localization and mapping techniques for plant trimming[24]
. Most
of the published literatures in this context are focused on (i)
vision-based control, advanced image processing techniques, and
gripper design for automated harvesting of valuable fruits (see for
example the published literatures on sweet pepper[7,25-28]
, oil
palm[29-31]
, mango[32]
, cucumber[23,33-38]
, almond[39,40]
, apple[41-43]
,
strawberry[44-46]
, cherry fruit[47]
, citrus[48–50]
, vineyard[51-53]
, and
tomato[54-57]
), or (ii) navigation algorithms and robust machine vision
systems for development of field robots that can be used in yield
estimation[42,58,59]
, thinning[60]
, weeding and targeted spraying[61-64]
,
seedling and transplanting[65,66]
, delicate handling of sensitive
flowers[67,68]
, and multipurpose autonomous field navigation
robots[18,69-75]
. In addition to these, several virtual
experimentation frameworks have been developed for agricultural
robots. An example includes the work of [76] in which a generic
high-level functionality was provided for easier and faster
development of agricultural robots. In another attempt, a
customized software platform called ForboMind[77]
was introduced
to support field robots for precision agriculture task with the
objective to promote reusability of robotic components.
ForboMind is open-source, and support projects of varying size and
complexity, facilitate collaboration for modularity, extensibility,
and scalability. In order to experiment with vision sensors and
agricultural robots[7]
, created a completely simulated environment
in V-REP (Coppelia Robotics)[78]
, ROS[79]
, and MATLAB
(Mathworks, Natick, MA, USA) for improvement of plant/fruit
scanning and visual servoing task through an easy testing and
debugging of control algorithms with zero damage risk to the real
robot and to the actual equipment. Example solutions addressing
robotic harvesting included eye-in-hand look-and-move
configuration for visual servo control[49,80-82]
, optimal manipulator
design and control[29,38]
, end-effector and gripper design[8,83]
,
stability tests for robot performance analysis in the dense obstacle
environments[84]
, motion planning algorithms[85]
, and orchard
architecture design for optimal harvesting robot[6]
. Improvements
in vision-based control system[7,48,49,86,87]
have enabled several
applications of robotic manipulators for greenhouse and orchard
tasks and have contributed to the decrease in workload and labor’s
fatigue while improving the efficiency and safety of the operations.
These achievements were considered a challenge in the earlier
agricultural robotics works[88-90]
.
Agricultural field robots[91]
on the other hand contribute to
increasing the reliability of operations, improved soil health, and
improved yield. They are generally equipped with two or
multiple sensors and cameras for navigation control, simultaneous
localization and mapping, and path planning algorithms[92-94]
.
Some of the earlier attempts for developing agricultural field robots
prototypes can be found in the works of [95-98]. The automated
harvesting platform shown in Figure 1 is one of the most recent
achievements in the field of agricultural robotics. It was
introduced by the SWEEPER EU H2020 project consortium –
(www.sweeper-robot.eu) on July 4, 2018. It is an assembly of an
autonomous mobile platform with Fanuc LRMate 200iD robot
manipulator (Fanuc America Corporation, Rochester Hills, MI)
holding an end-effector and catching device for fruit harvesting.
The ultimate goal of the Sweeper project is to put the first working
sweet pepper harvesting robot on the market. Using the camera
system mounted on the end-effector, the SWEEPER scans plants
looking slightly upwards for detecting mature fruits (the robot
observe the bottom part of the peppers to determine the fruit
maturity). The camera and sensors setup is completely
independent of the surrounding light conditions and provide
information about color images and distance maps that are used for
fruit detection, localization, and maturity classification. The
SWEEPER robot has been trained to detect obstacles such as leaves
and plant stems in the images. The training process was
accelerated using simulated artificial pepper plant models and deep
3. July, 2018 Shamshiri R R, et al. Research and development in agricultural robotics: A perspective of digital farming Vol. 11 No.4 3
learning network algorithms. Once the robot detects a pepper
fruit, information about its location is used to perform path
planning for the robotic arm trajectory. Because of the limited
moving space between the planting rows calculation of this
trajectory can be very complex. The robot then employs visual
servo control to reach the peduncle of the spotted peppers. The
robot camera takes images from different angles so that the arm
approaches the pepper in such a direction that the stem is always on
the back side of the pepper. A small cutting tool is positioned just
above the pepper which cuts the peduncle while the cutting tool is
moving downward. This separates the pepper from the plant’s
stem and drops it into a catching device which is moved toward the
pepper bin by the robotic arm. It is notable that the SWEEPER
can only harvest fruits that are located on the front side of the
plants and stems. A conveyer belt is said to be added to the robot in
order to convey harvested peppers to a standard pepper trolley.
Multiple robots and trolley will be parts of a fully automated
post-harvest logistic management system. The robot will exploit
its harvesting skill in full in a single stem row cropping system.
The most suitable yellow existing variety was used during the
SWEEPER test. According to the project website, for single row
growing system, the performance of SWEEPER evaluated with
only fruits that were on the front side of stems was respectively
62% and 31% in the modified and commercial crop. In general,
SWEEPER has a success rate of 49% in harvesting ripe fruits with
modified crop, and only 20% with the commercial (current
greenhouse growing) system. The average time to harvest one
fruit with SWEEPER is between 18 and 25 seconds compromising
4.73 s for platform movement, 3.71 s for fruit localization, 3.02 s
for obstacle localization, 4.03 s for visual servoing, 2.22 s for fruit
detaching, and 7.77 s for dropping fruit in container (data extracted
from SWEEPER website). The SWEEPER project team has
announced in their website that they have also achieved a harvest
time of less than 15 seconds (excluding platform movement) in
laboratory experiments. It is expected that results of projects like
this will serve as input for the development of a new fully
optimized and automated fruit production system for the
greenhouse horticulture sector.
Source: Sweeper EU H2020 project consortium – www.sweeper-robot.eu.
Figure 1 SWEEPER robot in action: The world first fully automated sweet pepper harvesting platform
For the purpose of this paper we provide a general review of
the recent advances in agricultural robotics, with focus on those
that employ high-tech sensors, artificial intelligence, machine
learning, and simulation environments for (i) weed control and
targeted spraying, (ii) field scouting and data collection, and (iii)
automated harvesting. We then extend our discussion to introduce
some of the most widely used simulation software and virtual
platforms that can be adapted to accelerate the design of
agricultural robots, improve operational performances, and evaluate
control capabilities of the actual hardware.
2.1 Weed control and targeted spraying robots
One of the main aspects of agricultural robotics is concerned
with the substitution of the human workforce by field robots or
mechanized systems that can handle the tasks more accurately and
uniformly at a lower cost and higher efficiency[6,99-103]
. Weed
control and precise spraying are perhaps the most demanded
applications for agricultural field robots. In this regard, targeted
spraying[104]
with robots for weed control application has shown
acceptable results and reduced herbicide use to as little as 5%-10%
compared to blanket spraying[105]
. While still not fully
commercialized, various promising technologies for weed robots
have been introduced and implemented over the past 10 years as
the results of interdisciplinary collaborative projects between
different international research groups and companies. Some of
the well-known names that are actively involved in the research
and development for various types of weed control robots are the
Wageningen University and Research Center (The Netherlands),
Queensland University of Technology, the University of Sydney,
Blue River Technologies (Sunnyvale, CA, USA), Switzerland’s
ecoRobotix (Yverdon-les-Bains, Switzerland), and France’s Naio
Technologies (Escalquens, France). For example a flexible
multipurpose farming and weeding robot platform named
BoniRob[18,106]
(shown in Figure 2a) was developed as a joint
project between the University of Osnabrueck, the DeepField
Robotics start-up, Robert Bosch company, and the machine
manufacturer Amazonen-Werker. The available time, labor,
equipment, costs, and types of weeds and the areas infested need
to be considered when planning a weed control program. For
such a robot to be efficient, it should be able to not only substitute
the tedious manual weed removal task, but also decreases the use
of spraying agrochemical and pesticide on the field. Figure 2
shows: (a) BoniRob[18,106]
: an integrated multipurpose farming
robotic platform for row crops weed control developed by
interdisciplinary teams which is also capable of creating details
map of the field, (b) AgBot II[107]
: an innovate field robot
prototype developed by the Queensland University of Technology
for autonomous fertilizer application, weed detection and
classification, and mechanical or chemical weed control, (c)
Autonome Roboter[108]
: a research effort robot developed by
Osnabrück University of Applied Sciences for weed control, (d)
Tertill[109]
: a fully autonomous solar powered compact robot
developed by FranklinRobotics for weed cutting, (e) Hortibot[110]
:
a robot developed by the Faculty of Agricultural Sciences at the
University of Aarhus for transporting and attaching a variety of
weed detection and control tools such as cameras, herbicide and
spraying booms, (f) Kongskilde Robotti[111]
: a robotic platform
equipped with drive belt operating based on the FroboMind
software[77]
that can be connected to different modules and
4. 4 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
implements for automated and semi-automated mechanical weed
control, precision seeding, furrow opening and cleanings, (g)
RIPPA[112]
: a solar-powered Robot for Intelligent Perception and
Precision Application developed by the Australian Centre for
Field Robotics at Sydney University, and (h) spray robot
developed by HollandGreenmachine for smart chemical
application in greenhouses. Some of these robots can reduce
weed chemical use by 80%-90%[18,107,112]
.
a. BoniRob[18,106]
b. AgBot II[107]
c. Autonome Roboter[108]
d. Tertill[109]
Source: Deepfield Robotics Source: Queensland University of Technology Source: Osnabrück University Source: franklinrobotics.com
e. Hortibot[110]
f. Kongskilde Robotti[111]
g. RIPPA[112]
h. Spray robot
Image credit: technologyreview.com Image credit: conpleks.com Source: The University of Sydney Source: Hollandgreenmachine
Figure 2 Example of weed control and targeted spraying robots
In order to apply chemical directly to the weed’s vascular
tissue, a direct chemical application end effector is required to cut
the weed’s stem and spread the chemical on the cut surface. An
example of such an application can be found in [113] where a
proto-type weed control robot was developed to spray weeds in
cotton plants in the seed line. A real-time intelligent weed control
system was introduced in [114] for selective herbicide application
to in-row weeds using machine vision and chemical application.
A mini–robot to perform spraying activities based on machine
vision and fuzzy logic has been described in [115,116]. More
examples of autonomous vehicle robot for spraying the weeds can
be found in [117–119] and [90] and [114]. Development of an
autonomous weeding machine requires a vision system capable of
detecting and locating the position of the crop. Such a vision
system should be able to recognize the accurate position of the
plant stem and protects it during the weed control[120]
. A
near-ground image capturing and processing technique to detect
broad-leaved weeds in cereal crops under actual field conditions
has been reported in the work of [121]. Here the researchers
proposed a method that uses color information to discriminate
between vegetation and background, whilst shape analysis
techniques were applied to distinguish between crop and weeds.
Shape features of the radish plant and weed were investigated by
[122]. They proposed a machine vision system using a charge
coupled device camera for the weed detection in a radish farm
resulting 92% success rate of recognition for radish and 98% for
weeds. A combined method of color and shape features for sugar
beet weed segmentation was proposed by [98] with a 90% success
rate in classification. This rate increased to 96% by adding two
shape features. Another approach extracted a correlation between
the three main color components R, G and B which constitute
weeds and sugar beet color classes by means of discriminant
analysis[123]
. Their method resulted in different classification
success rates between 77 and 98%. The segmentation of weeds
and soybean seedlings by CCD images in the field was studied by
[124]. Texture features of weed species have been applied for
distinguishing weed species with grass and broadleaf classification
accuracies of 93% and 85%, respectively[125]
. Textural image
analysis was used to detect weeds in the grass[126]
. Gabor wavelet
features of NIR images of apples were extracted for quality
inspection and used as input to kernel PCA[127]
. Kernel PCA first
maps the nonlinear features to linear space and then PCA is applied
to separate the image Gabor wavelet (5 scales and 8 orientations)
combined with kernel PCA had the highest recognition rate
(90.5%). Spray robots for weed control have been developed with
vertical spray booms that increase the deposition in the
canopy[128-130]
. Some of the emerging technologies are the
self-propelled vehicles such as Fumimatic® (IDM S.L, Almería,
Spain) and Tizona (Carretillas Amate S.L., Almería, Spain), or
autonomous field robots such as Fitorobot (Universidad de Almería,
Cadia S.L., Almería, Spain) that have been designed specifically to
navigate inside fields that has loose soil and operate in situations
where a large number of obstacles are present[130]
. Some of these
robots are based on inductive sensors for following metal pipes that
are buried in the soil. Studies that reports autonomous robot
navigation inside greenhouse environments are slim[116,118,131,132]
.
A fixed—position weed robot was presented by [133] which is
interfaced to a standard belt—conveyor displacement system and
provides the robot with pallets containing the crops. Artificial
neural networks have also been used by many researchers to
discriminate weeds with machine vision[134,135]
. For example,
BoniRob[18,106]
uses AI to differentiate between weeds and plants
and then mechanically destroys the detected weeds using a
custom-built mechanism called “ramming death rod”. Different
control modules of BoniRob are connected by Ethernet and
communicate using TCP/IP. This platform has 16 degrees of
freedom (DOF) realized by different electro-motors and hydraulic
cylinder actuation. Each of the 4 wheels is driven by separate
motors and can be steered independently (motor controllers are
connected by a CAN bus). These reviews indicate that a fully
commercial robotic platform for the elimination of weeds has not
been realized yet. In addition, most of the research works in the
5. July, 2018 Shamshiri R R, et al. Research and development in agricultural robotics: A perspective of digital farming Vol. 11 No.4 5
area of robotic weed control are applicable prior to the plant growth
or in some cases when the main plant height is between 0.2-0.3 m.
2.2 Field scouting and data collection robots
Field scouting robots face various interdisciplinary challenges
for providing reliable data and measurements that can be used and
processed by precision agriculture and crop models. Other than
the challenges of inherent physical and biological variability
involved with farm fields and orchards, scouting robot platforms
are expected to be flexible, multipurpose and affordable to be
considered viable for use in commercial scale. If successfully
integrated and implemented, these robots can play a key role in
reducing production cost, increasing productivity, quality, and
enabling customized plant and crop treatments. Development of
scouting robots for the purpose of data collection and modern
farming incorporates extensive use of advanced sensors for
precision agriculture[136,137]
in order to generate valuable results
while performing automatic and accurate navigation control,
manipulator control, obstacle avoidance, and three-dimensional
environment reconstructions. For example, an autonomous field
survey mobile robot platform with custom manipulator and gripper
was proposed[138]
to carry imaging sensors and GPS devises for
autonomous navigation and data collection inside greenhouses and
open-field cultivation environments (Figure 3a-3c). Various
multi-spectral imaging devices and LiDAR sensors are reported to
have been installed and used with modified mobile robot platforms
for automated monitoring and building reconstructed 3D point
clouds for generating computer images of trees and plants[69,139-141]
such as those shown in Figure 3d.
Actual Simulated Actual Simulated
a. A prototype surveillance field
robot[138], A
daptiveAgroTech.com
b. OSCAR field survey
robot, Inspectorbots.com
c. Husky UGV for field scouting and
3D mapping, Clearpathrobotics.com
d. Point cloud and detected maize plants [69]
, and 3D
point clouds of vineyard created by VinBotRobotnik
Automation (www.robotnik.eu)[140]
Figure 3 Examples of a prototype and professional field robots for scanning and 3D reconstruction of plants and environment
Some of the most advanced robotic technology for automated
field scouting and data collection are shown in Figure 4 including
(a) Trimbot2020[142]
, an outdoor robot based on a commercial
Bosch Indigo lawn mower platform and Kinova robotic arm for
automatic bush trimming and rose pruning, (b) Wall-Ye[143]
, a
prototype vineyard robot for mapping, pruning, and possibly
harvesting the grapes (wall-ye.com), (c) Ladybird[144,145]
, an
autonomous multipurpose farm robot for surveillance, mapping,
classification and detection for different vegetables, (d)
MARS[146,147]
: the mobile agricultural robot swarms are small and
stream-lined mobile robot units that have minimum soil
compaction and energy consumption and aim at optimizing plant
specific precision agriculture, (e) SMP S4: a surveillance robot for
bird and pest control developed by SMP Robotics
(smprobotics.com), (f) Vine agent, a robot equipped with advanced
sensors and artificial intelligence to monitor the field for plant’s
health assessment developed at the Universitat Politècnica de
València, (g) HV-100 Nursery Bot, a light weight robot developed
by Harvest Automation for moving of plants and potted trees in
greenhouses and small orchards developed by Harvest Automation
(harvestai.com/), (h) VinBot[59,148]
: an all-terrain mobile robot with
advanced sensors for autonomous image acquisition and 3D data
collection from vineyar for yield estimation and information
sharing, (i) Mantis, a flexible general purpose robotic data
collection platform equipped with RADAR, liDAR, panospheric,
stereovision, and thermal cameras[32]
, and (j) GRAPE, a Ground
Robot for vineyard monitoring and ProtEction funded by the
European Union’s for smart autonomous navigation, plant
detection and health monitoring, and manipulation of small
objects[149]
.
a. TrimBot[142]
b. Wall-Ye vinyard robot c. Lady bird[144]
j. MARS[146,147]
e. SMP S4
trimbot2020.org wall-ye.com Univ. of Sydney echord.eu/mars smprobotics.com
f. VineRobot g. HV-100 Nursery Bot h. VinBot[59,148]
i. Mantis [32]
d. GRAPE
vinerobot.eu harvestai.com vinbot.eu/ Univ. of Sydney grape-project.eu
Figure 4 Examples of general purpose robots for field scouting and data collection
6. 6 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
2.3 Harvesting robots
Traditional harvesting of fruits and vegetables for fresh market
is a labor-intensive task that demands shifting from tedious manual
operation to a continuously automated harvesting. Increasing the
efficiency and reducing labor dependents of harvesting will ensure
high-tech food production yield and competitiveness. In spite of
the advances in agricultural robotics, million tons of fruits and
vegetables are still hand-picked every year in open-fields and
greenhouses. Other than the high labor cost, the availability of the
skilled workforce that accepts repetitive tasks in the harsh field
conditions impose uncertainties and timeliness costs. For robotic
harvesting to be cost-effective, fruit yield needs to be maximized to
compensate the additional automation costs. This leads to
growing the plants at higher densities which make it even harder
for an autonomous robot to simultaneously detect the fruit, localize
and harvest it. In the case of sweet pepper fruit, with an estimated
yield of 1.9 million tons/year in Europe, reports indicate that while
an average time of 6 seconds per fruit is required for automated
harvesting, the available technology has only achieved a success
rate of 33% with an average picking time of 94 seconds per fruit[25]
.
For cucumber harvesting, a cycle time of 10 seconds was proven to
be economically feasible[38]
. Only in Washington State, 15-
18 billion apple fruits are harvested manually every year. An
estimated 3 million tons of apples is reported to have been
produced in Poland in 2015[150]
, out of which one-third are delicate
fruits and are less resistant to bruising from mass harvester
machines. Also in Florida, where the current marketable yield of
sweet pepper fruits in open-field cultivation is 1.6 to 3.0 with a
potential yield of 4 lb/ft2
in passive ventilated greenhouses[151]
,
manual harvesting is still the only solution. Therefore,
development of an automated robotic harvesting should be
considered as an alternative method to address the associated labor
shortage costs and timeliness. A fully automated robotic harvester
will contribute to solving some of the today’s major grower issues,
such as labor costs, labor availability, food safety and quality. It
also plays an essential role in improving the interactions between
human, machine, and plants[131]
. For example, the prevention of
musculoskeletal disorders in manual harvesting operations in Dutch
greenhouses has motivated various researchers for replacement of
human labor by automatons robot for picking cucumber[33]
and
sweet pepper[7]
fruits. A functional model was then introduced[23]
in the field test of an autonomous robot for de-leafing cucumber
plants grown in a high-wire cultivation system. Field results
showed that the de-leafing robot spent an average time of
140 seconds for two leaves plants, which was 35 times longer than
manual leaf picking per plant[23]
.
Research and development in robotic harvesting date back to
the 1980s, with Japan, The Netherlands, and the USA as the
pioneer countries. The first studies used simple monochrome
cameras for fruit detection inside the canopy[152]
. Other than the
visible light RGB cameras[41,153]
and the ultrasonic radar sensors
that are commonly used for object detection due to their affordable
cost[154]
, advances in the sensing and imaging technology have led
to the employment of sophisticated devices such as infrared[47]
,
thermal[155]
, hyperspectral cameras[156]
, LiDAR[32,39,40,157]
, or
combination of multi-sensors[158]
that are adopted with novel
vision-based techniques for extracting spatial information from the
images for fruit detection, recognition, localization, and tracking.
A common approach in fruit detection and counting[153]
is by using
a single viewpoint, as in the case of a cucumber harvesting robot[23]
,
or multiple viewpoints [32]
with additional sensing from one or
multiple vision sensors that are not located on the robot[159]
.
Examples of the recent achievements include automatic fruit
recognition from multiple images[160]
or based on the fusion of
color and 3D feature[161]
, multi-template matching algorithm[162]
,
symmetry analysis[163]
, combined color distance method and
RGB-D data analysis for apples[164]
and sweet-peppers[41]
, stereo
vision for apple detection[165,166]
, and the use of convolutional
neural networks[167]
and deep learning algorithms for fruit detection
and obstacle avoidance in extremely dense foliage[54,168]
. Some of
the challenges to be addressed in designing of a complete robotic
harvesting are the simultaneous localization of fruit and
environment mapping, path planning algorithms, and the number of
detectable and harvestable fruits in different plant density
conditions. Significant contributions have been made by various
research groups to address these challenges, however there is
currently no report of a commercial robotic harvesting for fresh
fruit market[169]
, mainly due to the extremely variable
heterogeneous working condition and the complex and unpredicted
tasks involved with different fruit and plant scenario. The
function of a harvesting robot can be separated into three main
sections as sensing (i.e., fruit recognition), planning (i.e.,
hand-and-eye coordination) and acting (i.e., end-effector
mechanism for fruit grasping)[170]
.
Theoretical and applied research on robotic harvesting of fruits
and vegetable are huge. Figure 5 shows some of the efforts that
resulted in building actual robotic harvesting platforms, including
(a) Harvey[28]
: an autonomous mobile robot platform with UR5
manipulator for harvesting sweet peppers grown in greenhouses
and other protect cultivation systems, (b) the CROPS harvesting
platform for sweet pepper[27,171]
, (c) the SWEEPER platform
(developed by the Sweeper EU H2020 project consortium,
www.sweeper-robot.eu) with a Fanuc LRMate 200iD robot
manipulator (Fanuc America Corporation, Rochester Hills, MI) and
a custom-built gripper and catching mechanism for sweet pepper
harvesting, (d) the Energid robotic citrus picking system (Bedford,
MA), (e) the citrus harvesting robot[48,49,155]
developed at the
University of Florida which uses a custom built gripper mounted
on the Robotics Research manipulator model 1207 (Cincinnati,
Ohio), (f) the DogTooth strawberry robot (Great Shelford,
Cambridge, UK), (g) the Shibuya Seiki robot that can harvest
strawberry fruits every 8 seconds, (h) a tomato harvesting robot
from Suzhou Botian Automation Technology Co., Ltd (Jiangsu,
Suzhou, China), (i) a cucumber harvesting robot developed at the
Wageningen University and Research Center[35,38]
, (j) an apple
harvesting robot[172]
with custom built manipulator mounted on top
of a modified crawler mobile robot, (k) one of the first
manipulators developed for the CROPS project[171]
and modified
for apple harvesting, (l) a linear actuator robotic system for apple
picking developed by ffrobotics (Gesher HaEts 12, Israel), (m) a
vacuum mechanism robot for apple picking from
AbundantRobotics (Hayward, CA, USA), (n) the UR5 manipulator
with a soft robotic universal gripper for apple harvesting developed
at the University of Sydney, and, (n) an apple catching prototype
robot[173-175]
developed at the Wachington State University. Most
of these projects have used eye-in-hand look-and-move
configuration in their visual servo control. Other than the issues
with frame transformation, this solution is not promising if the fruit
is heavily occluded by the high-density plant leaves[176]
.
Obviously, the final robot prototype needs to be relatively faster for
mass-harvest, with an affordable cost for greenhouse growers.
Swarms of simple robots with multiple low-cost cameras and
7. July, 2018 Shamshiri R R, et al. Research and development in agricultural robotics: A perspective of digital farming Vol. 11 No.4 7
innovative soft robotic grippers[177]
, or human-robot collaboration
are the research topics to solve the facing challenges in robotic
harvesting that current technology cannot overcome. These
approaches can significantly improve the processing time of
multiple fruit detection in the high-density plants, and provide
ground truth results over time for machine learning algorithms
based on human-operators experience. In fact, a promising
solution to efficient robotic harvesting is not through a single robot
manipulator. Results of simulation studies have revealed that
single arm robots for rapid harvesting are still far beyond
realization, and failed mainly due to the “sensing and moving”
action in high vegetation density. In this approach, even if the
fruit localization is accurate, and the robot control calculates an
optimum trajectory to reach the fruit without receiving additional
sensing feedback from the camera, the moment it enters into the
dense plant canopy it disrupts the exact location of the target fruit.
a. Harvey[28]
b. CROPS[27,171]
c. SWEEPER d. Energid citrus picking system e. Citrus robot[48,49,155]
Queensland Univ. of Technology crops-robots.eu sweeper-robot.eu Energid technologies University of Florida
f. DogTooth g. Shibuya Seiki h. Tomato harvesting robot i. cucumber robot[35,38]
j. Apple harvesting robot[172]
www.dogtooth.tech shibuya-sss.co.jp szbotian.com.cn Wageningen UR
k. Apple harvesting[178]
l. Apple picker m. Apple picking vacuum n. UR5 apple robot o. Apple catching[173-175]
crops-robots.eu FFRobotics.com abundantrobotics.com Univ. of Sydney Washington State University
Figure 5 Examples of harvesting robots for different fruits
3 Agricultural robotics and digital farming
Agricultural robotics is a promising solution for digital
farming and for handling the problems of workforce shortage and
declining profitability. Initial tests with one of the most recent
technologies available for automated harvesting (the Harvey[28]
robot) has already shown a success rate of 65% and detachment
rate of 90% for sweet pepper harvesting in real planting scenario
where no leaves and occluded fruits were trimmed or removed.
Field agent robots that autonomously monitor and collect data
empower growers with real-time detailed information about their
crops and farms, revealing upstream images for making
data-driven decisions. Agricultural robotic is taking farming
practices to a new phase by becoming smarter, detecting sources
of variability in the field, consuming less energy, and adapting
their performances for more flexible tasks. They have become an
integral part of the big picture in the future production of
vegetable and crops, i.e., growing plants in space or development
of robotized plant factories for producing vegetables in Antarctica.
The trend in food production is towards automated farming
techniques, compact Agri-cubes, and cultivation systems that have
the minimum human interface where skilled workforce are being
replaced with robotic arms and mobile platforms. In this context
digital farming have integrated new concepts and advanced
technologies into a single framework for providing farmers and
stakeholders with a fast and reliable method of real-time
observations at the plant level (i.e., field data collection and crop
monitoring) and acting at a more precise scale (i.e., diagnostics,
strategic decision-making, and implementing). Digital farming is
about collecting high-resolution field and weather data using
ground-based or aerial based sensors, transmitting these data into a
central advisory unit, interpreting and extracting information, and
providing decisions and actions to the farmers, field robots, or
agro-industries. Examples include thermal-RGB imaging
system[179]
for monitoring of plant and soil for health assessment,
creation of information maps (i.e., yield and density maps), and
data sharing. Implementation of digital farming practices result
in a sustainable, efficient, and stable production with a significant
increase in yield. Some of the technologies involved in digital
farming include the Internet of Thing[180]
, big data analysis[181]
,
smart sensors[182]
, GPS and GIS, ICT[183]
, wireless sensor
networks[184,185]
, UAV[186-188]
, cloud computing[189-191]
, simulation
software[192-195]
, mapping applications[196,197]
, virtual farms[198-200]
,
mobile devices[201-204]
, and robotics. A conceptual illustrating of
digital farming and its relationship with agricultural robotics is
provided in Figure 6, showing that the collected data by the robot
8. 8 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
agents are sent to a cloud advisory center for decision makings.
The actions are then implemented quickly and accurately by the
use of robots or other automated machinery, sending operational
updates and notification feedbacks to the farmers and
agro-industry sections. This system of computer-to-robot
communication combined with the sophisticated simulation
software, analytics applications, and data sharing platforms offers
a much smoother control over farming operations. In addition, it
provides farmers with details of historical field data for improving
their performances and optimizing crop yields for specific plots, or
even developing new business models.
Source: www.AdaptiveAgroTech.com
Figure 6 A conceptual illustration of digital farming and virtual orchards with emphasize on the role of agricultural robotics
A key consideration before utilization of field robots in
large-scale farms is to take into account the cost-benefit analysis
for selecting the number and size of the robots that are required for
a specific operation. Digital farming can address this by having
simulated robots work in the 3D reconstructed field, orchards, or
plantations that have been created using UAV imagery and
photogrammetry software. With this approach, even people with
limited experience or knowledge about crop production, data
collection, and analytical methods could connect their farms to a
network, share their field information, and receive prescriptions.
Using digital farming, growers can collect valuable information
about their fields that were previously ignored or used to be
measured by the use of traditional scouting methods. For example,
detailed measurements about height and size of each tree in a citrus
orchard, nutrient stress, the required time for robot navigation
inside the orchard, estimated time for robotic harvesting of a tree,
and much more can be extracted from a virtual orchard. While
this integration might still seem too ambitious to be widespread in
many regions, it can serve as a prophetic awareness for a perceptive
outlook, offering new insights that enhance the ability for a modern
farming system. Developments of a new generation of
agricultural robots that can easily and safely cooperate to
accomplish agricultural tasks become necessary. Heavy tractors
and machinery used today compacts the soil, which over time
severely deteriorates the fertility of the soil. This is a significant
threat to soil in Europe. Compacted soils require more than a
decade of expensive treatment to recover its fertility. The
problem can be solved by replacing heavy tractors with a number
of smaller vehicles that can treat crop fields just as well and
without compacting of the soil. However, that scenario requires a
human supervisor/operator for each vehicle that is very expensive.
A technology is then required to enable a single farmer to
supervise and operate a team of these automated vehicles. This
includes the development of a mission control center and
intelligent coverage path planning algorithms to enable team
members to communicate and cooperate, and solve a range of
agricultural tasks in a safe and efficient way. One of the topics
that have been proposed by many researchers for a long time is
the exotic concept of Multiple Robots that can work together to
accomplish a specific farming task. The idea is to employ
artificial intelligence and genetic algorithms where multiple
robots are programmed to collaborate with each other and form
an ecosystem. This approach becomes even more useful when
robots begin learning from each other and improve their
performance over time. For example, a swarm of robots can
contribute to the creation of nutrient maps by collecting soil
samples and communicating with a cloud advisory center for
executing proper action on the go. The efficiency of this
process may not be great in the beginning, but the performances
can be improved over time by having deep learning algorithms
that emphasize the so-called good-behavior and punish the
bad-behavior of each robot. These robots have great
advantageous for digital farming. For example, a heterogeneous
Multi-Robot system compromising a ground mobile platform and
9. July, 2018 Shamshiri R R, et al. Research and development in agricultural robotics: A perspective of digital farming Vol. 11 No.4 9
an aerial vehicle (Figure 7) for mapping environmental variables
of greenhouses has been simulated in the Unity3D 5.2.1 game
engine[205]
. This robot can measure the temperature, humidity,
luminosity and carbon dioxide concentration in the ground and at
different heights. Some of the relevant challenges in developing
multi-robot sensory systems are mission planning and task
allocation, obstacle avoidance, the guidance, navigation and
control of robots in different farming scenarios.
a. Air-ground Multi-Robot platform and its simulation in Unity3D[205]
b. Project MARS[146,147]
c. A Swarm of Nano Quadrotors
Source: TheDmel Youtube channel
Figure 7 Examples of Multi-Robot systems for digital farming
4 Challenges of robotics for precision agriculture:
Digitalization, Automation, and Optimization
After 20 years of research in precision agriculture there are
nowadays many types of sensors for recording agronomically
relevant parameters, as well as many farm management systems.
Electronically controlled machines and robots are state of the art.
In fact, technology is now capable of automating cyber-physical
systems by networking between different machines. This is what
we call “agriculture 4.0”. However, it still cannot be claimed that
precision agriculture has been widely established in crop
production. Why not? Data alone are not enough. Automatic
data recording only helps farm results where the analysis of the
collected material takes less time and allows more profit to be
made compared with good management decision based on gut
feeling and experience. Today, the largest portion of added value
deriving from the new technology lies with the machinery and not
the agricultural products. For instance, futures’ trading on the
commodity market has a much faster and more direct influence on
value development of agricultural products than does, e.g., the
quality of the product, or its yield, being increased in single-figure
percentages through the application of site-specific management
techniques. There remains the advantage of time savings.
Hereby, the task of agricultural robotic engineering development is
the creation of intelligent and simple to operate, so-called “smart”
systems. We call smart products those that appear cleverer than
the user in that they deliver answers even before the question has
been asked. An example: so-called fitness bracelets that record
and analyze the wearer’s movements. The smartness of the
equipment lies in the analysis of the values. Step count and heart
frequency are below average. This finding leads to a treatment
recommendation: exercise more! But the user still has to carry out
these recommendations. The second example, this time from
precision agriculture. Easily the most successful crop plant
sensors systems are those, that analyze, recommend, and then apply
a treatment in one go, such as the so-called N-Sensors. Although
the analytical procedure within the system is highly complex, such
sensors are very easy to operate. In contrast to that, for instance
yield mapping is an off-line approach which requires additional
processing steps analyzing the data on the PC. On top of
everything else, the yield information gained from one harvest can
only be set to use in the next growing season, representing a
long-term investment with much manual input and benefits that are
difficult to assess.
Challenges for sensor development and agricultural robotic
technology lie in the required high temporal and spatial resolution
data which are very different and difficult to measure parameters
under most unfavorable conditions. The aim of new analysis
methods is to combine the data and to fuse the different
information layers in order to derive new knowledge.
Additionally, automation of the data collection tasks is a
requirement in the development of “smart sensor” systems for
agricultural application in the sense that the decision making is
embedded in the sensor so that the results are directly applicable to
the robot for carrying out precise management actions. But what
exactly has the fitness bracelet mentioned above to do with the
N-Sensor as far as content is concerned? Both sensors analyze data
and process the material up to the point where recommendations
for direct action can be deduced from the result. Additionally,
both analytics are based on indicators not directly related with the
actual target values. The “fitness” of the plant can be efficiently
assessed through foliage chlorophyll content or green color. But
where the cause of the problem is not poor nitrogen supply but
instead lack of moisture, then the system must have this additional
information available. In this respect, intuitive interaction
between man and robot is necessary, a point that also represents a
great development challenge for sensors and automation
technology in crop production. The more we appreciate the
importance of the comprehension of detailed agronomic
relationships, the greater the need for information towards a better
understanding of these relationships. The more information
available, the deeper the understanding and this requires, in turn,
more data collection. The situation is therefore a loop within
which, especially in recent years, more and more data was collected
and increasingly intensified agronomic knowledge has been
developed. Hereby, however, the practical application of directly
usable agronomic knowledge has stagnated. Nowadays it still
requires a considerable mass of statistics and software expertise for
comprehensive application of precision agricultural technology.
For further development of smart sensors, relevant information
must be integrated into multi-causal decision-making systems in
order to generate knowledge. The targets are complex systems
that are easy to operate solutions with systemic, comprehensive and
transparent concepts, with good “usability” and simple application.
There must also be a way for practical experience to flow into these
integrated systems so that farmers, with the help of the technology,
can develop their expertise further. A core theme in the
development of Ag-robotics decision support systems is the step
from data storage over information retrieval to knowledge
management involving large amounts of data. Currently,
possibilities for the analysis of agricultural data and sensor data
fusion are being expanded through the application of multivariate
10. 10 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
statistical methods and machine learning techniques. Hereby, the
system limits are increasingly expanded and nowadays holistic
concepts for complete added-value networks are already in focus,
whereby the mobile transmission of data is a basis technology for
the establishment of fully integrated systems enabling real-time
data fusion from different sources.
What we have discussed here are knowledge management and
intelligent systems. But, with all this “high-tech”, are we able to
concentrate clearly on our target? The crux of all the technical
developments is creating a more efficient crop production.
Automation and networking should serve the systemic control of
the agronomic processes, not vice versa. This is the environment
in which the Leibniz Research Alliances’ “Sustainable Food
Production and Healthy Nutrition” innovation initiative “Food and
Agriculture 4.0” focuses on the agricultural production process -
intelligently connected, of course. The aim of the initiative is the
interdisciplinary development of process technology basics for
Agriculture 4.0. Where knowledge-based decision making shall
ensure the satisfaction of social demands as well as individual
producer’s and consumer’s requirements, in terms of yields and
profits, while still taking into account the local, spatial,
environmental heterogeneities as well as global climate phenomena.
For this purpose the research goal is to develop on one hand models
of the agricultural production processes, adjusted to meet the
specific conditions, and on the other hand automation technologies
with which the processes shall be controlled such that the natural
resources can be retained or even improved and at the same time
the product quality will be maintained. The interoperability and
digital networking of agriculture will enable new process control
systems and new sales models such as online slurry sales points,
exchange platforms where data is traded for advice or online direct
marketing. However, even with Agriculture 4.0, only what is
sown can be driven home from the field. For instance, the
weather risk will not be any the less, although the harvesting
window might be better positioned by setting information
technology to use. Hereby we finish with the summary that even
Agriculture 4.0 will show only modest results if we do not take
care that some of the value added through the new technologies is
actually being associated with the agricultural products.
5 Conclusions
Research efforts for development of agricultural robots that
can effectively perform tedious field tasks have grown significantly
in the past decade. With the exception of milking robots that were
invented in the Netherlands, robotics has not reached a commercial
scale for agricultural applications. With the decrease of the
workforce and the increase of production cost, research areas on
robotic weeding and harvesting have received more and more
attention in the recent years, however the fastest available
prototype robots for weeding and harvesting are not even close to
being able to compete with the human operator. For the case of
picking valuable fruits using robots, the technology is now
becoming closer to a commercial product with the emerging of the
SWEEPER. For other fruits such as citrus and apples that can be
mass harvested for juice industry, modifications of the existing
mechanical harvesting systems with some robot functionalities may
be more promising than using single robot system. Increasing the
speed and accuracy of robots for farming applications are the main
issues to be addressed for generalization of robotics systems,
however, compared to the industrial and military cases, the lack of
abundant research funding and budgets in agriculture has
decelerated this process. For the case of robot harvesting,
improving sensing (fruit detection), acting (manipulator movement,
fruit attachment, detaching, and collecting), and growing system
(leave pruning and plant reshaping) are suggested to increase the
efficiency. It should be noted that development of an affordable
and effective agriculture robot requires a multidisciplinary
collaboration in several areas such as horticultural engineering,
computer science, mechatronics, dynamic control, deep learning
and intelligent systems, sensors and instrumentation, software
design, system integration, and crop management. We
highlighted some of the facing challenges in the context of utilizing
sensors and robotics for precision agriculture and digital farming as:
object identification, task planning algorithms, digitalization, and
optimization of sensors. It was also mentioned that for an
autonomous framework to successfully execute farming tasks,
research focus should be toward developing simple manipulators
and multi-robot systems. This is in fact one of the academic
trends and research focuses in agricultural robotics for building a
swarm of small-scale robots and drones that collaborate together to
optimize farming inputs and reveal denied or concealed
information. As of the conclusion, some forms of human-robot
collaboration as well as modification of the crop breeding and
planting systems in fields and greenhouses might be necessary to
solve the challenges of agricultural robotics that cannot yet be
automated. For example, in a collaborative harvesting system
using human-and-robot, any fruit that is missed by the robot vision
will be spotted by the human on a touchscreen interface.
Alternatively, the entire robot sensing and acting mechanism can be
performed by a human operator in a virtual environment.
Nevertheless, an agricultural robot must be economically viable
which means it must sense fast, calculate fast, and act fast to
respond to the variability of the environment.
Acknowledgements
The first author would like to express his appreciations to
Professor Salah Sukkarieh at the University of Sydney, Professor
Cornelia Weltzien and Professor Manuela Zude at the Leibniz
Institute for Agricultural Engineering and Bioeconomy, and Dr.
Jochen Hemming at the Wageningen UR for their insightful
meetings, lab demonstrations, and group discussions during his
visits. We also extend our deep appreciations to Dr. Wang
Yingkuan of the Chinese Academy of Agricultural Engineering and
his professional team at the International Journal of Agricultural
and Biological Engineering for reviewing the manuscript draft and
the editorial works. The consultation supports and assistance on
the economic and viability assessment of agricultural robotics
provided by Dr. Mary Sh, Ms. Mojgan, and Dr. Fatima Kalantari at
AdaptiveAgroTech are duly acknowledged.
Disclaimer
Mention of commercial products, services, trade or brand
names, organizations, or research studies in this publication does
not imply endorsement by the authors, nor discrimination against
similar products, services, trade or brand names, organizations, or
research studies not mentioned.
[References]
[1] King A. Technology: The Future of Agriculture. Nature, 2017; 544:
S21.
[2] Wolfert S, Ge L, Verdouw C, Bogaardt M J. Big data in smart farming –
A review. Agric. Syst., 2017; 153: 69–80.
[3] Chlingaryan A, Sukkarieh S, Whelan B. Machine learning approaches
for crop yield prediction and nitrogen status estimation in precision
11. July, 2018 Shamshiri R R, et al. Research and development in agricultural robotics: A perspective of digital farming Vol. 11 No.4 11
agriculture: A review. Comput. Electron. Agric, 2018; 151: 61–69.
[4] Bechar A & Vigneault C. Agricultural robots for field operations:
Concepts and components. Biosystems Engineering, 2016; 149: 94-111.
[5] Oberti R, Marchi M, Tirelli P, Calcante A, Iriti M, Tona E, et al.
Selective spraying of grapevines for disease control using a modular
agricultural robot. Biosyst. Eng, 2016; 146: 203–215.
[6] Bloch V, Degani A, Bechar A, A methodology of orchard architecture
design for an optimal harvesting robot. Biosyst. Eng. 2018; 166:
126–137.
[7] Shamshiri R R, Hameed I A, Karkee M, Weltzien C. Robotic harvesting
of fruiting vegetables: A simulation approach in V-REP, ROS and
MATLAB. Proceedings in Automation in Agriculture-Securing Food
Supplies for Future Generations, 2018, InTech.
[8] Eizicovits D, van Tuijl B, Berman S, Edan Y. Integration of perception
capabilities in gripper design using graspability maps. Biosyst. Eng.,
2016; 146: 98–113.
[9] Longo D, Muscato G. Design and simulation of two robotic systems for
automatic artichoke harvesting. Robotics, 2013; 2(4): 217–230.
[10] Barth R, Hemming J, van Henten E J. Design of an eye-in-hand sensing
and servo control framework for harvesting robotics in dense vegetation.
Biosyst. Eng. 2016; 146: 71–84.
[11] Gonzalez-de-Soto M, Emmi L, Perez-Ruiz M, Aguera J,
Gonzalez-de-Santos P. Autonomous systems for precise spraying –
Evaluation of a robotised patch sprayer. Biosyst. Eng., 2016; 146:
165–182.
[12] Adamides G, Katsanos C, Parmet Y, Christou G, Xenos M, Hadzilacos T,
et al. HRI usability evaluation of interaction modes for a teleoperated
agricultural robotic sprayer. Appl. Ergon. 2017; 62: 237–246.
[13] Ishigure Y, Hirai K, Kawasaki H. A pruning robot with a power-saving
chainsaw drive. in Mechatronics and Automation (ICMA), 2013 IEEE
International Conference on, 2013; pp.1223–1228.
[14] Kawasaki H, Murakami S, Kachi H, Ueki S. Novel climbing method of
pruning robot. in SICE Annual Conference, 2008; pp.160–163.
[15] Drach U, Halachmi I, Pnini T, Izhaki I. Degani A. Automatic herding
reduces labour and increases milking frequency in robotic milking.
Biosyst. Eng., 2017; 155: 134–141.
[16] Bach A and Cabrera V. Robotic milking: Feeding strategies and
economic returns. J. Dairy Sci., 2017; 100(9): 7720–7728.
[17] Zhang C, Gao H, Zhou J, Cousins A, Pumphrey M. O, Sankaran S. 3D
robotic system development for high-throughput crop phenotyping.
IFAC-PapersOnLine, 2016; 49(16): 242–247.
[18] Ruckelshausen A, Biber P, Dorna M, Gremmes H, Klose R, Linz A, et al.
BoniRob–an autonomous field robot platform for individual plant
phenotyping. Precis. Agric. 2009; 9(841): 1.
[19] Comba L, Gay P, Ricauda Aimonino D. Robot ensembles for grafting
herbaceous crops. Biosyst. Eng., 2016; 146: 227–239.
[20] De Baerdemaeker J, Munack A, Ramon H, Speckmann H. Mechatronic
systems, communication, and control in precision agriculture. IEEE
Control Syst, 2001; 21(5): 48–70.
[21] Shamshiri R R, Kalantari F, Ting K C, Thorp K R, Hameed I A, Weltzien
C, et al. Advances in greenhouse automation and controlled environment
agriculture: A transition to plant factories and urban agriculture. Int J
Agric & Biol Eng, 2018; 11(1): 1–22.
[22] Sammons P J, Furukawa T, Bulgin A. Autonomous pesticide spraying
robot for use in a greenhouse. Proceedings in Australian Conference on
Robotics and Automation, 2005; pp.1–9.
[23] Van Henten E J, Van Tuijl B A J, Hoogakker G J, Van Der Weerd M J,
Hemming J, Kornet J G, Bontsema, J. An autonomous robot for
de-leafing cucumber plants grown in a high-wire cultivation system.
Biosyst. Eng., 2006; 94 (3): 317–323.
[24] Billingsley J, Visala A, Dunn M. Robotics in agriculture and forestry. in
Springer handbook of robotics, Springer, 2008; 1065–1077.
[25] Hemming J, Bac W, van Tuijl B, Barth R, Bontsema J, Pekkeriet E, van
Henten E. A robot for harvesting sweet-pepper in greenhouses. Proc.
Int. Conf. Agric. Eng. 2014; 6–10.
[26] Hemming J, Bontsema J, Bac W, Edan Y, van Tuijl B, Barth R, Pekkeriet
E. Final Report: Sweet-Pepper Harvesting Robot”. Report to the European
Commission in the 7th Framework Programme. 2014.
[27] Bac C W, Hemming J, Van Henten E J. Robust pixel-based classification
of obstacles for robotic harvesting of sweet-pepper. Comput. Electron.
Agric., 2013; 96: 148–162.
[28] Lehnert C, English A, McCool C, Tow A W, Perez T. Autonomous
sweet pepper harvesting for protected cropping systems. IEEE Robot.
Autom. Lett., 2017; 2(2): 872–879.
[29] Shamshiri R, Ishak W, Ismail W. Nonlinear tracking control of a two
link oil palm harvesting robot manipulator. Int J Agric & Biol Eng, 2012;
5(2): 1–11.
[30] Ishak W, Ismail W. Research and Development of Oil Palm Harvester
Robot at Universiti Putra Malaysia. Int J Eng Technol, 2010; 7(2):
87–94.
[31] Jayaselan H A J, Ismail W I W. Kinematics analysis for five DOF Fresh
Fruit Bunch harvester. Int J Agric & Biol Eng, 2010; 3(3): 1–7.
[32] Stein M, Bargoti S, Underwood J. Image based mango fruit detection,
localisation and yield estimation using multiple view geometry. Sensors,
2016; 16(11): 1915.
[33] Van Henten E J, Hemming J, Van Tuijl B A J, Kornet J G, Meuleman J,
Bontsema J, Van Os E A. An autonomous robot for harvesting cucumbers
in greenhouses. Autonomous Robots, 2002; 13(3): 241-258.
[34] Tang X, Zhang T, Liu L, Xiao D, Chen Y. A new robot system for
harvesting cucumber. Proceedings in American Society of Agricultural
and Biological Engineers Annual International Meeting, 2009;
pp.3873–3885.
[35] van Henten E J, van Tuijl B A J, Hemming J, Kornet J G, Bontsema J, Van
Os E A. Field test of an autonomous cucumber picking robot. Biosyst.
Eng., 2003; 86(3): 305–313.
[36] Van Henten E J, Hemming J, Van Tuijl B A J, Kornet J G, Bontsema J.
Collision-free motion planning for a cucumber picking robot. Biosyst.
Eng., 2003; 86(2): 135–144.
[37] van Henten E J, Schenk E J, van Willigenburg L G, Meuleman J, Barreiro
P. Collision-free inverse kinematics of the redundant seven-link
manipulator used in a cucumber picking robot. Biosyst. Eng., 2010;
106(2): 112–124.
[38] Van Henten E J, Van’t Slot D A, Hol C W J, Van Willigenburg L G.
Optimal manipulator design for a cucumber harvesting robot. Comput.
Electron. Agric., 2009; 65(2): 247–257.
[39] Underwood J P, Hung C, Whelan B, Sukkarieh S. Mapping almond
orchard canopy volume, flowers, fruit and yield using LiDAR and vision
sensors. Comput. Electron. Agric., 2016; 130: 83–96.
[40] Underwood J P, Jagbrant G, Nieto J I, Sukkarieh S. Lidar-based tree
recognition and platform localization in orchards. J. F. Robot., 2015;
32(8): 1056–1074.
[41] Thanh N T, Vandevoorde K, Wouters N, Kayacan E, de Baerdemaeker J G,
Saeys W. Detection of red and bicoloured apples on tree with an RGB-D
camera. Biosyst. Eng., 2016; 146: 33–44.
[42] Bargoti S, Underwood J P. Image segmentation for fruit detection and
yield estimation in apple orchards. J. F. Robot, 2017; 34(6): 1039–1060.
[43] Jia W, Zheng Y, Zhao D, Yin X, Liu X, Du R. Preprocessing method of
night vision image application in apple harvesting robot. Int J Agric &
Biol Eng, 2018; 11(2): 158–163.
[44] Han K S, Kim S C, Lee Y B, Kim S C, Im D H, Choi H K, et al.
Strawberry harvesting robot for bench-type cultivation. Biosyst. Eng.
2012; 37(1): 65–74,.
[45] Feng Q, Wang X, Zheng W, Qiu Q, Jiang K. New strawberry harvesting
robot for elevated-trough culture. Int J Agric & Biol Eng, 2012; 5(2):
1–8.
[46] Hayashi S, Shigematsu K, Yamamoto S, Kobayashi K, Kohno Y, Kamata
J, et al. Evaluation of a strawberry-harvesting robot in a field test.
Biosyst. Eng., 2010; 105(2): 160–171.
[47] Tanigaki K, Fujiura T, Akase A, Imagawa J. Cherry-harvesting robot.
Comput. Electron. Agric., 2008; 63(1): 65–72.
[48] Mehta S S, Burks T F. Vision-based control of robotic manipulator for
citrus harvesting. Comput. Electron. Agric., 2014; 102: 146–158.
[49] Mehta S S, MacKunis W, Burks T F. Robust visual servo control in the
presence of fruit motion for robotic citrus harvesting. Comput. Electron.
Agric., 2016; 123: 362–375.
[50] Lu Q, Cai J R, Liu B, Lie D, Zhang Y J. Identification of fruit and
branch in natural scenes for citrus harvesting robot using machine vision
and support vector machine. Int. J. Agric. Biol. Eng. 2014; 7(2):
115–121.
[51] Zaidner G, Shapiro A. A novel data fusion algorithm for low-cost
localisation and navigation of autonomous vineyard sprayer robots.
Biosyst. Eng., 2016; 146: 133–148.
[52] Nuske S, Wilshusen K, Achar S, Yoder L, Narasimhan S, Singh S.
Automated visual yield estimation in vineyards. J. F. Robot, 2014; 31(5)
837–860.
[53] Nuske S, Gupta K, Narasimhan S, Singh S. Modeling and calibrating
visual yield estimates in vineyards. In Field and Service Robotics. Springer,
Berlin, Heidelberg, 2014; pp.343–356.
[54] Senthilnath J, Dokania A, Kandukuri M, Ramesh K N, Anand G, Omkar S
N. Detection of tomatoes using spectral-spatial methods in remotely
12. 12 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
sensed RGB images captured by UAV. Biosyst. Eng., 2016; 146: 16–32.
[55] Feng Q C, Zou W, Fan P F, Zhang C F, Wang X. Design and test of
robotic harvesting system for cherry tomato. Int J Agric & Biol Eng,
2018; 11(1): 96–100.
[56] Feng Q C, Cheng W, Zhou J J, Wang X. Design of structured-light
vision system for tomato harvesting robot. Int J Agric & Biol Eng, 2014;
7(2): 19–26.
[57] Wang L L, Zhao B, Fan J W, Hu X A, Wei S, Li Y S, et al. Development
of a tomato harvesting robot used in greenhouse. Int J Agric & Biol Eng,
2017; 10(4): 140–149.
[58] Hung C, Underwood J, Nieto J, Sukkarieh S. A feature learning based
approach for automated fruit yield estimation. In Field and Service
Robotics. Springer, Cham, 2015; pp.485–498.
[59] Lopes C M, Graça J, Sastre J, Reyes M, Guzmán R, Braga R, et al.
Vineyard yeld estimation by VINBOT robot-preliminary results with the
white variety Viosinho. Proceedings of 11th Int. Terroir Congress. Jones,
G. and Doran, N. (eds.), Southern Oregon University, Ashland, USA. 2016;
pp.458-463.
[60] Lee Y J, Kwon T B, Song J B. SLAM of a mobile robot using
thinning-based topological information. Int. J. Control. Autom. Syst.,
2007; 5(5): 577–583.
[61] Li N, Zhang C L, Chen Z W, Ma Z H, Sun Z, Yuan T, et al. Crop
positioning for robotic intra-row weeding based on machine vision. Int J
Agric & Biol Eng, 2015; 8(6): 20–29.
[62] Ghazali K. H, Razali S, Mustafa M M, Hussain A. Machine Vision
System for Automatic Weeding Strategy in Oil Palm Plantation using
Image Filtering Technique. 2008 3rd Int. Conf. Inf. Commun. Technol.
From Theory to Appl, 2008; pp.1–5.
[63] Chen B, Tojo S, Watanabe K. Machine vision for a micro weeding robot
in a paddy field. Biosyst. Eng., 2003; 85(4): 393–404.
[64] Bakker T, Bontsema J, Müller J. Systematic design of an autonomous
platform for robotic weeding. J. Terramechanics, 2010; 47(2): 63–7.
[65] Hu J, Yan X, Ma J, Qi C, Francis K, Mao H. Dimensional synthesis and
kinematics simulation of a high-speed plug seedling transplanting robot.
Comput. Electron. Agric., 2014; 107: 64–72.
[66] Ryu K H, Kim G, Han J S. AE—Automation and emerging technologies:
development of a robotic transplanter for bedding plants. J. Agric. Eng.
Res., 2001; 78(2): 141–146.
[67] Huang Y J, Lee F F. An automatic machine vision-guided grasping
system for Phalaenopsis tissue culture plantlets. Comput. Electron.
Agric., 2010; 70(1): 42–51.
[68] Yang Q, Chang C, Bao G, Fan J, Xun Y. Recognition and localization
system of the robot for harvesting Hangzhou White Chrysanthemums.
Int J Agric & Biol Eng, 2018; 11(1): 88–95.
[69] Weiss U and Biber P. Plant detection and mapping for agricultural robots
using a 3D LIDAR sensor. Rob. Auton. Syst. 2011; 59(5): 265–273.
[70] Shamshiri R and Wan Ismail W I. Design and Simulation of Control
Systems for a Field Survey Mobile Robot Platform. Res. J. Appl. Sci.
Eng. Technol., 2013; 6(13): 2307–2315.
[71] Cariou C, Lenain R, Thuilot B, Berducat M. Automatic guidance of a
four‐wheel‐steering mobile robot for accurate field operations. J. F.
Robot. 2009; 26(6-7): 504–518.
[72] English A, Ross P, Ball D, Corke P. Vision based guidance for robot
navigation in agriculture. in Robotics and Automation (ICRA), 2014 IEEE
International Conference on, 2014; 1693–1698.
[73] Bak T and Jakobsen H. Agricultural robotic platform with four wheel
steering for weed detection. Biosyst. Eng. 2004; 87(2): 125–136.
[74] Yin X, Noguchi N. Development and evaluation of a general-purpose
electric off-road robot based on agricultural navigation. Int J Agric &
Biol Eng, 2014; 7(5): 14–21.
[75] Qi H X, Banhazi T M, Zhang Z G, Low T, Brookshaw I J. Preliminary
laboratory test on navigation accuracy of an autonomous robot for
measuring air quality in livestock buildings. Int J Agric & Biol Eng,
2016; 9(2): 29–39.
[76] Hellström T, Ringdahl O. A software framework for agricultural and
forestry robots. Industrial Robot: An International Journal, 2013; 40(1):
20–26.
[77] Jensen K, Larsen M, Nielsen S H, Larsen L B, Olsen K S, Jørgensen R N.
Towards an open software platform for field robots in precision agriculture.
Robotics, 2014; 3(2): 207–234.
[78] Rohmer E, Singh S P N, Freese M. V-REP: A versatile and scalable
robot simulation framework. IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS), 2013; pp.1321–1326.
[79] Quigley M, Conley K, Gerkey B, Faust J, Foote T, Leibs J, et al. ROS:
an open-source Robot Operating System. In ICRA workshop on open
source software, 2009; 3(2): 5.
[80] Lippiello V. Multi–object and multi–camera robotic visual servoing.
PhD thesis, Scuola di Dottorato di ricerca in Ingegneria dell’Informazione,
Università degli studi di Napoli Federico II, 2003.
[81] Mehta S S, Jayaraman V, Burks T F, Dixon W E. Teach by zooming: A
unified approach to visual servo control. Mechatronics, 2012; 22(4):
436–443.
[82] Barth R, Hemming J, van Henten E J. Design of an eye-in-hand sensing
and servo control framework for harvesting robotics in dense vegetation.
Biosyst. Eng., 2016; 146: 71–84.
[83] Zhao D A, Lv J D, Ji W, Zhang Y, Chen Y. Design and control of an
apple harvesting robot. Biosyst. Eng., 2011; 110(2): 112–122.
[84] Li Z, Li P, Yang H, Wang Y. Stability tests of two-finger tomato
grasping for harvesting robots. Biosyst. Eng., 2013; 116(2): 163–170.
[85] Bac C W, Roorda T, Reshef R, Berman S, Hemming J, van Henten E J.
Analysis of a motion planning problem for sweet-pepper harvesting in a
dense obstacle environment. Biosyst. Eng., 2016; 146: 85–97.
[86] Zhao Y, Gong L, Huang Y, Liu C. A review of key techniques of
vision-based control for harvesting robot. Comput. Electron. Agric.,
2016; 127: 311–323.
[87] Choi D, Lee W S, Ehsani R, Roka F M. A Machine Vision System for
Quantification of Citrus Fruit Dropped on the Ground under the Canopy.
Trans. ASABE, 2015; 58(4): 933–946.
[88] Sandini G, Buemi F, Massa M, Zucchini M. Visually guided operations
in green-houses. IEEE International Workshop on Intelligent Robots and
Systems (IROS’90’), 1990; pp.279–285.
[89] Dario P, Sandini G, Allotta B, Bucci A, Buemi F, Massa M, et al. The
Agrobot project for greenhouse automation. International Symposium on
New Cultivation Systems in Greenhouse, 1993; 361: 85–92.
[90] Monta M, Kondo N, Ting K C. End-effectors for tomato harvesting robot.
in Artificial Intelligence for Biology and Agriculture, Springer, 1998;
pp.1–25.
[91] Bechar A. ScienceDirect agricultural robots for field operations . Part
2 : Operations and Systems, Vol. 3, 2016.
[92] Hameed I A, la Cour-Harbo A, Osen O L. Side-to-side 3D coverage path
planning approach for agricultural robots to minimize skip/overlap areas
between swaths. Rob. Auton. Syst., 2016; 76: 36–45.
[93] Hameed I A. Intelligent coverage path planning for agricultural robots
and autonomous machines on three-dimensional terrain. J. Intell. Robot.
Syst., 2014; 74(3-4): 965–983.
[94] Hameed I A, Bochtis D D, Sorensen C G. Driving angle and track
sequence optimization for operational path planning using genetic
algorithms. Appl. Eng. Agric., 2011; 27(6): 1077–1086.
[95] Tillett N D. Automatic guidance sensors for agricultural field machines:
a review. J. Agric. Eng. Res., 1991; 50: 167–187.
[96] Noguchi N, Ishii K, Terao H. Development of an agricultural mobile
robot using a geomagnetic direction sensor and image sensors. J. Agric.
Eng. Res., 1997; 67(1): 1–15.
[97] Noguchi N, Terao H. Path planning of an agricultural mobile robot by
neural network and genetic algorithm. Comput. Electron. Agric., 1997;
18(2-3): 187–204.
[98] Åstrand B and Baerveldt A J. An agricultural mobile robot with
vision-based perception for mechanical weed control. Auton. Robots.
2002; 13(1): 21–35.
[99] Hopkins M. Automating in the 21st century career and technical
education. Greenh. Grow, 2000; pp. 4–12.
[100] Pilarski T, Happold M, Pangels H, Ollis M, Fitzpatrick K, Stentz A. The
demeter system for automated harvesting. Auton. Robots, 2002; 13(1):
9–20.
[101] Sezen B. Modeling automated guided vehicle systems in material
handling. Dogus Oniversiiesi Dergisi, 2003; 4(2): 207-216.
[102] Shamshiri R, Ismail W I W. A review of greenhouse climate control and
automation systems in tropical regions. J. Agric. Sci. Appl., 2013; 2(3):
176–183.
[103] Bac C W, Roorda T, Reshef R, Berman S, Hemming J, van Henten E J.
Analysis of a motion planning problem for sweet-pepper harvesting in a
dense obstacle environment. Biosyst. Eng., 2016; 146: 85–97.
[104] Young S L, Giles D K. Targeted and microdose chemical applications. in
automation: The future of weed control in cropping systems. Springer,
2014; pp.139–147.
[105] Midtiby H S, Mathiassen S K, Andersson K J, Jørgensen R N.
Performance evaluation of a crop/weed discriminating microsprayer.
Comput. Electron. Agric., 2011; 77(1): 35–40,.
[106] Sander S. BoniRob: An Autonomous Mobile Platform for Agricultural
Applications, 2015. Available at: http://www.fieldrobot.com/ieeeras/
13. July, 2018 Shamshiri R R, et al. Research and development in agricultural robotics: A perspective of digital farming Vol. 11 No.4 13
Downloads/20150923-Sander-Presentation.pdf.
[107] Bawden O, Ball D, Kulk J, Perez T, Russell R. A lightweight, modular
robotic vehicle for the sustainable intensification of agriculture. Australian
Robotics & Automation Association ARAA, 2014.
[108] Ruckelshausen A, Klose R, Linz A, Marquering J, Thiel M, Tölke S.
Autonome Roboter zur Unkrautbekämpfung. Zeitschrift für
Pflanzenkrankheiten und Pflanzenschutz, 2006; pp.173–180.
[109] MacKean R, Jones J L, Francis Jr J T. Weeding robot and method.
Google Patents, 24-Aug-2017.
[110] Jørgensen R N, Sørensen C G, Maagaard J, Havn I, Jensen K, Søgaard H T,
et al. Hortibot: A system design of a robotic tool carrier for high-tech
plant nursing. CIGR Ejournal, Vol. IX, No.1, Manuscript ATOE 07 006,
2007.
[111] Green O, Schmidt T, Pietrzkowski R P, Jensen K, Larsen M, Jørgensen R
N. Commercial autonomous agricultural platform: Kongskilde Robotti.
Second International Conference on Robotics and associated
High-technologies and Equipment for Agriculture and Forestry, 2014;
pp.351–356.
[112] Bogue R. Robots poised to revolutionise agriculture. Ind. Rob., 2016;
43(5): 450–456.
[113] Lamm R. D, Slaughter D C, Giles D K. Precision weed control system
for cotton. Trans. ASAE, 2002; 45(1): 231.
[114] Lee W S, Slaughter D C, Giles D K. Robotic weed control system for
tomatoes. Precis. Agric., 1999; 1(1): 95–113.
[115] Burks T F, Subramanian V, Singh S. Autonomous greenhouse sprayer
vehicle using machine vision and ladar for steering control. Proceedings
of the 2004 Conference on Automation Technology for Off-Road
Equipment, 2004; 79p.
[116] Subramanian V. Autonomous vehicle guidance using machine vision and
laser radar for agricultural applications. Doctoral dissertation. University
of Florida, 2005.
[117] Ollero A, Mandow A, Muñoz V F, De Gabriel J G. Control architecture
for mobile robot operation and navigation. Robot. Comput. Integr.
Manuf., 1994; 11(4): 259–269.
[118] Mandow A, Gomez-de-Gabriel J M, Martinez J L, Munoz V F, Ollero A,
Garcia-Cerezo A. The autonomous mobile robot AURORA for
greenhouse operation. IEEE Robot. Autom. Mag., 1996; 3(4): 18–28.
[119] Martínez J L, Mandow A, Morales J, Pedraza S, García-Cerezo A.
Approximating kinematics for tracked mobile robots. Int. J. Rob. Res.,
2005; 24(10): 867–878.
[120] Kiani S, Jafari A. Crop detection and positioning in the field using
discriminant analysis and neural networks based on shape features. J. Agr.
Sci. Tech, 2012; 14: 755-765.
[121] Perez A J, Lopez F, Benlloch J V, Christensen S. Colour and shape
analysis techniques for weed detection in cereal fields. Comput. Electron.
Agric., 2000; 25(3): 197–212.
[122] Cho S, Lee D S, Jeong J Y. AE—automation and emerging technologies:
Weed–plant discrimination by machine vision and artificial neural network.
Biosyst. Eng., 2002; 83(3): 275–280.
[123] Jafari A, Mohtasebi S S, Jahromi H E, Omid M. Weed detection in sugar
beet fields using machine vision. Int. J. Agric. Biol, 2006; 8(5): 602–605.
[124] Huang M, He Y. Crop and weed image recognition by morphological
operations and ANN model. Proceedings of IEEE Conference on
Instrumentation and Measurement Technology (IMTC 2007), 2007;
pp.1–4.
[125] Meyer G E, Mehta T, Kocher M F, Mortensen D A, Samal A. Textural
imaging and discriminant analysis for distinguishingweeds for spot
spraying. Trans. ASAE, 1998; 41(4): 1189–1197.
[126] Polder G, van Evert F K, Lamaker A, de Jong A, Van der Heijden G, Lotz
L A P, et al. Weed detection using textural image analysis. EFITA/
WCCA Conference, 2007.
[127] Zhu B, Jiang L, Luo Y, Tao Y. Gabor feature-based apple quality
inspection using kernel principal component analysis. J. Food Eng., 2007;
81(4): 741–749.
[128] Langenakens J, Vergauwe G, De Moor A. Comparing hand held spray
guns and spray booms in lettuce crops in a greenhouse. Asp. Appl. Biol.,
2002; 66: 123–128.
[129] Nuyttens D, Windey S, Sonck B. Comparison of operator exposure for
five different greenhouse spraying applications. J. Agric. Saf. Health,
2004; 10(3): 187.
[130] Sánchez-Hermosilla J, Rodríguez F, González R, Guzmán J L, Berenguel
M. A mechatronic description of an autonomous mobile robot for
agricultural tasks in greenhouses. In Mobile Robots Navigation, InTech,
2010; 583-608.
[131] González R, Rodríguez F, Sánchez-Hermosilla J, Donaire J G. Navigation
techniques for mobile robots in greenhouses. Appl. Eng. Agric., 2009;
25(2): 153–165.
[132] Borenstein J, Everett H R, Feng L. Navigating mobile robots: Systems
and techniques. AK Peters, Ltd., 1996.
[133] Belforte G, Deboli R, Gay P, Piccarolo P, Aimonino D R. Robot design
and testing for greenhouse applications. Biosyst. Eng., 2006; 95(3):
309–321.
[134] Burks T F, Shearer S A, Heath J R, Donohue K D. Evaluation of
neural-network classifiers for weed species discrimination. Biosyst. Eng.,
2005; 91(3): 293–304.
[135] Granitto P M, Verdes P F, Ceccatto H A. Large-scale investigation of
weed seed identification by machine vision. Comput. Electron. Agric.,
2005; 47(1): 15–24.
[136] Bogue R. Sensors key to advances in precision agriculture. Sens. Rev.,
2017; 37(1): 1–6.
[137] Mulla D J. Twenty five years of remote sensing in precision agriculture:
Key advances and remaining knowledge gaps. Biosyst. Eng., 2013;
114(4): 358–371.
[138] Shamshiri R, Ishak W, Ismail W. Design and Simulation of Control
Systems for a Field Survey Mobile Robot Platform. Res. J. Appl. Sci.
Eng. Technol., 2013; 6(13): 2307–2315.
[139] Das J, Cross G, Qu C, Makineni A, Tokekar P, Mulgaonkar Y, Kumar V.
Devices, systems, and methods for automated monitoring enabling
precision agriculture. IEEE International Conference on Automation
Science and Engineering (CASE), 2015; pp.462–469.
[140] Guzman R, Navarro R, Beneto M, Carbonell D, Robotnik—Professional
service robotics applications with ROS. Robot Operating System (ROS),
Springer, 2016; pp.253–288.
[141] Pierzchała M, Giguère P, Astrup R. Mapping forests using an unmanned
ground vehicle with 3D LiDAR and graph-SLAM. Comput. Electron.
Agric., 2018; 145: 217–225.
[142] Strisciuglio N, Tylecek R, Petkov N, Bieber P, Hemming J, van Henten E,
et al. TrimBot2020: an outdoor robot for automatic gardening. arXiv
Prepr. arXiv1804.01792, 2018.
[143] Diago M P and Tardaguila J. A new robot for vineyard monitoring.
Wine Vitic. J., 2015; 30(3): 38.
[144] Underwood J P, Calleija M, Taylor Z, Hung C, Nieto J, Fitch R, et al.
Real-time target detection and steerable spray for vegetable crops. in
Proceedings of the International Conference on Robotics and Automation:
Robotics in Agriculture Workshop, Seattle, WA, USA, 2015; pp.26–30.
[145] Bergerman M, Billingsley J, Reid J, van Henten E. Robotics in
agriculture and forestry. in Springer Handbook of Robotics, Springer, 2016;
pp.1463–1492.
[146] Blender T, Buchner T, Fernandez B, Pichlmaier B, Schlegel C.
Managing a Mobile Agricultural Robot Swarm for a seeding task. in
Industrial Electronics Society, IECON 2016-42nd Annual Conference of
the IEEE, 2016; pp.6879–6886.
[147] Blender T, Schlegel C. Motion control for omni-drive servicerobots
under Kinematic, Dynamic And Shape Constraints. IEEE 20th
Conference on Emerging Technologies & Factory Automation (ETFA),
2015; pp.1–8.
[148] Guzmán R, Ariño J, Navarro R, Lopes C M, Graça J, Reyes M, et al.
Autonomous hybrid GPS/reactive navigation of an unmanned ground
vehicle for precision viticulture-VINBOT, In 62nd German Winegrowers
Conference At: Stuttgart, 2016.
[149] Roure F, Moreno G, Soler M, Faconti D, Serrano D, Astolfi P, et al.
GRAPE: Ground Robot for vineyArd Monitoring and ProtEction. Iberian
Robotics Conference, 2017; pp.249–260.
[150] [150] Młotek M, Kuta Ł, Stopa R, Komarnicki P. The effect of manual
harvesting of fruit on the health of workers and the quality of the obtained
produce, 2015; 3: 1712–1719.
[151] Vallad GE, Smith HA, Dittmar PJ, Freeman JH. Vegetable production
handbook of Florida. Gainesville, FL, USA: University of Florida, IFAS
Extension; 2017. Available at:
http://edis.ifas.ufl.edu/pdffiles/CV/CV29200.pdf [Last access: July 10th,
2018].
[152] Li P, Lee S H, Hsu H Y. Review on fruit harvesting method for potential
use of automatic fruit harvesting systems. Procedia Eng., 2011; 23:
351–366.
[153] Qureshi W S, Payne A, Walsh K B, Linker R, Cohen O, Dailey M N.
Machine vision for counting fruit on mango tree canopies. Precision
Agriculture, 2017; 18(2): 224–244.
[154] Weltzien C, Harms H H, Diekhans N. KFZ Radarsensor zur
Objekterkennung im landwirtschaftlichen Umfeld. LANDTECHNIK–
Agricultural Eng., 2006; 61(5): 250–251.
14. 14 July, 2018 Int J Agric & Biol Eng Open Access at https://www.ijabe.org Vol. 11 No.4
[155] Bulanon D M, Burks T F, Alchanatis V. Study on temporal variation in
citrus canopy using thermal imaging for citrus fruit detection. Biosyst.
Eng., 2008; 101(2): 161–171.
[156] Okamoto H, Lee W S. Green citrus detection using hyperspectral
imaging. Comput. Electron. Agric., 2009; 66(2): 201–208.
[157] Westling F, Underwood J, Örn S. Light interception modelling using
unstructured LiDAR data in avocado orchards. arXiv Prepr.
arXiv1806.01118, 2018.
[158] Bulanon D M, Burks T F, Alchanatis V. Image fusion of visible and
thermal images for fruit detection. Biosyst. Eng., 2009; 103(1): 12–22.
[159] Hemming J, Ruizendaal J, illem Hofstee J W, van Henten E J. Fruit
detectability analysis for different camera positions in sweet-pepper.
Sensors (Basel)., 2014;14(4): 6032–6044.
[160] Song Y, Glasbey . A, Horgan G W, Polder G, Dieleman J A, van der
Heijden G W A M. Automatic fruit recognition and counting from
multiple images. Biosyst. Eng., 2014; 118(C): 203–215.
[161] Tao Y, Zhou J. Automatic apple recognition based on the fusion of color
and 3D feature for robotic fruit picking. Comput. Electron. Agric., 2017;
142(A): 388–396.
[162] Bao G, Cai S, Qi L, Xun Y, Zhang L, Yang Q. Multi-template matching
algorithm for cucumber recognition in natural environment. Comput.
Electron. Agric., 2016; 127(C): 754–762.
[163] Barnea E, Mairon R, Ben-Shahar O. Colour-agnostic shape-based 3D
fruit detection for crop harvesting robots. Biosyst. Eng., vol. 146, pp.
57–70, 2016.
[164] Vitzrabin E, Edan Y. Adaptive thresholding with fusion using a RGBD
sensor for red sweet-pepper detection. Biosyst. Eng., 2016; 146: 45–56.
[165] Gongal A, Amatya S, Karkee M, Zhang Q, Lewis K. Sensors and
systems for fruit detection and localization: A review. Comput. Electron.
Agric., 2015; 116: 8–19.
[166] Wang Q, Nuske S, Bergerman M, Singh S. Automated crop yield
estimation for apple orchards. In Proc. International Symposium on
Experimental Robotics, Quebec, June 2012; pp.745–758.
[167] Sa I, Ge Z, Dayoub F, Upcroft B, Perez T, McCool C. Deepfruits: A fruit
detection system using deep neural networks. Sensors, 2016; 16(8):
1222.
[168] Amatya S, Karkee M, Gongal A, Zhang Q, Whiting M D. Detection of
cherry tree branches with full foliage in planar architecture for automated
sweet-cherry harvesting. Biosyst. Eng., 2015; 146: 3–15.
[169] Bac C W, Henten E J, Hemming J, Edan, Y. Harvesting Robots for
High‐value Crops: State‐of‐the‐art Review and Challenges Ahead.
J. F. Robot., 2014; 31(6): 888–911.
[170] Murphy R. Introduction to AI robotics. MIT Press, 2000.
[171] Bontsema J, Hemming J, Pekkeriet E, Saeys W, Edan Y, Shapiro A, et al.
CROPS: Clever robots for crops. Eng. Technol. Ref., 2015; 1(1): 1–11.
[172] De-An Z, Jidong L, Wei J, Ying Z, Yu C. Design and control of an apple
harvesting robot. Biosyst. Eng., 2011; 110(2): 112–122.
[173] Davidson J R, Hohimer C J, Mo C, Karkee M. Dual Robot Coordination
for Apple Harvesting. ASABE Annual International Meeting, 2017; p.1.
[174] Davidson J R, Silwal A, Hohimer C J, Karkee M, Mo C, Zhang Q,
Proof-of-concept of a robotic apple harvester. IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS), 2016; pp.634–639.
[175] He L, Fu H, Karkee M, Zhang Q. Effect of fruit location on apple
detachment with mechanical shaking. Biosyst. Eng., 2017; 157: 63–71.
[176] Bac C W. Improving obstacle awareness for robotic harvesting of
sweet-pepper. Wagenigen University, 2015.
[177] Bao G J, Yao P F, Xu Z G, Li K, Wang Z H, Zhang L B, et al.
Pneumatic bio-soft robot module: Structure, elongation and experiment.
Int J Agric & Biol Eng, 2017; 10(2): 114–122.
[178] Nguyen T T, Kayacan E, De Baedemaeker J, Saeys W. Task and motion
planning for apple harvesting robot. IFAC Proc., 2013; 46(18): 247–252.
[179] Osroosh Y, Khot L R, Peters R T. Economical thermal-RGB imaging
system for monitoring agricultural crops. Comput. Electron. Agric., 2018;
147: 34–43.
[180] Tzounis A, Katsoulas N, Bartzanas T, Kittas C. Internet of things in
agriculture, recent advances and future challenges. Biosyst. Eng., 2017;
164: 31–48.
[181] Kamilaris A, Kartakoullis A, and Prenafeta-Boldú F X. A review on the
practice of big data analysis in agriculture. Comput. Electron. Agric.,
2017; 143: 23–37.
[182] Hornero G, Gaitán-Pitre J E, Serrano-Finetti E, Casas O, Pallas-Areny R.
A novel low-cost smart leaf wetness sensor. Comput. Electron. Agric.,
2017; 143: 286–292.
[183] Sarangi S, Umadikar J, Kar S. Automation of Agriculture Support
Systems using Wisekar: Case study of a crop-disease advisory service.
Comput. Electron. Agric., 2016; 122: 200–210.
[184] Ojha T, Misra S, Raghuwanshi N S. Wireless sensor networks for
agriculture: The state-of-the-art in practice and future challenges.
Comput. Electron. Agric., 2015; 118: 66–84.
[185] Nam W H, Kim T, Hong E M, Choi J Y, Kim J T. A Wireless Sensor
Network (WSN) application for irrigation facilities management based on
Information and Communication Technologies (ICTs). Comput. Electron.
Agric., 2017; 143: 185–192.
[186] Zhang Y, Li Y, He Y, Liu F, Cen H, Fang H. Near ground platform
development to simulate UAV aerial spraying and its spraying test under
different conditions. Comput. Electron. Agric., 2018; 148: 8–18.
[187] Senthilnath J, Kandukuri M, Dokania A, Ramesh K N. Application of
UAV imaging platform for vegetation analysis based on spectral-spatial
methods. Comput. Electron. Agric., 2017; 140: 8–24.
[188] Koc-San D, Selim S, Aslan N, San B T. Automatic citrus tree extraction
from UAV images and digital surface models using circular Hough
transform. Comput. Electron. Agric., 2018; 150: 289–301.
[189] Xia J, Yang Y, Cao H, Han C, Ge D, Zhang W. Visible-near infrared
spectrum-based classification of apple chilling injury on cloud computing
platform. Comput. Electron. Agric., 2018; 145: 27–34.
[190] Satit K and Somjit A. RESTful economic-ADS model for cost-effective
chain-wide traceability system-based cloud computing. Comput.
Electron. Agric., 2017; 139: 164–179.
[191] Ojha T, Misra S, Raghuwanshi N S. Sensing-cloud: Leveraging the
benefits for agricultural applications. Comput. Electron. Agric., 2017;
135: 96–107.
[192] Gu Z, Qi Z, Ma L, Gui D, Xu J, Fang Q, Yuan S, Feng G. Development
of an irrigation scheduling software based on model predicted crop water
stress. Comput. Electron. Agric., 2017; 143: 208–221.
[193] Burguete J, Lacasta A, García-Navarro P. SURCOS: A software tool to
simulate irrigation and fertigation in isolated furrows and furrow networks.
Comput. Electron. Agric., 2014; 103: 91–103.
[194] Cappelli G, Bregaglio S, Romani M, Feccia S, Confalonieri R. A
software component implementing a library of models for the simulation
of pre-harvest rice grain quality. Comput. Electron. Agric., 2014; 104:
18–24.
[195] Duarte L, Teodoro A. C, Monteiro A. T, Cunha M, Gonçalves H.
QPhenoMetrics: An open source software application to assess vegetation
phenology metrics. Comput. Electron. Agric., 2018; 148: 82–94.
[196] Gimenez J, Herrera D, Tosetti S, and Carelli R. Optimization
methodology to fruit grove mapping in precision agriculture. Comput.
Electron. Agric., 2015; 116: 88–100.
[197] Chen D, Shi Y, Huang W, Zhang J, Wu K. Mapping wheat rust based on
high spatial resolution satellite imagery. Comput. Electron. Agric., 2018;
152: 109–116.
[198] Pomar J, López V, and Pomar C. Agent-based simulation framework for
virtual prototyping of advanced livestock precision feeding systems.
Comput. Electron. Agric., 2011; 78(1): 88–97.
[199] Kaloxylos A, Groumas A, Sarris V, Katsikas L, Magdalinos P, Antoniou E,
Politopoulou Z, Wolfert S, Brewster C, Eigenmann R, Maestre Terol C.
A cloud-based Farm Management System: Architecture and
implementation. Comput. Electron. Agric., 2014; 100: 168–179.
[200] Luvisi A, Pagano M, Bandinelli R, Rinaldelli E, Gini B, Scartòn M,
Manzoni G, and Triolo E. Virtual vineyard for grapevine management
purposes: A RFID/GPS application. Comput. Electron. Agric., 2011;
75(2): 368–371.
[201] Stojanovic V, Falconer R E, Isaacs J, Blackwood D, Gilmour D,
Kiezebrink D, Wilson J. Streaming and 3D mapping of AGRI-data on
mobile devices. Comput. Electron. Agric., 2017; 138: 188–199.
[202] Cunha C R, Peres E, Morais R, Oliveira A A, Matos S G, Fernandes M A,
Ferreira P J S G, Reis M J C S. The use of mobile devices with multi-tag
technologies for an overall contextualized vineyard management.
Comput. Electron. Agric., 2010; 73( 2): 154–164.
[203] Picon A, Alvarez-Gila A, Seitz M, Ortiz-Barredo A, Echazarra J, Johannes
A. Deep convolutional neural networks for mobile capture device-based
crop disease classification in the wild. Comput. Electron. in Agric., 2018.
doi: 10.1016/j.compag.2018.04.002
[204] Molina-Martínez J M, Jiménez M, Ruiz-Canales A, Fernández-Pacheco D
G. RaGPS: A software application for determining extraterrestrial
radiation in mobile devices with GPS. Comput. Electron. Agric., 2011;
78(1): 116–121.
[205] Roldán J J, Garcia-Aunon P, Garzón M, de León J, del Cerro J, Barrientos
A. Heterogeneous multi-robot system for mapping environmental
variables of greenhouses. Sensors (Basel), 2016; 16(7): 1018. doi:
10.3390/s16071018.