(Field Autonomous Rover Machinery-Smart Irrigation System)
India is an intensive agriculture country. Agriculture, in its various forms, makes up to about 58% of people’s primary livelihood source in India. More than 70% of all rural population in India is directly, or indirectly depending on agriculture for its income. And therefore, agriculture has a significant GDP contribution to our country (about 17–18% of net GDP).
In a country like ours, where we stand independent on most of the food produce, there are, but several problems that come with it. The major one still remains the inappropriate use of insecticides/pesticides and improper watering to the crops by farmers. Crops worth Rs. 50000 Crore are lost each year due to low or improper consumption of pesticides in India. Whereas using them in higher quantities have resulted in serious problems for both the farmers and consumers like chronic damages to kidney, liver, reproductive organs, lungs etc due to intake of chemicals present in insecticides/pesticides. Also, it is quite known that in areas where there is water shortage, many a times farmers are not able to strategize proper watering of crops and end up having poor harvest.
Such problems not only need better solutions than those already existing, but also an automated one, that makes use of the current available technologies in both hardware and software domains of Computer Science. The proposed solution I’m presenting for the same is FARM-SIS(Field Autonomous Rover Machinery-Smart Irrigation System), which is presented in detail in this document. The project will not only help farmers in managing the efficient and requisite usage of insecticide/pesticides on their plants, but also in efficient distribution of water in crops in just the right quantities.
The primary aim is to create an automated system that will ensure proper injection of pesticides/insecticides in requisite amounts where it is really needed. Also, an automated system to deliver water right near the roots of the plants by using the required sensors and algorithms equipped
with a notification system that’ll notify the farmer via a text message of any automated activity or error by the same. The entire system consists of two modules. The first being the FARM, an independent, autonomous, four-wheeled bot that will traverse through between the rows of the crops in a farm field. This module was developed and tested successfully by me and my team (Naman Bansal, Swapnil Panwala and Bharat Ahuja). It will start from one end of the field and traverse in the entire field, row by row and let the farmer know once the entire field is traversed through. The bot is equipped with the required hardware like cameras and sensors and also with a deep learning algorithm that will scan leaves of the plants in real time with the cameras and detect if they have been infected with any disease. If yes, the requisite amount of medicine would be then spread onto the needed/affected parts. The second module is a Smart Irrigation System (SIS), which was later implemented by me to further increase the capability of our earlier module. It is equipped with hardware that will have sensors separated by a specific distance, spread in the entire filed. Each of that sensor will return a value to the algorithm of the moisture level, and the algorithm will automatically eject water in the needed areas autonomously, at the same time notifying the farmer of all updates.
We will discuss the work process of both the major modules of the proposed solution one at a time.
1) FARM
The current version of the project has been tested and is working for potato crops but can be suited to other crops as well given we train the algorithm for data pertaining to them.
The primary goals we had in mind in the beginning were as follows: — a) Scan the leaves of the crops. b) Detect the leaves for infected areas with a deep learning algorithm running in the backend. c) Spray requisite amount of insecticide/pesticide on the affected areas. d) Automatically traverse along the crop field, without damaging the any produce. e) Notify the farmer once the job is done, or in case of any anomaly.
The following images are of our first developed prototype (Fig 3a) and the final product we developed and tested on field (Fig 3b).
The bot is supposed to be a self-contained and independent product which is capable of identifying lane patterns in farms and also, detecting/rectifying defects in the plants growing in the farms.
The entire backend algorithm for the module runs on two independent CPUs (both Raspberry Pi, model:-3B+, running raspberry stretch with Pixel, Fig-4a). Both the CPUs were connected to one camera each, for their respective purposes. One was for the process of lane detection and the other was for the deep learning algorithm for detecting diseases in plants. The hardware components consisted of two ultra-sonic sensors in the front, two 12V heavy duty batteries, 4 motor drivers for each wheel, a lightning system, a motor connected to the pump for spraying insecticides and a container or two filled with insecticides/pesticides (in liquid state) in which the motor remained submerged. All these hardware components were controlled by our algorithm via an external circuit board, Arduino Mega (Fig-4b) . All the hardware mentioned above, except the batteries and a bit of wirings, was glued down to the bottom of the bot and was coded in C++ which was then uploaded to Arduino Board.
The software backend was majorly coded in Python 3.6 on the Raspberry Pi(s) .
1. Machine Learning Concepts Every Data Scientist Should Know
2. AI for CFD: byteLAKE’s approach (part3)
3. AI Fail: To Popularize and Scale Chatbots, We Need Better Data
4. Top 5 Jupyter Widgets to boost your productivity!
The lane detection algorithm was developed by defining colour codes to the plants and the space between the rows in a field. A virtual path was made by drawing up lines (a simulation from above of the same is shown in Fig 5).
Using this, and calculating an equal mean difference from lines from both the sides, we decided a safe traversal option for the wheels of our bot. We need to consider the point that in each iteration of its traversal, the bot scan leaves by traveling over a row at a time. After the bot reaches end of a particular row in x direction, the algorithm will notify it since it would also mean the end of the blue line in the algorithm. The existence of field is kept in notice at all times since the camera for lane detection is attached to the front of the bot bent downwards at an angle of 45 degrees.
The camera feeds each recorded frame of the real time video stream to the algorithm which is then pre-processed, and information is retrieved from the same. As soon as the algorithm detects an end of a row, the bot will autonomously rotate and start with a new row in a zig-zag fashion (Fig 6). Fig 7 shows the visual of the camera for lane detection and traversal as recorded by the bot.
The traversal was made possible by controlling the motor driver of each wheel of the bot via the Arduino Board according to the decision made by the algorithm running in one of the CPUs.
The second CPU ran our deep learning algorithm for detecting the infected leaves of the plant. For training our model on the same, we had our model learn from three possible scenarios, early blight(also known as collar rot, where the leaves form dark or/and sunken lesions at or above the soil line), late blight(black and brown lesions appear on stem and petioles) and healthy species. We had about 200–300 images for each type and were gathered by us on an actual farm. We developed a CNN model for the same (similar to what shown in Fig 8).
A raw code of layers in visible in Fig-9 of the various layers we added to our network. The output layer is using the Softmax activation function, so that when we compare the images from real time by using our model, we will get the probability of each case(early blight, late blight or healthy) for the captured image. The case which would have the highest probability would be assigned to the image and appropriate action would be taken. For example, suppose a particular leaf is captured by the bot. When run through the model, the algorithm gives us 0.6 probability that it is early blight, 0.1 that it is late blight and 0.3 that it is healthy, so the algorithm will consider the leaf as infected with early blight and spray the requisite amount of medicine in the affected area. In the other case, suppose if it gave a probability of 0.6 as healthy, then the bot would have considered it as a healthy plant and resumed its traversal.
After the model is trained on the above mentioned three classes, it is saved on the Raspberry Pi, and we create a small algorithm for testing the live images that the bot captures, and compare them with the images of the three classes with the so obtained model. This is the part which would actually run at real time. The bot has a second camera equipped beneath its base which would stream video live and the algorithm will break the video feed into image frames. It would then compare the leaves (if present in a frame) with the model and assign it a suitable class. Then depending on the class, an appropriate action would be taken by the bot. The plants as visible by the camera and as recorded in real time could be seen in one such example frame in Fig 10.
This image would be run through by the model and if the leaf is found to be affected, the algorithm would trigger a command to the Arduino Board to power up the water motor and spray the insecticide/pesticide via a pipe attached to the motor and aimed at the line of sight of the camera. If the bot encountered any unpredicted obstacle like a rock or any such stoppage, the two ultrasonic sensors which are calibrated to alert the bot if anything as such comes up within a range of 20 cm, will send the values to the lane detection algorithm, which in turn would halt the bot and notify the farmer via an SMS. Also an SMS alert would also be sent in case the battery of the bot is about to get drained or if the bot has completed its traversal.
2) SIS
This module was later added by me, in what seemed at that time as a major update to the capabilities of the system. Though completely independent of the FARM module, it can be considered as a work for the same system.
This module was relatively smaller and was tested in small field of just 10–12 plants. Soil sensors were planted in the field at a specific distance (25–10 cm) apart over the entire filed. The sensors were connected to their circuit board(Fig 11) which consisted of the 4 pins(AO, D,O gnd and Vcc). Vcc was for voltage input and gnd was for ground. AO and DO were for analog and digital outputs respectively. Digital would give an absolute value in 1 or 0, depending on whether the sensor recorded some moisture whereas Analog could give us the percent value of moisture recorded in the soil.
The many circuit board were connected to an Arduino Board where the code to control, read values of sensors and take appropriate actions was uploaded. The Arduino Board was connected to a CPU, Raspberry Pi (Model 3B+) where a python script was uploaded to launch at each reboot. The python script was coded using an api called ‘way2sms’ which enabled the script to send SMS to the farmer for each action or any important notification. (Fig 12).
The threshold I’ve set here is 40.0, that is if the moisture at any particular area, where the soil sensor is installed, falls below 40%, a notification by SMS would be sent to farmer and the script will trigger the Arduino Board to spray water to the area by powering on the attached motor and stop as soon as the required moisture level is reached. Fig 13(a and b) show the different scenarios of the soil in an indoor test before and after the trigger.
So, the proposed solution presented in this document FARM-SIS, is a direct approach to develop a product that will help India save the annual losses it suffers in the agriculture sector that ranges from 50,000–60,000 crores. Although not thoroughly tested on real life scenarios yet, we did deploy the project on sample farms for results and did get some outstanding results. Many farmers do not have subtle knowledge about the usage patterns and spray instructions of using the insecticides/pesticides over crops. Fig 14 shows an estimate of reasons and their contributions to the crop loss that India suffers each year. It is very clear from the graph that minus the damage done by weeds(unwanted plant growth amongst the crops), the major contributors to the crop losses are due to insects(26%), diseases(26%) and others such as rodents and other small pests(15%). If we for once assume the absolute removal of such problems, that will solve about 66% of the issues pertaining to crop losses. Though it may seem a bit far-fetched, it is high time that work is done in this domain as well.
We did test the FARM module on a real farm as well. The outcome we achieved was very encouraging. The battery life lasted for about 1–1.5 hours and the bot was able to climb over rough and uneven terrain owing to the off-road tyres we equipped it with. The average speed of the bot remains between 3–5 km/hr, and it stops occasionally for about 5–7 seconds whenever it matches a leaf as unhealthy of either category(early or late blight) to spray a requisite amount of medicine in the needed area. Fig 15 shows an image of an infected leaf as classified by the bot. The bot has stopped here for spraying the required medicine over the same.
When we rechecked our results over the farm field physically for the classes we trained our algorithm on, we did find out the bot was able to recognize most of the infected parts, except those that were not quite visible in the frame due to lightning issues, speed control, camera angle etc.
Although several times human intervention was needed when the bot got stuck into some potholes or some rock got stuck in between, but at least we can confirm that the human error in over spraying of pesticides was significantly reduced. We also worked to develop a live platform, that was wirelessly connected to the camera that was responsible for lane detection over a local network, so that the farmer could see the live feed of the bot in real time in his own device within a certain distance. Live streaming is an extended feature of the F.A.R.M. which would enable the user to supervise the tasks assigned to the bot by capturing the real time footage which would enable infection detection followed by spraying of insecticide. The response of live streaming would be independent of the other functionalities of the bot and hence would be a continuous process.
The second module, SIS (Smart Irrigation System) was also tested on a real farm with a controlled environment. I observed the system for about 24 hours and the outcome was very encouraging. The idea was too simple to execute and there was very less scope for human intervention. An SMS notification was sent to a pre-set mobile number each time any of the sensor would detect a moisture level of its surrounding oil area to be less than 40% (or any value pre-set in the algorithm). Fig 16 shows an example SMS as such received when using a single sensor in the system, as soon as the moisture level is recorded lower than a pre-set value, the python script running in Raspberry Pi will command the Arduino Board to power up the pump and release water near the area around that sensor, till the moisture level is not replenished to a certain value.
The following are all the features, their importance, and respective functions: –
Lane Detection and Autonomous movement
Description and Priority:
High priority task.
Lane detection would help the robot in moving on the right track which would be enabled using OpenCV and Numpy libraries in Python v3.6.x to create a software for the same.
Stimulus/Response Sequences
As soon as the bot is turned on, the action sequences in the script would be the first to be executed.
Live Streaming
Description and Priority:
Low priority task:
Live streaming is an extended feature of the F.A.R.M. which would enable the user to supervise the tasks assigned to the bot by capturing the real time footage which would enable infection detection followed by spraying of insecticide.
Stimulus/Response Sequences:
The response of live streaming would be independent of the other functionalities of the bot and hence would be a continuous process.
Infection Detection
Description and Priority:
High priority task.
The bot would use the Pi cam and software to identify the infections in the plants. It is the prime functionality of the bot.
Stimulus/Response Sequences:
The stimulus for infection detection would be the live feed received by the Pi cam and its response would be a message indicating whether the plant is infected or not.
Spraying Insecticide
Description and Priority:
Medium priority task.
The bot would spray the requisite amount of insecticide after identifying the infection/deformity.
Stimulus/Response Sequences:
The stimulus for spraying insecticide would be the message indicated by infection detection and the response would be the spraying of insecticide.
Detecting water level accurately
Description and Priority:
High priority task.
The system should be able to detect the water level in the soil accurately and should report the same to the python script running in Raspberry Pi.
Stimulus/Response Sequences:
The stimulus for detecting water level would be the spray of water in the required areas till a certain moisture level is reached and let the user/admin know by an SMS of any action performed.
D. S. Jenkinson and D. S. Powlson, “The effects of biocidal treatments on metabolism in soil-I. Fumigation with chloroform,” Soil Biology and Biochemistry.
M. M. Andrea, T. B. Peres, L. C. Luchini, M. A. Marcondes, A. Pettinelli Jr., and L. E. Nakagawa, “Impact of long term applications of cotton pesticides on soil biological properties, dissipation of [14C]-methyl parathion and persistence of multipesticide residues,” in Proceedings of the International Atomic Energy Agency, Vienna, Austria, 2001.
World Health Organization. Public Health Impact of Pesticides Used in Agriculture. England: World Health Organization; (1990)
Osman KA. Pesticides and human health. In: Stoytcheva M, editor. Pesticides in the Modern World — Effects of Pesticides Exposure. InTech; (2011). p. 206–30.
Banerjee, B. D., Seth, V., and Ahmed, R. S. (2001). Pesticide-induced oxidative stress: perspective and trends. Rev. Environ. Health 16, 1–40.
Credit: BecomingHuman By: ALARSH TIWARI