Emesh Agriculture

Photo by Dan Meyers on Unsplash

Many global issues need to be solved- such as world hunger, climate change, food waste, resource depletion, and so many more problems; and to add to the world hunger fuel, we need 70% more food by 2050 to feed the entire population.

The second agricultural revolutionized farming as it introduced new crop rotation techniques and inventions to help in picking cotton. This revolution helped our society to become what it is today, but this is not good enough for our future. It is not sustainable.

Every year, agriculture uses 2.8 trillion cubic meters of freshwater, 189.9 million metric tonnes of fertilizer, 2 million tonnes of pesticides. And it emits 6.3 billion tonnes of greenhouse gases.

If we want a sustainable future, we will need a 3rd agricultural revolution- the reinvention of agriculture.

Emesh Agriculture is a company looking to create a world of sustainable agriculture by improving current data collection methods to help farmers’ decision-making.

We will do this by utilizing nanosensors and drones to collect data on the conditions of the plants and soils, which AI will then analyze.


Nanosensors are on the nanoscale, between 1 and 100 nanometer, which is about a thousandth times the width of a human hair.

Because they are small in size, they have unique properties — faster, more sensitive, and can detect multiple analytes — that larger sensors do not.

How it works-

Nanosensors work by monitoring electrical changes in the sensor materials. Nanosensors detect changes from external interactions and communicate to the other nanocomponents.

They are passive, meaning they do not require batteries, and they harvest energy themselves, making them applicable for long-term usage. The low limit of detection and high sensitivity make nanosensors helpful towards smart agriculture.

Nanosensors at Emesh Agriculture-

We will place the nanosensors in the soil to detect metrics such as the soil moisture levels, pH, and nutrient levels. We will use chemical nanosensors to analyze the soil properties and top-down and bottom-up methods to manufacture nanomaterials.

Chemical Nanosensors

Chemical nanosensors detect changes in the electrical conductivity of the nanomaterial being measured. They are made up of two parts: chemical recognition receptors and physicochemical transducers. The receptor acts with the analytes to alter the physical properties so that the transducer can analyze the material. Once an analyte or chemical species is detected, a precise analysis of information will then follow.

A farm equipped with nanosensors would have large quantities of them in the soil, creating a blanket of sensors that can provide detailed, seamless data of any area of the field.

Manufacturing Nanomaterials-

There are two ways to manufacture nanomaterials: the top-down approach (Nanolithography) and the bottom-up approach.

The top-down approach processes imprinting, writing, or etching patterns on a microscopic level to create microscopic structures. This technique can create patterns with size in the nanometer range.

While this is cost-effective, this also creates nanomaterials with defects.

The bottom-up approach, however, builds the nanomaterials up from individual atoms or molecules.

While this is expensive, this also creates nanomaterials with fewer defects.

To impact the agricultural industry on a larger scale, we have to ensure that these sensors are environmentally safe and find a low-cost and high-quality solution to manufacture them.

Drones as Sensors

Drones collect data 2.5 times more efficiently than manual inspection, and they are 25% more accurate.

At Emesh Agriculture, we will use drones — particularly fixed-wing drones — to increase the variety of available data to farmers. While the sensors are on the ground, the drones will be used to get an aerial view of the fields.

Fixed-Wing Drones

Fixed-wing drones are faster — endurance and high-speed cruising make it 2.6x more rapid land mapping — than drones with multiple wings and can capture high resolutions photos of the field. They can withstand high wind resistance and are used for large mapped areas and take-off landing space.

Multispectral Sensor-

The drones will be equipped with multispectral sensors used to detect wavelengths of light, which can be used to analyze plants' health.

The light bands re-emitted from chlorophyll molecules during plant photosynthesis are managed by the multispectral sensors attached to the bottom of the drone. It is connected to capture all the multispectral data needed in one drone flight across the area.

The wavelengths of light emitted from chlorophyll are used to measure plant health because chlorophyll is essential to the success of photosynthesis. This way, we can gather data about crop count, size, health, maturity, and damage by analyzing these wavelengths. If the wavelength falls under the green wavelength of 495–570 nanometers, the plant is healthy. If the wavelength falls under the yellow, orange, or red wavelengths ranging from 580–700 nanometers, farmers should isolate the crops emitting those wavelengths.

Analyzing the data

The data collected from both the nanosensors and the drones will be analyzed by a machine learning algorithm that sorts through all the pixels. For example, in the above map, each pixel is assigned a number from -1 to 1. -1 indicates water, 0 indicates green, which means the plant in that pixel is healthy, and 1 indicates yellow, orange, or red, which means poor plant health. And as for the nanosensors, the data will tell the farmer what is off in the field — for example, soil moisture is too high — which will enable the farmer to regulate the resources they put in while maximizing the crop yield.

In the next 2–5 years, we, at Emesh Agriculture, want this idea of a connected farm to be a reality, a world where agriculture is at maximum efficiency with minimum impact on the earth.

Thank you for giving this article a read. Massive shoutout to Samantha Hatcher, Syona Gupta, and Jeffrey Huynh for all the research and hard work put into this project.