TIAND: TiHAN IITH Autonomous Navigation Dataset

High-fidelity perception of environment with high standards of reliability is a necessary step for path planning and navigation control of an autonomous vehicle. By exploiting complementary benefits of each sensor type, the necessary step can be realized by collection of sensor data and processing the data. Datasets are available in literature satisfying various research communities in the world, to achieve this overarching goal. In contrast to structured environments observed in various parts of the world, capturing an unstructured nature of the environment especially within Indian context is a challenging task.

We propose a novel dataset to fill this gap. In contrast to existing datasets within Indian context, hardware synchronized data is being collected by utilizing a suite of on-board sensors which include, short and long-range radars, 128 channel lidar, high-definition cameras and GPS, to achieve a 360° FoV from all the sensors.

Currently, the dataset consists of 180° FoV camera images with 1080p resolution. Radar data point clouds consists of 360° FoV with 1 long-range at the front of the vehicle and 5 short-range at four corners and at the back of the vehicle. The lidar sensor is mounted on the top of the vehicle achieving a 360° FoV with 2.4 million points in each of the point cloud frames.  

Data Collection Mediums

For our research purposes, we have utilized drones of two different types, quadcopters and hexacopters. The quadcopter drone is equipped with RGB camera.

For the autonomous vehicular dataset, vehicles are equipped with LiDAR, Radars, high-definition Cameras and GNSS. Hardware synchronized sensors data can be transferred with 5G modem.

The hexacopter drone is equipped with hyperspectral and multispectral cameras.

For more information on our data collection mediums, please visit the data collection page:

Perception and Navigation Sensors

Sensors used in the UGVs:

The Ego Vehicle is equipped with 6 RADARs (5 Short-Range + 1 Long-Range), 1 360° LiDAR, GNSS and 6 Cameras with 360° FoV for the collection of data. In addition to the sensors, a 5G sim-card based modem hardware is utilized for transfer of hardware synchronized data to cloud storage.

1. Radar

The long-range radar, ARS430D is a 77 GHz FMCW sensor with digital beam-forming scanning antenna which offers two independent scans for far range and short range. The far range scan can detect objects up to 200m and the short-range scan can detect objects up to 100m. The short-range radar SRR520D is a 77GHz FMCW sensor that can detect objects up to 100m. Both short-range and long-range sensors hardware are equipped with object annotation and tracking capabilities.

Example:

Radar point cloud
Radar detections

2. LiDAR

High resolution 128 channel LiDAR has a horizontal 360° FoV and vertical 40° FoV with customizable frame rates between 5Hz to 20Hz. The rate at which the data points generated for each frame is 2.4 million per second.

Example:

LiDAR point cloud
Object annotations on LiDAR point cloud

3. GNSS

The GNSS system in the vehicle consists of rugged high precision antennas with ultra-durable water tight enclosures and a receiver system with internal storage and INS options. The system can give precise positioning data with 1 cm accuracy at an ouput data rate of 10Hz in real-time. The collected data have various features such as latitude, longitude, height, roll, pitch, yaw, north velocity, east velocity, standard deviations of all features, etc.

4. Camera

The acA1920-40gc Basler ace GigE camera is equipped with a Sony IMX249 sensor. It has a default resolution of 1920p x 1200p and a frame rate of 42 fps.

Examples:

Raw images
Annotated images
Raw images
Annotated images
Raw images
Annotated images
Raw images
Annotated images

Sensors used in the UAVs:

TiHAN Quadcopters and Hexacopters were equipped with sensors such as RGB, Multispectral, Hyperspectral, LiDAR, Thermal sensors which are used for estimation of traits, plant diseases in agriculture.

The challenges are faced when images obtained from UAV are of variable quality for different UAV, camera and environment factors. Hence, we used state-of-the-art AI/ML/DL techniques to estimate the traits.

1. Hyperspectral Sensor Imaging (HSI)

Hyperspectral imaging sensor

Hyperspectral camera on UAV flying

RGB view of HS image of Sorghum

UAVs mounted with HSI have wide field of coverages, short revisiting periods, high spectral and spatial resolutions. UAV-based HSI can be employed in agriculture for nutrient analysis of canopy (phosphorous, nitrogen, etc.), early detection of canopy water stress, crop biomass and yield estimation, disease/pest stress detection etc.

2. Multispectral Imaging Sensor (MSI)

Multispectral Camera

Light Sensor and GPS module with RGB Camera for Multispectral Camera

Processed image

Multispectral imaging involves the collection of data across the electromagnetic spectrum, usually including light that is visible and invisible to the human eye. The use cases include, vegetation water content, density of biomass or relative biomass, chlorophyll content, crop productivity or yield estimation.

3. RGB Camera

DJI Zenmuse X5

An RGB camera is used to deliver colored images of people and objects by capturing light in red, green, and blue wavelengths (RGB). This camera utilizes visible light with a wavelength ranging from 400 to 700nm.