Open Source Datasets

TiAND

Comprehensive multimodal datasets for autonomous navigation research and development across terrestrial, aerial, underwater, and agricultural domains

4 Categories
15+ Datasets
100% Open Source

Terrestrial Datasets

Comprehensive ground vehicle datasets including multimodal, camera, radar, LiDAR, GNSS, and V2X communication data for autonomous navigation research.

6 Datasets

Aerial Datasets

UAV-based datasets for agriculture, infrastructure, and transportation applications including marker-based landing, LiDAR navigation, and crop recognition.



3 Datasets

Underwater Datasets

Sonar sensor datasets for autonomous underwater vehicles (AUVs) enabling research in underwater navigation and object detection.

1 Dataset

Agriculture Datasets

RGB, multispectral, and hyperspectral datasets for crop health monitoring, stress detection, and precision agriculture applications.


4 Datasets

Data Collection Vehicles

The Dataset consists of data from both the UAVs and AGVs. The UAVs include data from Quadcopter and Hexacopter drones, and the AGVs are sedan-class ego vehicles.

UAVs (Unmanned Aerial Vehicles)

Quadcopter Drone

Average Speed: 5 m/s
Altitude Capability: 120m
Max Takeoff Weight: 2 kg
Cameras: Multispectral & Hyperspectral

The Quadcopter Drone is equipped with multispectral and hyperspectral cameras, specifically tailored for agriculture purposes. The imagery is provided by the drone's RGB camera. In addition to solo missions, the Quadcopter Drone is also capable of swarm operations, enabling coordinated and synchronized flights for enhanced efficiency and coverage.

Hexacopter Drone

Altitude Capability: 120m
Average Speed: 5 m/s
Max Takeoff Weight: 5-6 kg
Range: Up to 5km

The Hexacopter Drone offers an extended range of up to 5km for expansive coverage. It is equipped with high-performance Multispectral and Hyperspectral cameras. It has capabilities including precision landing, marker-based landing, and BV LOS capabilities. The hexacopter drone is equipped with hyperspectral and multispectral cameras.

AGVs (Autonomous Ground Vehicles)

Collected sensor data is hardware synchronized from a suite of sensors. Our ego vehicle consists of the following sensors:

  • 6 Radars (5 short-range radars + 1 long-range radar)
  • 1 360° LiDAR
  • 6 Cameras providing a 360° FOV
  • GNSS and data for sensors

The Ego Vehicle is equipped with 6 RADARs (5 Short-Range + 1 Long-Range), 1 360° LiDAR, GNSS and 6 Cameras with 360° FoV for the collection of data. In addition to the sensors, a 5G sim-card based modem hardware is utilized for transfer of hardware synchronized data to cloud storage.

Sensors Used in the UGVs

1. Radar

Long-Range Radar: ARS430D - 77 GHz FMCW
Far Range Detection: Up to 200m
Short Range Detection: Up to 100m
Short-Range Radar: SRR520D - 77GHz FMCW

The long-range radar, ARS430D is a 77 GHz FMCW sensor with digital beam-forming scanning antenna which offers two independent scans for far range and short range. The far range scan can detect objects up to 200m and the short-range scan can detect objects up to 100m. The short-range radar SRR520D is a 77GHz FMCW sensor that can detect objects up to 100m. Both short-range and long-range sensors hardware are equipped with object annotation and tracking capabilities.

2. LiDAR

Type: High resolution 128 channel LiDAR
Horizontal FOV: 360°
Vertical FOV: 40°
Frame Rate: 5Hz to 20Hz (customizable)
Data Points: 2.4 million per second

3. GNSS

Antennas: Rugged high precision with ultra-durable watertight enclosures
Precision: 1 cm accuracy
Output Rate: 10Hz in real-time
Features: Latitude, longitude, height, roll, pitch, yaw, velocities, standard deviations

The GNSS system in the vehicle consists of rugged high precision antennas with ultra-durable watertight enclosures and a receiver system with internal storage and INS options. The system can give precise positioning data with 1 cm accuracy at an output data rate of 10Hz in real-time. The collected data have various features such as latitude, longitude, height, roll, pitch, yaw, north velocity, east velocity, standard deviations of all features, etc.

4. Camera

Model: acA1920-40gc Basler ace GigE
Sensor: Sony IMX249
Resolution: 1920p x 1200p
Frame Rate: 42 fps

Sensors Used in the UAVs

TiHAN Quadcopters and Hexacopters were equipped with sensors such as RGB, Multispectral, Hyperspectral, LiDAR, Thermal sensors which are used for estimation of traits, plant diseases in agriculture. The challenges are faced when images obtained from UAV are of variable quality for different UAV, camera and environment factors. Hence, we used state-of-the-art AI/ML/DL techniques to estimate the traits.

1. Hyperspectral Sensor Imaging (HSI)

Model: Pika L Hyperspectral imaging sensor
Spectral Range: 400 to 1000 nm
Spectral Resolution: 2.1 nm
Spectral Channels: 281
Spatial Channels: 900
Max Frame Rate: 249 frames per second

Complex phenotyping traits such as abiotic and biotic stress adaptation can be explored more precisely through the rich spectral information captured by Hyperspectral Imaging (HSI) sensors in hundreds of narrow spectral bands. UAV-based HSI can be employed in agriculture for nutrient analysis of canopy (phosphorous, nitrogen, etc.), early detection of canopy water stress, crop biomass and yield estimation, disease/pest stress detection etc.

2. Multispectral Imaging Sensor (MSI)

Model: Micasense RedEdge-MX
Bands: 5-band light sensor
Accessory: Downwelling Light Sensor (DLS)

Multispectral imaging involves the collection of data across the electromagnetic spectrum, usually including light that is visible and invisible to the human eye. The Micasense RedEdge-MX is included with a Downwelling Light Sensor (DLS). This sensor is a 5-band light sensor that calculates the surrounding light conditions during a flight for each of the camera's five spectral bands and then stores this data within the metadata of the captured images. The use cases include, vegetation water content, density of biomass or relative biomass, chlorophyll content, crop productivity or yield estimation.

3. RGB Camera

Model: DJI Zenmuse X5
Wavelength Range: 400 to 700nm
Type: Visible light RGB sensor

RGB camera is used to deliver colored images of people and objects by capturing light in red, green, and blue wavelengths (RGB). This camera utilizes visible light with a wavelength ranging from 400 to 700nm. Zenmuse X5 RGB sensor mounted on the drone is programmed to capture images, captured images were aligned and geo-rectified using Agisoft PhotoScan software. It is used to create a dense point cloud from the raw images, and a digital elevation model (DEM) and orthomosaic is generated. Orthomosaic is further segmented into plot-wise images by running the QGIS tool using shape files and R software. In addition to UAV captured data, we also collected data using a hand-held camera for paddy crop.

Download Process

Step 1: Fill the Google Form

Complete the Google form to request access to the dataset.

Step 2: Receive Confirmation

We will review your request and send you a confirmation email.

Step 3: Download the Dataset

Once approved, you will receive a link to download the dataset.

Note: All datasets are subject to a data usage agreement. Please review the terms before downloading.

Terrestrial Datasets

Ground vehicle datasets for autonomous navigation research

1. Multimodal Dataset

Meticulously curated multimodal dataset supporting object detection algorithms. Features four cameras, six radars, LiDAR, GPS, and IMU technologies. Data collected from Hyderabad, India with 2-4 minute scenes and synchronized data streams.

Request Access
Fill Google form to receive download link via email

2. Camera Dataset

An annotated camera dataset covered 3,000 km across diverse road types, including national highways, state highways, district roads, and rural and urban environments. The data was captured using Basler acA1920-40gc GigE cameras and is part of the 'DriveIndia Dataset'.

Download the EULA form, fill it out, and upload it in the request form.

Fill Google form to receive download link via email

3. RADAR Dataset

Raw radar data from 1 long-range and 5 short-range radars with 120° and 150° FOVs. Range up to 200m and 100m respectively. CSV files contain 13 channels including distance, velocity, acceleration, and standard deviations.

Request Access
Fill Google form to receive download link via email

4. LiDAR Datasets

Two LiDAR datasets: Object Detection (128-channel Ouster OS2-128) and Ground Dataset for ground point removal validation. Organized with synchronized camera images, calibration data, and labeled point clouds.

Request Access
Fill Google form to receive download link via email

5. GNSS Dataset

Raw GNSS data recorded and stored in CSV format with crucial navigation information for each data point.

Request Access
Fill Google form to receive download link via email

6. V2X Communication Dataset

V2X dataset collected in Hyderabad, India covering Vehicle-to-Vehicle (V2V), Vehicle-to-Infrastructure (V2I), Infrastructure-to-Vehicle (I2V), and Vehicle-to-Cloud (V2C) communication types.

Request Access
Fill Google form to receive download link via email

Aerial Datasets

UAV-based datasets for agriculture, infrastructure, and transportation

1. Marker Based Landing (MBL) Navigation Dataset

Comprehensive dataset with 7,517 high-resolution images captured across various altitudes and weather conditions. Features 2D annotations for custom marker class with 70/15/15 train/validation/test split.

Available Soon

2. LiDAR Navigation Dataset

LiDAR data collected at various altitudes with pre-flight calibration for each data collection session.

Available Soon

3. Paddy Growth Stage Recognition

Benchmark dataset for growth stage detection with 837 images and 4,928 annotated instances. Features 6 classes: Weed, Seedling, Tillering, Booting, Flowering, and Ripening stages.

Request Access
Fill Google form to receive download link via email

Underwater Datasets

Sonar sensor datasets for autonomous underwater vehicles

SONAR Dataset

Side scan sonar dataset captured using SSS-600K with 10-75 meter scanning range. High-quality images from Hyderabad lakes in .xtf format, compatible with JW Fisher and SonarView software for underwater terrain analysis.

Request Access
Fill Google form to receive download link via email

Agriculture Datasets

RGB, multispectral, and hyperspectral datasets for precision agriculture

1. Camera (Aerial)

RGB camera dataset of paddy and maize crops collected over three seasons (2018-2020) in collaboration with PJTSAU. Part of DSFS project for sustainable crop production under climatic change.

Available Soon

2. Multispectral (Aerial)

Multispectral camera dataset for crop head detection (Maize Tassel, Paddy Panicle) and early water stress identification in Maize. Three seasons of data collection in Kharif and Rabi seasons.

Request Access
Fill Google form to receive download link via email

3. Hyperspectral Imaging (Aerial)

UAV captured crop hyperspectral imaging datasets in 282 spectral channels across the 400-1000nm range. The imaging data has been collected in collaboration with ICRISAT Hyderabad for crop monitoring and phenotyping applications.
The datasets include:
1. Crop early water stress identification in Groundnut and Pearl Millet
2. ⁠Crop type classification

Request Access
Fill Google form to receive download link via email

4. Camera (Terrestrial)

Hand-held RGB camera data for paddy crops collected alongside UAV data for comprehensive crop monitoring and analysis.

Available Soon

Related Publications

Nitish Kumar et al. - "TIAND: A Multimodal Dataset for Autonomy on Indian Roads", IEEE Intelligent Vehicles Symposium (IV) 2024.

DOI: 10.1109/IV55156.2024.10588583

B. Anand et al. - "LiDAR-INS/GNSS-Based Real-Time Ground Removal, Segmentation, and Georeferencing Framework for Smart Transportation", IEEE Transactions on Instrumentation and Measurement, 2021.

DOI: 10.1109/TIM.2021.3117661

Parvez Alam, P. Rajalakshmi - "Deep Learning based steering angle prediction with LiDAR for Autonomous vehicle", IEEE Vehicular Technology Conference 2023.

DOI: 10.1109/VTC2023-Spring57618.2023.10201141

Bhaskar Anand et al. - "Evaluation of the quality of LiDAR data in the varying ambient light", IEEE Sensors Applications Symposium (SAS) 2022.

DOI: 10.1109/SAS54819.2022.9881373

Bhaskar Anand, P. Rajalakshmi - "Pipeline for automation of LiDAR data annotation", IEEE Sensors Applications Symposium (SAS) 2023.

DOI: 10.1109/SAS58821.2023.10254180

Bhaskar Anand, P. Rajalakshmi - "BEV Approach Based Efficient Object Detection using YoloV4 for LiDAR Point Cloud", IEEE Vehicular Technology Conference 2023.

DOI: 10.1109/VTC2023-Spring57618.2023.10200314

Bhaskar Anand, P. Rajalakshmi - "Client-Server Based Implementation of LiDAR Data Streaming System on ROS platform", IEEE International Symposium on Real-Time Distributed Computing (ISORC) 2023.

DOI: 10.1109/ISORC58943.2023.00034

Bhaskar Anand et al. - "Quantitative Comparison of LiDAR Point Cloud Segmentation for Autonomous Vehicles", IEEE Vehicular Technology Conference 2021.

DOI: 10.1109/VTC2021-Fall52928.2021.9625507

Bhaskar Anand et al. - "Comparative Run Time Analysis of LiDAR Point Cloud Processing with GPU and CPU", IEEE International Conference on Computing, Power and Communication Technologies 2020.

DOI: 10.1109/GUCON48875.2020.9231067

Bhaskar Anand et al. - "An experimental analysis of various multi-channel LiDAR systems", IEEE International Conference on Computing, Power and Communication Technologies 2020.

DOI: 10.1109/GUCON48875.2020.9231195

Bhaskar Anand et al. - "Real Time LiDAR Point Cloud Compression And Transmission For Intelligent Transportation System", IEEE Vehicular Technology Conference 2019.

DOI: 10.1109/VTCSpring.2019.8746417

Anjani Josyula et al. - "Fast Object Segmentation Pipeline for Point Clouds Using Robot Operating System", IEEE World Forum on Internet of Things 2019.

DOI: 10.1109/WF-IoT.2019.8767255

Shantanu Yadav et al. - "Vehicle Detection and Tracking using Radar for Lane Keep Assist Systems", IEEE Vehicular Technology Conference 2023.

DOI: 10.1109/VTC2023-Spring57618.2023.10199286

H. N. Srikanth et al. - "Pothole Detection for Autonomous Vehicles in Indian Scenarios using Deep Learning", IEEE International Symposium on Real-Time Distributed Computing 2023.

DOI: 10.1109/ISORC58943.2023.00033

Bhaskar Anand et al. - "A Novel Real-Time LiDAR Data Streaming Framework", IEEE Sensors Journal 2022.

DOI: 10.1109/JSEN.2022.3215189

Anjani Josyula et al. - "Coarse Object Tracking Technique for Point Clouds", IEEE Sensors Applications Symposium 2020.

DOI: 10.1109/SAS48726.2020.9220053

Bhaskar Anand et al. - "Region of Interest and Car Detection using LiDAR data for Advanced Traffic Management System", IEEE World Forum on Internet of Things 2020.

DOI: 10.1109/WFIoT48130.2020.9221354