Comprehensive multimodal datasets for autonomous navigation research and development across terrestrial, aerial, underwater, and agricultural domains
Comprehensive ground vehicle datasets including multimodal, camera, radar, LiDAR, GNSS, and V2X communication data for autonomous navigation research.
6 DatasetsUAV-based datasets for agriculture, infrastructure, and transportation applications including marker-based landing, LiDAR navigation, and crop recognition.
Sonar sensor datasets for autonomous underwater vehicles (AUVs) enabling research in underwater navigation and object detection.
1 DatasetRGB, multispectral, and hyperspectral datasets for crop health monitoring, stress detection, and precision agriculture applications.
The Dataset consists of data from both the UAVs and AGVs. The UAVs include data from Quadcopter and Hexacopter drones, and the AGVs are sedan-class ego vehicles.
The Quadcopter Drone is equipped with multispectral and hyperspectral cameras, specifically tailored for agriculture purposes. The imagery is provided by the drone's RGB camera. In addition to solo missions, the Quadcopter Drone is also capable of swarm operations, enabling coordinated and synchronized flights for enhanced efficiency and coverage.
The Hexacopter Drone offers an extended range of up to 5km for expansive coverage. It is equipped with high-performance Multispectral and Hyperspectral cameras. It has capabilities including precision landing, marker-based landing, and BV LOS capabilities. The hexacopter drone is equipped with hyperspectral and multispectral cameras.
Collected sensor data is hardware synchronized from a suite of sensors. Our ego vehicle consists of the following sensors:
The Ego Vehicle is equipped with 6 RADARs (5 Short-Range + 1 Long-Range), 1 360° LiDAR, GNSS and 6 Cameras with 360° FoV for the collection of data. In addition to the sensors, a 5G sim-card based modem hardware is utilized for transfer of hardware synchronized data to cloud storage.
The long-range radar, ARS430D is a 77 GHz FMCW sensor with digital beam-forming scanning antenna which offers two independent scans for far range and short range. The far range scan can detect objects up to 200m and the short-range scan can detect objects up to 100m. The short-range radar SRR520D is a 77GHz FMCW sensor that can detect objects up to 100m. Both short-range and long-range sensors hardware are equipped with object annotation and tracking capabilities.
The GNSS system in the vehicle consists of rugged high precision antennas with ultra-durable watertight enclosures and a receiver system with internal storage and INS options. The system can give precise positioning data with 1 cm accuracy at an output data rate of 10Hz in real-time. The collected data have various features such as latitude, longitude, height, roll, pitch, yaw, north velocity, east velocity, standard deviations of all features, etc.
TiHAN Quadcopters and Hexacopters were equipped with sensors such as RGB, Multispectral, Hyperspectral, LiDAR, Thermal sensors which are used for estimation of traits, plant diseases in agriculture. The challenges are faced when images obtained from UAV are of variable quality for different UAV, camera and environment factors. Hence, we used state-of-the-art AI/ML/DL techniques to estimate the traits.
Complex phenotyping traits such as abiotic and biotic stress adaptation can be explored more precisely through the rich spectral information captured by Hyperspectral Imaging (HSI) sensors in hundreds of narrow spectral bands. UAV-based HSI can be employed in agriculture for nutrient analysis of canopy (phosphorous, nitrogen, etc.), early detection of canopy water stress, crop biomass and yield estimation, disease/pest stress detection etc.
Multispectral imaging involves the collection of data across the electromagnetic spectrum, usually including light that is visible and invisible to the human eye. The Micasense RedEdge-MX is included with a Downwelling Light Sensor (DLS). This sensor is a 5-band light sensor that calculates the surrounding light conditions during a flight for each of the camera's five spectral bands and then stores this data within the metadata of the captured images. The use cases include, vegetation water content, density of biomass or relative biomass, chlorophyll content, crop productivity or yield estimation.
RGB camera is used to deliver colored images of people and objects by capturing light in red, green, and blue wavelengths (RGB). This camera utilizes visible light with a wavelength ranging from 400 to 700nm. Zenmuse X5 RGB sensor mounted on the drone is programmed to capture images, captured images were aligned and geo-rectified using Agisoft PhotoScan software. It is used to create a dense point cloud from the raw images, and a digital elevation model (DEM) and orthomosaic is generated. Orthomosaic is further segmented into plot-wise images by running the QGIS tool using shape files and R software. In addition to UAV captured data, we also collected data using a hand-held camera for paddy crop.
Complete the Google form to request access to the dataset.
We will review your request and send you a confirmation email.
Once approved, you will receive a link to download the dataset.
Note: All datasets are subject to a data usage agreement. Please review the terms before downloading.
Ground vehicle datasets for autonomous navigation research
Meticulously curated multimodal dataset supporting object detection algorithms. Features four cameras, six radars, LiDAR, GPS, and IMU technologies. Data collected from Hyderabad, India with 2-4 minute scenes and synchronized data streams.
An annotated camera dataset covered 3,000 km across diverse road types, including national highways, state highways, district roads, and rural and urban environments. The data was captured using Basler acA1920-40gc GigE cameras and is part of the 'DriveIndia Dataset'.
Download the EULA form, fill it out, and upload it in the request form.
Raw radar data from 1 long-range and 5 short-range radars with 120° and 150° FOVs. Range up to 200m and 100m respectively. CSV files contain 13 channels including distance, velocity, acceleration, and standard deviations.
Two LiDAR datasets: Object Detection (128-channel Ouster OS2-128) and Ground Dataset for ground point removal validation. Organized with synchronized camera images, calibration data, and labeled point clouds.
Raw GNSS data recorded and stored in CSV format with crucial navigation information for each data point.
V2X dataset collected in Hyderabad, India covering Vehicle-to-Vehicle (V2V), Vehicle-to-Infrastructure (V2I), Infrastructure-to-Vehicle (I2V), and Vehicle-to-Cloud (V2C) communication types.
UAV-based datasets for agriculture, infrastructure, and transportation
Comprehensive dataset with 7,517 high-resolution images captured across various altitudes and weather conditions. Features 2D annotations for custom marker class with 70/15/15 train/validation/test split.
LiDAR data collected at various altitudes with pre-flight calibration for each data collection session.
Benchmark dataset for growth stage detection with 837 images and 4,928 annotated instances. Features 6 classes: Weed, Seedling, Tillering, Booting, Flowering, and Ripening stages.
Sonar sensor datasets for autonomous underwater vehicles
Side scan sonar dataset captured using SSS-600K with 10-75 meter scanning range. High-quality images from Hyderabad lakes in .xtf format, compatible with JW Fisher and SonarView software for underwater terrain analysis.
RGB, multispectral, and hyperspectral datasets for precision agriculture
RGB camera dataset of paddy and maize crops collected over three seasons (2018-2020) in collaboration with PJTSAU. Part of DSFS project for sustainable crop production under climatic change.
Multispectral camera dataset for crop head detection (Maize Tassel, Paddy Panicle) and early water stress identification in Maize. Three seasons of data collection in Kharif and Rabi seasons.
UAV captured crop hyperspectral imaging datasets in 282 spectral channels across the 400-1000nm range. The imaging data has been collected in collaboration with ICRISAT Hyderabad for crop monitoring and phenotyping applications.
The datasets include:
1. Crop early water stress identification in Groundnut and Pearl Millet
2. Crop type classification
Hand-held RGB camera data for paddy crops collected alongside UAV data for comprehensive crop monitoring and analysis.
Nitish Kumar et al. - "TIAND: A Multimodal Dataset for Autonomy on Indian Roads", IEEE Intelligent Vehicles Symposium (IV) 2024.
DOI: 10.1109/IV55156.2024.10588583B. Anand et al. - "LiDAR-INS/GNSS-Based Real-Time Ground Removal, Segmentation, and Georeferencing Framework for Smart Transportation", IEEE Transactions on Instrumentation and Measurement, 2021.
DOI: 10.1109/TIM.2021.3117661Parvez Alam, P. Rajalakshmi - "Deep Learning based steering angle prediction with LiDAR for Autonomous vehicle", IEEE Vehicular Technology Conference 2023.
DOI: 10.1109/VTC2023-Spring57618.2023.10201141Bhaskar Anand et al. - "Evaluation of the quality of LiDAR data in the varying ambient light", IEEE Sensors Applications Symposium (SAS) 2022.
DOI: 10.1109/SAS54819.2022.9881373Bhaskar Anand, P. Rajalakshmi - "Pipeline for automation of LiDAR data annotation", IEEE Sensors Applications Symposium (SAS) 2023.
DOI: 10.1109/SAS58821.2023.10254180Bhaskar Anand, P. Rajalakshmi - "BEV Approach Based Efficient Object Detection using YoloV4 for LiDAR Point Cloud", IEEE Vehicular Technology Conference 2023.
DOI: 10.1109/VTC2023-Spring57618.2023.10200314Bhaskar Anand, P. Rajalakshmi - "Client-Server Based Implementation of LiDAR Data Streaming System on ROS platform", IEEE International Symposium on Real-Time Distributed Computing (ISORC) 2023.
DOI: 10.1109/ISORC58943.2023.00034Bhaskar Anand et al. - "Quantitative Comparison of LiDAR Point Cloud Segmentation for Autonomous Vehicles", IEEE Vehicular Technology Conference 2021.
DOI: 10.1109/VTC2021-Fall52928.2021.9625507Bhaskar Anand et al. - "Comparative Run Time Analysis of LiDAR Point Cloud Processing with GPU and CPU", IEEE International Conference on Computing, Power and Communication Technologies 2020.
DOI: 10.1109/GUCON48875.2020.9231067Bhaskar Anand et al. - "An experimental analysis of various multi-channel LiDAR systems", IEEE International Conference on Computing, Power and Communication Technologies 2020.
DOI: 10.1109/GUCON48875.2020.9231195Bhaskar Anand et al. - "Real Time LiDAR Point Cloud Compression And Transmission For Intelligent Transportation System", IEEE Vehicular Technology Conference 2019.
DOI: 10.1109/VTCSpring.2019.8746417Anjani Josyula et al. - "Fast Object Segmentation Pipeline for Point Clouds Using Robot Operating System", IEEE World Forum on Internet of Things 2019.
DOI: 10.1109/WF-IoT.2019.8767255Shantanu Yadav et al. - "Vehicle Detection and Tracking using Radar for Lane Keep Assist Systems", IEEE Vehicular Technology Conference 2023.
DOI: 10.1109/VTC2023-Spring57618.2023.10199286H. N. Srikanth et al. - "Pothole Detection for Autonomous Vehicles in Indian Scenarios using Deep Learning", IEEE International Symposium on Real-Time Distributed Computing 2023.
DOI: 10.1109/ISORC58943.2023.00033Bhaskar Anand et al. - "A Novel Real-Time LiDAR Data Streaming Framework", IEEE Sensors Journal 2022.
DOI: 10.1109/JSEN.2022.3215189Anjani Josyula et al. - "Coarse Object Tracking Technique for Point Clouds", IEEE Sensors Applications Symposium 2020.
DOI: 10.1109/SAS48726.2020.9220053Bhaskar Anand et al. - "Region of Interest and Car Detection using LiDAR data for Advanced Traffic Management System", IEEE World Forum on Internet of Things 2020.
DOI: 10.1109/WFIoT48130.2020.9221354