Interpolating bathymetry point dataset using python

Bathymetry point data refers to a dataset containing discrete data points that represent depth measurements at specific locations in a water body, such as oceans, seas, lakes, or rivers. These points are collected during bathymetric surveys using various instruments, such as echo sounders or sonar systems. Each data point typically consists of three key components:

  1. Latitude and Longitude: The geographic coordinates (latitude and longitude) specify the location of the data point on the Earth’s surface. These coordinates allow bathymetric point data to be accurately positioned on a map or in a geographic information system (GIS).
  2. Depth (Z-Coordinate): The depth value represents the measurement of the water depth at the given latitude and longitude. It indicates the distance from the water surface to the seafloor or lake bottom at that specific location. The depth can be positive or negative, depending on whether the water surface is above or below a reference datum.
  3. Optional Attributes: In addition to latitude, longitude, and depth, bathymetry point data may include other attributes, such as timestamps, uncertainties, or quality indicators. These attributes provide additional information about the data points and the accuracy of the depth measurements.

The dataset used for today’s tutorial is an bathymetry point dataset representing the coastal areas of Sydney. The dataset is stored in a shapefile format and contains latitude, longitude, and depth measurements at specific locations in the water body. The data is displayed as below. We will interpolate this dataset.

To interpolate the bathymetry dataset and obtain a continuous representation of the underwater topography, we utilized the Verde Python library from fatiando. Verde is a powerful and versatile Python library designed specifically for processing spatial data, such as bathymetry and geophysical surveys, and performing interpolation tasks.

The interpolation process involved different methods provided by Verde: K-nearest neighbors (knn), cubic, linear interpolation etc. These methods allow us to estimate depth values at locations between the measured data points, creating a smoothly varying bathymetric grid.

Below is a step-by-step tutorial on how to interpolate an irregular gridded point bathymetry dataset using Python and the Verde library. We will use K-nearest neighbors, cubic, and linear interpolation methods and then export the interpolated data to GeoTIFF files.

To get full source code with example dataset for this tutorial, click here.

Step 1: Install Required Libraries:
Make sure you have the required libraries installed.

Step 2: Import Required Libraries
Create a Python script or Jupyter Notebook and import the necessary libraries:

Step 3: Read the Bathymetry Point Data
Read the bathymetry data from a shapefile (NSWOEH.shp) using Geopandas. The shapefile should contain columns ‘X’, ‘Y’, and ‘Z’ representing the longitude, latitude, and bathymetry values, respectively:

Step 4: Define the Interpolation Function
Define a function that performs the interpolation. This function will take the spacing, data frame, interpolation method, and region as input and return the interpolated grid.
spacing is like resolution. If spacing value is low, the resolution will be high and vice versa. Region is the extent coordinates of point dataset.

Step 5: Define the Export Function
Define a function to export the interpolated data to a GeoTIFF file:

Step 6: Perform Interpolation and Export to GeoTIFF
Perform the interpolation for each method (knn, cubic, and linear), and export the results to GeoTIFF files:

Complete Code:

I hope this tutorial will create a good foundation for you. If you want tutorials on another GIS topic or you have any queries, please send an email at

We also offer freelancing services. Please email us at for any query.

Leave a ReplyCancel reply