Mapping

Show Only ...
Maps - Photos - Videos

Another Example of GeoPandas and Python Spatial Joins

Another Example of GeoPandas and Python Spatial Joins πŸ—Ί

Yesterday, I posted a much more complicated piece of code that pulled addresses from the SAM (State Address Management) database and did a spatial join to add a column to the file with Assembly District and Municipality. This was a bit too complex, so I made a simpler one for other purposes that doesn’t require the coordinates to be obtained from SAM.

This python script takes two parameters:

  • Path to a CSV file that contains an X and Y coordinate
  • Path to a Shapefile or Geopackage to Join Against

Then the code will create a new CSV file with the spatially joined attributes pulled from the Shapefile. I have only run it on a few large data sets, but I found it took roughly 1 second to join 1,000 records from call to end of end of script.

#!/usr/bin/python

import requests,sys,json,os,csv

import pandas as pd
import geopandas as gpd

lines=[]

# read list of addresses from parameter 1
with open(sys.argv[-2], newline='') as csvfile:	
	for line in csv.DictReader(csvfile):
		lines.append(line)

# convert to pandas
locPd = pd.DataFrame(lines,columns=lines[0].keys())
locPd.convert_dtypes()

locPd = gpd.GeoDataFrame(locPd,  geometry=gpd.points_from_xy(locPd.x.astype('float32'), locPd.y.astype('float32')))

# run spatial joins against parameter 2
ad = gpd.read_file(sys.argv[-1])
locPd = gpd.sjoin(locPd, ad, op="within")

# remove added geometery and index columns
del locPd['geometry']
del locPd['index_right']

# write pandas back to out csv
locPd.to_csv (os.path.splitext(sys.argv[-2])[0]+'-output.csv', index = False, header=True)

Final USGS DRG Topographic Map Update

Back when I was in Boy Scouts, we often used 1:24k quadrangle topographic maps from the USGS. These maps from the pre computer age, manually drafted, they are no longer updated in favor of GIS data sets and the computer created National Map product. Some of the trails and landmarks not in the GNIS dataset never made it over to the new maps - especially on state land. And the National Map topographic maps are hardly the works of art that many the old topographic maps were. 

You can download both the historical and modern maps for free here http://prd-tnm.s3.amazonaws.com/index.html?prefix=StagedProducts/

Final USGS DRG Topographic Map Update

If you are using a projected coordinate system, finding a location from a bearing and a distance is easy πŸ—Ί

If you are using a projected coordinate system, finding a location from a bearing and a distance is easy πŸ—Ί

newX = startX + distance * sin(bearing/180*pi)

newY = startY + distance * cos(bearing/180*pi)

Distance is in map units, so if you are working in UTM then that’s meters.

Bearing is in compass degrees, due north is 0, due east is 90, etc.

Creating Digital Surface Models GeoTIFF Using National Map Downloader, LiDAR Point Clouds and PDAL πŸ—Ί

Find LiDAR Point Clouds on the National Map Downloader: https://apps.nationalmap.gov/downloader/#/

Select Find Elevation Source Data (3DEP) – Lidar, IfSAR

Search for LiDAR Point Cloud (LPC)

Click Export all results as a TXT file and save to a directory.

Then run this Unix command on the text file to download point clouds:

cat data.txt | tr -d '\n' | xargs -d '\r' -I {} wget -c -nc '{}'

Next create a pipeline.txt for pdal with the classification (For DSM, 1 are unclassified points like buildings and treetops, while 2 are ground points, if you want a DEM, you can also make them this way too with Classification of 2:2):

{ 
    "pipeline": [
        { 
            "type": "readers.las"
        },
        {
            "type": "filters.range",
            "limits": "Classification[1:1]"
        },
        {
            "gdaldriver":"GTiff",
            "resolution": 1.0,
            "output_type": "max",
            "type":"writers.gdal"
        }
    ]
}

Next convert the point clouds into digital surface model (GeoTIFF), you can use this shell command with xargs to go over each LAS file, using the above pipeline:

ls *.laz | xargs -I {} basename {} .laz | xargs -P3 -I {} pdal pipeline pipeline.txt --readers.las.filename={}.laz --writers.gdal.filename={}.tif

The above command can be somewhat slow depending on how many LAZ point clouds you need to go through and your selected resolution. -P3 sets the number of parallel process (3) which can help speed things up a bit.

Now we have the digital surface models raster that can be used in QGIS for hillshade for 3D.

Build a virtual raster (dsm.vrt) for easy loading into QGIS rather than loading separate files.

gdalbuildvrt dsm.vrt *.tif

Digital Terrain Models

Lately I have been experimenting with digital terrain models extracted from the LIDAR point cloud. It allows me to measure the height of buildings and tree cover and can be used for better 3d modeling. 

Digital Terrain Models