netconvert: Improve --heightmap.geotiff#17473
Conversation
…ad of integers for more precision
…coordinates for lookup This adds support for geotiff input data in UTM projection, among others, and avoids an additional step to reproject the geotiff data to WGS84 (by using `gdalwarp`, for example), which adds a resampling step that introduces inaccuracy.
|
Thanks! Can you check whether your ECA is in order? (https://sumo.dlr.de/docs/FAQ.html#how_do_code_contributions_work) |
|
There was an issue while filling out the ECA, I've contacted the licensing team to resolve it. Will post here once it's done. |
0e2ca7b to
04f2ef2
Compare
|
It seems like I didn't test it well, and there is an issue with my bilinear interpolation change, producing incorrect heights. I've removed that commit from this PR and will work on it some more, independently. The rest still works fine. |
|
Thanks! @opatut If you have an example we can add to our tests, it would be more than welcome. |
|
I've been using heightmap files from the German state of Baden-Wurttemberg's state office of geo information, which are licenced rather openly, so you might be able to integrate them: https://opengeodata.lgl-bw.de/#/(sidenav:product/dgm1) They come in EPSG:25832 (UTM zone 32N) by default, so projection is necessary. Combine this with an OpenStreetMap excerpt, and you got your test dataset. This is from my Makefile: netconvert \
--verbose \
--sidewalks.guess \
--crossings.guess \
--heightmap.geotiff $(HEIGHTMAP) \
--offset.x -$(ORIGIN_X) \
--offset.y -$(ORIGIN_Y) \
--offset.z -$(ORIGIN_Z) \
--proj.utm=true \
--geometry.max-segment-length=4 \
--ptstop-output=$(OUTPUT)/osm_stops.add.xml \
--ptline-clean-up \
--ptline-output=$(OUTPUT)/osm_ptlines.xml \
--railway.topology.repair \
--output-file $(NET_FILE) \
--osm-files $(OSM_FILE) \
--keep-edges.in-geo-boundary $(BOUNDS) |
This pull request adds a few improvements for people who want to use the heightmap feature to generated 3D coordinates:
--offset.zto the value read from the heightmap input (otherwise it is unused, so this way you can offset the input data properly)gdal raster warp)I've also thrown in a change that uses bilinear interpolation instead of the previous triangle logic for interpolation of the raster data. It seems to produce the same results (rounded to centimeters) on my data and is probably faster to execute, and simpler to understand :) You can skip that commit if you like, it is refactoring only and adds no user benefit, but I prefer the cleaner logic and simpler math.