BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//pretalx.com//spathum24//speaker//L7ZM9P
BEGIN:VTIMEZONE
TZID:CET
BEGIN:STANDARD
DTSTART:20001029T040000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
TZNAME:CET
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20000326T030000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
TZNAME:CEST
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:pretalx-spathum24-3D3J9H@pretalx.com
DTSTART;TZID=CET:20240925T170000
DTEND;TZID=CET:20240925T173000
DESCRIPTION:The proposed paper deals with the challenges encountered and wo
 rkflows needed for reconstructing and editing historical geodata. It descr
 ibes the results of an effort to reconstruct territorial changes in Hessen
  since the first half of the 19th century. The paper focuses on the implem
 entation of an existing conceptual data modeling framework using a custom 
 plugin for QGIS (https://qgis.org/). This plugin aims at facilitating the 
 error prone process of editing historical vector data and is published und
 er an open-source license. Due to its generic design\, it can easily be re
 used by other projects. \n\nIntroduction \n\nOver the past years\, researc
 hers interested in the domain of historical cartography have been blessed 
 with an ever growing number of digitized maps available on the internet\, 
 provided by private and public institutions alike. Some of them have been 
 georeferenced and hence are available for desktop and web-based Geographic
  Information Systems (GIS) to be compared to historical and modern geodata
 . However\, these digital maps are still mere images\, a grid of raster ce
 lls with associated numerical values. They still need to be consulted by m
 eans of critical scholarly research to derive vector data. This kind of ge
 ospatial data can be used for the purpose of visualization and geospatial 
 analysis. The features extracted from the map may range from topographical
  features\, human settlement footprint and logistical infrastructure\, e.g
 . canals\, roads and railways. During the course of historical geographica
 l data modeling\, a special emphasis has been put upon the reconstruction 
 of historical borders. While the late 1990s and early 2000s mark the heyda
 y of the creation of national Historical GIS projects in Europe\, little a
 dvances have been made in this domain since. With the exception of a recen
 t project aiming at reconstructing the administrative boundaries of modern
  France. As a result\, scholars dependent on such data for their research 
 are confronted with a highly varying degree both in quality and quantity o
 f historical vector data. \n\nData Modeling \n\nWith the advent of GIS in 
 the 1980s and 1990s\, several mental frameworks have been developed to cop
 e with the central challenge in creating historical vector data: how to mo
 del the change in space and time in a manner that is both manageable by ex
 isting software solutions and researchers while limiting the amount of dup
 licated data. The most prevailing concepts have been the snapshot\, time-v
 ariant and – as a variation of the latter – the Least Common Geometry 
 (LCG) approach. \n\nThe snapshot model aims at reconstructing one or more 
 points in time. Geometries are copied\, hence spatial entities that have n
 ot changed their borders are nonetheless included multiple times within th
 e same data set. While this method is easily applied and allows for econom
 ic and fast initial results\, it does come with significant costs in the l
 ong run: databases created this way tend to be virtually non editable\, as
  ex post border changes will have to be added to several or all existing s
 napshots. Despite its obvious drawbacks this approach is still applied in 
 recent projects. \n\nA more complex approach is to encode the validity by 
 setting start and end points on the geometric features. This concept can b
 e extended by reconstructing the smallest entities of territorial boundari
 es (called Least Common Geometries (LCG)\, typically boroughs or parishes)
 . Using those features as puzzle pieces one can generate larger administra
 tive units by means of GIS based union operations in an automatized fashio
 n. Contrary to the snapshot model\, border changes between features are on
 ly recorded once and can be edited as soon as new evidence comes to light.
  This is especially important in cases where there is no single written so
 urce registering all the territorial changes within a region. Indeed\, our
  project has been significantly occupied with identifying the points in ti
 me specific border changes occurred and thus needs to be flexible enough t
 o incorporate new findings. While it is highly beneficial regarding the ma
 intainability of the data product new challenges arise with regard to data
  management. Researchers need to ensure that a) features do not overlap on
 e another at any given point in time (spatial topology) and b) a continuou
 s succession of features for an area (temporal topology) exists. While a l
 imited number of edits can be managed with standard GIS software\, an incr
 easing number of border changes rapidly leads to an increasing amount of t
 ime spent on quality assurance.\n\nQGIS Time Editor\n\nTo facilitate the p
 rocess of editing time-variant features we developed the Time Editor (sour
 ce-code: https://github.com/hil-mr/time-editor\,  documentation: https://w
 ms.hlgl.uni-marburg.de/docs/time-editor/) plugin for the well-known and es
 tablished open-source GIS QGIS. The plugin does provide several checks tha
 t address the challenges associated with the practical applications of the
  conceptual framework. The most important ones being the Temporal Integrit
 y and Spatial Integrity checks. The Temporal Integrity check ensures that 
 all features associated with an administrative unit do not overlap tempora
 rily. As historical administrative units might dissolve and be reestablish
 ed\, users can define exceptions for all existing integrity checks. The Sp
 atial Integrity check ensures that for any point in time there are no inte
 rsections between adjacent features. All checks can be limited by the use 
 of filter expressions and / or prior feature selections. The plugin was de
 signed to be as generic as possible and has been extensively tested in dif
 ferent project contexts. In addition to integrity checks the plugin provid
 es functions to facilitate the creation of new features.\n\nSummary\n\nWit
 h the methods described in this paper we aim at facilitating the edition o
 f historical vector datasets. We hope that the workflows and software solu
 tions developed are beneficial to other projects in this domain. Besides\,
  special emphasis is laid on openness – be it in the software developmen
 t process or regarding the licensing of the resulting data products.
DTSTAMP:20260315T170741Z
LOCATION:MG1/02.05
SUMMARY:Reconstructing and editing historical geodata. An open-source imple
 mentation of a conceptual framework - Niklas Alt
URL:https://pretalx.com/spathum24/talk/3D3J9H/
END:VEVENT
END:VCALENDAR
