Note beforehand: the information here is a technical programmer's view of what needs to be done :-) It might not be totally in sync with the project's terminology and plans.
Within the EU TEMA project, Nelen & Schuurmans is mainly involved with our 3Di hydrodynamic computation software, known as PDM-tech-02
in the project. Our direct aim is to improve 3Di for disaster management.
Disaster management also means collaboration, of course. For that purpose, TEMA has provided infrastructure and technology suggestions as a framework for collaboration. The implementation plan in this document shows how we want to use the infrastructure to be able to show a nice realistic demo in cooperation with partners.
I hope this document also helps other partners to get started. For that purpose, it is my intention to keep most of the code open source (and thus visible), though there's a https://github.com/HE-TEMA github team now, so some of the repositories will be visible there.
- A context broker implemented in NGSI-LD. Basically a semantic web storage. I've written a summary on my blog of a TEMA workshop on that topic that should give a pretty good introduction. The context broker is the central information exchange mechanism for the whole project. If we need weather forecasts, we'll ask the CB to send us new ones added by other project partners. If we have flood calculation results, we'll put their info into the CB for others to query.
- A minio 's3' object storage for storing (large) data.
- K3S, a simple but powerful kubernetes cluster. According to what I gathered, you can connect your own kubernetes nodes but you can also use the provided infrastructure. You can use it to run servers or scripts, provided you "dockerize" them and know how to deploy those dockers to the cluster.
The K3S cluster is (rightfully so) hidden away behind a firewall, so you need VPN access. The context broker ("CB") and minio are publicly accessible (I'm not mentioning the URL here as you have write access as anonoymous user to the CB :-) ).
Originally, I had a more elaborate setup in mind, based on the need to shuffle data to and from our regular data center. It turns out that everything can be much simpler, luckily.
See the previous version.
The sketch below shows the suggested setup, followed by the textual explanation of the various parts. Basically we use the context broker in combination with two small and simple dockers inside the K3S cluster + the minio s3 storage.
In the context broker, we'll need to store information. Information that needs terms and definitions because of its "linked data" nature. Initially, I thought a proper defined custom "JSON-LD" vocabulary was necessary, but it turns out the context broker just accepts entities with a new type, so at least initially we don't need one.
The three entity types that I expect to need or to write myself (just as a starting point for further discussion):
FloodRiskEvent
id = some:urn:1234
elevation_map_id = some:urn:1234
rainfall = .... (in mm/h)
rainfall_duration = ... (in h)
(and probably some timestamp which should be build in)
ElevationMap
id = some:urn:5678
title = "descriptive title"
# bucket and filename are in our minio.
bucket = "naples"
filename = ....
Maps4Flood (from "tech05")
id = some:urn:5678
title = "descriptive title"
# bucket and filename are in our minio.
bucket = "naples"
filename = ....
FloodCalculationResult
id = some:urn:8901
title = (the 3Di simulation name)
flood_risk_event_id = some:urn:1234
threedi_simulation = simulation ID, needed for contacting the 3Di api
threedi_schematisation_name = needed for contacting the 3Di api
maximum_waterdepth_bucket= "napels"
maximum_waterdepth_filename = filename in minio
FloodRiskEvent
: I've written a small form to fire off this event, but that might be another partner's job?ElevationMap
: DLR probably has to upload this, I'll upload an initial one myself.- We'll react to the flood risk event by starting a 3Di simulation and producing a
FloodCalculationResult
with some uploaded data in minio and parameters for getting the complete data from the simulation out of the 3Di API. Other partners can then use this for their calculations or visualisations.
See https://github.com/HE-TEMA/flood-calculation-site
We'll make a docker that contains a small flask (python) webserver that serves as the "target URL" for the context broker. It is based on the small example app available within the project.
- CB subscription target for FloodRiskEvent and ElevationMap.
- Small form to create FloodRiskEvent (DONE).
- Some handy debug pages to browse the available entity types and entities in the CB (DONE).
- Code to trigger the "task" docker below based on the two CB subscriptions.
Internally we use "Prefect", which looks a bit like airflow. In the end, it are just simple python scripts.
- One task uploads a simple elevation map geotiff + adds the event to the context broker.
- The other task reacts to an ElevationMap, which means the FloodRiskEvent can now be converted into a 3di scenario. The scenario is uploaded to our regular 3Di data center in Amsterdam. Upon completion, we store the results in minio and in the context broker.
I'm throwing in a possible case study just to make the possible data flows a bit more concrete.