This Python package is a collaborative effort by the gravity Metrology group at the German Federal Agency for Carthography and Geoesy (BKG) and the Hydrology section at GFZ Helmholtz Centre for Geosciences. It comprises functionalities and features around the respectively new instrument type of a Quantum Gravimeter (here AQG). New (standardized) instrument data format additional to new measurement and processing concepts lead to the first collection of scripts and now complete python package for a fully-featured analysis of AQG data. This encompasses live-monitoring while the instrument is actually measuring (with enhanced functionality than what is provided by the manufacturer), data processing, visualizations as well as archiving data, fulfilling the idea of reproducible data within FAIR principles. Many of these functionalities and concepts also apply to other gravimeter types. It is thus planned to include also access and processing of data for these other devices (starting in the near future with CG-6 relative gravimeters). This package is actively maintained and developed. If you are interested in contributing, please do not hesitate to contact us. Please find instructions for its installation and usage in the documentation or git repository, linked in the left panel. gravitools is listed in the python standard repository database "PyPi". Some highlight features, available in the first official stable release are: • Read and process raw data of the Exail Absolute Quantum Gravimeter (AQG) • Apply standardized or customized AQG data processing and outlier detection • Read and write processed datasets with metadata to .nc-files in NETCDF4-format • Handle Earth orientation parameters (EOP) from iers.org for polar motion correction • Visualize data with matplotlib • CLI for standard processing of AQG raw data to .nc-file • Dashboard for real-time processing and visualization during measurements (on AQG laptop) • Dashboard includes a proposed standard template for a measurement protocol • Standardized, easy-to-read and modify config files for processing options and reproducible data handling • Generation of PDF reports from individual measurements
This data publication contains airborne wind and eddy covariance data files, that were recorded with the ASK-16, a motorized glider owned by the FU Berlin, Germany. These data files include a large range of meteorological variables (wind speed, direction, temperature, humidity, etc.), positioning information, but also information on atmospheric chemistry (mainly methane concentration, carbon dioxide concentration, water vapor concentration) and turbulent matter (CH4 and CO2) and energy fluxes (latent heat flux) is available. Measurements were recorded between 2017 and 2022 to: (1) obtain three-dimensional wind vectors in within the atmospheric boundary layer (2) calibrate of wind measurements (3) record turbulent energy and matter fluxes A lot of these data files have been used in the publication “The ASK-16 Motorized Glider: An Airborne Eddy Covariance Platform to measure Turbulence, Energy and Matter Fluxes (to be published in atmospheric measurement techniques)” by Wiekenkamp et al., 2024a. This publication also provides a lot of additional details on the measurement system, the data handling and processing.
In the last years, a whole series of codes has been developed to process airborne wind data. Initially, the PyWingpod package was mainly build to handle data from the Wingpod of the ASK-16 motorized glider of the FU Berlin. However, due to the modular buildup of the package, functions within the different libraries can also be used to process data from other airborne platforms. Functions and scripts within PyWingpod have been developed to: a. load and process airborne five hole probe and meteo data, this includes (1) 5 hole probe pressure sensor data (static pressure, dynamic pressure and the differential alpha and beta pressure), (2) INS-GNSS data, (3) Temperature and humidity data and (4) any auxillary data that you want to add to the time series/ data frame. b. calibrate pressure sensor data from the five hole probe (mainly to correct for any effect of aircraft movement) c. calculate a reliable wind vector based on the available data that are specified in a. and the calibration parameters, which are obtained in step b.
This dataset contains measurements of viscous and viscoelastic materials that are used for analogue modelling. Proper density and viscosity scaling of ductile layers in the crust and lithosphere, requires materials like Polydimethylsiloxane (PDMS), to be mixed with fillers and low viscoity silicone oils. Changing the filler content and filler material, the density, viscosity and power-law coefficient can be tuned according to the requirements. All materials contain a large amount of PDMS and all but one a small amount of an additional silicone oil. Adding plasticine or barium sulfate lead to shear thinning rheologies with power-law exponents of p<0.95. Adding corundum powder only has a minor effect on the power-law exponent. Some mixtures also have an apparent yield point but all are in the liquid state in the tested range. In general, the rheologies of the materials are very complex and in some cases strongly temperature dependent. However, in the narrow range of relevant strain rates, the behaviour is well defined by a power-law relation and thus found suitable for simulating ductile layers in crust and lithosphere.
This data set provides a series of experiments from ring-shear tests (RST) on various materials that are used at several laboratories worldwide. The data contains the results of slide-hold-slide tests and the processed outputs of standardized ring shear tester data from related publications. Additionally, microscopy images of the materials under plain and polarized light are provided. The time dependent restrengthening of the materials is quantified using slide-hold-slide tests. This restrengthening has implications on the reactivation potential of granular shear zones in analogue models. With the provided software we first analyze the experimental data and then compare the angles and stresses needed to reactivate normal faults in the materials. We find that while healing rates are low, the majority of samples can not reactivate normal faults that are generated through extension of an analogue model.
This publication contains software that can be used to pre-process data from the Globe at Night citizen science project, and then run an analysis to determine the rate of change in sky brightness. The software requires input data, which can be obtained directly from Globe at Night. The data used for our publication "Citizen scientists report global rapid reductions in the visibility of stars from 2011 to 2022" is published here, and can be used as input to the software. The process requires access to the World Atlas of Artificial Night Sky Brightness, which is also available from GFZ Data Services.
FlotteKarte is a low-overhead plotting routine using Matplotlib, NumPy, and PyPROJ under the hood. The conceptual idea behind this package is that a map is fully defined through the 2D cartesian coordinates that result from applying the map projection to different geographical data. For displaying data on a two-dimensional canvas, Matplotlib is a powerful tool. Conversion between geographic and projected coordinates can easily be done using PyProj. The gap between these two powerful tools and a polished map lies in potential difficulties when translating spherical line topology to 2D cartesian space, and by introducing typical map decorations such as grids or ticks. FlotteKarte aims to fill this gap with a simple interface. FlotteKarte's philosophy is to work completely within the 2D projected coordinates, that is, very close to the projected data. If projected coordinates of data can be obtained, the data can be drawn directly on the underlying Matplotlib Axes. The Map class can then be used to add typical map decoration to that axes using information that it derives from the numerics of the PROJ projection.
This dataset provides friction data from ring-shear tests on feldspar sand FS900S used for the simulation of brittle behaviour in crust- and lithosphere-scale analogue experiments at the Tectonic Modelling Laboratory of the University of Bern (Zwaan et al. in prep; Richetti et al. in prep). The materials have been characterized by means of internal friction parameters as a remote service by the Helmholtz Laboratory for Tectonic Modelling (HelTec) at the GFZ German Research Centre for Geosciences in Potsdam (Germany). According to our analysis both materials show a Mohr-Coulomb behaviour characterized by a linear failure envelope. Peak, dynamic and reactivation friction coefficients of the feldspar sand are μP = 0.65, μD = 0.57, and μR = 0.62, respectively, and the Cohesion of the feldspar sand is in the order of 5-20 Pa. An insignificant rate-weakening of less than 1% per ten-fold rate change is registered for the feldspar sand. Granular healing is also minor.
The success of scientific projects increasingly depends on using data analysis tools and data in distributed IT infrastructures. Scientists need to use appropriate data analysis tools and data, extract patterns from data using appropriate computational resources, and interpret the extracted patterns. Data analysis tools and data reside on different machines because the volume of the data often demands specific resources for their storage and processing, and data analysis tools usually require specific computational resources and run-time environments. The data analytics software framework DASF, developed at the GFZ German Research Centre for Geosciences (https://www.gfz-potsdam.de) and funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/), provides a framework for scientists to conduct data analysis in distributed environments. The data analytics software framework DASF supports scientists to conduct data analysis in distributed IT infrastructures by sharing data analysis tools and data. For this purpose, DASF defines a remote procedure call (RPC) messaging protocol that uses a central message broker instance. Scientists can augment their tools and data with this protocol to share them with others. DASF supports many programming languages and platforms since the implementation of the protocol uses WebSockets. It provides two ready-to-use language bindings for the messaging protocol, one for Python and one for the Typescript programming language. In order to share a python method or class, users add an annotation in front of it. In addition, users need to specify the connection parameters of the message broker. The central message broker approach allows the method and the client calling the method to actively establish a connection, which enables using methods deployed behind firewalls. DASF uses Apache Pulsar (https://pulsar.apache.org/) as its underlying message broker. The Typescript bindings are primarily used in conjunction with web frontend components, which are also included in the DASF-Web library. They are designed to attach directly to the data returned by the exposed RPC methods. This supports the development of highly exploratory data analysis tools. DASF also provides a progress reporting API that enables users to monitor long-running remote procedure calls. One application using the framework is the Digital Earth Flood Event Explorer (https://git.geomar.de/digital-earth/flood-event-explorer). The Digital Earth Flood Event Explorer integrates several exploratory data analysis tools and remote procedures deployed at various Helmholtz centers across Germany.
Monitoring Velocity Changes using Ambient Seismic Noise SeisMIC (Seismological Monitoring using Interferometric Concepts) is a python software that emerged from the miic library. SeisMIC provides functionality to apply some concepts of seismic interferometry to different data of elastic waves. Its main use case is the monitoring of temporal changes in a mediums Green's Function (i.e., monitoring of temporal velocity changes). SeisMIC will handle the whole workflow to create velocity-change time-series including: Downloading raw data, Adaptable preprocessing of the waveform data, Computating cross- and/or autocorrelation, Plotting tools for correlations, Database management of ambient seismic noise correlations, Adaptable postprocessing of correlations, Computation of velocity change (dv/v) time series, postprocessing of dv/v time series, plotting of dv/v time-series
Origin | Count |
---|---|
Wissenschaft | 35 |
Type | Count |
---|---|
unbekannt | 35 |
License | Count |
---|---|
offen | 35 |
Language | Count |
---|---|
Englisch | 35 |
Resource type | Count |
---|---|
Keine | 35 |
Topic | Count |
---|---|
Boden | 23 |
Lebewesen & Lebensräume | 11 |
Luft | 8 |
Mensch & Umwelt | 35 |
Wasser | 6 |
Weitere | 35 |