The role of the application experts is to transfer HPC competence and computational know-how within the respective fields to SNIC users.
For help or advice on using, developing or tuning HPC applications you can contact the SNIC-UPPMAX application experts directly, preferably via e-mail: email@example.com.
Theoretical Material Physics Diana Iusan
Density functional theory (DFT) is used extensively to study the properties of real materials. Electronic, structural, magnetic, optical, elastic and several other properties can be calculated by DFT based electronic structure codes. The accuracy of the DFT based methods allows one not only to analyze experimental data with a good precision but also to have the power of accurate prediction.
Modeling of materials in chemistry Pavlin Mitev
Development and evaluation of theoretical and computational methods for modeling of static and dynamic properties of inorganic solid and liquid phase materials with focus on developing new transferable classical potentials.
In collaboration with eSSENCE
Algorithm and Code Development Marcus Lundberg
When moving code from a single processor machine to a parallel computer, both the code and the underlying algorithms must be adapted to the parallel environment. Finding and exploiting the inherent parallelism as well as choosing a suitable parallel programming model becomes key issues for porting the code. I can help with algorithm development, optimization techniques, and also scripts to manage experiments more effectively.
Bioinformatics Douglas Scofield, Anders Sjölander, Jonas Söderberg
By combining computer science, mathematics, statistics and information technology, bioinformaticians try to answer questions and problems that arise in modern biology. Bioinformatics tools and software are used for the end analysis, as well as a guide for further experiments.
Modelling and analysis in Earth and Environmental Science Björn Claremar
Understanding and predicting processes on different scales in the nature, like motions in atmosphere, water or ice, may require complex modelling, sometimes in 3D. Parallel computing and clusters are helpful infrastructures to meet these demands. Further, the analysis of big datasets may require lots of memory, also that suitable for processing on a cluster.