Swedish e-Science Research Center - SeRC
The Swedish e-science Research Centre (SeRC) is formed by the universities in Stockholm and Linköping – KTH, Linköping University (LiU), Stockholm University (SU) and Karolinska Institutet (KI) – around the two largest high-performance computing (HPC) centres in Sweden: PDC at KTH and NSC at LiU.
More information can be found at the
Research at SeRC is focused on the collaboration between tool makers and tool users, and brings together a core of nationally leading IT research teams with expertise in e-Science method development and leading scientists in selected application areas. SeRC will constitute a leading visionary e-Science node with a national scope and strong international ties Substantially increased collaboration between applied and method-oriented groups is needed, and SeRC will provide a platform for this. Our approaches to reach these goals are:
- Formation of e-Science Communities that connect application groups with relevant core e-Science groups and computer experts at PDC and NSC.
- Research in core e-Science methods such as distributed resources, database technology, numerical analysis, visualization and interaction, mathematical modeling and parallel algorithms, focusing on problems critical for several e-Science communities.
- Much closer collaboration between PDC and NSC, and a substantial increase in advanced support staff, which will turn the centers into comprehensive e-Science enablers.
SeRC is also taking a national responsibility in the e-Science area in terms of hosting a large part of the Swedish e-Science infrastructure through PDC and NSC. Already today these two high performance computing centers take the nationally leading role, which will be further developed within SeRC beyond the hardware aspect of e-Infrastructure.
A key feature of research within SeRC are the e-Science communities, which connect application oriented groups with relevant core e-Science groups. Each of the communities will thus comprise computer experts, e-Science method developers and scientists from application areas who jointly run e-Science projects. The projects will be characterized by strong novelty in terms of technology (high end computing, novel architectures, grid, databases, etc.), methodology (new theories, models, methods, algorithms and software) and application (new application areas with large potential gains from e-Science tools). There will not be a fixed set of communities, but they will be created dynamically as the research environments evolve.
Examples of e-Science communities are:
- Waves: The Waves community will focus on numerical methods for wave propagation problems and include researchers in Electromagnetics, Aeroacoustics, Acoustics as well as Numerical Analysis.
- Fluids: The core activity within this community is the development of efficient and accurate methods for the simulation of turbulent flow.
- Climate and the Environment: The community will work to improve climate modeling through better numerical techniques for space-time discretization, code scalability and algorithms for efficient coupling between different climate model components.
- Bioinformatics & Sequence Databases: The bioinformatics community will focus on ways to combine bioinformatics with experiments, membrane protein classification and structure prediction, molecular modeling, comparative genomics, and database organization, as well as new algorithms – including distributed computing and storage – to identify correlations in whole-genome alignments and accurate methods to match spectra against databases in proteomics.
- Complex Diseases: The e-Science community of Complex Diseases will initially focus on neuroscience, cancer and cardiovascular disease, with priority on the development of tools for distributed databases and security for modeling and computation.
- Particle Simulation: The core of this e-Science community is particle-based modeling, using e.g. timedependent molecular dynamics or Monte Carlo simulation.
- Electronic Structure: This community will focus on first principle calculations based on e.g. density functional theory and Hartree–Fock methods, but also on multiscale simulation techniques.
e-Science tools are based on components from several IT disciplines. SeRC brings together a number of leading groups in the IT core sciences and connects them together as well as with strategic applications areas. The core e-Science areas are:
- Methods for distributed resources and database technology: These methods and tools will be used to enhance the resource usage and data management of the e-Science communities described in the previous section. Much emphasis will be put on seamless integration of data and computation. The consortium will harmonize the security infrastructure allowing single sign on and easy transition between resources. Virtualization techniques and market-based algorithms for resource allocation and scheduling will also be studied, as well as distributed algorithms and design techniques to address scalability of various grid services. Scalable self-management of distributed services will also be addressed.
- Visualization and Image Science: The human visual sense is superior to today’s computers in terms of perceiving content in images. It is this human capability that visualization builds upon by generating images representing the content of large and complex data sets. In image science the goal is to is to translate complex spatio-temporal patterns into forms that can be understood by humans. Interactivity also plays a central role in the visualization and image science workflows. To meet the challenges posed by the ever-increasing information flow, new visualization methods need to be developed and tailored for specific application demands. We also envision a convergence of visualization and image science methods leading to sophisticated knowledge representations in visualization and Image processing pipelines to further reduce data sizes, deal with uncertainties, and highlight areas of interest. The distributed nature of the future visualization resources and users also calls for research in areas such as high-quality remote rendering and collaborative visualization. Another area of high priority for the e-Science community is the integration of mathematical methods, primarily for data mining, into the visual knowledge discovery environments to enable them to deal with large-scale and high-dimensional data efficiently.
- Numerical Analysis: Development of numerical algorithms is critical in successful e-Science based research, often matching or outperforming improvements of hardware in terms of speed gains, in particular in areas at the forefront of research which have only recently become amenable to computer simulations. An important research direction is the development of general software and theory for first principle computational mathematical modeling. Research will also focus on some particularly challenging areas where algorithmic progress will be essential, such as multiscale, multiphysics and stochastic problems as well as application fields like turbulent and multiphase flow and high frequency wave propagation.
- Mathematical Modeling: Computation oriented mathematics, statistics and informatics belong to the core of e-Science. The relevant mathematical theories and tools can take many forms, including dynamical systems, stochastic models, differential equations etc. Analytic methods as well as complex simulation based ditto have their given place. For example, the goal for systems biology is to identify the underlying biological system, or network of interacting parts (genes, proteins, metabolites, cells, organs), that causes the observed dynamics and correlation patterns. Network algorithms are based on Boolean, graphical, ordinary differential equations or Bayesian models, and standard techniques include bifurcation analysis of dynamic systems. Mathematically challenging new approaches involve connections between algebraic graph theory and decentralized control theory, or the use of so-called “dissipativity theory,” for the global analysis of underlying dynamics.
- Parallel Algorithms and Performance Optimization: With increasing number of cores per processor follow demands on code optimisation and efficient parallelisation. Development tools and languages developed during the next 5-10 years will need advances in component-based modeling, checking, parallel constructs, debugging, and development support.
Future development of PDC and NSC
The two largest HPC centers in Sweden, NSC and PDC, both take part in SeRC. One of the most important goals of SeRC is to tighten the existing collaborations to align efforts and have the centers act with a single voice both in Sweden and on the international arena. Through SeRC collaborations will be further developed and deepened ensuring the development of complementary competences and well aligned hardware procurements.
Transforming NSC and PDC from HPC hardware supporters to e-Science enablers is another important mission. The main mechanism for this will be a substantial increase at the centers of application experts and software engineers working in the SeRC e-Science communities. In order to make sure that they have a strong coupling to the research areas, each one of them will be closely associated with an application or core e-Science area within SeRC.