1.4. High Performance Astrophysical Fluid Simulations Using InSilicoLab Framework¶
Abstract. With the advent of PL-Grid Infrastructure, Polish scientists have been equipped with a substantial computational resources, forming favorable conditions for the development of all research areas relying on numerical simulation techniques. However, modern hydrodynamical simulation codes require a properly configured software environment. Installation of the compilers, libraries and post processing software requires technical know- how that is not common in the population of young researchers and students of astronomy. To reduce the barriers inhibiting the start of the newcomers to the field of computational astrophysics, we have implemented a web-based workspace for astrophysical simulation codes based on InSilicoLab framework [20]. InSilicoLab for Astrophysics is a solution dedicated for computational astrophysicists intending to conduct a numerical experiment using the PL-Grid Infrastructure. Currently, it serves as an interface for the multipurpose, magnetohydrodynamical, open-source software PIERNIK – an MHD code that relies on a dimensionally split, second order algorithm in space and time, allowing to account for various fluid components: multiple fluids, dust, cosmic rays, and additional physical processes, such as fluid interactions and Ohmic resistivity effects. As a result, scientists using only a web browser can perform a full sequence of actions starting from copying of the source code from a publicly accessible repository, through the remote compilation of the code, execution of the numerical experiment in the PL-Grid Infrastructure and an immediate visualization of the results using the yt package [29]. Additionally the simulation results are stored as binary HDF5 files in LFC Catalogue, available for further analysis. InSilicoLab for Astrophysics vastly softens the learning curve of advanced astrophysical simulations, bringing new possibilities for scientists around the world.
1 Introduction
Fast progress in computational infrastructure and availability of professional open-source astrophysical simulation codes create new opportunities for youngscientists and students. Successful utilization of complex astrophysical simulation codes depends, however, on availability of proper computational environment. Practically all modern hydrodynamical simulation codes require particular compilers, numerical libraries and specialized post-processing software for analysis of simulation results. Installation and configuration of the software environment requires technical know-how that is, however, not common in the population of young researchers and students of astronomy. To reduce the barriers inhibiting the start of newcomers in the field of computational astrophysics we implemented a web-based workspace based on InSilicoLab environment for astrophysical simulation codes. InSilicoLab is a software package [15] which enables embedding simulation codes in PL-Grid environment. The codes can be then compiled and executed from a web browser. Already accessible implementations offer GAUSSIAN, GAMESS and TURBOMOLE packages within the framework of InSilicolab for Chemistry [20], simulations of particle cascades in the Earth atmosphere for Cherenkov Telescope Array (CTA) [1]. InSilicoLab for Astrophysics offers an easy access to astrophysical fluid simulations on clusters of PL-Grid infrastructure. InSilicoLab enables preparation of necessary input data (e.g. the module to generate initial condition and other simulation parameters) for the supported codes, submission of jobs to the queue, job execution and preliminary analysis of simulation results. The output data data are stored in network filesystems and are accessible for other experiments executed on other machines. InSilicoLab for Astrophysics enables execution of numerical simulations without complicated preparations of software environment on the clusters of PL-Grid infrastructure. Current functionality of InSilicoLab for Astrophysics supports numerical experimenting with the multipurpose magnetohydrodynamical (MHD) code PIERNIK, which is open-source software accessible from git repository [25].
2 Piernik MHD code
2.1 Algorithms and scaling
PIERNIK is a grid-based MHD code using a simple, conservative numerical scheme, which is known as Relaxing TVD scheme (RTVD) [18]. The code relies on a dimensionally split, second order algorithm in space and time [28,24]. The Relaxing TVD scheme is easily extensible to account for additional fluid components [4,7,5,6]: multiple fluids, dust, cosmic rays, and additional physical processes, such as fluid interactions, Ohmic resistivity effects and self-gravity. The simplicity and a small number of floating point operations of the basic algorithm is reflected in a high serial performance. A unique feature of PIERNIK code relies on our original implementation of anisotropic transport of cosmic-raycomponent in fluid approximation [8]. The basic explicit CR diffusion scheme has been recently supplemented with a multigrid-diffusion scheme. We have recently implemented two new modules: Adaptive Mesh Refinement (AMR) and a Multigrid (MG) solver. The AMR algorithm allows to reach much bigger effective resolutions than it was possible with uniform grid. It dynamically adds regions of improved resolution (fine grids) where it is required by refinement criteria. It also can delete grids which are no longer needed to maintain high-quality solution. The MG, on the other hand, is one of the fastest known methods to solve parabolic and elliptic differential equations, which in our case are used to describe self-gravity of the fluid and diffusion of the cosmic rays, respectively. In addition, the isolated external boundaries for self-gravity use multipole expansion of the potential to determine proper boundary values in a fast and efficient manner. Combination of AMR and multigrid algorithms make PIERNIK an ideal tool for simulations of multiphysics phenomena in gaseous disks of galaxies, active galactic nuclei, and planet-forming, circumstellar disks. There are two main grid decomposition approaches in the PIERNIK code: the uniform grid (UG) and recently developed Adaptive Mesh Refinement (AMR). The UG algorithm divides the grid into smaller pieces, not necessarily of the same size, and assigns one piece to each process. Decomposition is performed in a way that minimizes the total size of internal boundaries between the pieces. Communication between each piece is done via non-blocking MPI communication. The current implementation of AMR algorithm uses Hybrid-Block AMR approach, which means that on each level of refinement the grid is decomposed into relatively small pieces of same size and shape (typically 16 3 cells) and each grid piece can be covered by some grid pieces at finer level of refinement. The finer grid pieces do not cover more than one coarse grid piece and the coverage don’t need to be complete (in contrast to standard Block AMR approaches) in order to save computational resources. The resolution difference between consecutive refinement levels is equal to 2. Typically there are many grid pieces that are associated with a given process. They are kept evenly distributed along a Morton or Hilbert fractal curves to decrease intracommunication and improve load balance. Gravitational potential for the gas components is obtained by solving the Poisson equation inside the computational domain with an iterative, multigrid solver [13]. The multigrid algorithm is based on a quick and simple V-cycle with some passes of a Red-Black Gauss-Seidel relaxation (smoothing) as an approximate solver. To minimize the effect of boundaries on the gravitational potential, we use a multipole expansion up to l = 16 moments to calculate the potential on the external boundaries [17]. The multigrid module creates a stack of coarse grids, each coarsened by a factor of 2 (i.e. compatible with the AMR assumptions) to accelerate approximate solutions of elliptic equations byrelaxation. An extension of the multigrid solver is the recently implemented parallel Multigrid-Preconditioned Conjugate Gradient Solver. PIERNIK code is equipped with the cylindrical coordinate system implemented with angular momentum-conserving form of the momentum equation [23], which with respect to Cartesian geometry introduces only one additional source term, minimizing possible nonphysical evolution of a system. Additionally, a new algorithm for a fast Eulerian transport [22] has been implemented for simulations of astrophysical disks to reduce number of computational steps or to increasing the resolution in the azimuthal direction.
We utilize parallel HDF5 [12] in the PIERNIK code. Current implementations allow for 2 scenarios of I/O: 1) one process collects data via MPI and writes to a single hdf5 file, 2) all processes write to (or read from) a single file using parallel HDF5 library routines. The checkpointing facility allows PIERNIK to split long computation time into small chunks and does not require the number of CPUs to be constant, making it very portable and flexible in utilizing available resources. Output produced by PIERNIK complies to Grid Data Format [34], which allows to utilize yt package [29] for analyzis and visualization purposes. As a result scientific results obtained using PIERNIK can be directly compared to the simulation output from other popular astrophysical codes such as FLASH, Enzo, Athena, Gadget, RAMSES and many others. PIERNIK is developed in an open source fashion under GPL license. Each and every change in the code base is accounted for on publicly available gitrepository [25] and placed under scrutiny using automated continuous integration system – Jenkins [27].
2.2 Examples of Piernik applications.
In the following we present two examples of complex multi-fluid simulations of two examples of astrophysical disks: protoplanetary disks and galactic disks. The main focus of astrophysics of those objects is their evolution. The relevant questions to address with the technique of astrophysical fluid simulations are: how planets form in disks consisting of dust and gas, how stars form from interstellar gas and what are consequences of stellar explosions on the dynamics of interstellar medium, what is the origin of galactic magnetic fields and what is their impact on star formation. The basic concern of planet formation theory is how to grow dust from 1 m particles to km sized rocks and planetesimals, that are basic building blocks for planets. There is a general agreement that dust particles can grow up to cm in radius due to collisions and interparticle interactions. However both theoretical models [3] and physical experiments [2] show that coagulation via collisional sticking is completely ineffective for large dust grains (0 : 1 1 m ) . Moreover, particles of sizes 10 cm 1 m start to drift very rapidly towards the center of the disks, what results in a complete depletion of dust grains within timescales of hundreds of years [30]. A combination of those two processes is known as “meter barrier”. Despite of these circumstances, that are unfavourable for planet formation, there is a process that commences to dominate dust evolution when local ratio of dust to gas density approaches unity. This mechanism was first presented by [33] and named the streaming instability. It appears that combination of dust trapping in gas pressure maxima and aerodynamic coupling of gas and dust, enhancing the maxima even further, results in significant dust pile-up [16]. Even without the presence of self-gravity, dust concentration may be risen up to the three orders of magnitude, which could possibly lead to gravitationally bound objects [19]. In our recent paper [21] we investigated for the first time the streaming instability in quasi-global protoplanetary disks (see Fig. 2). We have found that nonlinear evolution of the streaming instability leads to suitable conditions for the formation of gravitationally bound dust blobs. Our model extends the previous work of other authors [19] by taking into account the full dynamics of protoplanetary disk, e.g. radial migration, that lead to significant variation in physical quantities, such as gas pressure gradient, and were previously treated as constant. This is an important step towards understanding the mechanism responsible for planetary formation. In the currently ongoing project we incorporate self gravity, aiming to verify the hypothesis that planets can form due to combined action of streaming and gravitational instabilities.
PIERNIK has been successfully used in computationally demanding simu lations of cosmic ray driven galactic dynamo [10,11]. In this project we focus on the case of galactic disks, which involve many physical ingredients and processes, such as magnetic field generation, cosmic-ray transport and gravitational instability induced star formation. In such cases we need to resolve multiscale environment ranging from parsec scale, gravitationally bound star forming regions, up to tens of kiloparsec long CR-driven outflows. We have shown that the contribution of cosmic rays to the dynamics of the ISM on a global galactic scale leads to a very efficient magnetic field amplification on the timescale of galactic rotation. The model reveals a large scale regular magnetic field with apparent spiral structure in the face-on view and a X-shaped structure in the edge-on view. In the presence of spiral arms in the distribution of stars the magnetic field reveals a well pronounced spiral component closely corresponding to the material arms. Dynamical magnetic field structures with opposite polarities develop within the disk and are present even at the saturation phase of the dynamo. Moreover, during the coalescence phase of the two galaxies shown in Fig. 3 the magnetic field structure becomes irregular as observed in M51. An important part of the CR-driven dynamo is the galactic wind which reaches velocities of a few hundred km/s at galactic altitudes of a few kpc. Recently we performed high-resolution simulations of the magnetized interstellar medium (ISM) in gas-rich star forming disk galaxies at high-redshift [9]. In our models type II Supernovae locally deposit cosmic rays into the ISM. Our initial work indicates that cosmic rays produced in supernova remnants contribute essentially to the transportation of a significant fraction of gas in a wind perpendicular to the disk plane. The wind speeds can exceed the escape velocity of the galaxies and the global mass loading factor, i.e. the ratio of the gas mass leaving the galactic disk in a wind to the star formation rate, is approximately 10. These values are very similar to values observed for high redshift ( z = 2 3 ) star forming galaxies. Therefore cosmic ray driven galactic winds provide a natural and efficient mechanism to explain the low efficiency for the conversion of gas into stars in galaxies as well as the early enrichment of the intergalactic medium with metals. This mechanism can be of at least equal important as the usually considered momentum feedback from massive stars and thermal feedback from supernovae. Aforementioned astrophysical problems pose a serious challenge from the computational point of view, as we deal with several nonlinear and mutually interacting physical processes that happen on various time and length scales. That implies that undertaken numerical experiments require great spatial resolution and could not be tackled without significant amount of computational power.
3 Description of the solution and results: InSilicoLab for Astrophysics
To become a user of InSilicoLab for Astrophysics one should open an account in PL-Grid Portal and the apply for activation of the InSilicoLab for Astrophysics service. Activation of Global gLite access in section Global services is also necessary. User manuals (Polish version only) are available at the webpage [26]. The InSilicoLab for Astrophysics service is available at [14]. Access to the service is possible with the aid of a certificate which has to be registered in a web-browser, or via OpenId system of PL-Grid infrastructure. The entry page of the service is shown if Fig. 4
The next step is to configure the proxy certificate by click on Configure your proxy. The proxy certificate is obligatory for running simulations in PL-Grid infrastructure and for getting access to data stored in PL-Grid filesystems. After configuring the proxy certificate one can start user’s experiment. To start a new simulation one should select Piernik from the menu, as shown in Fig. 5. An empty card of a new experiment opens and the user is promoted to fill in text fields to specify a problem name and problem description.
One of the predefined experiments can be selected from the list including the following test problems:
- sedov – the Sedov explosion experiment demonstrating spherical shock wave propagating after explosion of supernova in a uniform interstellar medium.
- otvortex – evolution of sinusoidal perturbation of gas velocity and magnetic field, named Orszag-Tang vortex, leading to strong shock waves in a magnetized medium.
- tearing – experiment demonstrating magnetic reconnection and formation of magnetic islands in plasma.
Sedov explosion test is perhaps the most common test of astrophysics fluid simulation codes, because its results can be confronted with a corresponding analytical solution. To explore one of the options in more detail we select sedov, from the list of test problems, as shown in Fig. 7. Our choice implies that three problem-specific files will be checked out from the repository:
- piernik.def – configuring the list of physical modules selected at the compilation phase.
- initproblem.F90 – containing FORTRAN 2003 module to construct the initial condition
- problem.par – containing numerical parameters for the experiment, organized in FORTRAN namelists.
The three problem-specific files can be edited according to the user’s needs. As an example, user can replace the default neutral fluid by ionized fluid and add magnetic field by replacing in file piernik.def the line:
#define NEUTRAL
by the following sequence:
To change runtime parameters of the simulation one should edit problem.par file shown in Fig. 8. To change grid resolution in x , y and z direction, replace
n_d = 64, 64, 64
by another set of three integers, or change periodic to outflow boundary conditions through replacing
bnd_xl = ’per’
by
bnd_xl = ’out’
and similarly, in subsequent lines of the namelist ’BASE_DOMAIN’. In the same way, the initproblem.F90 is editable file, so it is possible to add corrections in the source code of the module defining the initial condition. Alternatively, by using ’Choose file’ the user can upload selected problem configuration files from his own copy of code repository. After completing the configuration process the simulation starts after Run command. Current state of the job can be followed in ’Job execution status’ field. When the job is finished results are accessible in the field ’Download job files’, shown in Fig. 10. With default settings only the last output file (hdf5 format) is stored along with a series of standard plots (png format), generated by a python script of yt package, displaying gas density and energy density slices through the computational domain. To collect all the output files one should select option Store all data in LFC before the job execution. All output files together with plot files become available in LFC catalogue. The procedure described by now represents a beginners mode of utilization of the code that offers only three simple example test problems. The full list of predefined test problems is available in Piernik ’problems’ directory that can be downloaded from the code repository (see [25]). Selection of USER in the list Select problem opens the possibility to upload a set of problem files (problem.par, piernik.def and initproblem.F90) taken form Piernik ’problems’ catalogue, or files prepared by the user for his own test problem.
4 Conclusions and future work
PIERNIK is a multipurpose astrophysics code. It was successfully used in multi fluid simulations of galactic dynamos driven by cosmic rays and in studies of instabilities occurring in circumstellar disks that may lead to planets’ formation. The broad spectrum of included physical processes and and modern algorithms allows to predict it will gain new users in yet untamed areas of computational astrophysics. Furthermore, PIERNIK was chosen as a model astrophysical code for integration with Polish Grid Infrastructure as a web-based service that will allow to easily run simulation across all major Polish HPC centers. Any optimization performed within this project will have a great impact upon whole polish astrophysical community. Due to the open source nature of the PIERNIK, all modification done during the course of this project will be promptly applied to code’s repository and will immediately benefit PIERNIK Users community.
Figures:
Fig. 1. Strong scaling of Piernik for the Jeans problem.
Fig. 2. Two-fluid hydrodynamical simulation of early stages of planet formation as a result of combined action of streaming and gravitational instabilities in protoplanetary disks. The picture shows dust condensations that emerged due to the action of streaming instability from an initially smooth dust distribution. The dust condensations are plausible progenitors of planetesimals, intermediate objects on evolutionary tracks towards rocky planets. The 3D simulation in the highest resolution ( 2560 480 160 consumed 1 MCPUh on PL-Grid infrastructure).
Fig. 3. Magnetohydrodynamical simulation of magnetic field generation in a spiral galaxy interacting with a companion dwarf galaxy [32]. The system resembles to the famous M51 (Whirlpool) galaxy and its companion NGC5195. This is an example of a hybrid simulation that includes N-body simulation of stars and dark matter component, performed with VINE code [31], and MHD simulation including cosmic ray dynamics performed with PIERNIK code.
Fig. 4. Entry webpage for the InSilicoLab for Astrophysics service
Fig. 5. Creating a new Piernik experiment.
Fig. 6. Empty card of a new experiment is shown in the upper panel and example contents of problem description fields in the lower panel.
Fig. 7. Selecting the sedov test problem.
Fig. 8. Editing parameter file problem.par
Fig. 9. Selecting the number of CPU cores.
Fig. 10. Window displaying the job execution status at the running phase (upper panel) and after the job is finished (lower panel).
Fig. 11. Window displaying the contents of LFC catalogue.
References
- Barnacka, A., Bogacz, L., Gochna, M., Janiak, M., Komin, N., Lamanna, G., Moderski, R., Siudek, M.: Pl-grid e-infrastructure for the cherenkov telescope array observatory. In: Bubak, M., Szepieniec, T., Wiatr, K. (eds.) Building a National Distributed e-Infrastructure–PL-Grid, Lecture Notes in Computer Science, vol. 7136, pp. 301–313. Springer Berlin Heidelberg (2012)
- Blum, J., Wurm, G.: The Growth Mechanisms of Macroscopic Bodies in Protoplanetary Disks. Annual Review of Astron. and Astrophys. 46, 21–56 (Sep 2008)
- Dullemond, C.P., Dominik, C.: Dust coagulation in protoplanetary disks: A rapid depletion of small grains. Astron. Astrophys. 434, 971–986 (May 2005)
- Hanasz, M., Kowalik, K., Wóltański, D., Pawłaszek, R.: The PIERNIK MHD code - a multi-fluid, non-ideal extension of the relaxing-TVD scheme (I). In: Goździewski, K., Niedzielski, A., Schneider, J. (eds.) EAS Publications Series. EAS Publications Series, vol. 42, pp. 275–280 (Apr 2010)
- Hanasz, M., Kowalik, K., Wóltański, D., Pawłaszek, R.: PIERNIK MHD code - a multi-fluid, non-ideal extension of the relaxing-TVD scheme (III). In: de Avillez, M.A. (ed.) EAS Publications Series. EAS Publications Series, vol. 56, pp. 363–366 (Sep 2012)
- Hanasz, M., Kowalik, K., Wóltański, D., Pawłaszek, R.: PIERNIK MHD code - a multi-fluid, non-ideal extension of the relaxing-TVD scheme (IV). In: de Avillez, M.A. (ed.) EAS Publications Series. EAS Publications Series, vol. 56, pp. 367–370 (Sep 2012)
- Hanasz, M., Kowalik, K., Wóltański, D., Pawłaszek, R., Kornet, K.: The PIERNIK MHD code - a multi-fluid, non-ideal extension of the relaxing-TVD scheme (II). In: Goździewski, K., Niedzielski, A., Schneider, J. (eds.) EAS Publications Series. EAS Publications Series, vol. 42, pp. 281–285 (Apr 2010)
- Hanasz, M., Lesch, H.: Incorporation of cosmic ray transport into the ZEUS MHD code. Application for studies of Parker instability in the ISM. Astron. Astrophys. 412, 331–339 (Dec 2003)
- Hanasz, M., Lesch, H., Naab, T., Gawryszczak, A., Kowalik, K., Wóltański, D.: Cosmic Rays Can Drive Strong Outflows from Gas-rich High-redshift Disk Galaxies. ApJL 777, L38 (Nov 2013)
- Hanasz, M., Wóltański, D., Kowalik, K.: Global Galactic Dynamo Driven by Cosmic Rays and Exploding Magnetized Stars. ApJL 706, L155–L159 (Nov 2009)
- Hanasz, M., Wóltanski, D., Kowalik, K., Kotarba, H.: Cosmic-ray driven dynamo in galaxies. In: Bonanno, A., de Gouveia Dal Pino, E., Kosovichev, A.G. (eds.) IAU Symposium. IAU Symposium, vol. 274, pp. 355–360 (Jun 2011)
- HDF Group: What is hdf5? (2013), http://www.hdfgroup.org/HDF5/whatishdf5.html
- Huang, J., Greengard, L.: A fast direct solver for elliptic partial differential equations on adaptively refined meshes. SIAM Journal on Scientific Computing 21(4), 1551–1566 (1999), http://epubs.siam.org/doi/abs/10.1137/S1064827598346235
- InSilicoLab Portal: InSilicoLab for Astrophysics (2013), http://insilicolab.astro.plgrid.pl
- InSilicoLab Team, ACK CYFRONET AGH: InSilicoLab (2013), http://insilicolab.cyfronet.pl
- Jacquet, E., Balbus, S., Latter, H.: On linear dust-gas streaming instabilities in protoplanetary discs. MNRAS 415, 3591–3598 (Aug 2011)
- James, R.A.: The Solution of Poisson’s Equation for Isolated Source Distributions. Journal of Computational Physics 25, 71 (Oct 1977)
- Jin, S., Xin, Z.: The relaxation schemes for systems of conservation laws in arbitrary space dimension. Comm. Pure Appl. Math. 48, 235–276 (1995)
- Johansen, A., Oishi, J.S., Mac Low, M.M., Klahr, H., Henning, T., Youdin, A.: Rapid planetesimal formation in turbulent circumstellar disks. Nature 448, 1022–1025 (Aug 2007)
- Kocot, J., Szepieniec, T., Harężlak, D., Noga, K., Sterzel, M.: Insilicolab – managing complexity of chemistry computations. In: Bubak, M., Szepieniec, T., Wiatr, K. (eds.) Building a National Distributed e-Infrastructure–PL-Grid, Lecture Notes in Computer Science, vol. 7136, pp. 265–275. Springer Berlin Heidelberg (2012)
- Kowalik, K., Hanasz, M., Wóltański, D., Gawryszczak, A.: Streaming instability in the quasi-global protoplanetary discs. MNRAS 434, 1460–1468 (Sep 2013)
- Masset, F.: FARGO: A fast eulerian transport algorithm for differentially rotating disks. Astron. Astrophys. Suppl. Ser. 141, 165–173 (Jan 2000)
- Mignone, A., Bodo, G., Massaglia, S., Matsakos, T., Tesileanu, O., Zanni, C., Ferrari, A.: PLUTO: A Numerical Code for Computational Astrophysics. ApJS 170, 228–242 (May 2007)
- Pen, U.L., Arras, P., Wong, S.: A Free, Fast, Simple, and Efficient Total Variation Diminishing Magnetohydrodynamic Code. ApJS 149, 447–455 (Dec 2003)
- Piernik Developement Team: Piernik mhd code (2013), https://github.com/piernik-dev
- Portal PL-Grid: Podręcznik użytkownika. Astrofizyka: InSilicoLab for Astrophysics (2013), https://portal.plgrid.pl/web/guest/podrecznik-pl-grid
- developement team, J.: Jenkins (2011), http://jenkins-ci.org/content/about-jenkins-ci
- Trac, H., Pen, U.L.: A Primer on Eulerian Computational Fluid Dynamics for Astrophysics. PASP 115, 303–321 (Mar 2003)
- Turk, M.J., Smith, B.D., Oishi, J.S., Skory, S., Skillman, S.W., Abel, T., Norman, M.L.: yt: A Multi-code Analysis Toolkit for Astrophysical Simulation Data. ApJS 192, 9 (Jan 2011)
- Weidenschilling, S.J.: Aerodynamics of solid bodies in the solar nebula. MNRAS 180, 57–70 (Jul 1977)
- Wetzstein, M., Nelson, A.F., Naab, T., Burkert, A.: Vine — A Numerical Code for Simulating Astrophysical Systems Using Particles. I. Description of the Physics and the Numerical Methods. ApJS 184, 298–325 (Oct 2009)32. Wóltański, D., Hanasz, M., Kowalik, K.: Cosmic Ray Driven Dynamo in Spiral Galaxies. MNRAS (in prep.) (2014)
- Youdin, A.N., Goodman, J.: Streaming Instabilities in Protoplanetary Disks. ApJ 620, 459–469 (Feb 2005)
- yt Developement Team: Grid data format (2011), https://bitbucket.org/yt_analysis/grid_data_format