Anda di halaman 1dari 9

68 THIS ARTICLE HAS BEEN PEER-REVIEWED.

COMPUTING IN SCIENCE & ENGINEERING


E X A S C A L E
C O M P U T I N G
1521-9615/13/$31.00 2013 IEEE
COPUBLISHED BY THE IEEE CS AND THE AIP
Giovanni Lapenta
Katholieke Universiteit Leuven, Belgium
Stefano Markidis
KTH Royal Institute of Technology, Sweden
Stefaan Poedts
Katholieke Universiteit Leuven, Belgium
Dean Vucinic
Vrije Universiteit Brussel, Belgium
Space Weather Prediction and
Exascale Computing
Space weather can have a great effect on Earths climate. Predicting the impact of space
environment disturbances on Earth presents a challenge to scientists. Here, the ExaScience
Labs efforts are presented, which use exascale computing and new visualization tools to
predict the arrival and impact of space events on Earth.
T
he vast interplanetary space be-
tween the sun and Earth is a com-
plex, coupled system. Important
solar processes travel through the
solar system and reach Earth and other plan-
ets. The study of these important processes af-
fecting people and technology in space and on
the ground is called space weather. The impact
can range from the beautiful and harmless
displays of the auroras in the two polar re-
gions, to severe radiation threats to astronauts
in space, to magnetic disturbances inducing
damaging currents in ground-level infra-
structures such as power and communication
lines. There are even indications that Earths
climate might be sensitive to space radiation
affecting cloud cover and to the variability of
solar energy outputs intensity and spectrum.
These are some of the reasons there has been
a lot of recent effort and resources devoted to
studying space weather.
1,2
Here, we focus on the steps necessary for
achieving a true physics-based capability to
predict the potential arrival and consequences
of major space weather storms. Great distur-
bances in the space environment are common,
but their precise arrival and impact on human
activities vary greatly. Simulating a system to
predict these disturbances requires comput-
ing resources at the limit of whats possible
todaywith current technology and foresee-
able future generations of so-called exascale
supercomputers.
Our work is part of the Intel ExaScience Lab
(www.exascience.com) in Leuven, Belgium,
and focuses on the critical interplay of accurate
space weather modeling, with physics-moti-
vated visualization approaches. Simulations of
space events, especially when the full physics
description is included, are rich sources of data
that include details of the microphysics as well
as overall macroscopic trends. Both are key
to space weather forecasting, where the mac-
roscopic trends (such as predicting a storms
pathway through Earths space) are as impor-
tant as the local details (such as the source of
high-energy particles at shocks and reconnec-
tion regions). Extracting this valuable infor-
mation in a user-friendly way that can help a
forecaster isa challenge and one of the focuses
of the ExaScience Lab.
CISE-15-5-Lapenta.indd 68 23/10/13 4:22 PM
SEPTEMBER/OCTOBER 2013 69
Challenges of Space Weather
Modeling
The actors of space weather are the sun, Earth,
and the vast space in between. Like any star, the
sun is made of a highly energetic and conduc-
tive gas, called plasma. In plasma, the atoms
have been broken into their nuclei and elec-
trons, which then become free to move. The
suns hot plasma is confned by gravity and
moves in complex patterns that produce large
currents and magnetic felds. The gravitational
confnement isnt perfect, and the sun emits a
highly varying outfow of plasma, called solar
wind, which permeates the entire solar system,
reaching Earth. Earth itself has a magnetic
feld, which makes compasses point towards the
North Pole and allows many species of animals
to navigate during migrations. This feld pro-
tects Earth from incoming solar wind and its
disturbances. Only a small number of the par-
ticles can reach Earths surface: so-called cosmic
rays. Most of the incoming plasma is stopped
and defected, reaching only high strata of the
atmosphere at the polar regions and causing the
aurora (in the form of northernor southern
lights) visible by people living at high latitudes.
Space weather simulations must take into
account the plasma emitted from the sun and
follow its evolution in interplanetary space, fo-
cusing on its impact on Earth (see Figure 1).
To describe and predict such space weather
processes, both electromagnetic felds and plas-
ma particles must be modeled and simulated.
The plasmas nuclei (mostly protons, the nuclei
of hydrogen) and electrons are loosely coupled,
and each species has its own typical scales. The
electromagnetic felds keep the species coupled,
forming a very nonlinear and multiscale system.
Modeling space weather is a daunting task,
because the system is enormous and includes
a wide variety of physical processes and time
and space scales. Figure 2 shows the typical
Figure 1. Different phases of a space weather event. (a) An artists view from NASA and (b) images from
various space missions. The sun emits a constant solar wind in the form of hot ionized gas and a magnetic
feld. The solar wind interacts with Earths magnetic feld, forming an elongated magnetosphere that
protects Earths environment from many of the most devastating consequences. Strong perturbations and
high-energy particles still penetrate, however, creating serious trouble in space and on Earth. (SDO = Solar
Dynamics Observatory; LASCO = Large Angle and Spectrometric Coronagraph; and IMAGE = Imager for
Magnetopause-to-Aurora Global Exploration.)
Ionosphere
Magnetotail
Magnetosphere
Polar
cusps
Sun-to-earth sun-
effect propagation
Solar
environment
Interplanetary
plasmas,dust...
SDO September 2010 LASCO September 2002 IMAGE July 2000
(a)
(b)
CISE-15-5-Lapenta.indd 69 23/10/13 4:22 PM
70 COMPUTING IN SCIENCE & ENGINEERING
physical scales observed from space exploration
missions in Earths environment. The scales are
presented in the form of an hourglass with the
top part representing the macroscopic scales
of evolution and the bottom the microscopic
scales. Microscopic and macroscopic scales are
tightly coupled.
A computer model of space weather must face
this multifaceted challenge by using state-of-
the-art mathematical techniques to deal with
nonlinear multiscale systems integration. The
modeling and simulation activities at the Exa-
Science Lab represent a codesign effort where
expertises in computer science and in the ap-
plied science of space weather are developed in
close collaboration.
Kinetic Models
The most fundamental approach for modeling
particle behavior in Earths space environment
is kinetic. In this approach, the evolution of the
distribution function f(r,v,t), that describes the
particle statistical distribution and is related to
the probability of fnding a particle with veloc-
ity v in the position r, is calculated solving the
Vlasov equation:

+
+ +
f t
t
v t
q f t
( , , )
( , , )
( ) ( , , )
r v
r v
E v B r v
f
v
==0,
where q is the particle charge, and E and B are
the electric and magnetic felds generated by the
particles in Earths space environment. From
the particle statistical distribution, its possible
to calculate the charge density ( r
c
) and current
density ( J), also called moments, by computing
the local statistical averages:

c
q f t d q f t d = =

( , , ) , ( , , ) r v v J v r v v.
The motion of Earths space environment plas-
ma is driven by the electric and magnetic felds,
and (at the same time) the moving plasma gen-
erates the felds. The evolution of the electric
and magnetic felds, given the plasma charge
and current densities ( r
c
and J), is determined
by the Maxwell equations:
=
=
=

= +

E
B
E
B
B J
E


c
0
0 0
0
t
t
.
The Vlasov and Maxwell equations form a cou-
pled system since the Vlasov equation depends on
the electric and magnetic felds, and the Maxwell
equation depends on the distribution function via
the moments ( r
c
) and J. Scientists have developed
many numerical techniques to solve these equa-
tions and predict the behavior of plasma in Earths
space environment. One of the most success-
ful approaches has been the particle-in-cell (PIC)
method.
3
In this numerical technique, a statisti-
cal sample of computational particles represents
the initial distribution function. The distribution
functions evolution is then calculated by com-
puting the particle position (r
p
) and velocity (v
p
)
of the computational particles with mass m
p
and
charge q
p
solving Newtons Equation of Motion:
d
dt
d
dt
q
m
p
p
p p
p
p
r
E B
=
= +
v
v
v ( ).
At each computational cycle, the particles are
advanced, the charge and current densities are
calculated at a meshs different grid points by
interpolation, and the Maxwell equations are
numerically computed by standard techniques
for the solution of hyperbolic partial differen-
tial equations.
Although the PIC simulations are highly suc-
cessful in modeling microphysics phenomena,
their applicability to predict the macroscopic
Figure 2. The physical scales of Earths space environment. Typical
scales observed during missions of exploration in Earths space
environment. The spatial scales are on the left and the temporal
scales are on the right. Space plasmas are made of electrons and ions
(mostly protons, the nuclei of hydrogen). Electrons are much lighter,
and their scales are orders of magnitude smaller. A tremendous spread
is present, requiring advanced modeling techniques and the largest
computing resources conceivable.
Petascale
Millions km
System scales
L = 10,000 km

i
= d
i
= 1,000 km
d
e
= 100 km

e
= 10 km

e
= 100 m
Electrons scales
Ion scales
Fluid
Hours
1 m
1 s
10
3
s
10
4
s
10
5
s Kinetic
Exascale
CISE-15-5-Lapenta.indd 70 23/10/13 4:22 PM
SEPTEMBER/OCTOBER 2013 71
behavior of plasmas in space weather events
is limited. In fact, if Newtons and Maxwells
equations are integrated in time by explicit tech-
niques, numerical constraints on the simulation
time step and grid spacing arise. To retain the
numerical stability, explicit PIC simulations must
use time step and grid spacing that are a fraction
of the smallest electron scales in Earths space en-
vironment (Figure 2). Even with future exascale
computing platforms, its clear that the explicit
PIC method wont cover all the space and time
scales present in the space plasmas or be able to
predict macroscopic space weather events. To
reach the description of such large macroscopic
phenomena, implicit modeling approaches must
be adopted (as well describe). To model such ex-
treme ranges of scales, encompassing small and
large scales, implicit methods are needed.
4
Fluid Models
The fuid approach focuses on the macroscopic
plasma dynamics, which is determined by the
(moving) plasmas interaction with the geometry of
the magnetic feld. This continuum model consid-
ers the infuence of magnetic felds on the motion
of the plasma as a whole (and vice versa) and isnt
concerned with the separate electrons and ions.
The theoretical tool to describe this macroscopic
behavior is called magnetohydrodynamics (MHD).
The macroscopic description incorporates aver-
ages over the microscopic dynamics, requiring:
typical-length scales of the investigated phe-
nomenon that are much larger than microscopic
plasma-length scales (such as the gyro-radii of the
electrons and ions and the electrostatic responses);
that the plasma density be high enough so
that collisions between electrons and ions are
frequent enough and the plasma is in ther-
modynamic equilibrium with a distribution
function close to Maxwellian (for example,
the studied phenomenas typical time scales
are much larger than the collision times and
the typical length scales are much larger than
the particles mean free path length); and
that the plasma temperature be high enough so
that the ionization degree is quite high and the
single-fuid approach holds.
5
The continuum (or fuid) model thus provides
a macroscopic description of plasmas in terms of
averaged functions f (r, t). Hence, the macroscopic
equations no longer involve details of velocity space.
They can be obtained through a systematic proce-
dure by expanding a fnite number of moments of
the Boltzmann equation (consisting of the aforemen-
tioned Vlasov equation but with a collision term in
the right-hand side representing the distribution
functions rate of change due to the short-range in-
terparticle interactions, which are often somewhat
arbitrarily called collisions). These moments are
obtained by frst multiplying the expressions with
powers of v and then integrating over velocity
space, and they involve the moments of the distri-
bution function itself. Indeed, the zeroth moment,
for example, is associated with the particle density,
and the frst moment is associated with the average
velocity. The expansion thus must be truncated after
a limited number of terms. The justifcation of this
truncation is part of the transport theory.
5
The ideal (nondissipative) MHD equations
consist of the continuity equation, the momen-
tum equation, the induction equation (such as
Amperes Law, which neglects the displacement
current and describes the magnetic felds evo-
lution), and a temperature, pressure, or energy
evolution equation. The strength of this simple
mathematical model lies in two facts:
The equations are scale independentfor ex-
ample, their dimensionless form doesnt depend
on the plasmas size, the magnetic felds magni-
tude, or the time scale (they can thus be used to
model the global dynamics of a huge variety of
plasmas, including laboratory plasmas, coronal
plasmas, galactic plasmas, and so on).
They can be written in conservation form, ex-
pressing the conservation of a plasmas main
macroscopic quantities (mass, momentum, en-
ergy, and magnetic fux).
In the latter formulation, the ideal MHD
equations read as follows:

l
l
l
l
l
l
l
l
l

t
v
e
B

v
B
v
vv
2 2
2 2

\
)

\
)

B
e p
2
2
II BBBB
v
2
2

l
l
l
l
l
l
l
l
l
l
l
l
v v
v v
( ) BB BB
BB BB
ll
=

l
l
l
l
l
l
l
0
0
0
0
,
where r denotes the plasma mass density and
p is the thermal pressure, v the average fuid
CISE-15-5-Lapenta.indd 71 23/10/13 4:22 PM
72 COMPUTING IN SCIENCE & ENGINEERING
velocity, and e(= p/(r - r), with g the ratio of
specifc heats) denotes the internal energy, while
I is the unit tensor.
The conservation form of the MHD equa-
tions can be exploited numerically. As a matter of
fact, in the successful fnite volume method, the 3D
space is discretized by splitting it up into many
small (fnite) volumes and then applying these
conservation laws to each of these small volumes.
For example, we can calculate the fuxes through
each of the boundary surfaces and update the
quantities according to the net fux through all
boundaries of the corresponding grid cell. Scien-
tists have used this method successfully for simu-
lating nonlinear plasma dynamics, especially in
combination with upwind shock-capturing nu-
merical schemes. The solenoidal condition,
B = 0, constitutes a physical constraint that yields
a numerical complication as special measures
must be taken to ensure this additional condi-
tion everywhere and at all times.
Coupling Fluid and Kinetic Models
with the Implicit Moment Method
The implicit moment method provides a math-
ematical framework to bring together the fuid
and kinetic scales
6
(see Figure 3). The felds and
particles are studied together in a coupled man-
ner. The word implicit refers to the methods
ability to advance the numerical simulation of
both felds and particles together without any lag
between the two (compared to explicit methods,
where the time lag is a typical problem). The
word moment refers to the use of moments of
the particle statistical distribution.
The implicit moment method has a solid
track record and its properties have been stud-
ied in theory and practice in many space appli-
cations. Recently, two of us developed the new
code iPic3D, which the ExaScience Lab is using
and further developing.
7
The implicit moment method lets the user
select the local level of resolution according to
the scales of the local processes. This feature
enables the modeling of space weather events
with minimum effort, by increasing the resolu-
tion only where its absolutely needed. However,
even with this method, the current petascale
computers can simulate only the subsections of
a typical space weather event; to perform simu-
lations with existing resources, macroscopic
models without proper treatment of microscales
must be used. In addition, to relax the computa-
tions, some of the processes are neglected and/
or approximated with adjustable parameters and
heuristic assumptions, reducing the approachs
predictive quality. To overcome the current
drawbacks and achieve a truly predictive model
of space weather events, the exascale computing
resources must create a more accurate solution.
The Need for Exascale
Figure 4 illustrates the current situation of the
state-of-the-art in space weather modeling.
The full description based on physically sound
frst principles requires us to resolve the elec-
tron scales in highly localized system regions
where energy conversion processes develop. In
the fgure, we report a state-of-the-art simula-
tion based on existing macroscopic models that
dont capture the full microscopic physics. To
reach truly predictive capability, exascale re-
sources will be needed.
A simple calculation provides the reason.
Missions of exploration and models provide a
clear indication of the scales involved (see Fig-
ures 1 and 2). The needed simulation box for
modeling Earths environment must be of the
order of 100 Earth radii in each of the three
spatial directions (100R
E
100R
E
100R
E
),
where Earths radius is 6,353 kilometers. As
Figure 2 shows, the smallest spatial scales are
the electrostatic responses (called Debye length) at
about 100 meters. Clearly, simulating the whole
box of more than 600,000 km with a resolution
of 100 m will require impossible resources (for
the foreseeable future). Yet, thats exactly what
the most common methods in use would need
Figure 3. Agents in the implicit moment method. The plasma particles
(electrons in orange and ions in yellow) evolve in a grid for the electric
and magnetic felds. The Maxwell equations control the felds, and the
Newton equations control the particle. The implicit moment method
introduces the moments in Lagrangian form to handle the multiple
scales and to allow the user the desired accuracy.
Moments
,J
Particles
X
p
,V
p
Fields
E,B
Plasma:
Electrons,ions,elds
Lagrange
Maxwell
Newton
CISE-15-5-Lapenta.indd 72 23/10/13 4:22 PM
SEPTEMBER/OCTOBER 2013 73
to do. The explicit methods must resolve the
smallest electrostatic responses, and Figure 4b
reports the impractical requirements of such a
calculation. Our experience leads us to assume
that a single core of current design can ftat
most16 16 16 cells each, with hundreds
of particles for each species (electrons and
ions). With that assumption, a current state-
of-the-art explicit code would require 63 mil-
lions of billions of cores. Not even exascale can
provide that. Note that this estimation is pes-
simistic, as the electron scales are nonuniform
in space, and, in many regions, the explicit
method can use larger granularity, leaving still
greater areas of application for explicit meth-
ods. However, global modeling cant be based
on the explicit method.
In contrast, at the ExaScience Lab, the focus
is on implicit moment methods that let us select
the local accuracy by applying two approaches.
First, the implicit method eliminates the need
to resolve the electrostatic scales everywhere, as
it averages over them and can lower the resolu-
tion to the electron electromagnetic response
(electron inertial range) at about 10 km. This
mathematical modeling approach saves two or-
ders of magnitude in each direction and simi-
larly allows a corresponding reduction of time
steps needed. The user can raise the bar on Fig-
ure 2s hourglass by two notches in space and in
time. Keeping the bar horizontal, the implicit
moment method lets us raise the resolution in
space and in time concurrently. A uniform grid
with the implicit moment method still requires
63 billions of cores. This is closer to being with-
in reach, but its still too large. This resolution
remains still outside the reach of even future
exascale computers.
Second, using adaptive grids focuses the res-
olution to only regions where the simulations
Figure 4. Physics-based model of space weather. (a) An output of a 3D simulation with the iPic3D code. The
regions requiring additional resolution are identifed with arrows. Only a very thin layer in each place needs to
be resolved. (b) The cost of an explicit particle-in-cell (PIC) fully kinetic simulation of the same region would
require much more computing power, as the table shows.*
Method Cells per dimension Cores
6.2600e+16
6.2600e+10
Coarse level: 6.2600e+04
Renement: 1.9812e+06
Explicit approach 6,353,000
63,530
Coarse: 635
Adaptivity on localized regions
Implicit moment method
Implicit moment method with
AMR**
(a)
(b)
*On petascale (order 100,000 cores) supercomputers, it isnt possible to simulate space weather events with full resolution, even with the most
advanced modeling method. Exascale will instead allow a fully resolved model.
**AMR = Adaptive Mesh Renement.
CISE-15-5-Lapenta.indd 73 23/10/13 4:22 PM
74 COMPUTING IN SCIENCE & ENGINEERING
truly need to resolve the electron dynamics.
However, the resolved regions are highly limited
subsets (see Figure 4) of reduced dimensionality:
thin crusts over surfaces dividing different types
of plasma (such as the solar wind from Earths
magnetosphere). A reasonable assumption
guided by practical experience is to use a layer
of 10 cores away from the target surfaces. Addi-
tionally, we assume the impacted surfaces to be
about 10percent of a sphere encircling Earth at
the distances of interaction with the solar wind.
We can then compute an estimate for the total
number of cores needed. For large-scale mod-
eling of the whole system, resolving just the ion
scales, a very manageable number of cores is
needed: 62,000. This is within reach of petas-
cale computing and indeed is currently state of
the art, but to resolve the electron physics and
avoid ad hoc assumptions that limit the simu-
lations predictive value, the Adaptive Mesh
Refnement (AMR) approach will need an ad-
ditional 1.98 million cores. All together, the
frst-ever predictive simulation based on full
physics modeling will require about 2 million
cores and will be feasible on the frst exascale
computers.
Visualization of Space Weather
Simulations at the Exascale
As weve mentioned, the space weather physics
is defned with electromagnetic continuum feld
quantities (at the macro scale) and the plasma
discretized particle resolutions (at the micro
scale), computed by the implicit moment-based
simulations (Figure 3). Thus, the exascale visu-
alization is three orders of magnitude more ex-
treme than visualization found in the petascale
simulations practice.
810
In general, the visualization goal in the Exa-
Science project is to provide visual aids tangible
to the end user (space weather experts), with
the expectation of improving the space weather
forecast. As the respective simulation datasets
are becoming extremely large, the traditional
visualization workfowwhich stores the data
to disk and then visualizes them in a separate
stepbecomes completely infeasible.
Instead, the ExaScience project adopted in
situ visualization instead (see Figure 5). Its
coupled with simulation locally, thus allowing
analysis while the simulation is running and
obviating the need to store data. In addition, it
removes the ineffcient data movement present
in the traditional visualization pipeline. The
proposed in situ visualization has been imple-
mented using the ray-tracing algorithm,
11,12

which leads to a scalable solution by consum-
ing only a maximum 10 percent of total simula-
tion time.
13
Figure 5. Traditional visualization pipeline compared
to the ExaScience in situ visualization pipeline. (a)
Traditional. (b) In situ.
(a)
Solver
Disk
storage
Visualization
Visualization
Solver
iPIC3D
(b)
Pressure 1.0
1.0
0.9
0.8
0.7
(a)
x
z
y
Pressure 1.0
1.0
0.9
0.8
0.7
x
z
y
(b)
Figure 6. Frame of a simulation of an erupting solar flament during a
coronal mass ejection. The density isosurface and cutting plane showing
pressure distributions, together with the magnetic feld streamlines
colored by the pressure, viewed from two different viewpoints: (a) a
front view and (b) a side-top view of the sun coronal loop.
CISE-15-5-Lapenta.indd 74 23/10/13 4:22 PM
SEPTEMBER/OCTOBER 2013 75
The developed visualization scenario forms
a predefned set of the selected visualization
algorithms and simulation data, with examples
shown in Figures 68. The approach aims to
enable fast, real-time-like processing, bringing
new insight into the complex physics models
and their validation against acquired satellite
data. As the space weather exascale model closes
the gap between continuum plasma models and
kinetic particles models, multiple physics de-
scriptions must be handled by the visualization
scenario. The visualization of these two respec-
tive models will need the following:
vector feld lines (frozen time) and particle trajecto-
ries (time-varying)see Figures 6 and 8;
volume visualization (isosurface rendering being
a special case) of average quantities defned in
AMR cells, expected to vary dramatically in
size, being highly refned near instability fronts
and high gradients regions (see Figure 7);
particle phase space cutsthe visualization of
particle density as a function of velocity and/
or location components, by means of 2D or
3D histograms; and
virtual telescopesvolume visualization taking
into account time lag, to compare the simula-
tion results with the spacecraft observations.
The third and fourth items will be developed
within the ExaScience project in the near future.
The projects major challenge will be to
help space weather experts easily visualize the
simulation during its execution, enabling in-
teractive on-the-fy computational steering.
Multiple, linked display windows with pre-
defned visualization scenarios will automate
the visualization process.
14
Remote, collabora-
tive visualization will improve effcient use of
the available exascale environment. The fnal
ExaScience application will be equipped with
a GUI for navigating through multidimen-
sional, multivariable representations, rendered
by the ray-tracing graphics engine, to control
the full set of modeling parameters: (a) within
simulation to interact with the running solver
and (b) to guide the interactive visualization
and analysis processes.
S
pace weather simulation poses a great
challenge to modelers: multiple processes
developing on widely different scales,
each represented by different physics
models. Modeling and understanding these con-
ditions requires a coordinated effort focusing on
new mathematical models for coupling multi-
scale and multiphysics models;
implementation of the models in a scalable
software that can reach high performance on
future exascale machines; and
new visualization tools to facilitate analysis,
enable insight, and reveal physics to the space
weather forecasters and other possible users,
who arent necessarily trained to use high-
performance computing resources.
(a) (b)
Figure 7. Alternative visualization of the same solar eruption simulation
in Figure 6. Two times are shown(a) earlier and (b) laterwhere the
flament is breaking up, and magnetic reconnection alters the systems
topology. Multiple transparent isodensity regions (with the same
density data as Figure 6) are represented using volume rendering.
Figure 8. Reconnection site on the Earth magnetosphere. The
region of reconnection and the plasma ejected form two areas of
intensifed density. Different views of the electron densities are shown,
represented using volume rendering.
CISE-15-5-Lapenta.indd 75 23/10/13 4:23 PM
76 COMPUTING IN SCIENCE & ENGINEERING
Here, weve discussed the frst results of the
ongoing efforts at the ExaScience Lab, focus-
ing on the models weve implemented and on
the visualization approach were pursuing. This
work is in progress and requires further efforts
to develop an extremely scalable fault-tolerant
simulation toolkit and the hardware simulation
model for the large-scale computer systems and
its envisaged workloads. This codesign develop-
ment will let us exploit the full potential of future
exascale computer architectures that are specially
adapted for space weather forecasting.
Acknowledgments
Support was provided by the European Commission
Seventh Framework Programme (FP7/2007-2013)
under grant agreement 263340 (the SWIFF proj-
ect; www.swiff.eu) and by the Intel Exascience Lab
(www.exascience.com).
References
1. V. Bothmer and I.A. Daglis, Space WeatherPhysics
and Effects, Springer, 2007.
2. D.N. Baker, How to Cope with Space Weather,
Science, vol. 297, no. 5586, 2002, pp. 14861487.
3. C.K. Birdsall and A.B. Langdon, Plasma Physics via
Computer Simulations, Taylor & Francis, 2004.
4. G. Lapenta, Particle Simulations of Space Weather,
J. Computational Physics, vol. 231, no. 3, 2011,
pp.795821.
5. J.P.H. Goedbloed and S. Poedts, Principles of Magneto-
hydrodynamics, Cambridge Univ. Press, 2004.
6. J.U. Brackbill and D.W Forslund, An Implicit Method
for Electromagnetic Plasma Simulation in Two Dimen-
sions, J. Computational Physics, vol. 46, no. 2, 1982,
pp. 271308.
7. S. Markidis, G. Lapenta, and Rizwan-uddin, Multi-
Scale Simulations of Plasma with iPIC3D, Mathemat-
ics and Computers in Simulation, vol. 80, no. 7, 2010,
pp. 15091519.
8. K.L. Ma et al., Next-Generation Visualization Tech-
nologies: Enabling Discoveries at Extreme Scale,
SciDAC Rev., vol. 12, 2009, pp. 1221.
9. R.B. Ross et al., Visualization and Parallel I/O at
Extreme Scale, J. Physics: Conf. Series, vol. 125, no. 1,
2008; doi:10.1088/1742-6596/125/1/012099.
10. K.L. Ma et al., Ultra-Scale Visualization: Research
and Education, J. Physics: Conf. Series, vol. 78, no. 1,
2007; doi:10.1088/1742-6596/78/1/012088.
11. I. Wald et al., Interactive Rendering with Coherent
Ray Tracing, Computer Graphics Forum, vol. 20, no. 3,
2001, pp. 153165.
12. L. Seiler et al., Larrabee: A Many-Core x86 Archi-
tecture for Visual Computing, ACM Trans. Graphics,
vol.27, no. 3, 2008; doi:10.1145/1360612.1360617.
13. T. Haber et al., Exascale In-Situ Visualization Using
Raytracing, Proc. Intl Conf. High Performance Comput-
ing, Networking, Storage and Analysis Companion,
ACM, 2011, pp. 1516.
14. D. Vucinic, Development of a Scientic Visualization
System CFViewComputational Field Visualization
System and Object-Oriented Software Methodology, LAP
Lambert Academic, 2010.
Giovanni Lapenta is a professor at the Katholieke Uni-
versiteit Leuven, Belgium, a leader of applications at
the Intel Exascience Lab in Leuven, and a guest scien-
tist at the Los Alamos National Laboratory and Univer-
sity of Colorado. His research interests include plasma
physics and radiation transport with astrophysical
and industrial applications and high-performance
computing applied to engineering and astrophysical
problems. Lapenta has a PhD in engineering from
Politecnico di Torino. Contact him at giovanni. lapenta@
wis.kuleuven.be.
Stefano Markidis is a postdoctoral fellow at the KTH
Royal Institute of Technology, Sweden, and a capita
selecta professor at Katholieke Universiteit Leuven,
Belgium. His research interests include high-perfor-
mance computing, new programming models and
algorithms for exascale computing, and magnetic
reconnection in space and astrophysical plasmas.
Markidis has a PhD in nuclear engineering from Uni-
versity of Illinois at Urbana-Champaign. Contact him
at markidis@pdc.kth.se.
Stefaan Poedts is a full professor at the Katholieke
Universiteit Leuven, Belgium. His research interests
include solar astrophysics, space weather, solar
wind modeling, and coronal mass ejections. Poedts
has a PhD in mathematics from Katholieke Univer-
siteit Leuven. Contact him at stefaan.poedts@wis.
kuleuven.be.
Dean Vucinic is a professor in the Mechanical En-
gineering and Electronics and Informatics depart-
ments at the Vrije Universiteit Brussel, Belgium. His
research interests include visualization and data
analysis. Vucinic has a PhD in engineering from Vrije
Universiteit Brussel. Contact him at dean.vucinic@
vub.ac.be.
Selected articles and columns from IEEE Computer
Society publications are also available for free at
http://ComputingNow.computer.org.
CISE-15-5-Lapenta.indd 76 23/10/13 4:23 PM

Anda mungkin juga menyukai