Anda di halaman 1dari 54

1

COAWST


Users Manual


Version 3.0




http://woodshole.er.usgs.gov/operations/modeling/COAWST/index.html




April 18, 2014
2
Table of Contents.

1. Introduction
2. Obtaining Code
3. Installation
4. Setting up and running applications
5. Examples
5a) JOE_TCs =WRF-ROMS-SWAN all 3 on same grid.
Also can be used for WRF-ROMS, WRF-SWAN, or WRF only.
5b) JOE_TCd =WRF-ROMS-SWAN with ROMS and SWAN on the same grid,
WRF on a different grid.
5c) INLET_TEST/Coupled = ROMS+SWAN same grid
5d) INLET_TEST/DiffGrid = ROMS+SWAN different grids
5e) INLET_TEST/Refined = ROMS+SWAN same grid +grid refinement in each.
5f) INLET_TEST/Swanonly =SWAN only one grid, or with grid refinement.
5g) Rip_current = ROMS+SWAN coupled, rip current example
5h) Shoreface = ROMS only, driven from a SWAN forcing file
6. running COAWST
7. Visualization Tools.
8. How to set up a WRF application.
9. How to get boundary, init, and climatology data for ROMS grids.
10. How to get wind, boundary, and init files for SWAN.
11. m files.
12. Change log.
13. Distribution text and user list.
14. List of references using COAWST.


3
COAWST Users Manual

1. Introduction
The Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST)
Modeling System is an agglomeration of open-source modeling components that has
been tailored to investigate coastal processes of the atmosphere, ocean, waves, and
coastal environment. The modeling system is currently comprised of:

Coupler: - Model Coupling Toolkit (MCT) v 2.6.0
Ocean: - Regional Ocean Modeling System (ROMS) svn 455
Atmosphere: - Weather Research and Forecasting Model (WRF) v 3.4
Wave(s): - Simulating Waves Nearshore (SWAN) v 40.91A
- Refraction Diffraction (RefDif)
Sediment Transport: - Community Sediment Transport Modeling Systems (CSTMS)

Here are some model specific user forums that can provide useful information:

ROMS
https://www.myroms.org/forum/index.php

WRF
http://forum.wrfforum.com/

SWAN
http://swanmodel.sourceforge.net/
SWAN has been preprocessed with options:
perl switch.pl -unix -impi -mpi -f95 -netcdf *.ftn *.ftn90
I then changed all the *.f and *.f90 to *.F

MCT
http://www.mcs.anl.gov/research/projects/mct/

The main reference is:
Warner, J.C., Armstrong, B., He, R., and Zambon, J.B., 2010, Development of a Coupled
Ocean-Atmosphere-Wave-Sediment Transport (COAWST) modeling system: Ocean
Modeling, v. 35, no. 3, p. 230-244.

The main website is:
http://woodshole.er.usgs.gov/operations/modeling/COAWST/index.html

2. Obtaining Code
The code is maintained in a svn repository and is distributed thru subversion
checkout. Please contact John Warner for code access at jcwarner@usgs.gov



4
A COAWST Modeling System Training was held at the USGS Woods Hole Science
Center from July 23-27, 2012. It was attended by 46 scientists from over 8 countries
onsite and viewed via Webex online. It was a great success and helped to foster
collaboration amongst the community. We have posted the ppts (some converted to pdfs),
the group photo, agenda, and recordings of the presentations to the svn site. To access
these, use the svn checkout command:

svn checkout https://coawstmodel.sourcerepo.com/coawstmodel/data/training_23jul2012
.

The presentations were reorded using Webex. To view them, you can extract the
nbr2player.msi to install the viewer. Alternatively, i have converted all of them to
windows media player files. The files are large. If you want them, send me a request and i
will put them on an ftp site.

3. Installation
3.1 Obtain the source code via the svn repository. You should get an email with your
username and password. To check out the code I suggest you make a directory called
COAWST, cd to that dir, and use:

svn checkout --username myusrname
https://coawstmodel.sourcerepo.com/coawstmodel/COAWST .

Notice the . at the end of the command to place the code in that location. Alternatively
instead of the ." you can add any path of where you want it to go.

3.2 Install required libraries
The following libraries are required to run the coupled modeling system. These are:

Netcdf
Fortran Compiler
MPI
MCT
SCRIP

These libraries/programs (SCRIP is a program, others are libs) only need to be installed
once, typically by an administrator, and set to be available for all users. The Netcdf,
Fortran compiler, and MPI are required to be obtained by the users. Currently the system
uses netcdf v3.x. It has been tested with ifort, pgi, and some gfortran Fortran compilers.
For MPI, we have used mvapich, openmpi, and MPICH2. Set your environment
variables, for example as shown here:

setenv MCT_INCDIR /he_data/he/jbzambon/MCT/mct
setenv MCT_LIBDIR /he_data/he/jbzambon/MCT/mct
setenv NETCDF_INCDIR /usr/local/apps/netcdf-3.6.1/pgi/6.0/x86_64/include
setenv NETCDF_LIBDIR /usr/local/apps/netcdf-3.6.1/pgi/6.0/x86_64/lib
5
setenv NETCDF /usr/local/apps/netcdf-3.6.1/pgi/6.0/x86_64/lib

The four settings of MCT and NETCDF _INCDIR and _LIBDIR are for roms. The
setting of NETCDF is for WRF.

The MCT and SCRIP packages are distributed with the COAWST package because I
modified a few parts of these codes to make them compatible with the systems we have
tested. The MCT and SCRIP installs are described below.

3.2.1 MCT installation
1) Copy the MCT package from COAWST/Lib/MCT and place into a folder that
can be shared to all users. It is recommended to only install this once, and just let
all users link to this for all their compilations.
2) ./configure
This will create a file Makfile.conf. You should edit this file and see if the paths
etc are correct. I included the file Makefile.conf_jcw as an example of how my
setup was on a Windows machine.
3) make
This will build the package in both the mct and mpeu directories. I modified the
Makefiles in both of those directories so that they can also work in a Windows
world. The Windows compilers typically build *.obj files, instead of just *.o files.
So I modified the mct/Makefile and mpeu/Makefile for the $(F90RULECPP).
You can edit those files and see the changes.
4) make install
This will place the MCT libs and inc files into the /usr/lib and /usr/include folders.
The location can be modified, look at the MCT/README file.
5) Set your environment variables of
setenv MCT_INCDIR COAWST/Lib/MCT/include
setenv MCT_LIBDIR COAWST/Lib/MCT/lib
(or where ever you installed them)

3.2.2 SCRIP Installation
The SCRIP package is used to create interpolation weights between two separate
grids.
1) Create some working folder for the SCRIP package. This can be done by the
administrator to do this only once. Lets call the location /usr/SCRIP.
2) Copy the code into that folder and go to the source dir.
cp r COAWST/Libs/SCRIP/ /usr/SCRIP
cd /usr/SCRIP/source
3) Edit the build script for your particular environment.
edit makefile
Set FLAGS, LIB, and INLCUDE for the build. These are set for USGS. Values
for the HPC are in the makefile but need to be uncommented.
4) make
6
Type make to build the executable. The executable scrip is made in the next
level up (one up from source, i.e. /usr/SCRIP). It also makes scrip_test, you can
try that if you want.

3.3 Configuring the system for your application
Users may need to initialize environmental settings.


4. Setting up and running applications
To run the COAWST system, users should set up a folder to hold all your project files. In
that folder, there will be files for ROMS, SWAN, WRF, and general control files. One of
the control files will be your project header file, let's call it project.h. The project.h file
will list cpp (c pre-processor) options that get used to compile the code. These options are
listed in section 4.1. Section 4.2 describes a control file needed to compile the model.


4.1 cpp options for COAWST
Here is a list of cpp options that should be used/required with COAWST. These options
would be listed in your project.h file (described below in the test examples section).

4.1.1) One set of options allows the user to activate a single model or multiple models. It
is REQUIRED that you specify which model(s) you want to use, using:
#define ROMS_MODEL /* if you want to use the ROMS model */
#define SWAN_MODEL /* if you also want to use the SWAN model */
#define WRF_MODEL /* if you also want to use the WRF model */

As of svn version 767 (September 26, 2013) the system can now run any set of model
choices:
WRF only, ROMS only, SWAN only,
WRF+ROMS, WRF+SWAN, WRF+ROMS+SWAN, ROMS+SWAN
We can also run:
SWAN only with grid refinement,
ROMS only with grid refinement,
ROMS+SWAN with grid refinement.

4.1.2) To activate model coupling:
#define MCT_LIB /* if you have more than one model selected and you want to
couple them.*/

The following cpp options are activated internally. The user should NOT list these in
their project.h file.
ROMS_COUPLING means that roms is being used and is coupled to another model.
SWAN_COUPLING means that swan is being used and is coupled to another model.
WRF_COUPLING means that wrf is being used and is coupled to another model.
AIR_OCEAN means that wrf and roms are active (other models could also be active).
7
WAVES_OCEAN means that swan and roms are active (other models could also be
active).
AIR_WAVES means that swan and wrf are active (other models could also be active).

4.1.3) Some new cpp options that are available for the coupling include:
#define UV_KIRBY /* compute "surface-average" current based on Hwave that will
be sent from the ocn to the wav model for coupling*/
#define UV_CONST /* send vel =0 from the ocn to wave model */
#define ZETA_CONST /* send zeta =0 from the ocn to wave model */
#define SST_CONST /* do not send sst data from roms to wrf */

4.1.4) A new option for atmosphere coupling is
#define ATM2OCN_FLUXES
This option specifies that the heat and momentum fluxes as computed by the atmosphere
model will be used in the ocean model. This will allow both models to be using the
identically same fluxes at the interface. When using this option, you should also use
#undef BULK_FLUXES
because the fluxes will be computed by wrf depending on the surface scheme you select.

4.1.5) Methods for grid interpolation.
#define MCT_INTERP_WV2AT /* this allows grid interpolation between the wave
and atmosphere models */
#define MCT_INTERP_OC2AT /* this allows grid interpolation between the ocean
and atmosphere models */
#define MCT_INTERP_OC2WV /* this allows grid interpolation between the ocean
and wave models */
If you use different grids for the ocean, atmosphere, or wave models, then you need to
activate the appropriate option above so that the data fields are interpolated between
grids. We have updated the method for the grid interpolation to use flux conservative
weights. So now the user can have the same fluxes for both atm and ocn models and the
fluxes are transferred between models using a flux conservative approach. An example of
this is shown below for the SCRIP interpolation in the JOE_TCd (d=different) test case.

4.1.6) Methods for grid refinement.
#define REFINED_GRID /* this allows grid refinement in roms or in swan.*/
- ROMS has two-way and SWAN has one-way grid refinement. For now, if you are
running a simulation with both roms and swan, we require that ROMS and SWAN have
the same number of grids. The test case INLET_TEST_REFINED (explained below) is
an example where both roms and swan are coupled, with grid refinement, and coupling
on the refined grids. The grids for ROMS and SWAN can be different sizes, and then the
user needs to provide interpolation weights for those grid combinations. These number of
grids is set in the coawst.bash file as "NestedGrids."
- WRF is two-way (from the WRF developers).
- If you are running a simulation using WRF +either ROMS or SWAN, then you need to
set which WRF grid will be coupled to the ROMS+SWAN grids. This feature is set in the
coupling.in file using the parameter WRF_CPL_GRID. The default is 1, to couple with
8
the WRF parent grid. In the future, we will have more options to couple various levels of
R/S grids to various WRF grids. But we need to start somewhere, and for now you can
only choose 1 WRF grid.

4.1.7) SWAN wave interactions to ROMS or to WRF:
The following 3 options are available to allow exchange of wave data to ROMS for use in
bulk_fluxes for computation of ocean surface stress, and to allow exchange of wave data
to WRF for use in MYJSFC and MYNN surface layer schemes to allow increased bottom
roughness of the atmosphere over the ocean:

#define COARE_TAYLOR_YELLAND
Taylor, P. K., and M. A. Yelland, 2001: The dependence of sea surface roughness on the
height and steepness of the waves. J. Phys. Oceanogr., 31, 572-590.

#define COARE_OOST
Oost, W. A., G. J. Komen, C. M. J. Jacobs, and C. van Oort, 2002:New evidence for a
relation between wind stress and wave age from measurements during ASGAMAGE.
Bound.-Layer Meteor., 103, 409-438.

#define DRENNAN
Drennan, W.M., P.K. Taylor and M.J. Yelland, 2005: Parameterizing the sea surface
roughness. J. Phys. Oceanogr. 35, 835-848.

4.1.8) Implementation of new wave-current interaction formulation.
We added a new method based on the vortex force approach. The method is described in
detail Kumar et al (2012). The new cpp options for this are:

#define WEC_MELLOR radiation stress terms from Mellor 08
#define WEC_VF wave-current stresses from Uchiyama et al.
#define WDISS_THORGUZA wave dissipation from Thorton/Guza
#define WDISS_CHURTHOR wave dissipation from Church/Thorton
#define WDISS_WAVEMOD wave dissipation from a wave model
#define WDISS_INWAVE wave dissipation from a InWave model
#define ROLLER_SVENDSEN wave roller based on Svendsen
#define ROLLER_MONO wave roller for monchromatic waves
#define ROLLER_RENIERS wave roller based on Reniers
#define BOTTOM_STREAMING wave enhanced bottom streaming
#define SURFACE_STREAMING wave enhanced surface streaming

Additional information is to be added soon. Interested users should read the Kumar et al
(2012) paper.

4.1.9) Drag limiter option.
#define DRAGLIM_DAVIS is a feature added to WRF and SWAN to limit the ocean
roughness drag to be a maximum of 2.85E-3, as detailed in:
9
Davis et al, Prediction of Landfall Hurricanes with the Advanced Hurricane WRF Model,
Monthly Weather Review, 136, pp 1990-2005.

In SWAN, this can be activated when using the Komen wind input. For WRF, it can be
activated when using myjsfc or mynn surface layer options.


4.1.10) Other model options.
The options above work for the coupled system.
- For the ROMS model, all other cpp options are listed in:
ROMS/Include/cppdefs.h
User should scan this list of options for activating ocean physics and ocean boundary
conditions.
- For the WRF model, these options are set in the namelist.input.
- For the SWAN model, these options are set in the INPUT file.


4.2 To compile the model
To compile the model, you need to use the coawst.bash file. If this does not work then
contact me and I will get a build script for you. You need to edit the coawst.bash. Some
important parts are:

export ROMS_APPLICATION=JOE_TC
For now, the application you set will be called ROMS_APPLICATION. This is
because the *.bash file is distributed from roms. I will modify this in the future to be
more general, rather than specific to roms. But for now, set this as the name of your
application. Use capital letters. This is the name of your project.h file that contains all
your cpp options.

export NestedGrids=1
This is the number of grids for roms or for swan. The number of grids for WRF is set in
the namelist.input file.

export MY_ROOT_DIR=/raid1/jcwarner/Models/COAWST
export MY_PROJECT_DIR=${MY_ROOT_DIR}
The rootdir is the location of the source code. You can have a project dir that you work
in. I typically make a new copy of the code each time in case I change something and this
means I have multiple rootdirs. Then I can always go back and see exactly the version of
code that was used for a particular run. Therefore I set the project dir as the same location
as the rootdir.

export MY_ROMS_SRC=${MY_ROOT_DIR}/
Keep this the same as listed here.

export USE_MPI=on
export USE_MPIF90=on
10
The COAWST system was designed to work with mpi for parallel processor applications.
If you set the modeling system to build a coupled application, then it will always produce
an executable coawstM. If you set to build an individual model and also set MPI=on,
then you will get a coawstM.

export FORT=pgi
You can set this to be ifort, pgi, or gfortran. Other choices may work.

export MY_HEADER_DIR=${MY_PROJECT_DIR}/Projects/J OE_TCs
export MY_ANALYTICAL_DIR=${MY_PROJECT_DIR}/Projects/JOE_TCs
Use these to set the locations for you application. Header dir is where your project.h file
is located.

After you edit the coawst.bash file you need to run it. To (re)build everything you use:
./coawst.bash

This command will just rebuild any changes to roms or swan, and will rebuild all of wrf:
./coawst.bash noclean

This command will rebuild all of roms and swan and just the changes that have been
made to wrf:
./coawst.bash nocleanwrf

This will only rebuild roms or swan or wrf files that have been changed.
./coawst.bash noclean nocleanwrf

If you need to make modifications to the WRF configuration file, here are some tips. you
could do this:
cd WRF
./clean -a
./configure
then select the options that you want. It will create the configure.wrf file and you can edit
that file. then cd .. (to the root dir) and use
./coawst.bash -nocleanwrf
This way it will not re-create the configure.wrf file and it will build all of the system.

Another handy tool is the script command
http://www-users.cs.umn.edu/~skim/cs1901/script.html
You could use:
script coawst.log
./coawst.bash
exit
and then save that coawst.log file to see how the system was built.


4.3 Some general helpful tips and a few new functionalities in v3.
11

4.3.1) ROMS is now very sensitive to the vertical coordinate system. The user needs to
make sure that the values set in the roms init file are the same as the values set in the
ocean.in file. For example the following values are checked for consistency:
Vtransform=1;
Vstretching=1;
theta_s=5.0;
theta_b=0.4;
Tcline=50.0;
N=16;
User should check that these values are the same in their netcdf files as the *.in file. More
info is provided at: https://www.myroms.org/wiki/index.php/Vertical_S-coordinate.

4.3.2) ROMS grid files should not have fill values for mask arrays. A command to
remove these are:
ncat t ed - O - a _Fi l l Val ue, mask_r ho, d, , - a _Fi l l Val ue, mask_u, d, , - a
_Fi l l Val ue, mask_v, d, , - a _Fi l l Val ue, mask_psi , d, , USeast _gr d17. nc

4.3.3) This version you only need one project.h file. In previous version we had to
modify a wrf.h and a swan.h file. You do not need to modify those files anymore. Just
modify your project.h file.

4.3.4) For SWAN, you do not need to specify any OUTPUT commands for the coupling.
The coupling and OUTPUT are now completely separate. SWAN now offers writing out
of files in netcdf (as of SWAN 4091A). This option is available with the COAWST
format, but you need to have netcdf4 or higher.

4.3.5) For WRF, you can have a 2-way nest in WRF, and have this coupled to roms and
/or swan. For a moving nest in wrf, WRF requires that it be built without fastsse. For a
non-moving nest, it can be built with the standard options.

4.3.6) For WRF-ROMS coupling, the user needs to have the following 2 options set to
allow the sst from roms to be read in and used by WRF:
sst_update =1
io_form_auxinput4 =2
The sst_update is set to =1 in the Registry. So users do not need to explicitly add that.
The second command for the io_form needs to be listed in the namelist.input file. See the
JOE_TC cases as examples.

4.3.7) Some information about heat fluxes for WRF-ROMS.
If WRF_MODEL is defined:
- you still have to define EMINUSP to activate exchange of rain and evap.
- SOLAR_SOURCE is needed indeed, otherwise all the heat goes into the surface layer
only.
- longwave outgoing component is estimated in Master/mct_roms_wrf.f so there is no
need to define LONGWAVE_OUT in ROMS

12

If WRF_MODEL is not defined or you are going to use BULK FLUXES:
BULK_FLUXES (in bulk_flux.F) computes turbulent heat fluxes (latent and sensible
heat), momentum stress and evaporation (used in the fresh water flux if EMINUPS is also
defined -used as salinity surface boundary condition-).
Radiative fluxes (i.e., shortwave and longwave radiation flux) are not calculated, nor is
the rain component of the EMINUSP calculation.
The surface scheme (COARE) implemented in bulk_flux.F requires:
- air temperature (usually at 2m)
- relative humidity (usually at 2m)
- mean sea level pressure
- u-wind component (positive east), usually at 10m v-wind component (positive north),
usually at 10m.
With these parameters bulk_flux will estimate latent heat, sensible heat, u-momentum
stress, v-momentum stress and evaporation. Note that in the ocean.in, you have to
specify:
BLK_ZQ (usually 2m)
BLK_ZT (usually 2m)
BLK_ZW (usually 10m)
these numbers should be consistent with the height of the levels of the surface variables
(as said usually wind is at 10m, air temp at 2m, humidity at 2m, but this might be
different depending on your surface forcing dataset).

Net shortwave should be provided by you meteo forcing. This is not calculated in
bulk_flux.f, but is necessary to compute the full heat flux term.
Net longwave: you have several choices:
- provide net longwave in the forcing file
- provide INCOMING longwave in the forcing file and define
LONGWAVE_OUT (ROMS then will estimate the outgoing component based on its
SST)
- do not provide the longwave but instead total cloud cover (in the forcing file) and
ROMS will estimate the net longwave. You do not need to define CLOUD, as it is
defined internally by ROMS if def LONGWAVE

If you want the E-P flux, define EMINUSP and provide in the forcing file the variable
rain, while, as said, evaporation is estimated in bulk_flux.F.

So, in the end:
#define BULK_FLUXES
#define LONGWAVE or #define LONGWAVE_OUT or provide the net longwave
in the forcing file
#define EMINUSP is needed, otherwise set to zero the surface salinity fux (#define
ANA_SSFLUX and set zero stflx(itrc==isalt) in stflux.h)
#define ATM_PRESS if you want the inverted barometric effect (mean sea level
pressure must be in the forcing file)

13

5. Examples
We will use the JOE_TC and the Inlet_test cases as examples of how to set up different
configurations. The JOE_TC application is a tropical cyclone that travels west from a
deep ocean basin onto a shelf that slopes landward to the west. The JOE_TCs (same grid)
and JOE_TCd (different grids) are good examples for a coupled WRF-ROMS-SWAN
application. The Inlet_test case is for an idealized inlet with wave and tidal driven flows
and has three cases: Coupled is the standard case with the same grid for each roms and
swan; DiffGrid has different size grids for roms and swan; Refined has the same parent
grids for roms and swan and the same child grids for roms and swan.

5a) JOE_TCs = WRF-ROMS-SWAN all 3 on same grid.
Also can be used for WRF-ROMS, WRF-SWAN, or WRF only.
5b) JOE_TCd = WRF-ROMS-SWAN with ROMS and SWAN on the same grid,
WRF on a different grid.
5c) INLET_TEST/Coupled = ROMS+SWAN same grid
5d) INLET_TEST/DiffGrid = ROMS+SWAN different grids
5e) INLET_TEST/Refined = ROMS+SWAN same grid +grid refinement in each.
5f) INLET_TEST/Swanonly = SWAN by itself, also with grid refinement.
5g) Rip_current test case = ROMS+SWAN
5h) Shoreface test case = ROMS driven by a swan forcing file.
5i) Headland test case = ROMS+SWAN for a promontory.


5a) JOE_TCs = WRF-ROMS-SWAN all 3 on same grid.

step 1: Create a folder to hold all the project files. For this application a folder is
already made and is called COAWST/Projects/JOE_TCs. This will be referred to
as the project folder.

step 2: Create WRF grid, init file, and input files:
wrfbdy_d01
wrfinput_d01
namelist.input
These files need to be placed in the root_dir.
Copies of these files have already been created and are located in the
Projects/JOE_TCs folder. Copy these 3 files from that folder to the root dir.
Edit namelist.input and set in the &domains section the number of processors for
WRF to use for the simulation (total WRF =nproc_x*nproc_y).
nproc_x =M
nproc_y =N

step 3: Create a project header file.
The application control file has already been created for this setup. For this case,
we can use the file in the project directory
COAWST/Projects/JOE_TCs/joe_tc.h
14
We have preselected to use a certain case G, which has the 3 way coupling.
Some key parameters in here are .
#define ROMS_MODEL
#define SWAN_MODEL
#define WRF_MODEL
If you want to create your own, the best choice is to start out by copying an
application.h file from some other project folder which is similar to your
configuration, and then tailor it to the new application.

step 4: Build the system.
Edit coawst.bash and provide appropriate file locations and run the build
command:
./coawst.bash
During the build if wrf asks for the location of the netcdf files, you can provide
them. If you set NETCDF=___ to the correct location in your environment, then
WRF will know where the files are.
During the buld wrf will ask for the system you are using such as:

Select option 3, PC Linux x86_64, PGI compiler 5.2 and higher (RSL_LITE)
And select 1 for normal. Compiling wrf can take up to 20 minutes or longer.

step 5: Create the ocean grid. The project folder already contains the roms grid.
This grid was made from a version of :
COAWST/Tools/mfiles/create_roms_grid.m.
Alternatively, we now distribute wrf2roms_mw.m that can be used to create a
roms grid from a wrf grid. After creating the grid, copy the grid file (in this case
joe_tc_grd.nc) to the project directory.

step 6: Create the initial file for the ocean model. The project folder already
contains the roms init file but if you want to see how it was made you can use:
COAWST/Tools/mfiles/create_roms_init.m.
After creating the init file copy the file (in this case joe_tc_ocean_init.nc) to the
project directory.

step 7: Create an ocean input file.
The application ocean input file has already been created and we can use the file
in the project directory
COAWST/Projects/JOE_TCs/ocean_joe_tc.in
Set the total number of processors to be allocated to roms as:
NtileI ==4 ! I-direction partition
NtileJ ==3 ! J-direction partition

step 8: Make the swan bathy +grid files.
When you use create_roms_grid, there were other files created:
roms_bathy.bot
grid_coord.grd
15
These were created by roms2swan.m and these are the swan files needed. Rename
these (such as joe_tc_roms_bathy.bot and joe_tc_grid_coord.grd) and copy these
2 files to the project directory. (This has already been done.)

step 9: create swan input file
The best step here is to copy an existing file and tailor it to your application. For
now we can use:
COAWST/Projects/JOE_TCs/INPUT_JOE_TC

step 10: modify the coupling.in file
COAWST/Projects/JOE_TCs/coupling_joe_tc.in to:
- allocate processors for all 3 models
- set coupling time interval
- list input file names

NnodesATM = 20 ! atmospheric model
NnodesWAV = 12 ! wave model
NnodesOCN = 12 ! ocean model

! Time interval (seconds) between coupling of models.

TI_ATM_WAV = 600.0d0 ! atmosphere-wave coupling interval
TI_ATM_OCN = 600.0d0 ! atmosphere-ocean coupling interval
TI_WAV_OCN = 600.0d0 ! wave-ocean coupling interval

! Coupled model standard input file name.

ATM_name =namelist.input ! atmospheric model
WAV_name =Projects/JOE_TC/INPUT_JOE_TC ! wave model
OCN_name =Projects/JOE_TC/ocean_joe_tc.in ! ocean model

step 11: run the system using a PBS run script or whatever is for your system

/usr/local/mpi/bin/mpirun -np 44 -machinefile $PBS_NODEFILE ./coawstM
Projects/JOE_TC/coupling_joe_tc.in >joe_tc.out


16


A) PSFC from WRF at hour 40.




B) SST from ROMS in the WRF output at hour 40.


17


C) Hwave from SWAN in the WRF output at hour 40.

Figure 1. Results at hour 40 for JOE_TCs simulation for: A) PSFC from WRF; B) SST
from ROMS; and C) HWAVE from SWAN.



Additional notes for JOE_TCS:
This test case is distributed with "ExpG" defined. That case has ROMS+WRF+SWAN all
coupled. You can select to activate a different case in the Projects/JOE_TCs/joe_tc.h file,
such as "ExpA1" which is just WRF+SWAN, or "ExpA" which is just WRF+ROMS.

The test case JOE_TCw is basically just the JOE_TCs test case, but only has the WRF
model component. I set the joe_tcw case separately so that people can see what is needed
to run just wrf by itself.


5b) JOE_TCd simulation using ROMS + SWAN same grid, WRF different grid.
This application provides information on how to run the models on different
grids. Each model could be on a separate grid, but for this example roms and
swan are on the same grid, and wrf is on a different grid. This will be very close
to the setup in 5a but now we will just create new grids for roms and swan. The
roms and swan grids will be decreased resolution to have 100 cells in the x-
direction and 75 cells in the y-direction (instead of 200x150).

step1 : Create a new projects folder
Projects/JOE_TCd. (this has already been created).

step 2: Create roms +swan grids. This can be accomplished using
COAWST/Tools/mfiles/create_roms_xygrid.m or wrf2roms_mw.
18
We renamed grid_coord.grd and roms_bathy.bot to
joe_tc_coarse_grid_coord.grd and joe_tc_coarse_roms_bathy.bot. Copy these 2
files and the joe_tc_coarse_grd.nc file to the project folder.

step 3: create roms init files using
COAWST/Tools/mfiles/create_roms_init.m

step 4: Create wrf files: we will use the same wrf files from before of:
wrfbdy_d01
wrfinput_d01
namelist.input
These files need to be placed in the root_dir (I also make a copy and put into the
project folder).

step 5: Build SCRIP.
We will use the SCRIP package to create the interpolation weights.
create some working folder for the SCRIP package. This can be done by the
administrator to do this only once. Lets call the location /usr/SCRIP.
cp r COAWST/Lib/SCRIP/ /usr/SCRIP
cd /usr/SCRIP/source
edit makefile and set FLAGS, LIB, and INLCUDE for the build. These are set for
USGS, and the values for HPC are in the makefile but need to be uncommented.
Type make to build the executable.
The executable scrip is made in the next level up (one up from source, i.e.
/usr/SCRIP).
It also makes scrip_test, you can try that if you want.

step 6: Convert roms grid file to scrip input format.
Use the m file
COAWST/Tools/mfiles/scrip_roms.m
(new as of July 7, 2010).
You need to edit the file and provide information for the grid and output files. The
m file is setup to run the JOE_TC test case for the coarse grid.
This creates joe_tc_coarse_roms_scrip.nc.

step 7: Convert wrf grid to scrip input format.
Use the m file
COAWST/Tools/mfiles/scrip_wrf.m
(new as of July 7, 2010).
You need to edit the file and provide information for the grid and output files. The
m file is setup to run the JOE_TC test case for the coarse grid.
This creates joe_tc_wrf_scrip.nc.

Step8: Run SCRIP.
Previously we had set up SCRIP to compute distance weighted remapping
weights. I am not supporting that option any more and require the use of the flux
19
conservative remapping. The flux conservative is first order accurate and uses the
same sparse matrix interpolation method.

Edit the file scrip_in. The file looks like:

&remap_inputs
num_maps =2
grid1_file ='joe_tc_coarse_roms_scrip.nc'
grid2_file ='joe_tc_wrf_scrip.nc'
interp_file1 ='ocn2atm_weights_consv.nc'
interp_file2 ='atm2ocn_weights_consv.nc'
map1_name ='ROMS to WRF consv Mapping'
map2_name ='WRF to ROMS consv Mapping'
map_method ='conservative'
normalize_opt ='fracarea'
output_opt ='scrip'
restrict_type ='latitude'
num_srch_bins =90
luse_grid1_area =.false.
luse_grid2_area =.false.
/

The input files we just created are the grid1_file (joe_tc_coarse_roms_scrip.nc)
and grid2_file (joe_tc_wrf_scrip.nc). The output files from scrip are interp_file1
and interp_file2 are going to be called ocn2atm_weights_consv.nc and
atm2ocn_weights_consv.nc. Make sure you have the map_method set to be
conservative and normalize_opt to be fracarea.

To run scrip, place the input files in the top scrip level directory and run:
./scrip
After SCRIP ends, copy the two *_weights*.nc files to the project folder.

Step9: Create the wav atm weights files.
Because the wave and ocean models are on the same grids, we can use same
weights files for the wave model.
cp ocn2atm_weights_consv.nc wav2atm_weights_consv.nc
cp atm2ocn_weights_consv.nc atm2wav_weights_consv.nc

step 10: Edit the ocean model configuration file.
Edit Projects/JOE_TCd/joe_tc.h and add the option
#define MCT_INTERP_WV2AT
#define MCT_INTERP_OC2AT
This will allow mct interpolation between the wave atm models, and between
the ocean atm models.
This is set already to define case H which has those cpp options set.

20
step11: add the files to Projects/J OE_TCd/coupling_joe_tc.in. Add the lines:
W2ONAME ==wav2ocn_weights_consv.nc
W2ANAME ==wav2atm_weights_consv.nc
A2ONAME ==atm2ocn_weights_consv.nc
A2WNAME ==atm2wav_weights_consv.nc
O2ANAME ==ocn2atm_weights_consv.nc
O2WNAME ==ocn2wav_weights_consv.nc

change the ocean and wave input file names to be
WAV_name =Projects/JOE_TC/INPUT_JOE_TC_COARSE ! wave model
OCN_name =Projects/JOE_TC/ocean_joe_tc_coarse.in ! ocean model

step12: make, run the program


5c) INLET_TEST/Coupled = ROMS+SWAN same grid
This application tests the MCT coupling between ROMS and SWAN for an idealized
inlet. The application has an enclosed basin in the southern part of the domain with a
small opening near the center of the grid. The model is driven by a sinusoidal water level
on the northern edge and is coupled to SWAN which develops waves from the north with
a 1m height propagating towards the south. Flow pattern that develops are:





Wave height. depth integrated current to wave model
(Hwave) (vWave)

Figure 2. Results for Inlet_test coupled.

All of the grids and input files have already been created for this application and are
available in Projects/Inlet_test/Coupled. Here is a basic set of steps required to make
these files and run this application.

21
step 1: Use create_roms_xygrid.m to create the roms and swan grids. These are simple
rectangular grids and can be created easily with these tools. User sets the x/y locations,
depth, and masking. This m file calls mat2roms_mw to create the roms grid file called
inlet_test_grid.nc and calls roms2swan to create the swan grid files. The swan grid files
were renamed to inlet_test_grid_coord.grd and inlet_test_bathy.bot.

step2: For this case, we are initializing the roms model from rest, so we are relying on
ANA_INITAL default values. The m file create_roms_init could be used to create a
more complicated set of init conditions. We are also using ana_sediment to create a
uniform field of 10m thick bed with porosity=0.5 and 1 grain size. This could have been
created with the create_roms_init file as well. The file sediment_inlet_test.in lists some
sediment properties for this application.

step3: For this case we are initializing SWAN from rest so there are no swan init files. we
could have run swan for with some wind field, but this case has no winds.

step4: create the header file: Edit inlet_test.h to see the options selected for this
application. we needed to define
#define ROMS_MODEL
#define SWAN_MODEL
#define MCT_LIB
, as well as several boundary conditions and wave current interaction (WEC) options, the
wave-enhanced bottom boundary layer option (SSW_BBL), GLS_MIXING, and some
sediment options of suspended load and bed morphology updating.

step 4: Determine settings for roms and swan in the ocean_inlet_test.in and
swan_inlet_test.in These are some standard options in these files.

step5: create the coupling file called coupling_inlet_test.in to enter the roms and swan
input file names and determine the coupling interval.

step6: edit the coawst.bash file to build this application.

step 7 compile and run.

5d) INLET_TEST/DiffGrid = ROMS+SWAN different grids
This test case has a grid for ROMS and a larger grid for SWAN. This is useful for
applications where the lateral shadow effects of the wave model can influence the ocean
grid. So the wave model can be simulated on a larger grid and the ocean model on a
smaller inset grid. This application is very similar to 5c inlet_test_coupled. All of the files
generated for this test case are in Projects/Inlet_test/DiffGrid.

step1: ROMS and SWAN are on different grids. You need to create a roms grid and a
swan grid. This is achieved using create_roms_xygrid.m.

step 2: Build SCRIP.
22
We will use the SCRIP package to create the interpolation weights. create some working
folder for the SCRIP package. This can be done by the administrator to do this
only once. Lets call the location /usr/SCRIP.
cp r COAWST/Lib/SCRIP/ /usr/SCRIP
cd /usr/SCRIP/source
edit makefile and set FLAGS, LIB, and INLCUDE for the build. These are set for
USGS, and the values for HPC are in the makefile but need to be uncommented.
Type make to build the executable.
The executable scrip is made in the next level up (one up from source, i.e.
/usr/SCRIP).
It also makes scrip_test, you can try that if you want.

step 3: Convert roms grid file to scrip input format.
Use the m file
COAWST/Tools/mfiles/mtools/scrip_roms.m
(new as of July 7, 2010).
You need to edit this file and provide information for the grid and output files.
Uncomment the file name for the inlet_test_grid.nc and create
inlet_test_roms_scrip.nc

step 4: Convert swan grid to scrip input format.
Use the m file
COAWST/Tools/mfiles/mtools/scrip_swan.m
You need to edit this file and provide information for the grid and output files.
This creates inlet_test_swan_scrip.nc.

Step5: Run SCRIP.
I am now only supporting the option to use of the flux conservative remapping.
The flux conservative is first order accurate and uses the same sparse matrix
interpolation method.

Edit the file scrip_in. The file looks like:

&remap_inputs
num_maps =2
grid1_file ='inlet_test_roms_scrip.nc'
grid2_file ='inlet_test_swan_scrip.nc'
interp_file1 ='ocn2wav_weights_consv.nc'
interp_file2 ='wav2ocn_weights_consv.nc'
map1_name ='ROMS to SWAN consv Mapping'
map2_name ='SWAN to ROMS consv Mapping'
map_method ='conservative'
normalize_opt ='fracarea'
output_opt ='scrip'
restrict_type ='latitude'
num_srch_bins =90
23
luse_grid1_area =.false.
luse_grid2_area =.false.


The input files are the files we just created: grid1_file is inlet_test_roms_scrip.nc
and grid2_file is inlet_test_swan_scrip.nc. The output files from scrip are
interp_file1 and interp_file2 are going to be called ocn2wav_weights_consv.nc
and wav2ocn_weights_consv.nc. Make sure you have the map_method set to be
conservative and normalize_opt to be fracarea.

To run scrip, place the input files in the top scrip level directory and run:
./scrip
After SCRIP ends, copy the two *_weights*.nc files to the project folder.

step 6: Edit the ocean model configuration file.
Edit Projects/Inlet_test/DiffGrid/inlet_test.h and add the option
#define MCT_INTERP_OC2WV
This will allow mct interpolation between the ocean and wave models.

step7: add the files to Projects/Inlet_test/DiffGrid/coupling_inlet_test_diffgrid.in. Add
the lines:
W2ONAME ==Projects/Inlet_test/DiffGrid/wav2ocn_weights_consv.nc
O2WNAME ==Projects/Inlet_test/DiffGrid/ocn2wav_weights_consv.nc

change the ocean and wave input file names to be
WAV_name =Projects/Inlet_test/DiffGrid/swan_inlet_test_diff.in ! wave model
OCN_name =Projects/Inlet_test/DiffGrid/ocean_inlet_test.in ! ocean model

step8: make, run the program


5e) INLET_TEST/Refined = ROMS+SWAN same grid + grid refinement in each.
This test case has a coarse and fine grid for ROMS, and a coarse and fine grid for SWAN,
run coupled on both grids. All of the files generated for this test case are in
Projects/Inlet_test/Refined.

step1: You need to create a roms grid for the coarse model, roms grid for fine model,
swan grid for coarse model, and swan grid for fine model. Suggest you do this:
- First create a roms grid for the coarse model. This was done using
Tools/mfiles/create_roms_grid and selecting the inlet_test case.
- To create the refined grid, use Tools/mfiles/create_nested_grid.m. For this application
we use a 5:1 refinement ratio.
- To create the two swan grids, you can use Tools/mfiles/roms2swan.m from the 2 roms
grids.

24
step2: Create input files for the ocean and the swan grids. You need one ocean.in file,
with the parameter values repeated for the 2 grids. See
Projects/Inlet_test/Refined/ocean_inlet_test_ref5.in. Edit this ocean*.in and set
values for all Numgrids (ie dt, file names, etc.)
- For Lm and Mm, the parent is -2, the others are -7.
- If the size of h for the parent grid is [Mp,Lp], then Lm =Lp-2, and Mm=Mp-2.
- If the size of h for each child grid is [Mp,Lp], then Lm =Lp-7, and Mm=Mp-7.

You need to create to separate SWAN INPUT files. See that same folder for the 2
swan files (swan_inlet_test.in and swan_inlet_test_ref5.in).

step3: Need to create init and bndry files for the largest scale ocean and wave grids.

step 4: Need to create init file for each ocean and wave refined child grids. The child
grids do not need boundary files.

step 5: Edit the Projects/Inlet_test/Refined/coupling_inlet_test_refined5.in file and list the
input files: swan has two, just one for roms.

step6: Edit coawst.bash and set
NestedGrids =2
This value of NestedGrids is the total number of roms grids (which is the same as the
total number of swan grids), including the parent and all children. So if you have 2 roms
grids and 2 swan grids, then NestedGrids =2.

step7: Build the model
./coawst.bash j
and run it using
mpiexec np 2 ./coawstM.exe Projects/Inlet_test/Refined/coupling_inlet_test_refined5.in


5f) INLET_TEST/Swanonly = SWAN by itself, also with grid refinement.
This test case is basically the swan files from section 5e, but allows you to run swan by
itself, and with a child grid.

To run SWAN by itself, with only 1 grid, the header file should have
#define SWAN_MODEL
#undef REFINED_GRID

Set coawst.bash to have NestedGrids =1
./coawst.bash to compile.
One note is that this needs to be run with the command pointing to the swan input file(s).
So to run the swan only with just one grid, run the model with (X can be any number of
processors)
mpiexec np X ./coawstM.exe Projects/Inlet_test/Swanonly/swan_inlet_test.in

25

To run SWAN by itself and with a child grid, the header file should have
#define SWAN_MODEL
#define REFINED_GRID

Set coawst.bash to have NestedGrids =2
./coawst.bash to compile.
The command line needs to point to the swan input files. So to run the swan only with a
parent and a child use (X can be any number of processors)
mpiexec np 1 ./coawstM.exe Projects/Inlet_test/Swanonly/swan_inlet_test.in
Projects/Inlet_test/Swanonly/swan_inlet_test_refined5.in
(all that is on one line, the text is just word wrapping here).


5g) Rip_current test case.
This test case is provided as an example of how to setup a coupled simulation to study a
rip current. As of version 750, it supersedes the previous distribution of coupling to
RefDif, as described in Haas and Warner (2009). The current test case is coupling of
ROMS+SWAN, and the application is described in detail in Kumar et al. (2012) in
section 4.3 of that paper. Users can adapt the test case for other types of similar
investigations.

The application is run by editing the coawst.bash file to set
export ROMS_APPLICATION=RIP_CURRENT

This case requires mpi and the MCT libraries because it is a coupled application. And the
run files are at:

export MY_HEADER_DIR=${MY_PROJECT_DIR}/Projects/Rip_current
export MY_ANALYTICAL_DIR=${MY_PROJECT_DIR}/Projects/Rip_current

This is built like the other tests cases, using ./coawst.bash. The user needs to set the total
number of processors for each of roms and swan in the coupling_rip_current.in file. Then
you need to set the processor allocation in the ocean_rip_current.in. The model is set up
to run for 1 hour of simulation time, with swan on a 5s time step and roms on a 1s time
step. Coupling is every 5s. The model is run using:

mpiexec -np 4 ./coawstM.exe Projects/Rip_current/coupling_rip_current.in

You can certainly change the number of processors for your own system. Upon
completion of the test case, you can use the included plot_rip_current.m matlab file to
create the plot below.
26

Figure 3. Rip current test case bathy and currents.


5h) Shoreface test case.
This test case is distributed in the Rutgers trunk. This case is a planar beach and only uses
the ROMS model, driven from a SWAN forcing file. The application is run by editing the
coawst.bash file to set
export ROMS_APPLICATION=SHOREFACE

This case does not require, but could be run in mpi. Here we show how to run it in serial
model. Set the file locations as:
export MY_HEADER_DIR=${MY_PROJECT_DIR}/ROMS/Include
export MY_ANALYTICAL_DIR=${MY_PROJECT_DIR}/ROMS/Functionals

This is built like the other tests cases, using ./coawst.bash. The model is run using:

./coawstS <ROMS/External/ocean_shoreface.in

To create the forcing file, we first ran swan and used that output with the m file
swanmat2roms.m to create the swan forcing file.



5i) Headland test case.
This test case was distributed in the Rutgers trunk, but we have slightly modified it to run
more in the COAWST system. This case is a coastline with a promontory that extends
outward from the land. It is set to be a coupled roms +swan simulation. More details are
here:
https://www.myroms.org/wiki/index.php/TEST_HEAD_CASE
The application is run by editing the coawst.bash file to set
27
export ROMS_APPLICATION=HEADLAND
I changed the name from test_head to headland, and then copied all the necessary files to
its new project directory. This case is coupled and needs mpi. Set the file locations as:

export MY_HEADER_DIR=${MY_PROJECT_DIR}/Projects/Headland
export MY_ANALYTICAL_DIR=${MY_PROJECT_DIR}/ Projects/Headland

This is built like the other tests cases, using ./coawst.bash. The model is run using:

mpiexec -np 2 ./coawstM Projects/Headland/coupling_headland.in

here is an example of the flow at 12 hours.






Section 6. Running COAWST.
Each user group has special ways to run the jobs.

USGS:
Edit the job control file. For our cluster using PBS this is the file run_nemo:

#! / bi n/ bash
### J ob name
#PBS - N cwst v3
### Number of nodes
#PBS - l nodes=2: ppn=8
### Mai l t o user
28
#PBS - mae
#PBS - M j cwar ner @usgs. gov
#PBS - q st andar d

umask 0002

echo "t hi s j ob i s r unni ng on: "
cat $PBS_NODEFI LE

NPROCS=`wc - l < $PBS_NODEFI LE`

cd / r ai d1/ j cwar ner / Model s/ COAWST_r egr ess/ coawst 351

mpi r un - np 12 - machi nef i l e $PBS_NODEFI LE . / coawst M
Pr oj ect s/ J OE_TCs/ coupl i ng_j oe_t c. i n > cwst v3. out


To run the the job
qsub run_nemo

To check job status
qstat an
qstat f

kill job
qdel pid where pid is the job number.


NCSU HPC:
Configuring COAWST for Use on HPC

he queues
1) Downl oad l at est ver si on usi ng subver si on t o your di r ect or y
svn co - - user name USERNAME ht t ps: / / svn2. host ed-
pr oj ect s. com/ coawst / COAWST .
2) Sour ce ( or add t o your . cshr c/ . bashr c f i l e) :
#COAWST
#Mer i net PGI v. 10. 5
sour ce / home/ gwhowel l / scr i pt s/ pgi 64_mx- 105. csh
set env MCT_I NCDI R / he_dat a/ he/ j bzambon/ MCT/ mx/ bui l d/ i ncl ude
set env MCT_LI BDI R / he_dat a/ he/ j bzambon/ MCT/ mx/ bui l d/ l i b
set env MPEU_LI BDI R / he_dat a/ he/ j bzambon/ MCT/ mx/ mpeu
set env NETCDF / he_dat a/ he/ j bzambon/ net cdf / net cdf - 3. 6. 3/ pgi 105/ mx/ bui l d
set env NETCDF_LI BDI R / he_dat a/ he/ j bzambon/ net cdf / net cdf -
3. 6. 3/ pgi 105/ mx/ bui l d/ l i b
set env NETCDF_I NCDI R / he_dat a/ he/ j bzambon/ net cdf / net cdf -
3. 6. 3/ pgi 105/ mx/ bui l d/ i ncl ude
set env J ASPERLI B / he_dat a/ he/ j bzambon/ j asper / j asper -
1. 900. 1/ mx/ bui l d/ l i b
set env J ASPERI NC / he_dat a/ he/ j bzambon/ j asper / j asper -
1. 900. 1/ i nf / bui l d/ i ncl ude
3) Copy f i l es i n / he_dat a/ he/ j bzambon/ coawst _hpcf i xes/ as descr i bed
i n t he README
29
4) Fol l ow i nst r uct i ons i n Tool s/ Docs/ COAWST_User _Manual . doc t o set
up COAWST t o r un f or your i ndi vi dual case.
5) Run coawst . bash i nst al l scr i pt
a. For conf i gur e. wr f , sel ect 3 f or Li nux x86_64, PGI compi l er wi t h
gcc ( dmpar )
b. For nest i ng, sel ect 1 f or basi c
6) Modi f y he_coawst scr i pt as appr opr i at e t o submi t your j ob
7) Submi t your j ob t o t he queue

He2 queues
1) Downl oad l at est ver si on usi ng subver si on t o your di r ect or y
svn co - - user name USERNAME ht t ps: / / svn2. host ed-
pr oj ect s. com/ coawst / COAWST .
2) Sour ce ( or add t o your . cshr c/ . bashr c f i l e) :
#COAWST
#I nf i ni band PGI v. 10. 5
sour ce / home/ gwhowel l / scr i pt s/ pgi 105_mvapi ch. csh
set env MCT_I NCDI R / he_dat a/ he/ j bzambon/ MCT/ i nf / bui l d/ i ncl ude
set env MCT_LI BDI R / he_dat a/ he/ j bzambon/ MCT/ i nf / bui l d/ l i b
set env MPEU_LI BDI R / he_dat a/ he/ j bzambon/ MCT/ i nf / mpeu
set env NETCDF / he_dat a/ he/ j bzambon/ net cdf / net cdf - 3. 6. 3/ pgi 105/ i nf / bui l d
set env NETCDF_LI BDI R / he_dat a/ he/ j bzambon/ net cdf / net cdf -
3. 6. 3/ pgi 105/ i nf / bui l d/ l i b
set env NETCDF_I NCDI R / he_dat a/ he/ j bzambon/ net cdf / net cdf -
3. 6. 3/ pgi 105/ i nf / bui l d/ i ncl ude
set env J ASPERLI B / he_dat a/ he/ j bzambon/ j asper / j asper -
1. 900. 1/ i nf / bui l d/ l i b
set env J ASPERI NC / he_dat a/ he/ j bzambon/ j asper / j asper -
1. 900. 1/ i nf / bui l d/ i ncl ude
3) Copy f i l es i n / he_dat a/ he/ j bzambon/ coawst _hpcf i xes/ as descr i bed
i n t he README
4) Fol l ow i nst r uct i ons i n Tool s/ Docs/ COAWST_User _Manual . doc t o set
up COAWST t o r un f or your i ndi vi dual case.
5) Run coawst . bash i nst al l scr i pt
a. For conf i gur e. wr f , sel ect 3 f or Li nux x86_64, PGI compi l er wi t h
gcc ( dmpar )
b. For nest i ng, sel ect 1 f or basi c
6) Modi f y he2_coawst scr i pt as appr opr i at e t o submi t your j ob
7) Submi t your j ob t o t he queue

Processor Configuration
Use bqueues to check available number of processors in queue
Determine processor configuration.
For example: if you wish to run on 50 processors (n=3):
25 WRF (5x5) OR (n+2) x (n+2)
16 ROMS (4x4) OR (n+1) x (n+1)
9 SWAN (3x3) OR n x n
If you wish to run on fewer processors, reduce n.
Once you have decided the number of processors to run on, make the necessary
changes to the following files:

coupl i ng_*. i n:
NnodesATM/ WAV/ OCN i n [ r oot ] / Pr oj ect s/ J OE_TC/ coupl i ng_j oe_t c. i n
( Li nes 39- 41)
30

ocean_*. i n:
Nt i l eI / J i n [ r oot / Pr oj ect s/ J OE_TC/ oceak_j oe. t c. i n ( Li nes 70- 71)

namel i st . i nput :
Npr ocx/ y i n [ r oot ] / namel i st . i nput ( Li nes 53- 54)

bsub scr i pt :
Tot al number of pr ocessor s i n #BSUB n ( Line 2)

Run Coupled Model
After all the processors have been setup, run the coupled model by typing:
bsub <run_couple
Track the status of the model by typing:
tail f ocean56.out


Section 7. Visualization Tools.
NCVIEW
Source code at:
http://meteora.ucsd.edu/~pierce/ncview_home_page.html

Some new colormaps at:
https://www.myroms.org/forum/viewtopic.php?f=31&t=1930&p=6926#p6926


VAPORGUI
this from the WRF visualization world.
http://www.vapor.ucar.edu/docs/install/index.php?id=unixbin


Section 8. How to set up a WRF application.

This section describes how to set up a WRF (only) application. Some of this was taken
from the WRF Users manual. We use the ARW core. It is recommended for you to do
the online wrf tutorial (http://www.mmm.ucar.edu/wrf/OnLineTutorial/index.htm)

Building the WRF Code
Starting with COAWST svn version 469 (June 2, 2011), you can use the distributed WRF
and WPS programs to create the WRF forcings and/or to run WRF by itself.

Make a new directory and copy the COAWST code into that dir, and cd into that dir.
You will always need a Projects header file. I recommend that you create a directory
called "My_Project" (create your own name, dont actually use 'My_Project'). The
directory could be: COAWST/Projects/My_Project. In that directory, create a header file
such as: my_project.h. For a wrf only application, you would need to have in that file:
#define WRF_MODEL
edit coawst.bash and set the Project name, header file location, and build location.
31
build the model with
./coawst.bash
o choose one of the options
o usually, option "1" is for a serial build, that is the best for an initial test ( we use option
3 for pgi ), then 1 again for basic nesting (if needed).
When it is done, you can list the contents: ls WRF/main/*.exe
o if you built a real-data case, you should see ndown.exe, real.exe, and
wrf.exe
o if you built an ideal-data case, you should see ideal.exe and wrf.exe


Building the WPS Code
Building WPS requires that WRF is already built. WPS is distributed with the code.
cd COAWST/WPS
./configure
o choose one of the options
o usually, option "1" is for a serial build, that is the best for an initial test (we used 5)
./compile
ls -ls *.exe
o you should see geogrid.exe, ungrib.exe, and metgrid.exe
o if you built an ideal-data case, you should see ideal.exe and wrf.exe
ls -ls util/*.exe
o you should see a number of utility executables: avg_tsfc.exe, g1print.exe,
g2print.exe, mod_levs.exe, plotfmt.exe, plotgrid.exe, and
rd_intermediate.exe

geogrid: Creating a Grid
1) edit WRF/WPS/namelist.wps

namelist.wps
&shar e
wr f _cor e = ' ARW' ,
max_dom= 1,
st ar t _dat e = ' 2003- 12- 01_00: 00: 00' ,
end_dat e = ' 2003- 12- 19_00: 00: 00' ,
i nt er val _seconds = 21600
i o_f or m_geogr i d = 2,
/

&geogr i d
par ent _i d = 1,
par ent _gr i d_r at i o = 1,
i _par ent _st ar t = 1,
j _par ent _st ar t = 1,
e_we = 395,
e_sn = 360,
geog_dat a_r es = ' 10m' ,
dx = 12000,
dy = 12000,
map_pr oj = ' mer cat or ' ,
r ef _l at = 32. 0381,
r ef _l on = - 77. 45058,
t r uel at 1 = 31. 5,
t r uel at 2 = 0. 0,
st and_l on = - 78. 4,
geog_dat a_pat h = ' / r ai d1/ j cwar ner / Model s/ WRF/ geog'
32
/

&ungr i b
out _f or mat = ' WPS' ,
pr ef i x = ' FI LE' ,
/

&met gr i d
f g_name = ' FI LE'
i o_f or m_met gr i d = 2,
opt _out put _f r om_met gr i d_pat h = ' . / ' ,
opt _met gr i d_t bl _pat h = ' met gr i d/ ' ,
/

&mod_l evs
pr ess_pa = 201300 , 200100 , 100000 ,
95000 , 90000 ,
85000 , 80000 ,
75000 , 70000 ,
65000 , 60000 ,
55000 , 50000 ,
45000 , 40000 ,
35000 , 30000 ,
25000 , 20000 ,
15000 , 10000 ,
5000 , 1000
/
33
2) get the geogrid data http://www.mmm.ucar.edu/wrf/src/wps_files/geog.tar.gz
untar this to a directory called geog
gzip -cd geog.tar.gz | tar -xf -
Set the geog_data_path in the namelist.wps file to this dir.

3) make sure there is a link to the correct geogrid table.
>ls ltr WPS/geogrid/*.TBL
should return
>geogrid/GEOGRID.TBL ->GEOGRID.TBL.ARW

4) run geogrid
>WPS ./geogrid.exe
and get back a complete successful information.

5) ls ltr should show a geo_em.d01.nc file. Use ncview or some other viewer to check it
out.

ungrib: Getting IC and BC data
1) Get grib data here:
http://dss.ucar.edu/datasets/ds083.2/
You will need to register first and get a username/passwd.
Then you can use wget (v1.9 or higher) to get the data. Here is an example of a wget
script:

############## file get_gfs_data #########################
#! / bi n/ csh - f
#
# c- shel l scr i pt t o downl oad sel ect ed f i l es f r omdss. ucar . edu
# usi ng wget
# NOTE: i f you want t o r un under a di f f er ent shel l , make sur e you
# change t he ' set ' commands accor di ng t o your shel l ' s synt ax
# af t er you save t he f i l e, don' t f or get t o make i t execut abl e
# i . e. - " chmod 755 <name_of _scr i pt >"
#
# Exper i enced Wget User s: add addi t i onal command- l i ne f l ags her e
# Use t he - r ( - - r ecur si ve) opt i on wi t h car e
# Do NOT use t he - b ( - - backgr ound) opt i on - si mul t aneous f i l e
# downl oads can cause your dat a access t o be bl ocked.
set opt s = " - N"
#
# Repl ace " xxxxxx" wi t h your passwor d
# I MPORTANT NOTE: I f your passwor d uses a speci al char act er t hat has
# speci al meani ng t o csh, you shoul d escape i t wi t h a backsl ash
# Exampl e: set passwd = " my\ ! passwor d"
set passwd = " MYPASSWD"
#
#set cer t _opt = " "
# I f you get a cer t i f i cat e ver i f i cat i on er r or ( ver si on 1. 10 or hi gher ) ,
# uncomment t he f ol l owi ng l i ne
set cer t _opt = " - - no- check- cer t i f i cat e"
#
# aut hent i cat e
wget $cer t _opt - O / dev/ nul l - - save- cooki es aut h. dss_ucar _edu - - post -
dat a=" emai l =j cwar ner @usgs. gov&passwd=$passwd&act i on=l ogi n" ht t ps: / / dss. ucar . edu/ cgi -
bi n/ l ogi n
#
# downl oad t he f i l e( s)
34
wget $cer t _opt $opt s - - l oad- cooki es aut h. dss_ucar _edu
ht t p: / / dss. ucar . edu/ dat aset s/ ds083. 2/ dat a/ gr i b1/ 2003/ 2003. 09/ f nl _20030909_00_00
wget $cer t _opt $opt s - - l oad- cooki es aut h. dss_ucar _edu
ht t p: / / dss. ucar . edu/ dat aset s/ ds083. 2/ dat a/ gr i b1/ 2003/ 2003. 09/ f nl _20030909_06_00
wget $cer t _opt $opt s - - l oad- cooki es aut h. dss_ucar _edu
ht t p: / / dss. ucar . edu/ dat aset s/ ds083. 2/ dat a/ gr i b1/ 2003/ 2003. 09/ f nl _20030909_12_00
wget $cer t _opt $opt s - - l oad- cooki es aut h. dss_ucar _edu
ht t p: / / dss. ucar . edu/ dat aset s/ ds083. 2/ dat a/ gr i b1/ 2003/ 2003. 09/ f nl _20030909_18_00
wget $cer t _opt $opt s - - l oad- cooki es aut h. dss_ucar _edu
ht t p: / / dss. ucar . edu/ dat aset s/ ds083. 2/ dat a/ gr i b1/ 2003/ 2003. 09/ f nl _20030910_00_00
#
# cl ean up
r maut h. dss_ucar _edu
############## end of file get_gfs_data #########################
make the file executable and run it : ./get_gfs_data
cp the data, I used /raid1/jcwarner/Models/WRF/gfs_data/fnl_0309_gfs

2) The gfs data is in grib format. We want to convert that format to something that WRF
can read so we use the ungrib.exe program. cd to COAWST/WPS
link a Vtable to the GFS data with:
>ln sf ungrib/Variable_Tables/Vtable.GFS Vtable

3) link the GFS files to common names that WPS will recognize. Use:
( I moved all the fnl files to the gfs_data/fnl_0309 dir).
>./link_grid.csh /raid1/jcwarner/Models/WRF/gfs_data/fnl_0309_gfs/fnl_0309
Do an ls -ltr and see all the GRIBFILE.AAA etc .linking to all the gfs data files.

4) Run ungrib
>./ungrib.exe >& ungrib.out &
Edit the .out file and see if there were any problems. do an ls and see FILE:YYYY-MM-
DD-HH.

5) Get SST data
Go here for SST data: ftp://polar.ncep.noaa.gov/pub/history/sst
for the times that you want, and put into a folder.
I used: /WRF/gfs_data/fnl_0309_sst

5a) WPS/ ln -sf ungrib/Variable_Tables/Vtable.SST Vtable

5b) WPS/ rm GRIBFILE*
WPS/ ./link_grib.csh
/raid1/jcwarner/Models/WRF/gfs_data/fnl_0309_sst/rtg_sst_grb_0.5

5c) edit COAWST/WPS/namelist.wps and change
prefix =FILE
to
prefix =SST

5d) COAWST/WPS/ ./ungrib.exe >& ungrib_sst.out &
edit the .out file and see that all went well.

35


metgrid: Interpolate ungribbed met data to the grid.
1a) Check the the metgrid table is pointing to the right place.
>cd metgrid
>ls ltr
should show METGRID.TBL ->METGRID.TBL.ARW

1b) if you acquired sst data, then edit
COAWST/WPS/namelist.wps and change the fg_name from
fg_name =FILE
to
fg_name =FILE, SST

1c) run metgrid
COAWST/WPS/ ./metgrid.exe
As it runs you see . processing FILE, SST,
When done check that the met files were made. Do an ls ltr and see the met_em.d01**
files.

real.exe to create the init and BC files.
Now we need to run the real program.
1) >cd WRF/test/em_real
edit the namelist file. I have:

&t i me_cont r ol
r un_days = 18,
r un_hour s = 0,
r un_mi nut es = 0,
r un_seconds = 0,
st ar t _year = 2003, 2000, 2000,
st ar t _mont h = 12, 01, 01,
st ar t _day = 01, 24, 24,
st ar t _hour = 00, 12, 12,
st ar t _mi nut e = 00, 00, 00,
st ar t _second = 00, 00, 00,
end_year = 2003, 2000, 2000,
end_mont h = 12, 01, 01,
end_day = 19, 25, 25,
end_hour = 00, 12, 12,
end_mi nut e = 00, 00, 00,
end_second = 00, 00, 00,
i nt er val _seconds = 21600
i nput _f r om_f i l e = . t r ue. , . t r ue. , . t r ue. ,
hi st or y_i nt er val = 60, 60, 60,
f r ames_per _out f i l e = 12, 1000, 1000,
r est ar t = . f al se. ,
r est ar t _i nt er val = 720,
i o_f or m_hi st or y = 2,
i o_f or m_r est ar t = 2,
i o_f or m_i nput = 2,
i o_f or m_boundar y = 2,
debug_l evel = 0,
auxi nput 4_i nname = " wr f l owi np_d<domai n>" ,
auxi nput 4_i nt er val = 360,
i o_f or m_auxi nput 4 = 2
/

36
&domai ns
t i me_st ep = 60,
t i me_st ep_f r act _num = 0,
t i me_st ep_f r act _den = 1,
max_dom = 1,
s_we = 1, 1, 1,
e_we = 395, 112, 94,
s_sn = 1, 1, 1,
e_sn = 360, 97, 91,
s_ver t = 1, 1, 1,
e_ver t = 28, 28, 28,
num_met gr i d_l evel s = 27
dx = 12000, 10000, 3333. 33,
dy = 12000, 10000, 3333. 33,
gr i d_i d = 1, 2, 3,
par ent _i d = 0, 1, 2,
i _par ent _st ar t = 1, 31, 30,
j _par ent _st ar t = 1, 17, 30,
par ent _gr i d_r at i o = 1, 3, 3,
par ent _t i me_st ep_r at i o = 1, 3, 3,
f eedback = 1,
smoot h_opt i on = 0,
npr oc_x =8,
npr oc_y =8,
/

&physi cs
mp_physi cs = 6, 3, 3,
r a_l w_physi cs = 1, 1, 1,
r a_sw_physi cs = 1, 1, 1,
r adt = 6, 30, 30,
sf _sf cl ay_physi cs = 2, 1, 1,
sf _sur f ace_physi cs = 2, 2, 2,
bl _pbl _physi cs = 2, 1, 1,
bl dt = 0, 0, 0,
cu_physi cs = 1, 1, 0,
cudt = 6, 5, 5,
i sf f l x = 1,
i f snow = 0,
i cl oud = 1,
sur f ace_i nput _sour ce = 1,
num_soi l _l ayer s = 4,
ucmcal l = 0,
maxi ens = 1,
maxens = 3,
maxens2 = 3,
maxens3 = 16,
ensdi m = 144,
sst _updat e = 1
/

&f dda
/

&dynami cs
w_dampi ng = 0,
di f f _opt = 1,
km_opt = 4,
di f f _6t h_opt = 0,
di f f _6t h_f act or = 0. 12,
base_t emp = 290.
damp_opt = 0,
zdamp = 5000. , 5000. , 5000. ,
dampcoef = 0. 2, 0. 2, 0. 2
khdi f = 0, 0, 0,
kvdi f = 0, 0, 0,
non_hydr ost at i c = . t r ue. , . t r ue. , . t r ue. ,
pd_moi st = . t r ue. , . t r ue. , . t r ue. ,
pd_scal ar = . t r ue. , . t r ue. , . t r ue. ,
/

37
&bdy_cont r ol
spec_bdy_wi dt h = 5,
spec_zone = 1,
r el ax_zone = 4,
speci f i ed = . t r ue. , . f al se. , . f al se. ,
nest ed = . f al se. , . t r ue. , . t r ue. ,
/

&gr i b2
/

&namel i st _qui l t
ni o_t asks_per _gr oup = 0,
ni o_gr oups = 1,
/

Pay attention to time and e_* settings.

2) cp or link metfiles to the WRF/test/em_real directory.
>ln sf /raid1/jcwarner/Models/WRF/WPS/met_em.d01.2003-09* .

3) If you do not want bogus vtex, go to step 4.
If you want to include a Bogus Vortex
This is sort of painful, but may be worth it.
3a) reconfigure WRF to work in serial mode 1, option 0.
3b) edit WRFV3/test_em_real/namelist.input to include

&tc
insert_bogus_storm =.true.
remove_storm =.false.
latc_loc =21.2
lonc_loc =-52.3
vmax_meters_per_second =62.5
rmax =100000.0
vmax_ratio =0.85
/

Namelist var descriptions here =>
http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3.1/users_guide_chap10.htm#tc
_bogus

3c) Run tc.exe with
COAWST/WRF/test_em_real/mpirun np 1 hostfile nodes.list ./tc.exe

3d) cp the outfile to replace the first met_em file. for example:
cp auxinput1_d01_2003_09_11:00:00:00 met_e.d01.2003_09_11:00:00

3e) reconfigure wrf:
>WRF ./configure (3,1)
>WRF ./compile em_real >& compile.out &


38
4) run the real program
4a) edit namelist.input and set nproc_x=1, nproc_y=1
4b) Make a file called nodes.list, and just list one node
nemo3
Then run real (in mpi) with this command:
>mpirun np 1 hostfile nodes.list ./real.exe
When done, check to see that it made wrfinput_d01 and wrfbdy_d01 netcdf files.
Edit the rsl.error* and rsl.out* files to see that all went well.
I then removed the met* files.


run WRF.
Now we need to run the wrf program. For a wrf only application, I soft linked coawstM
with WRF/test/em_real/wrf.exe. Edit namelist.input and reset nproc_x=NX,
nproc_y=NY

>qsub run_nemo with :

#! / bi n/ bash
### J ob name
#PBS - N wr f 1
### Number of nodes
#PBS - l nodes=5: ppn=8, wal l t i me=120: 00: 00
### Mai l t o user
#PBS - mae
#PBS - M j cwar ner @usgs. gov
### Out f i l es
###PBS - e i sabel _105. er r
###PBS - o i sabel _105. out
### PBS queue
#PBS - q st andar d

echo " t hi s j ob i s r unni ng on: "
cat $PBS_NODEFI LE

NPROCS=`wc - l < $PBS_NODEFI LE`

cd / r ai d1/ j cwar ner / Model s/ COAWST

mpi r un - np 40 - machi nef i l e $PBS_NODEFI LE . / coawst M > wr f _r un1. out

look at the output.
ncview wrfout_d01_***


Section 9. How to get boundary, init, and climatology data for ROMS
grids.
The main driver is Tools/mfiles/roms_clm/roms_master_climatology.m. User needs to
edit this file to identify grid and directory names. This set of tools will interpolate
HYCOM data to the user grid for the requested time period. Users will need to edit the
irg.mat and jrg.mat files to set the HYCOM grid indices from which to interpolate the
data from.


39

Section 10. How to get wind, boundary, and init files for SWAN.

Wind Forcing:
If not coupled to wrf, you need to download wind data from somewhere and create an
ascii wind file. See the SWAN manual as to their format. It needs to be written out in an
ascii text file. We typically use format number 4. We typically obtain wind data from
NARR at http://nomads.ncdc.noaa.gov/.

Boundary data:
There are several sources and methods to obtain boundary data from swan. We typically
obtain data from Wave Watch 3 and use this to create TPAR files (see swan manual).
Here are some helpful methods/tools.

1. To run in hindcast mode, you will need to download historical ww3 data in grib2
format from NOAA's ftp server. First, decide what noaa grid you want the data from
ftp://polar.ncep.noaa.gov/pub/history/waves/README

2. Get the data. Go here to get ftp data from the grid and time period you need.
This is the global data set:
ftp://polar.ncep.noaa.gov/pub/history/waves/
(scroll down to, for example, multi_1.glo_30m.xx.YYYYMM.grb2)
get the three files for each month: xx =hs, tp, dp files (Hsig, Period, and Direction).
Do not change the file names.

3. The file Tools/mfiles/swan_forc/ww3_swan_input.m is the main driver. This is used to
create SWAN TPAR boundary forcing files. For hindcast mode, the TPAR files will be
converted from the grib2 data using nctoolbox. Edit
Tools/mfiles/swan_forc/ww3_swan_input.m
and enter the user required info. Run the m file.

5. Edit the SWAN INPUT file and add the information from the Bound_spec_command
file.

6. Place the TPAR files in the ../forcings folder.

Init files:
Method 1: (try this first)
In the swan input file, use the command
INITIAL DEFAULT
(need to use linear wind growth terms)

Method 2:
To create an init file for swan, you can run SWAN in stationary mode on the same
number of processors and create a hot start file(s).

40
Section 11. m files.
- As of release 567, February 8, 2012, we have rewritten the m files to use native matlab
netcdf interface. This will allow a more uniform practice amongst users. The caveat of
this is that the matlab interface uses the Fortran convention, ie the variables are read and
writen as Var(x,y,z,t). Many of the previous toolboxes used the C convention of
Var(t,z,y,x). Please be careful when using the native matlab. You can use
Tools/mfiles/mtools/netcdf_load to use native matlab to load a netcdf file into the
workspace.

- The only other toolbox that is currently required is the nctoolbox found at:
http://code.google.com/p/nctoolbox/
Follow instructions on that website to install that toolbox. It is relatively painless. These
nctools are used in the roms_clm routines to read HYCOM data via opendap. They are
also used in the swan_forc m files to read the grib2 WW3 data.
This nctoolbox is not required to run the COAWST system. We are just offering this as a
method to obtain forcings, bry, and init data. Users can obtain this data by many other
methods.

mfile listing:
- inwave_tools: under development. Not supported at this time.

- landmask: not distributed yet. needs to be update to matlab interface.

- m_map: set of utilities used in the grid generation routines, used to convert lat/lon to
meters, and to support different projections.

- mtools: set of utilites mostly for ROMS to create grids, load a netcdf file, converting
grids for scrip, creating wrf grid from roms grid, etc.

- roms_clm: main driver is roms_master_climatology_coawst_mw.m, used to create
boundary, init, and climatology files for roms grids using opendap to read
HYCOM data.

- swan_forc: main driver is ww3_swan_input.m to read WW3 grib2 data and create
SWAN TPAR forcing files.


Section 12. Change log.
"svn log" gives the whole listing. Here are some key updates:

Rev # Date Description
-------------------------- COAWST V1.0 ------------------------------------
0 01Jan2009 First upload.
17 23Feb2009 Merge grid refinement with atm coupling methods.
83 16Apr2009 Update time stepping for refined grids.
99 24Apr2009 Add SWAN forcing package.
41
107 28Apr2009 Add InWave.
147 06Jul2009 Add Inlet tests (3).
185 07Aug2009 Update to WRFV3.1.1
196 03Sep2009 Add BLOCK_J ONSSON.
203 14Sep2009 Add UV_KIRBY vel avg for wave coupling.
207 16Sept2009 Added hwave/lwave from SWAN to WRF.
-------------------------- COAWST V2.0 ------------------------------------

250-262 01May2010 Update to ROMS svn 455, Update SWAN 4072ABCDE
This was a major update. Complete build is now handled
thru coawst.bash script.

-------------------------- COAWST V3.0 ------------------------------------

300-315
330 15Jun2010 Removed old Project files.
331-338 07Jul2010 Add ATM2OCN_FLUXES to allow consistent wrf-roms
fluxes and to update scrip interpolation to use conservative
fluxes.
351 19Jul2010 Update Inlet tests to work with consv fluxes, and be more
consistent. Update manual to include Inlet Tests
descriptions, added m files to create the refined grids.
393 07Dec2010 SWAN uses urms instead of ubot to compute Madsen
bottom stress. Also pass urms to roms for bbl models
instead of ubot. Rename Ub_swan to Uwave_rms. If user
does not activate a bbl, then swan fric will be based on user
supplied value in swan.in. If user does activate bbl, then
swan fric is based on ZoNik.
426 27Feb2011 Update to WRF 3.2.1
430 05Mar2011 Incorporate wec-vortex force method.
433 09Mar2011 Update to SWAN 40.81
469 02Jun2011 Add WPS and WRF to run separately
476-477 12Jul2011 Add WRF_CPL_GRID to choose which wrf grid to couple
to. Modifications to work for ifort on cx2. Create
mct_couple_params file to hold coupled vars. Modify WRF
to write to stdout, not to rsl files. modify coupling.in files.
500-516 27Oct2011 Update to conservative 2-way grid refinement for ROMS.
528-531 30Nov2011 Incorporate Seaice module.
561 08Feb2012 Update m files to use native matlab netcdf interface. Go
thru the distributed swan_forc, roms_clm, and mtools to
ensure functionality. Update /cleanup manual.
599 11Jul2012 last distribution on hosted-projects.com Moved to Source
repo.
602 11Jul2012 Allow SWAN only simulations.
603 11Jul2012 Allow WRF only simulations.
607-635 24Jul2012 Update to WRF 3.4.
42
669 02Oct2012 Add #define ROMS_MODEL to all roms apps.
672 11Oct2012 Change mct_couper_params to mct_wrf_coupler_params in
wrf side, update wrf time step counter.
673 11Oct2012 Update ROMS grid refinement to allow refined region to
vary spatially in the parent grid (interp2 w/ spline).
680 22Oct2012 Edit Compilers to not use LIBS=netcdf calls for swan only
compilations.
681 22Oct2012 Allow SWAN to create its own hot file.
682 24Oct2012 Allow SWAN to run in STAT mode.
686 01Nov2012 Update mask_io to not overwrite point sources in his files.
701-708 04Feb2013 Update SWAN to 4091A.
725 26Apr2013 Update JOE_TC to use sf_surface_physics =2.
751-752 06Sept2013 Update Rip_current test case to Kumar et al 2012.
758 16Sept2013 Update some roms input files so they all have Ltracer src
762-767 26Sept2013 Allow WRF-SWAN coupling, add DRAGLIM_DAVIS
771 23Oct2013 Add headland test case
789 19Dec2013 Update wrf-swan cpling for diff grids.
801 07Mar2014 Update Compilers and mct_coupler for gfortran with wrf.
813 17Apr2014 Update Compiles and wrf makefile, -I more wrf paths,
gfortran flags for recl=4
43
Section 13. Distribution text and user list.

Dear COAWST User-

Your usrname is listed above, and your password is your username with 12345 after it. To check
out the code I suggest you make a directory called COAWST, cd to that dir, and use:

svn checkout --username usrname
https://coawstmodel.sourcerepo.com/coawstmodel/COAWST .

Notice the . at the end of the command to place the code in that location. Alternatively
you can put a path instead of the dot.

The code now contains these versions:
ROMS svn 455
WRF v3.4
SWAN 40.91A
MCT 2.6.0

There have been many changes in this latest release. Please read the document:
COAWST\Tools\Docs\COAWST_User_Manual.doc
for information about new build methods, new cpp options, etc.
Also, please look at the new Discussion and Trac site at:
https://coawstmodel-trac.sourcerepo.com/coawstmodel_COAWST/
You need to login to see the Discussion site.

If you have any issues please respond to me. If you have questions about a particular model,
some useful forums are:

ROMS
https://www.myroms.org/forum/index.php

WRF
http://forum.wrfforum.com/

SWAN
http://swanmodel.sourceforge.net/

MCT
http://www.mcs.anl.gov/research/projects/mct/

As always, i will try my best to help you out, especially if the problem is with the coupling. But do
your part to investigate how to use each of these models.

thanks,
john
44
User List.
jcwarner jcwarner@usgs.gov John Warner *
maitane maitane.olabarrieta@essie.ufl.edu Maitane Olabarrieta *
Jacopo jacopo.chiggiato@ismar.cnr.it Jacopo Chiggiato #*
barmstrong barmstrong@usgs.gov Brandy Armstrong *
jbzambon jbzambon@ncsu.edu Joe Zambon *
rhe rhe@ncsu.edu Ruoying He *
khaas Kevin.haas@gtsav.gatech.edu Kevin Haas *
asingleton a.singleton@imperial.ac.uk Andrew Singleton x
ddukhovskoy ddukhovskoy@fsu.edu Dmitry Dukhovskoy #
jmralves jmralves@fc.ul.pt Jose Alves #*
tracey tracey@surflegend.co.jp Tracey Tom #*
milicak milicak@rsmas.miami.edu Mehmet Ilicak *
nganju nganju@usgs.gov Neil Ganju #*
liess liess@umn.edu Stefan Liess
dputrasa dputrasa@gmail.com Dian Putrasahan
gpassala gpassala@ucsd.edu Gino Passalacqua #*
stone lujy1985@ouc.edu.cn Stone
arango arango@marine.rutgers.edu Hernan Arango *
kumar nkumar@geol.sc.edu Nirnimesh Kumar *
ivica ivica.jan@gmail.com Ivica Janekovic *
pbudgell admin@mariclime.com Paul Budgell *
xfyang xfyang@gatech.edu Xiufeng Yang *
lrenault lrenault@atmos.ucla.edu Lionel Renault #*
agmunoz agmunoz@cmc.org.ve ngel G. Muoz #
zxue zxue@ncsu.edu Zuo (George) Xue #*
backkom yaozhigang.ouc@gmail.com Zhigang Yao
luciano luciano.pezzi@cptec.inpe.br Luciano Ponzi Pezzi #*
jnelson jsnelso2@ncsu.edu J ill Nelson
soupy sdalyander@usgs.gov P.Soupy Dalyander *
cdenamiel Clea.Denamiel@usm.edu Clea Denamiel *
lfowler l.fowler09@imperial.ac.uk Luke Fowler
james james.farley-nicholls05@imperial.ac.uk James Farley-Nicholls
rtoumi r.toumi@imperial.ac.uk Ralf Toumi
mdutour Mathieu.Dutour@irb.hr Mathieu Dutour *
kuzmic kuzmic@irb.hr Milivoj Kuzmic *
rftome rftome@gmail.com Ricardo Tom
scarniel sandro.carniel@ismar.cnr.it Sandro Carniel *
sudipta sudiptab@cdac.in Sudipta Banerjee x
bsantas basantas@cdac.in Basanta Samal x
rsamelson rsamelson@coas.oregonstate.edu Roger Samelson
bin_liu bin_liu@ncsu.edu Bin Liu #*
yzheng yzheng@fsu.edu Yangxing Zheng *
khyun khyun@ncsu.edu Kyung Hoon Hyun
andres andres@dgeo.udec.cl Andres Sepulveda
cjjang cjjang@kordi.re.kr Chan Joo Jang #*
45
pvelissariou pvelissariou@fsu.edu Panagiotis Velissariou #
xmzeng xzeng2@ncsu.edu XM Zeng *
tomasz tomasz.dabrowski@marine.ie Tomasz Dabrowski
arusso a.russo@univpm.it Aniello Russo #
gvoulgaris gvoulgaris@geol.sc.edu George Voulgaris *
gwilson greg.wilson@dal.ca Greg Wilson
rgonzalez raul.gonzalez@plymouth.ac.uk Ral Gonzlez-Santamara #
mcure marcel.cure@numericswarehouse.com Marcel Cure #*
bzhang bzhangys@yahoo.com Bin Zhang #
nmori mori.nobuhito.8a@kyoto-u.ac.jp Nobuhito Mori #*
pmooney priscilla.a.mooney@nuim.ie Priscilla Mooney *
rcaldeira rcaldeira@ciimar.up.pt Rui Caldeira #*
nnatoo nnatoo@marum.de Nilima Natoo #*
nguyen minh.nguyen.hus@gmail.com NGUYEN Nguyet_Minh
lifangjiang lifangjiang@scsio.ac.cn Lemon *
ishihara ishihara@dcc.co.jp Shuji Ishihara
ironling ironling@gmail.com Ironling #*
gabriel gabriel.munchow@ufrgs.br Gabriel Munchow #*
nicholls stephen.d.nicholls@nasa.gov Stephen Nicholls #*
npisciotto npisciot33@gmail.com Nick Pisciotto
daeyong dyeatmos@gmail.com Dae-Yong Eom *
jksim jksim@kesti.co.kr J inkyoung Sim
rhetland hetland@tamu.edu Rob Hetland *
liukevin rcxyrcxy@hotmail.com LiuKevin
arturo arturo.quintanar@gmail.com Arturo Quintanar *
cdrews drews@ucar.edu Carl Drews #*
kkobchul kkobchul@naver.com Chul-Min Ko #*
mashinde samurai@tropmet.res.in Mahesh Shinde
jkala J.Kala@murdoch.edu.au Jatin Kala
wlhuang hwl1943@gmail.com Wen-Long Huang #*
akbarpour akbarpour@inio.ac.ir Mahmood Akbarpour
pnolan paul.nolan@ucd.ie Paul Nolan
apau apaul@marum.de Andre Paul *
chunsil sky@nwp2.snu.ac.kr Chun-Sil J in
bonnland bonnland@ucar.edu Brian Bonnlander *
gill gill@ucar.edu David Gill *
bruyerec bruyerec@ucar.edu Cindy Bruyere *
zdong zdong@UDel.Edu Zhifei Dong
kirby kirby@UDel.Edu J im Kirby
lequochuy lequochuy@imh.ac.vn Le Quoc Huy #*
hung hungdamduy@gmail.com Hung Damduy #*
wang wzq@lasg.iap.ac.cn Wang *
manel manel.grifoll@upc.edu Manel Grifoll *
torleif torleif.lunde@cih.uib.no Torleif Markussen Lunde
martin martin.king@uni.no Martin P. King #
dwang dwang@umces.edu Dakui Wang #*
46
ephraim ephraim.paul@ocean.tamu.edu Ephraim U. Paul #
zhaoruzhang zhaoruzhang@gmail.com Zhaoru Zhang
wenxiazhang wenxiazhang@tamu.edu Wenxia Zhang
kelly.lynne kelly.lynne.cole@gmail.com Kelly Cole #
zhangxq zhangxq@neo.tamu.edu Xiaoqian Zhang *
celso celsoferreira@tamu.edu Celso Ferreira
rmontuoro rmontuoro@tamu.edu Raffaele Montuoro
christina christina.patricola@gmail.com Christina Patricola #
jshsieh jsh@ocean.tamu.edu Jen-Shan Hsieh #*
jaison jaison@atmos.ucla.edu Jaison Kurian
mingkui mingkui.li@gmail.com Mingkui Li *
xuzhao xuzhao@neo.tamu.edu Zhao Xu
maxiaohui maxiaohui@neo.tamu.edu Xiaohui Ma
karthik Karthik.Balaguru@pnnl.gov Karthik Balaguru
sarava sarava@tamu.edu R. SARAVANAN
ping ping@tamu.edu PING CHANG
khiem khiem@imh.ac.vn Mai Van Khiem
kien kien.cbg@gmail.com BS. Truong Ba Kien #
abhisek abhisek.hit@gmail.com Abhisek Chakraborty
markus markus.gross@metoffice.gov.uk Markus Gross
johannes johannesro@met.no Johannes Rhrs
tcraig tcraig@ucar.edu Tony Craig
zhaobiao zhaobiaodeyouxiang@163.com Biao Zhao #
bbutman bbutman@usgs.gov Brad Butman
mahmood mahmoodakbarpour@gmail.com Mahmood Akbarpour
shihnan schen77@ntu.edu.tw Shih-Nan *
zafer zdefne@usgs.gov Zafer Defne *
benjamin benjamin.wong11@imperial.ac.uk Benjamin Wong *
alexis alexis.perez@insmet.cu Alexis Perez Bello
rsignell rsignell@usgs.gov Rich Signell *
jmsole jmsole@meteosim.com Josep Maria Sol *
raluca raluca.radu@imperial.ac.uk Raluca Radu *
csherwood csherwood@usgs.gov Christopher R. Sherwood *
jorge jorge.navarro@ciemat.es Jorge Navarro *
nilsmk nilsmk@met.no Nils Melsom Kristensen
isafak isafak@usgs.gov Ilgar Safak *
alvise alvise.benetazzo@ve.ismar.cnr.it Alvise Benetazzo *
nehru nehru.machineni@gmail.com Nehru Machineni *
sword shijian.ocean@qq.com Shi J ian *
francesco francesco.falcieri@ve.ismar.cnr.it Francesco Falcieri
alfredo aaretxabaleta@usgs.gov Alfredo Aretxabaleta *
huadong huadong.du@gmail.com Du Huadong *
kritanai kritanai.torsri@gmail.com Kritanai Torsri *
jwlong jwlong@usgs.gov Joe Long *
ijmoon ijmoon@jejunu.ac.kr Il-Joo Moon *
adakudlu Muralidhar.Adakudlu@gfi.uib.no Muralidhar Adakudlu *
47
ctang ctang@yic.ac.cn Cheng Tang *
___________________ move to source repo _______________________

jl_hdzf jl_hdzf@yahoo.com Jose L Hernandez
guoqiang Guoqiang.Liu@dfo-mpo.gc.ca Guoqiang Liu
jian jhe6@ncsu.edu J ian He
colucix a.coluccelli@univpm.it Alessandro Coluccelli
avalentini avalentini@arpa.emr.it Andrea Valentini
fjose fjose@fgcu.edu Dr. Felix Jose
paulo paulo.calil@furg.br Paulo H. R. Calil
ramos ramos.oceano@gmail.com Arthur Ramos
alexander alexander.haumann@usys.ethz.ch Alexander Haumann
nardieric nardieric@gmail.com Eric Nardi
majid majid.noranian@gmail.com Majid Noranian
lanli Lanli.Guo@dfo-mpo.gc.ca Lanli Guo
hhiester hhiester@fsu.edu Hannah Hiester
zhangchen 15054823857@163.com Zhang Chen
pilar pilar.delbarrio@unican.es Pilar Delbarrio
tjdewhurst tjdewhurst@gmail.com Toby Dewhurst
diane diane.foster@unh.edu Diane Foster
ygh2 ygh2@wildcats.unh.edu Yunus Cezarli
cenglert cenglert.nh@gmail.com Chris Englert
evan evan.gray.04@gmail.com Evan Gray
crf43 crf43@wildcats.unh.edu Charlie Watkins
donya donyafrank@gmail.com Donya Frank
will will@ccom.unh.edu Will Fessenden
jchadwick jchadwick@ccom.unh.edu Jordon Chadwick
jsrogers jsrogers@stanford.edu Justin Rogers
cdong cdong@atmos.ucla.edu Changming Charles Dong
rhg11c rhg11c@my.fsu.edu Russell Glazer
tkkim tk.kim@kiaps.org Taekyun Kim
marcelo Marcelo.HerediaGomez@anteagroup.com Heredia Gomez Marcelo
ibnu ibnusofian@bakosurtanal.go.id Ibnu Sofian
charles charles.james@sa.gov.au Charles James
ryan Ryan.Lowe@uwa.edu.au Ryan Lowe
icebearw icebearw@yahoo.com.cn Dan Wang
mhadfield Mark.Hadfield@niwa.co.nz Mark Hadfield
doristea doristea@gmail.com Doris Diana Otero Luna
djkobashi d.kobashi@tamu.edu DJ Kobashi *
shenbiao shenbiao@ouc.edu.cn Biao Shen
thomaski thomaski@hawaii.edu Thomas Kilpatrick
barcenajf barcenajf@unican.es Javi Brcena
garciajav garciajav@unican.es Javi Garca
perezdb perezdb@unican.es Beatriz Prez
cardenasm cardenasm@unican.es Mar Crdenas
cida cida@unican.es Alba Cid
48
zengz zengz@unican.es Zeng Zhou
pnp29 pnp29@alumnos.unican.es Paula Nez
requejos requejos@unican.es Soledad Requejo
gutierrezoq gutierrezoq@unican.es Omar Gutirrez
belen belen.lopez@unican.es Beln Lpez de San Romn
barajas gabriel.barajas@unican.es Gabriel Barajas
higuerap higuerap@unican.es Pablo Higuera
diazg diazg@unican.es Gabriel Diaz
tomasan tomasan@unican.es Antonio Toms
camus paula.camus@unican.es Paula Camus
antonio antonio.espejo@unican.es Antonio Espejo
castanedos castanedos@unican.es Sonia Castanedo
abascalaj abascalaj@unican.es Ana Abascal
ripolln ripolln@unican.es Nicols Ripoll
rafael rafael.tinoco@unican.es Rafael Tinoco
karen0009 karen0009@163.com Karen
hychenj hychenj@mit.edu Julia Hopkins
sphibbs s.phibbs12@imperial.ac.uk Samuel Phibbs
aikenr aikenr@onid.orst.edu Rebecca Aiken
lqh lqh@mail.iap.ac.cn Qinghai Liao
jrmiro jrmiro@meteo.cat Josep Ramon Mir
kyheo kyheo21@kiost.ac Ki Young Heo
garcia garcia.leon.m@gmail.com Manuel Garca
p_heidary p_heidary@sina.kntu.ac.ir Pourya Heidary
shaowu shaowu.bao@noaa.gov Shaowu Bao
ahumada ahumada@angel.umar.mx Miguel Angel Ahumada
nagarajuc nagarajuc@cdac.in Nagaraju Chilukoti
adomingu adomingu@cicese.mx Alejandro Dominguez
brianstiber brianstiber@yahoo.com Brian Stiber
agronholz agronholz@marum.de Alexandra Gronholz
hatsue hatsue@lamma.ufrj.br Hatsue Takanaca de Decco
sunjiaouc sunjiaouc@sina.com Sun J ia
hslim hslim@kiost.ac Hak Soo LIM
juan juchernandezdi@unal.edu.co Juan Camilo Hernndez Daz
jhwang jhwang@gmail.com JH Wang
kieran kieran.lyons@marine.ie Kieran Lyons
ggerbi ggerbi@skidmore.edu Greg Gerbi
clchen clchen@sio.org.cn Chen Changlin
jhihying jhihyingchen.6914@gmail.com ChihYing (David) Chen
ljmontoya ljmontoya@udem.edu.co Luis Javier Montoya
gabrielv gabriel@prooceano.com.br Gabriel Vieira de Carvalho
fyshi fyshi@UDel.Edu Fengyan Shi
claudia claudia.pasquero@unimib.it Claudia Pasquero
zengrui zengrui.r@gmail.com Zengrui Rong
joonho ocean0510@gmail.com Joon Ho Lee
chenzhen 120332986@qq.com Chen Zhen
49
kanoksri kanoksri@haii.or.th Kanoksri Sarinnapakorn
marcello marcello.magaldi@sp.ismar.cnr.it Marcello Magaldi
robong robert.ong@postgrad.curtin.edu.au Robert Ong
montavez montavez@um.es Juan Pedro Montvez
mostafa Mostafa.Bakhoday@gfi.uib.no Mostafa Bakhoday Paskyabi
ytseng ytseng@ncar.ucar.edu Yu-heng Tseng
ggarcia ggarcia@coas.oregonstate.edu Gabriel Garcia Medina
ezber ezber@itu.edu.tr Yasemin Ezber
cakan cakan@coas.oregonstate.edu Cigdem Akan
hbzong hbzong@sklec.ecnu.edu.cn Haibo Zong
geyijun geyijun1983@163.com Ge Yijun
ozsoy ozsoy@ims.metu.edu.tr Emin Ozsoy
mhan mhan@rsmas.miami.edu MyeongHee Han
kbrrhari k.b.r.r.hari@gmail.com K.Hari Prasad
tconnolly tconnolly@whoi.edu Tom Connolly
ocehugo ocehugo@gmail.com Hugo Bastos de Oliveira
jean coldstreet@hotmail.com Jean Qin J iang
hyunsu hyunsu@pusan.ac.kr Hyunsu Kim
mountzhang mountzhang@gmail.com J infeng Zhang
renhao zjhk0701@163.com Renhao Wu
dhson dhson.monre@gmail.com Duong Hong Son
barbourp barbourp@coas.oregonstate.edu Phil Barbour
oceanlgdx oceanlgdx@163.com Liu Ke
audalio audalio@lamma.ufrj.br Audalio Rebelo Torres
thanvanvan thanvanvan@gmail.com THAN Van Van
olafur or@belgingur.is Olafur Rognvaldsson
glazejimmy glazejimmy@gmail.com Chunming Chang
mthorup mthorup@g3consulting.com Marc Thorup
saji saji@u-aizu.ac.jp Saji Hameed
rcandella rcandella@ieapm.mar.mil.br Rogrio Neder Candella
rhashemi r.hashemi@bangor.ac.uk Reza Hashemi
zhenhua linzhenhuaouc@gmail.com Zhenhua Lin
skastner skastner@skidmore.edu Sam Kastner
rmeynadier Remi.Meynadier@latmos.ipsl.fr Remi Meynadier
srikanthy srikanthy@cdac.in Srikanth Yalavarthi
lfelipem lfelipem@msn.com Lus Felipe Mendona
nmhuan nmhuan61@gmail.com Minh Huan Nguyen
guoxiang 13345019651@163.com Guoxiang Wu
msd msd@ccpo.odu.edu Mike Dinniman
zhangxf zhangxf@mail.nmdis.gov.cn Xuefeng Zhang
mrmpur mrmohammadpur@yahoo.com Mohammadreza Mohammadpur
aricky aricky84@gmail.com Antonio Ricchi
fanjieyu fanjieyu@163.com Fan Wei
cnewinger c.newinger11@imperial.ac.uk Christina Newinger
geson geson@pusan.ac.kr Goeun
sima sima.hamzelo@gmail.com Sima Hamzelo
50
dcrino d.crino@fluidsolutions-a.com Daniele Crin
xwu xwu@email.sc.edu Xiaodong Wu
sjlake sjlake@vims.edu Samuel J Lake
brush brush@vims.edu Mark J Brush
jpotter jpotter1@skidmore.edu Jamie Potter
linxiaxu linxiaxu@ufl.edu Linxia Xu
curbano curbano1979@gmail.com Claudia Patricia Urbano
mjlewis m.j.lewis@bangor.ac.uk Matt Lewis
meira meira@cesup.ufrgs.br Lindolfo Meira
nikhil NIKHIL003@e.ntu.edu.sg Nikhil Garg
dohnkim dohnkim@yonsei.ac.kr Dong-Hoon Kim
fereshteh fereshtehkomijani@gmail.com Fereshteh Komijani
drif13 drif13@gmail.com Zheng Zhang
romaric Romaric.Verney@ifremer.fr Romaric Verney
worachat worachat@haii.or.th Worachat Wannawong
hoangvu hoangvu.nguyen@kaust.edu.sa Hoang Vu Nguyen
msscordoba mss130430@utdallas.edu Miguel S. Solano Cordoba
chenlin jumping_er5@foxmail.com Chen Lin
gaolei 812285653@163.com gaolei cheng
georgiy georgiy.stenchikov@kaust.edu.sa Georgiy L. Stenchikov
motani motani.satoshi@gmail.com Satoshi Motani
wanzi wanzi0411@gmail.com Wenwen Kong
gssaldias gssaldias@gmail.com Gonzalo Saldias
kirang kirang@cdac.in Kiran Prakash Gajbhiye
junwei junwei@pku.edu.cn Jun Wei
haibo haibo1981nju@gmail.com Haibo HU
mwhitney Michael.Whitney@uconn.edu Michael M. Whitney
pcheng pcheng@xmu.edu.cn Peng Cheng
shuai shuai.wang13@imperial.ac.uk Shuai Wang
layeghi layeghi2001@yahoo.com Behzad Layeghi
zhouwei zhouwei@cma.gov.cn Wei Zhou
wenlong jymt_ma@yahoo.com Ma Wenlong
jumpinger jumpinger.chen5@gmail.com Chen Lin
gowardbrown a.j.gowardbrown@bangor.ac.uk Alice Jane Goward Brown
dujt dujt@fio.org.cn J ianting Du
mmoulton melissarmoulton@gmail.com Melissa Moulton
nicleona nicleona@bu.edu Nicoletta Leonardi
frocha weatherfgr@gmail.com Fernanda Rocha
kukulka kukulka@udel.edu Tobias Kukulka
rjenkins robman0618@gmail.com Robert Jenkins
efrain efrain_mateos@tlaloc.imta.mx Efran Mateos Farfn
ztessler ztessler@ccny.cuny.edu Zachary Tessler
pooran pooran.khedri@gmail.com Pooran Khedri
dmantsis dmantsis@envsci.rutgers.edu Damianos F. Mantsis
bing bing@nuist.edu.cn Yuan Bing
xifeng feng@coastal.ufl.edu Xi Feng
51
spneil s.p.neill@bangor.ac.uk Simon Philip Neill
kshedstrom kshedstrom@alaska.edu Katherine Hedstrom
xiaodongwu xiaodongwu1026@gmail.com Xiaodong Wu
elger elger@mailbox.sc.edu Nathan Elger
tnmiles tnmiles@marine.rutgers.edu Travis Miles
seroka seroka@marine.rutgers.edu Greg Seroka
jbrodie jbrodie@udel.edu Joe Brodie
dveron dveron@udel.edu Dana Veron
roxana roxana.tiron@ucd.ie Roxana Tiron
bcahill b.cahill@imperial.ac.uk Bronwyn Cahill
thjung thjung@udel.edu Tae-hwa Jung
mengxia mengxia.umes@gmail.com Meng Xia
ackschmidt ackschmidt@gmail.com Andre Schmidt
sumit sumit@coral.iitkgp.ernet.in Sumit Dandapat


jcwarner@usgs.gov, maitane.olabarrieta@essie.ufl.edu, jacopo.chiggiato@ismar.cnr.it,
barmstrong@usgs.gov, jbzambon@ncsu.edu, rhe@ncsu.edu,
Kevin.haas@gtsav.gatech.edu, ddukhovskoy@fsu.edu, jmralves@fc.ul.pt,
tracey@surflegend.co.jp, milicak@rsmas.miami.edu, nganju@usgs.gov, liess@umn.edu,
dputrasa@gmail.com, gpassala@ucsd.edu, lujy1985@ouc.edu.cn,
arango@marine.rutgers.edu, nkumar@geol.sc.edu, ivica.jan@gmail.com,
admin@mariclime.com, xfyang@gatech.edu, lrenault@imedea.uib-csic.es,
agmunoz@cmc.org.ve, zxue@ncsu.edu, yaozhigang.ouc@gmail.com,
luciano.pezzi@cptec.inpe.br, jsnelso2@ncsu.edu, sdalyander@usgs.gov,
c.denamiel@imperial.ac.uk, l.fowler09@imperial.ac.uk, james.farley-
nicholls05@imperial.ac.uk, r.toumi@imperial.ac.uk, Mathieu.Dutour@irb.hr,
kuzmic@irb.hr, rftome@gmail.com, sandro.carniel@ismar.cnr.it, sudiptab@cdac.in,
basantas@cdac.in, rsamelson@coas.oregonstate.edu, bin_liu@ncsu.edu,
yzheng@fsu.edu, khyun@ncsu.edu, andres@dgeo.udec.cl, cjjang@kordi.re.kr,
pvelissariou@fsu.edu, xzeng2@ncsu.edu, tomasz.dabrowski@marine.ie,
a.russo@univpm.it, gvoulgaris@geol.sc.edu, greg.wilson@dal.ca,
raul.gonzalez@plymouth.ac.uk, marcel.cure@numericswarehouse.com,
bzhangys@yahoo.com, mori.nobuhito.8a@kyoto-u.ac.jp, priscilla.a.mooney@nuim.ie,
rcaldeira@ciimar.up.pt, nnatoo@marum.de, minh.nguyen.hus@gmail.com,
lifangjiang@scsio.ac.cn, ishihara@dcc.co.jp, ironling@gmail.com,
gabriel.munchow@ufrgs.br, stephen.d.nicholls@nasa.gov, npisciot33@gmail.com,
dyeatmos@gmail.com, jksim@kesti.co.kr, hetland@tamu.edu, rcxyrcxy@hotmail.com,
arturo.quintanar@gmail.com, drews@ucar.edu, kkobchul@naver.com,
samurai@tropmet.res.in, J.Kala@murdoch.edu.au, hwl1943@gmail.com,
akbarpour@inio.ac.ir, paul.nolan@ucd.ie, apaul@marum.de, sky@nwp2.snu.ac.kr,
bonnland@ucar.edu, gill@ucar.edu, bruyerec@ucar.edu, zdong@UDel.Edu,
kirby@UDel.Edu, lequochuy@imh.ac.vn, hungdamduy@gmail.com,
wzq@lasg.iap.ac.cn, manel.grifoll@upc.edu, torleif.lunde@cih.uib.no,
martin.king@uni.no, dwang@umces.edu, ephraim.paul@ocean.tamu.edu,
zhaoruzhang@gmail.com, wenxiazhang@tamu.edu, kelly.lynne.cole@gmail.com,
52
zhangxq@neo.tamu.edu, celsoferreira@tamu.edu, rmontuoro@tamu.edu,
christina.patricola@gmail.com, jsh@ocean.tamu.edu, jaison@atmos.ucla.edu,
mingkui.li@gmail.com, xuzhao@neo.tamu.edu, maxiaohui@neo.tamu.edu,
Karthik.Balaguru@pnnl.gov, sarava@tamu.edu, ping@tamu.edu, khiem@imh.ac.vn,
kien.cbg@gmail.com, abhisek.hit@gmail.com, markus.gross@metoffice.gov.uk,
johannesro@met.no, tcraig@ucar.edu, zhaobiaodeyouxiang@163.com,
bbutman@usgs.gov, mahmoodakbarpour@gmail.com, schen77@ntu.edu.tw,
zdefne@usgs.gov, benjamin.wong11@imperial.ac.uk, alexis.perez@insmet.cu,
rsignell@usgs.gov, jmsole@meteosim.com, raluca.radu@imperial.ac.uk,
csherwood@usgs.gov, jorge.navarro@ciemat.es, nilsmk@met.no, isafak@usgs.gov,
alvise.benetazzo@ve.ismar.cnr.it, nehru.machineni@gmail.com, shijian.ocean@qq.com,
francesco.falcieri@ve.ismar.cnr.it, aaretxabaleta@usgs.gov, huadong.du@gmail.com,
kritanai.torsri@gmail.com, jwlong@usgs.gov, ijmoon@jejunu.ac.kr,
Muralidhar.Adakudlu@gfi.uib.no, ctang@yic.ac.cn, jl_hdzf@yahoo.com,
Guoqiang.Liu@dfo-mpo.gc.ca, jhe6@ncsu.edu, a.coluccelli@univpm.it,
avalentini@arpa.emr.it, fjose@fgcu.edu, paulo.calil@furg.br, ramos.oceano@gmail.com,
alexander.haumann@usys.ethz.ch, nardieric@gmail.com, majid.noranian@gmail.com,
Lanli.Guo@dfo-mpo.gc.ca, hhiester@fsu.edu, 15054823857@163.com,
pilar.delbarrio@unican.es, tjdewhurst@gmail.com, diane.foster@unh.edu,
ygh2@wildcats.unh.edu, cenglert.nh@gmail.com, evan.gray.04@gmail.com,
crf43@wildcats.unh.edu, donyafrank@gmail.com, will@ccom.unh.edu,
jchadwick@ccom.unh.edu, jsrogers@stanford.edu, cdong@atmos.ucla.edu,
rhg11c@my.fsu.edu, tk.kim@kiaps.org, Marcelo.HerediaGomez@anteagroup.com,
ibnusofian@bakosurtanal.go.id, charles.james@sa.gov.au, Ryan.Lowe@uwa.edu.au,
icebearw@yahoo.com.cn, Mark.Hadfield@niwa.co.nz, doristea@gmail.com,
d.kobashi@tamu.edu, shenbiao@ouc.edu.cn, thomaski@hawaii.edu,
barcenajf@unican.es, garciajav@unican.es, perezdb@unican.es, cardenasm@unican.es,
cida@unican.es, zengz@unican.es, pnp29@alumnos.unican.es, requejos@unican.es,
gutierrezoq@unican.es, belen.lopez@unican.es, gabriel.barajas@unican.es,
higuerap@unican.es, diazg@unican.es, tomasan@unican.es, paula.camus@unican.es,
antonio.espejo@unican.es, castanedos@unican.es, abascalaj@unican.es,
ripolln@unican.es, rafael.tinoco@unican.es, karen0009@163.com, hychenj@mit.edu,
s.phibbs12@imperial.ac.uk, aikenr@onid.orst.edu, lqh@mail.iap.ac.cn,
jrmiro@meteo.cat, kyheo21@kiost.ac, garcia.leon.m@gmail.com,
p_heidary@sina.kntu.ac.ir, shaowu.bao@noaa.gov, ahumada@angel.umar.mx,
nagarajuc@cdac.in, adomingu@cicese.mx, brianstiber@yahoo.com,
agronholz@marum.de, hatsue@lamma.ufrj.br, sunjiaouc@sina.com, hslim@kiost.ac,
juchernandezdi@unal.edu.co, jhwang@gmail.com, kieran.lyons@marine.ie,
ggerbi@skidmore.edu, clchen@sio.org.cn, jhihyingchen.6914@gmail.com,
ljmontoya@udem.edu.co, gabriel@prooceano.com.br, fyshi@UDel.Edu,
claudia.pasquero@unimib.it, zengrui.r@gmail.com, ocean0510@gmail.com,
120332986@qq.com, kanoksri@haii.or.th, marcello.magaldi@sp.ismar.cnr.it,
robert.ong@postgrad.curtin.edu.au, montavez@um.es, Mostafa.Bakhoday@gfi.uib.no,
ytseng@ncar.ucar.edu, ggarcia@coas.oregonstate.edu, ezber@itu.edu.tr,
cakan@coas.oregonstate.edu, hbzong@sklec.ecnu.edu.cn, geyijun1983@163.com,
ozsoy@ims.metu.edu.tr, mhan@rsmas.miami.edu, k.b.r.r.hari@gmail.com,
53
tconnolly@whoi.edu, ocehugo@gmail.com, coldstreet@hotmail.com,
hyunsu@pusan.ac.kr, mountzhang@gmail.com, zjhk0701@163.com,
dhson.monre@gmail.com, barbourp@coas.oregonstate.edu, oceanlgdx@163.com,
audalio@lamma.ufrj.br, thanvanvan@gmail.com, or@belgingur.is,
glazejimmy@gmail.com, mthorup@g3consulting.com, saji@u-aizu.ac.jp,
rcandella@ieapm.mar.mil.br, r.hashemi@bangor.ac.uk, linzhenhuaouc@gmail.com,
skastner@skidmore.edu, Remi.Meynadier@latmos.ipsl.fr, srikanthy@cdac.in,
lfelipem@msn.com, nmhuan61@gmail.com, 13345019651@163.com,
msd@ccpo.odu.edu, zhangxf@mail.nmdis.gov.cn, mrmohammadpur@yahoo.com,
aricky84@gmail.com, fanjieyu@163.com, c.newinger11@imperial.ac.uk,
geson@pusan.ac.kr, sima.hamzelo@gmail.com, d.crino@fluidsolutions-a.com,
xwu@email.sc.edu, sjlake@vims.edu, brush@vims.edu, jpotter1@skidmore.edu,
linxiaxu@ufl.edu, curbano1979@gmail.com, m.j.lewis@bangor.ac.uk,
meira@cesup.ufrgs.br, NIKHIL003@e.ntu.edu.sg, dohnkim@yonsei.ac.kr,
fereshtehkomijani@gmail.com, drif13@gmail.com, Romaric.Verney@ifremer.fr,
worachat@haii.or.th, hoangvu.nguyen@kaust.edu.sa, mss130430@utdallas.edu,
jumping_er5@foxmail.com, 812285653@163.com, georgiy.stenchikov@kaust.edu.sa,
motani.satoshi@gmail.com, wanzi0411@gmail.com, gssaldias@gmail.com,
kirang@cdac.in, junwei@pku.edu.cn, haibo1981nju@gmail.com,
Michael.Whitney@uconn.edu, pcheng@xmu.edu.cn, shuai.wang13@imperial.ac.uk,
layeghi2001@yahoo.com, zhouwei@cma.gov.cn, jymt_ma@yahoo.com,
jumpinger.chen5@gmail.com, a.j.gowardbrown@bangor.ac.uk, dujt@fio.org.cn,
melissarmoulton@gmail.com, nicleona@bu.edu, weatherfgr@gmail.com,
kukulka@udel.edu, robman0618@gmail.com, efrain_mateos@tlaloc.imta.mx,
ztessler@ccny.cuny.edu, pooran.khedri@gmail.com, dmantsis@envsci.rutgers.edu,
bing@nuist.edu.cn, feng@coastal.ufl.edu, s.p.neill@bangor.ac.uk,
kshedstrom@alaska.edu, xiaodongwu1026@gmail.com, elger@mailbox.sc.edu,
tnmiles@marine.rutgers.edu, seroka@marine.rutgers.edu, jbrodie@udel.edu,
dveron@udel.edu, roxana.tiron@ucd.ie, b.cahill@imperial.ac.uk, thjung@udel.edu,
mengxia.umes@gmail.com, ackschmidt@gmail.com, sumit@coral.iitkgp.ernet.in
54

Section 14. List of references and Acknowledgements for COAWST.

Warner, J.C., Armstrong, B., He, R., and Zambon, J.B., (2010). Development of a
Coupled Ocean-Atmosphere-Wave-Sediment Transport (COAWST) modeling system:
Ocean Modeling, v. 35, no. 3, p. 230-244.

Kumar, N., Voulgaris, G., and Warner, J.C. (2011). Implementation and modification of a
three-dimensional radiation stress formulation for surf zone and rip-current applications,
Coastal Engineering, 58, 1097-1117, doi:10.1016/j.coastaleng.2011.06.009.

Olabarrieta, M., J. C. Warner, and N. Kumar (2011), Wave-current interaction in Willapa
Bay, J. Geophys. Res., 116, C12014, doi:10.1029/2011JC007387.

Olabarrieta, M., Warner, J.C., and Armstrong, B. (2012). Ocean-atmosphere dynamics
during Hurricane Ida and Nor'Ida: an atmosphere-ocean-wave coupled modeling system
application. Ocean Modelling, 43-44, pp 112-137.

Kumar, N., Voulgaris, G., Warner, J.C., and M., Olabarrieta (2012). Implementation of a
vortex force formalism in the coupled ocean-atmosphere-wave-sediment transport
(COAWST) modeling system for inner-shelf and surf-zone applications. Ocean Modeling
47, pp 65-95.

Renault, L., J. Chiggiato, J. C. Warner, M. Gomez, G. Vizoso, and J. Tintor (2012),
Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and
Balearic Sea, J. Geophys. Res., 117, C09019, doi:10.1029/2012JC007924.

Benetazzo, A., Carniel, S., Sclavo, M., and Bergamasco, A. Wave-current interaction:
effect on the wave field in a semi-enclosed basin. Ocean Modeling (in press).


Forecast systems:
Hydro and Agro Informatics Institute
http://live1.haii.or.th/wrfroms_image/?page=1&userdesire=g

USGS Woods Hole
http://woodshole.er.usgs.gov/project-pages/cccp/public/COAWST.htm


Acknowledgements
We thank all the modeling and tool systems for open access to their codes, and to the
Integration and Application Network (ian.umces.edu/symbols), University of Maryland
Center for Environmental Science, for the courtesy use of their symbols and diagrams.

Anda mungkin juga menyukai