The Navigatrix has been updated. The new website can be found at navigatrix.net.




Post new topic Reply to topic  [ 4 posts ] 
Author Message
 Post subject: Downloading forecast data for lazy folks...
Site Admin

Joined: 20 Mar 2012, 13:32
Posts: 116
...lazy folks like me that is.

Sailing in the South West Pacific I found the following two sources of weather forecast data very useful:

(1) MetVUW, NZ's University of Wellington's meteorolgy depratment. They seem to compare US, Euro and UK model outputs and either pick and chose or blend them into their own 7-day forecasts.

(2) ECMWF, the European Center for Medium Range Weather Forecasting. The Euro version of the US GFS model that delivers the data for our grib files. Provides model outputs (MSLP) for 10 days. For longer range forecasts I found it useful to check whether the ECMWF and GFS models disagreed significantly. If so I knew that anything might happen and nobody knows, otherwise I'd have reasonable confidence in the basic longer term outlook (say beyond 72 hrs).

Unfortunately, the output from both these sources is only available as images embedded in web pages. At the typical connection speeds in this region it's pretty painful to have to look at these online, let alone manually downloading the images for later off-line consumption and digestion (10 images for ECMWF, 14 for MetVUW).

The attached little shell script automates the download of these forecast images via wget - a piece of software in Navigatrix that is good and patient at downloading files from the web across slow connections. For MetVUW data it will first look at the overview page to figure out the time stamp of the latest available data and then uses this time stamp to construct the download URLs for the individual forecast images (in 12 hr intervalls rather than the available 6 hr granularity to limit total download volume).
Attachment:
wxd.zip [1.38 KiB]
Downloaded 675 times

For the ECMWF it simply requests the images from the latest available model run (you can specify an earlier model run as an optional command line argument - see comments in the script).

To use the script, simply save the attached ZIP file and extract the shell script into ~/bin ('~' being your home directory - in my case /home/markus/). Then press [ctrl]+[alt]+[T] to open a terminal and make the script executable by typing
Code:
chmod +x bin/wxd [enter]

To use the script, open a terminal as above, change to the directory that you want to save the downloaded images in and, at the prompt, type
Code:
wxd [enter]

Assuming that you have an internet connection you will then see the download progress as the script works through the images unattended.

A few comments:
  • The script will create a new directory named YYYY-MM-DD-HH under the directory from which it has been invoked (current year, month, day and hour in utc according to system time). All downloaded files will be saved into this new directory.
  • You can run the script multiple times in case some of downloads fail the first time around. It will skip files it has laready downloaded and re-try the ones it hasn't got yet.
  • The region for which the files are downloaded is the South West Pacific for MetVUW and for ECMWF the area from [5N..55S] x [90E..170W] (centered around Australia). Both sources have other regions avaiable (ECMWF is global) and you can probably figure out how to change the URLs in the script by looking at their websites. If you do modify the script for other regions please post it back here so others can benefit as well.
  • The script also downloads a bunch of additional forecast data from Fiji, Australia's BOM and NOAA. I left these in but you can easily get rid of them by deleting lines 48-73.


Top
   
 
 Post subject: Re: Downloading forecast data for lazy folks...

Joined: 03 Feb 2015, 01:40
Posts: 2
Absolutely great and so easy to use. Thanks very much. I will play with it over the next few days


Top
   
 
 Post subject: Re: Downloading forecast data for lazy folks...
Site Admin

Joined: 20 Mar 2012, 13:32
Posts: 116
Glad if it's helpful. I haven't used it in a while (back on land...) but noticed recently that the URL for the ECMWF model images has changed - new:
Code:
http://www.ecmwf.int/en/forecasts/charts/medium/mean-sea-level-pressure-wind-speed-850-hpa-and-geopotential-500-hpa?step=0&parameter=Wind%20850%20and%20mslp&area=Australia

The "step" value needs to be replaced by the forecast time [0, 24, 48, 72...240]. It will also require some additional twisting to get to the image URL (first get the HTML from the URL above, then extract the link to image and get the image in a 2nd step for each forecast time). If you have a chance to fix it would be nice to post an update here.

EDIT: deleted 'relative_archive_date' parameter from URL; without it the latest will be automatically fetched


Top
   
 
 Post subject: Re: Downloading forecast data for lazy folks...
Site Admin

Joined: 20 Mar 2012, 13:32
Posts: 116
OK, the snippet below works for the new ECMWF site. Just replace the corresponding lines in the original script with it:
Code:
# ECMWF model, 10 days (centered around Australia):
for hr in {0..240..24}
do
    outfile="ecmwf_$(printf %03d $hr).gif"
    # get raw html to extract GIF URL
    wget $wgopts -O dummy "http://www.ecmwf.int/en/forecasts/charts/medium/mean-sea-level-pressure-wind-speed-850-hpa-and-geopotential-500-hpa?step=${hr}&parameter=Wind%20850%20and%20mslp&area=Australia"
    [ -s dummy ] || rm dummy
    if [ -e dummy ]
    then
        gifurl=$(grep -o "class=\"www-chart\"[^>]*" dummy  | sed 's/.*\(http:\/\/[^\"]*\)".*/\1/')
        wget $gifurl
        [ -s $(basename $gifurl) ] && mv $(basename $gifurl) $outfile
    fi
done


Top
   
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 4 posts ] 


Search for:

Credits © 2010 - 2024