Navigatrix.net - A Voyager's Companion http://navigatrix.net/ |
|
Downloading forecast data for lazy folks... http://navigatrix.net/viewtopic.php?f=10&t=553 |
Page 1 of 1 |
Author: | Markus [ 07 Nov 2013, 04:12 ] |
Post subject: | Downloading forecast data for lazy folks... |
...lazy folks like me that is. Sailing in the South West Pacific I found the following two sources of weather forecast data very useful: (1) MetVUW, NZ's University of Wellington's meteorolgy depratment. They seem to compare US, Euro and UK model outputs and either pick and chose or blend them into their own 7-day forecasts. (2) ECMWF, the European Center for Medium Range Weather Forecasting. The Euro version of the US GFS model that delivers the data for our grib files. Provides model outputs (MSLP) for 10 days. For longer range forecasts I found it useful to check whether the ECMWF and GFS models disagreed significantly. If so I knew that anything might happen and nobody knows, otherwise I'd have reasonable confidence in the basic longer term outlook (say beyond 72 hrs). Unfortunately, the output from both these sources is only available as images embedded in web pages. At the typical connection speeds in this region it's pretty painful to have to look at these online, let alone manually downloading the images for later off-line consumption and digestion (10 images for ECMWF, 14 for MetVUW). The attached little shell script automates the download of these forecast images via wget - a piece of software in Navigatrix that is good and patient at downloading files from the web across slow connections. For MetVUW data it will first look at the overview page to figure out the time stamp of the latest available data and then uses this time stamp to construct the download URLs for the individual forecast images (in 12 hr intervalls rather than the available 6 hr granularity to limit total download volume). Attachment: For the ECMWF it simply requests the images from the latest available model run (you can specify an earlier model run as an optional command line argument - see comments in the script). To use the script, simply save the attached ZIP file and extract the shell script into ~/bin ('~' being your home directory - in my case /home/markus/). Then press [ctrl]+[alt]+[T] to open a terminal and make the script executable by typing Code: chmod +x bin/wxd [enter] To use the script, open a terminal as above, change to the directory that you want to save the downloaded images in and, at the prompt, type Code: wxd [enter] Assuming that you have an internet connection you will then see the download progress as the script works through the images unattended. A few comments:
|
Author: | harryws [ 02 Mar 2015, 05:36 ] |
Post subject: | Re: Downloading forecast data for lazy folks... |
Absolutely great and so easy to use. Thanks very much. I will play with it over the next few days |
Author: | Markus [ 02 Mar 2015, 10:31 ] |
Post subject: | Re: Downloading forecast data for lazy folks... |
Glad if it's helpful. I haven't used it in a while (back on land...) but noticed recently that the URL for the ECMWF model images has changed - new: Code: http://www.ecmwf.int/en/forecasts/charts/medium/mean-sea-level-pressure-wind-speed-850-hpa-and-geopotential-500-hpa?step=0¶meter=Wind%20850%20and%20mslp&area=Australia The "step" value needs to be replaced by the forecast time [0, 24, 48, 72...240]. It will also require some additional twisting to get to the image URL (first get the HTML from the URL above, then extract the link to image and get the image in a 2nd step for each forecast time). If you have a chance to fix it would be nice to post an update here. EDIT: deleted 'relative_archive_date' parameter from URL; without it the latest will be automatically fetched |
Author: | Markus [ 02 Mar 2015, 13:28 ] |
Post subject: | Re: Downloading forecast data for lazy folks... |
OK, the snippet below works for the new ECMWF site. Just replace the corresponding lines in the original script with it: Code: # ECMWF model, 10 days (centered around Australia):
for hr in {0..240..24} do outfile="ecmwf_$(printf %03d $hr).gif" # get raw html to extract GIF URL wget $wgopts -O dummy "http://www.ecmwf.int/en/forecasts/charts/medium/mean-sea-level-pressure-wind-speed-850-hpa-and-geopotential-500-hpa?step=${hr}¶meter=Wind%20850%20and%20mslp&area=Australia" [ -s dummy ] || rm dummy if [ -e dummy ] then gifurl=$(grep -o "class=\"www-chart\"[^>]*" dummy | sed 's/.*\(http:\/\/[^\"]*\)".*/\1/') wget $gifurl [ -s $(basename $gifurl) ] && mv $(basename $gifurl) $outfile fi done |
Page 1 of 1 | All times are UTC - 5 hours |
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group http://www.phpbb.com/ |