A aviation & planes forum. AviationBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » AviationBanter forum » rec.aviation newsgroups » Soaring
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Off Topic - Application to save daily web data??



 
 
Thread Tools Display Modes
  #1  
Old July 28th 07, 02:12 PM posted to rec.aviation.soaring
Gary Emerson
external usenet poster
 
Posts: 152
Default Off Topic - Application to save daily web data??

Greetings,

While off topic, my application is soaring related.

Does anyone know of an application that would automatically load and
store web data.

What I'd like to have is the daily unysis data saved so that I could go
back and review any date's forecast.

http://weather.unisys.com/gfs/6panel...es_6panel.html

For example, I go fly on Saturday and it's just awesome. I want to
better predict what the conditions that lead up to this were. So I go
back to the forecast images from the preceding week that were
automatically stored so that I can review and then be better prepared in
the future to anticipate great soaring weather.

To do this, it would be great to have an application that would load
URLs automatically at a pre-set interval that would save each day's data
in a logical format.

Anyone know of something like this?

Thanks,

Gary
  #2  
Old July 28th 07, 04:24 PM posted to rec.aviation.soaring
Martin Gregorie[_1_]
external usenet poster
 
Posts: 276
Default Off Topic - Application to save daily web data??

Gary Emerson wrote:
Greetings,

While off topic, my application is soaring related.

Does anyone know of an application that would automatically load and
store web data.

What I'd like to have is the daily unysis data saved so that I could go
back and review any date's forecast.

http://weather.unisys.com/gfs/6panel...es_6panel.html

For example, I go fly on Saturday and it's just awesome. I want to
better predict what the conditions that lead up to this were. So I go
back to the forecast images from the preceding week that were
automatically stored so that I can review and then be better prepared in
the future to anticipate great soaring weather.

To do this, it would be great to have an application that would load
URLs automatically at a pre-set interval that would save each day's data
in a logical format.

Anyone know of something like this?

How about wget?

http://en.wikipedia.org/wiki/Wget
http://www.gnu.org/software/wget/

The first link describes wget. The second is its home website and says
where to find downloads for various operating systems.

It is a command line utility, so its easy to write a script (BAT file in
WindowSpeak) that automates your task. If your computer is on 24/7 you
can use the job scheduler to automatically run the script once a day.
Use 'at' or 'cron' for Linux and OS/X or the job scheduler for Windows.

HTH


--
martin@ | Martin Gregorie
gregorie. | Essex, UK
org |
  #3  
Old July 28th 07, 11:19 PM posted to rec.aviation.soaring
Gary Emerson
external usenet poster
 
Posts: 152
Default Off Topic - Application to save daily web data??

Martin Gregorie wrote:
Gary Emerson wrote:
Greetings,

While off topic, my application is soaring related.

Does anyone know of an application that would automatically load and
store web data.

What I'd like to have is the daily unysis data saved so that I could
go back and review any date's forecast.

http://weather.unisys.com/gfs/6panel...es_6panel.html

For example, I go fly on Saturday and it's just awesome. I want to
better predict what the conditions that lead up to this were. So I go
back to the forecast images from the preceding week that were
automatically stored so that I can review and then be better prepared
in the future to anticipate great soaring weather.

To do this, it would be great to have an application that would load
URLs automatically at a pre-set interval that would save each day's
data in a logical format.

Anyone know of something like this?

How about wget?

http://en.wikipedia.org/wiki/Wget
http://www.gnu.org/software/wget/

The first link describes wget. The second is its home website and says
where to find downloads for various operating systems.

It is a command line utility, so its easy to write a script (BAT file in
WindowSpeak) that automates your task. If your computer is on 24/7 you
can use the job scheduler to automatically run the script once a day.
Use 'at' or 'cron' for Linux and OS/X or the job scheduler for Windows.

HTH



Anything more geared for the less programming inclined by chance?
  #4  
Old July 29th 07, 01:17 PM posted to rec.aviation.soaring
Martin Gregorie[_1_]
external usenet poster
 
Posts: 276
Default Off Topic - Application to save daily web data??

Gary Emerson wrote:
Martin Gregorie wrote:
Gary Emerson wrote:
Greetings,

While off topic, my application is soaring related.

Does anyone know of an application that would automatically load and
store web data.

What I'd like to have is the daily unysis data saved so that I could
go back and review any date's forecast.

http://weather.unisys.com/gfs/6panel...es_6panel.html

For example, I go fly on Saturday and it's just awesome. I want to
better predict what the conditions that lead up to this were. So I
go back to the forecast images from the preceding week that were
automatically stored so that I can review and then be better prepared
in the future to anticipate great soaring weather.

To do this, it would be great to have an application that would load
URLs automatically at a pre-set interval that would save each day's
data in a logical format.

Anyone know of something like this?

How about wget?

http://en.wikipedia.org/wiki/Wget
http://www.gnu.org/software/wget/

The first link describes wget. The second is its home website and says
where to find downloads for various operating systems.

It is a command line utility, so its easy to write a script (BAT file
in WindowSpeak) that automates your task. If your computer is on 24/7
you can use the job scheduler to automatically run the script once a
day. Use 'at' or 'cron' for Linux and OS/X or the job scheduler for
Windows.

HTH



Anything more geared for the less programming inclined by chance?

Not that I know of.

FTP might be a possibility, but that would probably need permission from
the server owner and is much harder to automate. After that you're into
real programming in C or Java.

If you use wget you only have to figure out how to do the job once. If
you write that down you can just retype it whenever you want, but making
a script is better. You simply save the command line in a file (a BAT
file for Windows), and then run that as a sort of shorthand.

wget is very powerful and is tackling a complex task, which is why its
manual is daunting. For instance, you can tell it what to do if a later
run downloads a file with the same name as one you already have.
However, if you're just downloading straightforward data files from a
single directory and the filenames include the date they were made, then
its pretty straightforward to use. Here's an example:

wget -r --random-wait --no-host-directories -P gliding_copy
http://www.gregorie.org/gliding/omarama

That command should be written as a single line, but you'll probably see
it as two lines because your newsreader will line wrap.

Try running it. It copies a story and pictures about a visit to Omarama
from my website, putting it in the gliding_copy/gliding/omarama
directory. This story links to another story about a visit to Boulder,
so that gets copied into gliding_copy/freeflight/october_2001

The parameters on the command line a

-r recursively follow links from copied pages.
If you're just grabbing a set of images you
don't use this.

--random-wait is being kind to the web server by using a
small random wait between each file fetched.
You can also use --wait=n where n is the
number of seconds to wait.

--no-host-directories
Normally wget would put the stuff in a directory
with the same name as the host (in this case
www.gregorie.org) --no_host_directories tells
it not to do this.

-P gliding_copy
causes the downloaded stuff to be put in a
directory called 'gliding_copy'.

http://www.gregorie.org/gliding/omarama
This is where the stuff to be downloaded is found
on the net.

You can contact me directly if you need more advice about wget.

--
martin@ | Martin Gregorie
gregorie. | Essex, UK
org |
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Daily Airbus News john smith Piloting 2 October 21st 06 06:57 PM
Daily Airbus News john smith Owning 0 October 20th 06 10:00 PM
2006 WGC daily write ups [email protected] Soaring 4 June 4th 06 02:07 AM
Landsat (15m or 30m) and Modis (1km twice daily) [email protected] Soaring 7 May 11th 05 09:54 PM
homebuilder websites updated daily? Joa Home Built 6 December 28th 03 07:30 PM


All times are GMT +1. The time now is 10:58 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 AviationBanter.
The comments are property of their posters.