Other Help Topics :: directory listing on http/ftp sever
gnu-utils is a big file. Although it's cool that it will fix it, ideally something already present in DSL should be implemented. I'll try to work with friedgold's suggestion if i can.
Edit: doesn't seem to work with http, and requires username/password, even with the -a option. That's not acceptable.
Edit2: Just to clarify what I'm doing.... I've written a flua script to update a frugal installation. It contains a list of DSL mirrors, and it needs to obtain a list of files in the directories specified in that mirror list. Some are http and some are ftp. It works ok as-is if you have gnu-utils installed, but the default wget does not have the -S option required to grab a list of files in the directory. The user simply clicks on the mirror, and the list of ISO files is downloaded. This must be done automatically, and without additional effort on the part of the user in order to make the script actually worthwhile to use. If the user has to do too much work he may as well just download a file with wget or a browser, mount it, and copy the necessary files over. The main point is to make it as automatic as possible.For the username / pass the standard thing to do is use the username anonymous and the pass as a email address.
For example if I use firefox to go to a ftp site it actually uses the username anonymous and the password mozilla@example.com. Similarly wget uses anonymous and @wget.
For http servers I'm not too sure what kind of output you want. What does wget -S <URL> do that wget <URL> doesn't? For me -S doesn't seem to make a difference to the downloaded file, only to the infomation that is shown on std output (i.e. is shows the server responses)That had crossed my mind, but only briefly. I'll try again with anonymous, but it doesn't seem to connect to http servers at all.
Showing the server response is what i need, since the server gives you a list of files available in that directory. It's in html format, but thanks to Robert's mydslBrowser the script is able to parse the html easily enough. It's not downloading any files at this point, so wget <url> is not useable. The file is downloaded only after the user sees what is available and chooses a file. In order to show the user what is available I need a way to grab a list of files. And so we come back to the original problem.You should be able to do an "ls" from an ftp command script, just like any other ftp command.
Then you redirect the output from stdout into a file and then parse the file.cbagger01 is right. I made the following script and it worked:
and it worked. You can parse the file names out with:
awk '{print $9}' filelist
You can't get a directory listing from an http connection as far as I know. If there is an index page, that gets shown if no specific page is used as the target.Next Page...
original here.