Wget is nice little piece of software that everyone should know. With it you can check site, download from FTP an entire collection of files or a photo gallery. Just open your terminal and these steps |
GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc. GNU Wget has many features to make retrieving large files or mirroring entire web or FTP sites easy, here are some interesting options. All these commands must be used from linux terminal. Basic use download a package knowing its http (or ftp) URL: wget http://kernel.org/pub/linux/kernel/v2.6/patch-2.6.23.bz2 Using Wget for Recursive Downloads wget -r http://my.site.todownload.com/ The -r command tells wget to recursively download everything from the listed url. Using Wget for Recursive Downloads but limit the number of levels to 2 wget -r -l2 http://my.site.todownload.com/ Now the -r does the same as above the -l tells wget to limit to that number of levels here 2 levels deep (otherwise the defualt is 5) Using Wget for Recursive Downloads but limit the type of files you want to download wget -r -A.pdf -R.htm http://my.site.todownload.com/ This one tells wget to do a recursive get and Accept all files with .pdf extension and reject all files with .htm extension Using Wget for Recursive Downloads from a FTP with authentication wget -r ftp://username:password@my.site/path/to/download Here you tell wget to download from FTP with userid and password Using Wget to check dead link on your site wget spider -r -o log.txt http://yourdomain.com In this example we tell Wget to act like a web spider (Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there), and put results in the file log.txt, so you can open it and search for a list of broken links. Using Wget to download a photo gallery for i in `seq -w 1 100`; do wget http://www.mysite.com/images/DSCF00$i.jpg; done In this example we run a cycle that go from 1 to 100 and every time download a different URL, really useful for quickly download a gallery with no links. Finally, I forgot to tell you that wget is also usable by Mac and Windows (requires Cygwin) |
About Me
- Kalyan Kumar Pasupuleti
- Kalyan Kumar Pasupuleti B-Tech(Information Technology). • AWS Certified Solutions Architect - Associate • RedHat Certified Engineer(RHCE) • Directory Services and Authentication Certificate of Expertise(LDAP) • Red Hat SELinux Policy Administration Certificate of Expertise(SELinux) • Network Services Security Certificate of Expertise (Network Services) • RedHat Certified Virtualization Administrator(RHCVA) • Red Hat Certified Security Specialist (RHCSS) Working as Cloud DevOps engineer
Thursday, April 28, 2011
Wget
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment