You work as security technician at ABC.com. While doing web application testing, you might be required to look through multiple web pages online which can take a long time. Which of the processes listed below would be a more efficient way of doing this type of validation?
A.
Use mget to download all pages locally for further inspection.
B.
Use wget to download all pages locally for further inspection.
C.
Use get* to download all pages locally for further inspection.
D.
Use get() to download all pages locally for further inspection.
Explanation:
Wget is a utility used for mirroring websites, get* doesn’t work, as for the actual FTP command to work there needs to be a space between get and * (ie. get *), get(); is just bogus, that’s a C function that’s written 100% wrong. mget is a command used from "within" ftp itself, ruling out A. Which leaves B use wget, which is designed for mirroring and download files, especially web pages, if used with the R option (ie. wget R www.ABC.com) it could mirror a site, all expect protected portions of course.Note: GNU Wget is a free network utility to retrieve files from the World Wide Web using HTTP and FTP and can be used to make mirrors of archives and home pages thus enabling work in the background, after having logged off.