While doing web application testing, you might be required to look through multiple web pages online which can take a long time. Which process below would be a more efficient way of doing this type of validation?
A.
Useget utility to download all pages locally for further inspection
B.
Use wget utility to download all pages locally for further inspection
C.
Use mget utility to download all pages locally for further inspection
D.
Use get * utility to download all pages locally for further inspection
Explanation:
Wget is a utility used for mirroring websites, get* doesn’t work, as for the actual FTP command to work there needs to be a space between get and * (ie. get *), get( ); is just bogus, that’s a C function that’s written 100% wrong. mget is a command used from “within” ftp itself, ruling out A. Which leaves B use wget, which is designed for mirroring and download files, especially web pages, if used with the -R option (ie. wget -R www.testking.com ) it could mirror a site, all expect protected portions of course.
Note: GNU Wget is a free network utility to retrieve files from the World Wide Web using HTTP and FTP andcan be usedto make mirrors of archives and home pages thus enabling work in the background, after having logged off.