A friend of mine is looking to archive a (vile) website that is about to be taken down. She needs it for advocacy for the people the site is against. What program can download as much of the site as possible?
@hhardy01
What parameters do I need for it? I don't need just one page, but the entire directory tree or whatever it's called...
@eladhen
wget -r
more info:
https://www.gnu.org/software/wget/manual/wget.html#Recursive-Retrieval-Options