I used below command to download a website completely but some parts like images on the "index.html" not work:
Code: Select all
# wget -m https://ctf101.org/
Thank you.
Code: Select all
# wget -m https://ctf101.org/
Hi,hack3rcon wrote:Any idea?Code: Select all
# wget -m https://ctf101.org/
Code: Select all
$ vrms
No non-free or contrib packages installed on debian! rms would be proud.
Any tools?andreathome wrote:In the past there were quite good tools that could download quite a big part of htm or html based websites.
Nowadays a lot of website are database driving, using CMS front tools.
In my humble opinion it is not possible to just simply download such CMS based websites, certainly not with simple tools.
So no word there on CMS website which work completely different, with databases, sometimes even located elsewhere, so I think most of current websites cannot be downloaded with this tool, only fully classical htm/html based types.HTTrack is an offline browser utility, allowing you to download a World Wide website from the Internet to a local directory, building recursively all directories, getting html, images, and other files from the server to your computer.
I just tested with wget (check here: https://www.guyrutenberg.com/2014/05/02 ... sing-wget/) and it worked quite nicely. There may be some elements that are still downloaded (such as web fonts) but a quick test with firefox in offline mode shows the page just fine.hack3rcon wrote:Hello,
I used below command to download a website completely but some parts like images on the "index.html" not work:Any idea?Code: Select all
# wget -m https://ctf101.org/
Thank you.