Scheduled Maintenance: We are aware of an issue with Google, AOL, and Yahoo services as email providers which are blocking new registrations. We are trying to fix the issue and we have several internal and external support tickets in process to resolve the issue. Please see:
viewtopic.php?t=158230
New to Debian (Or Linux in general)? Ask your questions here!
hack3rcon
Posts: 746 Joined: 2015-02-16 09:54
Has thanked: 48 times
#1
Post
by hack3rcon » 2020-10-23 18:01
Hello,
I want to download the entry of a URL. the source code of an operating system placed on that URL and I used below command:
Code: Select all
$ wget --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off URL
The size of all files is not more than 100 MB, but it take some hours. Which parameter is wrong?
Thanks.
peter_irich
Posts: 1406 Joined: 2009-09-10 20:15
Location: Saint-Petersburg, Russian Federation
Been thanked: 11 times
#2
Post
by peter_irich » 2020-10-23 18:28
Try show messages with option
and then again with "-o logfile" but without "-e robots=off" or at first without "-e robots=off".
RU55EL
Posts: 546 Joined: 2014-04-07 03:42
Location: /home/russel
#3
Post
by RU55EL » 2020-10-23 21:20
hack3rcon wrote: Hello,
I want to download the entry of a URL. the source code of an operating system placed on that URL and I used below command:
Code: Select all
$ wget --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off URL
The size of all files is not more than 100 MB, but it take some hours. Which parameter is wrong?
Thanks.
What makes you think a parameter is wrong?
hack3rcon
Posts: 746 Joined: 2015-02-16 09:54
Has thanked: 48 times
#4
Post
by hack3rcon » 2020-10-24 17:56
Can you test this wget command and tell me how long did it take?
Code: Select all
$ wget -c --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off https://9p.io/sources/plan9/sys/src/
Thank you.
arzgi
Posts: 1197 Joined: 2008-02-21 17:03
Location: Finland
Been thanked: 31 times
#5
Post
by arzgi » 2020-10-25 14:39
hack3rcon wrote: Can you test this wget command and tell me how long did it take?
.
Can you try some different URL? The problem when you download from net, there are many other possible bottlenecks, which you can't solve.
RU55EL
Posts: 546 Joined: 2014-04-07 03:42
Location: /home/russel
#6
Post
by RU55EL » 2020-10-26 15:53
hack3rcon wrote: Can you test this wget command and tell me how long did it take?
Code: Select all
$ wget -c --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off https://9p.io/sources/plan9/sys/src/
Thank you.
What output do you get with the command?
arid
Posts: 136 Joined: 2014-05-08 18:40
Location: Aridzona
Has thanked: 3 times
Been thanked: 1 time
#7
Post
by arid » 2020-10-26 20:18
I thought it was the movie Plan9 From Outer Space so I downloaded it.
Nothing else here is worth my time.
Oh well, it is on an extemely slow connection or everyone on the planet is downloading this ancient Bell Labs system.
It's still in progress and at the speed it's going at, maybe 6 hours total.
Look it up on Wikipedia.
There's no drama in my sid... ...
RU55EL
Posts: 546 Joined: 2014-04-07 03:42
Location: /home/russel
#8
Post
by RU55EL » 2020-10-26 22:34
RU55EL wrote: hack3rcon wrote: Hello,
I want to download the entry of a URL. the source code of an operating system placed on that URL and I used below command:
Code: Select all
$ wget --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off URL
The size of all files is not more than 100 MB, but it take some hours. Which parameter is wrong?
Thanks.
What makes you think a parameter is wrong?
that is why i asked this.