Scheduled Maintenance: We are aware of an issue with Google, AOL, and Yahoo services as email providers which are blocking new registrations. We are trying to fix the issue and we have several internal and external support tickets in process to resolve the issue. Please see: viewtopic.php?t=158230

 

 

 

Why this wget command is so slow?

New to Debian (Or Linux in general)? Ask your questions here!
Locked
Message
Author
hack3rcon
Posts: 746
Joined: 2015-02-16 09:54
Has thanked: 48 times

Why this wget command is so slow?

#1 Post by hack3rcon »

Hello,
I want to download the entry of a URL. the source code of an operating system placed on that URL and I used below command:

Code: Select all

$ wget --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off URL
The size of all files is not more than 100 MB, but it take some hours. Which parameter is wrong?

Thanks.

peter_irich
Posts: 1406
Joined: 2009-09-10 20:15
Location: Saint-Petersburg, Russian Federation
Been thanked: 11 times

Re: Why this wget command is so slow?

#2 Post by peter_irich »

Try show messages with option

Code: Select all

-o logfile
and then again with "-o logfile" but without "-e robots=off" or at first without "-e robots=off".

User avatar
RU55EL
Posts: 546
Joined: 2014-04-07 03:42
Location: /home/russel

Re: Why this wget command is so slow?

#3 Post by RU55EL »

hack3rcon wrote:Hello,
I want to download the entry of a URL. the source code of an operating system placed on that URL and I used below command:

Code: Select all

$ wget --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off URL
The size of all files is not more than 100 MB, but it take some hours. Which parameter is wrong?

Thanks.
What makes you think a parameter is wrong?

hack3rcon
Posts: 746
Joined: 2015-02-16 09:54
Has thanked: 48 times

Re: Why this wget command is so slow?

#4 Post by hack3rcon »

Can you test this wget command and tell me how long did it take?

Code: Select all

$ wget -c --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off https://9p.io/sources/plan9/sys/src/
Thank you.

arzgi
Posts: 1197
Joined: 2008-02-21 17:03
Location: Finland
Been thanked: 31 times

Re: Why this wget command is so slow?

#5 Post by arzgi »

hack3rcon wrote:Can you test this wget command and tell me how long did it take?
.
Can you try some different URL? The problem when you download from net, there are many other possible bottlenecks, which you can't solve.

User avatar
RU55EL
Posts: 546
Joined: 2014-04-07 03:42
Location: /home/russel

Re: Why this wget command is so slow?

#6 Post by RU55EL »

hack3rcon wrote:Can you test this wget command and tell me how long did it take?

Code: Select all

$ wget -c --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off https://9p.io/sources/plan9/sys/src/
Thank you.
What output do you get with the command?

arid
Posts: 136
Joined: 2014-05-08 18:40
Location: Aridzona
Has thanked: 3 times
Been thanked: 1 time

Re: Why this wget command is so slow?

#7 Post by arid »

I thought it was the movie Plan9 From Outer Space so I downloaded it. :mrgreen:

Nothing else here is worth my time.

Oh well, it is on an extemely slow connection or everyone on the planet is downloading this ancient Bell Labs system.

It's still in progress and at the speed it's going at, maybe 6 hours total. :shock:

Look it up on Wikipedia.
There's no drama in my sid......

User avatar
RU55EL
Posts: 546
Joined: 2014-04-07 03:42
Location: /home/russel

Re: Why this wget command is so slow?

#8 Post by RU55EL »

RU55EL wrote:
hack3rcon wrote:Hello,
I want to download the entry of a URL. the source code of an operating system placed on that URL and I used below command:

Code: Select all

$ wget --level=inf --recursive --page-requisites --user-agent=Mozilla --no-parent --convert-links --adjust-extension --no-clobber -e robots=off URL
The size of all files is not more than 100 MB, but it take some hours. Which parameter is wrong?

Thanks.
What makes you think a parameter is wrong?

that is why i asked this.

Locked