I'm currently working with a debian 11 server. Due to multiple problems with cURL, wget, ... I decided to write a small bash script that saves the time it takes to process "curl google.com", put in into crontab and let it run every minute.
Code: Select all
#!/bin/bash
C="$(date +"%T")"
T="$(date +%s)"
curl google.com
T="$(($(date +%s)-T))"
echo "TIME: ${C} || CURLTIME: ${T}s" >> /CRONMASTER/curltime.txt
Code: Select all
TIME: 14:25:01 || CURLTIME: 0s
TIME: 14:26:01 || CURLTIME: 0s
TIME: 14:27:01 || CURLTIME: 1s
TIME: 14:28:01 || CURLTIME: 0s
TIME: 14:29:01 || CURLTIME: 0s
TIME: 14:30:01 || CURLTIME: 0s
TIME: 14:31:01 || CURLTIME: 1s
TIME: 14:32:01 || CURLTIME: 0s
TIME: 14:33:01 || CURLTIME: 0s
TIME: 14:34:01 || CURLTIME: 1s
TIME: 14:35:01 || CURLTIME: 0s
TIME: 14:36:01 || CURLTIME: 5s
TIME: 14:37:01 || CURLTIME: 5s
TIME: 14:38:01 || CURLTIME: 5s
TIME: 14:39:01 || CURLTIME: 5s
TIME: 14:40:01 || CURLTIME: 5s
The problem isn't with cURL because it also happens with other commands, e.g.:
Code: Select all
time curl google.com >> 5.179s
time wget google.com >> 10.136s
time ping -c 3 google.com >> 16.067s
time ping -c 3 172.217.16.206 >> 2.019s
time traceroute google.com >> 1m5.308s
Thanks in advance!