Networking with ping and wget—what
they do, when to use them, and the flags you’ll actually need.
1) ping — “Is it reachable? How fast is it?”
What it does: sends ICMP Echo Request packets and waits for Echo Reply. Great for checking reachability, latency (ms), and packet loss.
Note: Some hosts block ICMP,
so ping may fail even if a website works.
Quick starts
ping google.com # continuous pings until Ctrl+Cping -c 5 google.com # send 5 pings then stopping -4 example.com # force IPv4ping -6 example.com # force IPv6Useful options (Linux iputils
ping)
·
-c
N → send N packets (e.g., -c 5)
·
-i
S → interval between pings (sec)
(default 1s)
·
-s
BYTES → payload size (e.g., -s 1400)
·
-W
S → per-reply timeout (seconds)
·
-w
S → overall deadline; exit after S
seconds
·
-t
N → set TTL (time-to-live) value
·
-q → quiet (summary only)
·
-n → numeric output (skip DNS)
Examples:
ping -c 5 -i 0.2 -W 2 8.8.8.8 # 5 pings, 200ms apart, 2s timeout eachping -c 3 -s 1400 -W 1 example.com # test with larger payloadRead the output
·
Per line: time=xx.x ms is round-trip latency.
·
Summary: packet loss, min/avg/max/mdev (mean deviation ≈ jitter).
Exit codes (good for scripts)
·
0 = at least one reply received
·
1 = no replies
·
2 = error (e.g., bad args)
Common issues
·
unknown
host → DNS problem; try ping 8.8.8.8 (IP direct).
· 100% loss to a site that loads in browser → ICMP likely blocked.
·
High time or variable (jitter) → congestion or Wi-Fi issues.
·
High wa (I/O wait) in top while
pinging? Disk is the bottleneck, not network.
2) wget —
“Download this (and maybe a whole site)”
What it does: non-interactive downloader for HTTP/HTTPS/FTP. Perfect for scripts and large/long downloads.
Quick starts
wget https://example.com/file.zip # save using remote filenamewget -O notes.pdf https://site/notes # save as a specific namewget -c https://site/big.iso # resume a partial downloadEveryday options
·
-O
FILE → write to this filename
·
-c → continue (resume) if file exists
·
--limit-rate=200k → throttle speed
·
-q / -nv / -v → quiet / less chatty / verbose
·
--show-progress → progress bar (when not quiet)
·
--timeout=SECONDS / --tries=N → robustness on flaky nets
Website / directory downloads
·
-r → recursive
·
-np → no parent (don’t climb up)
·
-l
N → recursion depth
·
-m → mirror mode (same as -r -N -l inf --no-remove-listing)
·
-N → timestamping (download only if newer)
·
-e
robots=off → ignore robots.txt (use
responsibly!)
Example: mirror a docs section (shallow):
wget -r -np -l 2 -k -p https://example.com/docs/# -k: convert links for local viewing; -p: get page requisites (CSS/imgs)Auth, headers, and cookies
wget --user=alice --password=secret https://site/private/filewget --header="Authorization: Bearer <TOKEN>" https://api.example.com/data.jsonwget --save-cookies cookies.txt --keep-session-cookies URLwget --load-cookies cookies.txt URLHandling redirects & content names
wget --content-disposition https://example.com/download?id=123# use server's suggested filename from the Content-Disposition headerBackground & lists
wget -b URL # run in background (logs to wget-log)wget -i urls.txt # download all URLs from a fileProxies (lab/campus networks)
export http_proxy=http://proxy:3128export https_proxy=http://proxy:3128wget https://example.comSSL/TLS notes
· Certificate errors? Prefer fixing CA/cert.
·
--no-check-certificate disables validation (avoid unless you truly must, and
never in production scripts).
3) When to use which?
|
Task |
Use |
|
Check basic reachability/latency |
|
|
DNS vs network check |
|
|
Download a file reliably |
|
|
Mirror docs for offline read |
|
|
API/scripted downloads |
|
Need to test actual HTTP
(status codes/headers)? curl
-I https://example.com is better
for protocol-level checks.
4) Mini-labs (25–35 minutes)
Lab A: Reachability & latency
ping -c 5 google.comping -c 5 8.8.8.8ping -c 5 -4 example.comping -c 5 -6 example.com# Compare loss/avg time; note differences IPv4 vs IPv6 if any.Lab B: Timeouts & packet size
ping -c 5 -W 1 -s 1200 example.com# Try increasing -s; observe if loss increases (some paths drop large ICMP).Lab C: Robust file downloads
wget https://speed.hetzner.de/100MB.bin -O test.bin# interrupt (Ctrl+C), then:wget -c https://speed.hetzner.de/100MB.bin -O test.bin # resumesLab D: Batch downloads
printf "%s\n" \ https://example.org/a.pdf \ https://example.org/b.pdf > urls.txtwget -i urls.txt -P ~/Downloads/papersLab E: Shallow mirror (offline view)
wget -r -np -l 1 -k -p https://example.com/docs/# open the saved index.html in a browser and browse locally5) Troubleshooting quickies
ping
·
unknown
host → DNS issue; check /etc/resolv.conf or try IP.
·
Only IPv6
fails → ISP/router may lack
IPv6; use -4.
· Packet loss spikes → Wi-Fi interference; move closer/switch band.
· Consistently high latency → overloaded link or distant server.
wget
·
404
Not Found / 403 Forbidden → wrong URL or access control.
·
certificate
verify failed → clock wrong or
missing CA; fix CA/clock rather than --no-check-certificate.
·
Stalls midway →
add --tries=10
--timeout=15 --continue --limit-rate=500k.
Exam-ready bullets
·
ping tests ICMP reachability and measures
latency/jitter/loss. Key flags: -c, -i, -W, -w, -4/-6, -s, -n. Exit codes: 0 some
replies, 1 none, 2 error.
·
wget downloads via HTTP/HTTPS/FTP; resume with -c, name with -O,
rate-limit with --limit-rate, mirror with -r -np -k -p/-m,
use headers/auth/cookies when needed.
·
ping can fail even when web works (ICMP blocked). For HTTP
behavior, prefer curl -I.
Want this as a printable 2-page handout with a one-glance flag table? I can format and share it.