Disk Usage
"Disk full" errors are never fun. Let's learn to check disk space before it's a crisis.
df - Disk Free
Shows free space on mounted filesystems:
-h makes it human-readable (GB, MB instead of bytes).
Important Columns
| Column | Meaning |
|---|---|
| Filesystem | Disk partition |
| Size | Total capacity |
| Used | Space used |
| Avail | Space available |
| Use% | Percentage used |
| Mounted on | Where it's accessible |
90% Warning
When Use% hits 90%+, you should investigate. At 100%, things break.
Check Specific Mount
Inodes
Files also use inodes (metadata). You can run out of inodes with many small files:
du - Disk Usage
Shows how much space files/directories use:
-s = summary (total only), -h = human-readable.
Directory Breakdown
Find Large Directories
This shows the biggest directories at the top.
Practical Scenarios
Disk Full - Find the Culprit
Find Large Files
Check Docker
Docker can eat disk space silently:
Common Disk Hogs
/var/log- Log files/var/lib/docker- Docker images/containers/home- User data/tmp- Temporary files (should auto-clean)
Modern Alternative: ncdu
ncdu (NCurses Disk Usage) provides an interactive interface for exploring disk usage:
sudo apt install ncdu
ncdu /
Navigate with arrow keys, delete files with d. Much easier than piping du through sort.
ncdu is a Lifesaver
When you need to quickly find what's eating disk space, ncdu is the fastest way. It shows a sorted, navigable view of disk usage. Delete files directly from the interface.
Which command shows disk usage of a directory with human-readable sizes?
Quick Reference
| Command | Shows |
|---|---|
df -h | Free space on filesystems |
df -i | Inode usage |
du -sh path | Size of directory |
du -h --max-depth=1 | Size of subdirectories |
du -h | sort -hr | head | Largest directories |
Key Takeaways
df -hshows filesystem free spacedu -shshows directory size- Sort by size to find space hogs:
| sort -hr | head - Watch for 90%+ usage on critical filesystems
- Logs and Docker are common culprits
Next: checking memory usage.