Introduction to Pipes

This is where Linux gets its power: connecting commands.

The Pipe Operator: |

The pipe | sends stdout of one command to stdin of another:

Terminal
$ls | wc -l
15
  • ls outputs filenames
  • | pipes that output
  • wc -l counts lines

Result: number of files.

Building Pipelines

Terminal
$cat access.log | grep 'ERROR' | wc -l
47

Read this left to right:

  1. cat access.log - read the file
  2. grep 'ERROR' - filter for error lines
  3. wc -l - count them

47 errors in the log.

Why Pipes Are Powerful

Without pipes, you'd need temp files:

hljs bash
# Without pipes (ugly)
cat access.log > temp1.txt
grep 'ERROR' temp1.txt > temp2.txt
wc -l temp2.txt
rm temp1.txt temp2.txt

# With pipes (beautiful)
cat access.log | grep 'ERROR' | wc -l

Pipes are cleaner and faster - data flows through memory, not disk.

Unix Philosophy

"Do one thing well" - Unix tools are designed to be small and focused. Pipes combine them into powerful workflows. This is the Unix philosophy in action.

Common Pipe Patterns

Filter and Count

Terminal
$grep 'pattern' file | wc -l
(count matches)

Sort and Unique

Terminal
$cut -d: -f1 /etc/passwd | sort | uniq
(sorted unique usernames)

Find and Process

Terminal
$find . -name '*.txt' | xargs grep 'TODO'
(TODOs in all txt files)

Filter and Format

Terminal
$ps aux | grep python | awk '{print $2, $11}'
(PID and command of python processes)

Multiple Pipes

Chain as many as you need:

Terminal
$cat /var/log/syslog | grep 'error' | cut -d' ' -f1-3 | sort | uniq -c | sort -rn | head -5
15 Jan 14 10:23 12 Jan 14 09:45 8 Jan 14 11:02 5 Jan 14 08:30 3 Jan 14 12:15

This finds the top 5 most common error timestamps.

Stderr and Pipes

Important: pipes only transfer stdout. Stderr still goes to the terminal.

Terminal
$ls /nonexistent | wc -l
ls: cannot access '/nonexistent': No such file or directory 0

The error appears, but wc receives nothing (empty stdin).

To pipe stderr too:

Terminal
$ls /nonexistent 2>&1 | grep 'cannot'
ls: cannot access '/nonexistent': No such file or directory

Head and Tail in Pipes

Terminal
$command | head -20
(first 20 lines)
$command | tail -20
(last 20 lines)
$command | head -100 | tail -10
(lines 91-100)

Less for Long Output

Terminal
$command | less
(paginated view)

Perfect when you need to explore output interactively.

Advanced: Process Substitution

Sometimes you need to pass command output where a filename is expected:

hljs bash
# Compare output of two commands
diff <(ls dir1) <(ls dir2)

# Read from command output
while read line; do
  echo "$line"
done < <(cat file.txt)

<(command) creates a temporary file-like object from command output. It's like a pipe, but works where filenames are required.

When to Use Process Substitution

Use it when a command requires a filename, not stdin. For example, diff expects two files - process substitution lets you compare command outputs directly.

Knowledge Check

What does `cat file.txt | grep 'error' | wc -l` do?

Key Takeaways

  • | connects stdout of one command to stdin of another
  • Build complex workflows from simple commands
  • Read pipelines left to right
  • Pipes only transfer stdout (use 2>&1 for stderr)
  • This is the Unix philosophy: small tools, combined creatively

Next: building complex pipelines with multiple commands.