To loop through the contents of a file line-by-line in Bash, you can use a while
loop with the read
command. Here are the most common and reliable methods:
1. Basic Loop (Line-by-Line)
while IFS= read -r line; do
echo "Line: $line"
done < "file.txt"
IFS=
: Prevents leading/trailing whitespace from being trimmed.-r
: Disables backslash (\
) escaping (e.g.,\n
is treated as literal text, not a newline).< "file.txt"
: Redirects the file into the loop.
2. Skip Empty Lines
while IFS= read -r line || [[ -n "$line" ]]; do
[[ -z "$line" ]] && continue # Skip empty lines
echo "Line: $line"
done < "file.txt"
|| [[ -n "$line" ]]
: Ensures the last line is read even if it lacks a newline.
3. Process Fields (Split by Delimiter)
If lines contain delimited values (e.g., field1,field2
):
while IFS=',' read -r field1 field2 rest; do
echo "Field 1: $field1 | Field 2: $field2 | Rest: $rest"
done < "data.csv"
IFS=','
: Splits lines by commas (adjust the delimiter as needed).rest
: Captures remaining fields if there are more than two.
4. Read Entire File into an Array
For small files (process lines later):
mapfile -t lines < "file.txt" # Requires Bash ≥4.0
for line in "${lines[@]}"; do
echo "Line: $line"
done
mapfile -t
: Reads lines into the arraylines
(stripping newlines).- Avoid for large files (loads entire file into memory).
5. Filter Lines (e.g., Skip Comments)
while IFS= read -r line; do
[[ "$line" =~ ^# ]] && continue # Skip lines starting with #
[[ -z "$line" ]] && continue # Skip empty lines
echo "$line"
done < "config.txt"
Key Notes
- Avoid
for line in $(cat file.txt)
: - Splits on whitespace, not lines (unsafe for filenames/text with spaces).
- Subshell Pitfall:
# Variables modified inside the loop won't persist outside (due to subshell)
cat file.txt | while read -r line; do
((count++)) # `count` will reset outside the loop
done
Use done < "file.txt"
instead of piping to avoid subshell issues.
Example: Count Lines in a File
count=0
while IFS= read -r line; do
((count++))
done < "file.txt"
echo "Total lines: $count"
Handling Large Files
The while read
loop is memory-efficient for large files since it processes lines one-by-one.
Choose the method that best fits your use case!