Bash scripting basics

Bash scripting is one of the fastest ways to automate repetitive tasks on Linux and macOS. If you regularly rename files, parse logs, deploy services, back up directories, or glue command-line tools together, a small Bash script can save a surprising amount of time.

In this updated article, I want to go beyond a basic introduction. We will cover the core building blocks of Bash scripts, some best practices that make scripts safer in production, and a few interesting features that many beginners do not discover early enough.

1. What is a Bash script?

Bash stands for Bourne Again SHell. It is both an interactive shell and a scripting language. A Bash script is simply a text file that contains shell commands executed in order.

Bash is especially useful when you want to:

  • combine multiple terminal commands into one reusable script
  • automate system administration tasks
  • wrap existing tools such as grep, awk, sed, tar, find, docker, or kubectl
  • build small deployment, backup, cleanup, or monitoring utilities
  • run scheduled jobs using cron

2. The smallest possible Bash script

#!/usr/bin/env bash

echo "Hello from Bash"

The first line is called the shebang. It tells the operating system which interpreter should run the file.

To execute the script:

chmod +x hello.sh
./hello.sh

I generally prefer #!/usr/bin/env bash over #!/bin/bash because it is a bit more portable across environments.

3. Variables and command-line arguments

Variables in Bash are simple, but there is one rule beginners often forget: do not put spaces around =.

#!/usr/bin/env bash

name="Thanh"
role="DevOps Engineer"

echo "Name: $name"
echo "Role: $role"

You can also receive input from command-line arguments:

#!/usr/bin/env bash

echo "Script name: $0"
echo "First argument: $1"
echo "Second argument: $2"
echo "Argument count: $#"
echo "All arguments: $@"
echo "Current process id: $$"

Run it like this:

./demo.sh deploy production

4. Reading input from the user

Bash can also ask the user for input interactively:

#!/usr/bin/env bash

read -rp "Enter your username: " username
echo "Welcome, $username"

Useful options for read:

  • -r: prevents backslash escaping from being interpreted unexpectedly
  • -p: prints a prompt
  • -s: hides input, useful for passwords

Example:

read -rsp "Enter password: " password
echo
echo "Password received"

5. Conditions with if, elif, and else

Conditional logic is one of the most common things you will use in shell automation.

#!/usr/bin/env bash

if [[ $# -eq 0 ]]; then
  echo "Please provide at least one argument"
elif [[ $1 == "start" ]]; then
  echo "Starting the service"
else
  echo "Unknown command"
fi

In modern Bash, prefer [[ ... ]] instead of [ ... ] when possible. It is generally safer and easier to read.

6. Case statements are cleaner than long if chains

If you are building a script with subcommands such as start, stop, restart, or status, case is usually the better tool.

#!/usr/bin/env bash

case "$1" in
  start)
    echo "Starting service"
    ;;
  stop)
    echo "Stopping service"
    ;;
  restart)
    echo "Restarting service"
    ;;
  status)
    echo "Checking status"
    ;;
  *)
    echo "Usage: $0 {start|stop|restart|status}"
    ;;
esac

7. Loops: for and while

A for loop is good when iterating over a list:

#!/usr/bin/env bash

for file in *.log; do
  echo "Processing: $file"
done

A while loop is often better when reading a file line by line:

#!/usr/bin/env bash

while IFS= read -r line; do
  echo "Line: $line"
done < input.txt

This pattern avoids several common parsing issues and is much safer than many ad-hoc alternatives.

8. Functions make scripts easier to maintain

Even in small scripts, functions are worth using. They help you avoid duplication and make the script easier to read later.

#!/usr/bin/env bash

log_info() {
  echo "[INFO] $1"
}

backup_file() {
  local source_file="$1"
  cp "$source_file" "$source_file.bak"
}

log_info "Starting backup"
backup_file "config.yaml"

9. A few interesting Bash features many beginners miss

9.1 Strict mode

This is one of the most useful things you can add at the top of production scripts:

#!/usr/bin/env bash
set -euo pipefail

IFS=$'
	'
  • set -e: exit immediately when a command fails
  • set -u: treat unset variables as errors
  • set -o pipefail: fail a pipeline if any command inside it fails
  • IFS: helps avoid bad splitting behavior around whitespace

This is not magic, but it prevents many silent failures.

9.2 Arrays

Bash supports arrays, which are useful when working with multiple files, services, or environments:

#!/usr/bin/env bash

services=(nginx redis postgres)

for service in "${services[@]}"; do
  echo "Checking $service"
done

9.3 Here documents

Heredocs are a neat way to generate files or multi-line output:

cat <<EOF > config.env
APP_ENV=production
APP_DEBUG=false
APP_PORT=8080
EOF

This is extremely useful for quick config generation in CI/CD or provisioning scripts.

9.4 Traps

You can use trap to clean up temporary files even if the script exits early:

#!/usr/bin/env bash

TMP_FILE=$(mktemp)
trap 'rm -f "$TMP_FILE"' EXIT

echo "temporary data" > "$TMP_FILE"
cat "$TMP_FILE"

This is one of those small details that makes scripts much more reliable.

9.5 Debugging mode

When debugging Bash, use:

bash -x your_script.sh

Or inside the script:

set -x

This prints commands as they execute, which is very helpful when tracking down quoting or branching issues.

10. Common Bash mistakes

Bash is powerful, but it is also easy to write fragile scripts if you rush. Here are some very common mistakes:

  • Unquoted variables: write "$file", not $file
  • Using for f in $(ls): this breaks with spaces and special characters
  • Writing var = value: spaces make it invalid in Bash
  • Ignoring exit codes: check failures when your script touches production systems
  • Parsing text carelessly: shell scripts often fail because of unexpected spaces, tabs, or newlines

A great external resource for these issues is Bash Pitfalls.

11. Example 1: backup a directory

Here is a practical example that creates a timestamped backup archive:

#!/usr/bin/env bash
set -euo pipefail

SOURCE_DIR="${1:-}"
BACKUP_DIR="${2:-./backup}"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)

if [[ -z "$SOURCE_DIR" ]]; then
  echo "Usage: $0 <source_dir> [backup_dir]"
  exit 1
fi

if [[ ! -d "$SOURCE_DIR" ]]; then
  echo "Directory does not exist: $SOURCE_DIR"
  exit 1
fi

mkdir -p "$BACKUP_DIR"
ARCHIVE_NAME="backup_${TIMESTAMP}.tar.gz"

tar -czf "$BACKUP_DIR/$ARCHIVE_NAME" "$SOURCE_DIR"

echo "Backup created at: $BACKUP_DIR/$ARCHIVE_NAME"

Run it like this:

./backup.sh /var/log ./artifacts

12. Example 2: check if a service is running

#!/usr/bin/env bash

service_name="$1"

if pgrep -x "$service_name" >/dev/null; then
  echo "$service_name is running"
else
  echo "$service_name is not running"
fi

Example:

./check-service.sh nginx

13. Example 3: batch rename files

#!/usr/bin/env bash

for file in *.txt; do
  mv "$file" "old_$file"
done

This is a tiny script, but it demonstrates why Bash is so productive for file operations.

14. Example 4: a simple deployment helper

#!/usr/bin/env bash
set -euo pipefail

APP_DIR="/opt/myapp"

echo "Pulling latest code..."
git -C "$APP_DIR" pull

echo "Installing dependencies..."
npm --prefix "$APP_DIR" install

echo "Restarting service..."
systemctl restart myapp

echo "Deployment complete"

This is the kind of script many engineers write very early in their DevOps journey.

15. When should you use Bash and when should you switch to Python?

Bash is a great choice when you are:

  • gluing command-line tools together
  • working with files, directories, processes, and environment variables
  • writing short automation scripts for CI/CD, DevOps, or local tooling

However, if your logic becomes complex, if you need data structures beyond basic arrays, or if you need better testing and maintainability, Python is often the better long-term choice.

16. Final thoughts

Bash scripting is not just about putting commands into a .sh file. The real value comes from writing scripts that are safe, readable, and useful under real operational pressure. Start small, automate something annoying, and improve your scripts over time. That is how Bash becomes genuinely powerful.

References