Skip to content

Script Structure

Note

Learn how to organize PowerShell scripts with proper structure, parameters, help documentation, and best practices.

Overview

A well-structured script is easier to understand, maintain, and share. This page covers how to organize your PowerShell scripts professionally.

A good script structure:

  • Is easy to read and understand
  • Has clear documentation
  • Handles parameters properly
  • Follows consistent organization
  • Is reusable and shareable

Standard Script Template

Here's the recommended structure for PowerShell scripts:

<#
.SYNOPSIS
Brief description of what this script does

.DESCRIPTION
Detailed description explaining purpose and behavior

.PARAMETER Param1
Description of first parameter

.PARAMETER Param2
Description of second parameter

.EXAMPLE
.\MyScript.ps1 -Param1 "Value"
Brief explanation of this example
#>

#Requires -Version 5.1
#Requires -Modules ActiveDirectory

[CmdletBinding()]
param(
    [Parameter(Mandatory=$true)]
    [string]$Param1,

    [string]$Param2 = "Default"
)

# --- Configuration ---
$ErrorActionPreference = "Stop"
$VerbosePreference = "Continue"

# --- Functions ---
function Get-HelperData {
    # Helper function code
}

# --- Main Script Logic ---
Write-Verbose "Starting script execution..."

try {
    # Your main code here
}
catch {
    Write-Error "Script failed: $_"
    exit 1
}
finally {
    # Cleanup code
}

Write-Verbose "Script execution complete."
exit 0

Order matters: 1. Comment-based help (must be first or second) 2. #Requires statements 3. [CmdletBinding()] (if using advanced features) 4. param() block 5. Configuration variables 6. Function definitions 7. Main script logic


Script-Level Help

Basic Structure

Script help goes at the very top of the file, before the param() block:

<#
.SYNOPSIS
One-line description of what this script does

.DESCRIPTION
Detailed explanation of the script's purpose, behavior,
and any important information users should know.

.PARAMETER ParameterName
Description of this parameter, including valid values,
defaults, and any important notes.

.EXAMPLE
.\ScriptName.ps1 -ParameterName "Value"
Description of what this example does and when to use it.

.NOTES
Author: Your Name
Version: 1.0.0
Last Modified: 2025-12-16
#>

param(
    [string]$ParameterName
)

# Script code follows...

Complete Help Reference

For comprehensive documentation on all help keywords (.SYNOPSIS, .DESCRIPTION, .PARAMETER, .EXAMPLE, .NOTES, .LINK, .INPUTS, .OUTPUTS, etc.), see Comment-Based Help.

Placement Rules

✅ Correct - Help before param():

<#
.SYNOPSIS
Gets user data
#>

param([string]$Username)

# Code...

❌ Wrong - Help after param():

param([string]$Username)

<#
.SYNOPSIS
Gets user data
#>  # TOO LATE! Won't work

# Code...

Important: Help MUST come before the param() block or it won't be recognized by Get-Help.

Viewing Script Help

# View help for a script
Get-Help .\MyScript.ps1

# View detailed help
Get-Help .\MyScript.ps1 -Full

# View examples
Get-Help .\MyScript.ps1 -Examples

# View specific parameter
Get-Help .\MyScript.ps1 -Parameter SourcePath

Script Parameters

Basic Param Block

param(
    # Simple string parameter
    [string]$Name,

    # Parameter with default value
    [string]$Path = "C:\Temp",

    # Mandatory parameter
    [Parameter(Mandatory=$true)]
    [string]$ComputerName,

    # Switch parameter (true/false flag)
    [switch]$Recurse,

    # Validated parameter
    [ValidateSet("Start", "Stop", "Restart")]
    [string]$Action
)

Advanced Parameters with CmdletBinding

Add [CmdletBinding()] to enable common parameters like -Verbose, -Debug, etc.:

[CmdletBinding()]
param(
    # Accept pipeline input
    [Parameter(ValueFromPipeline=$true)]
    [string]$InputObject,

    # Multiple validations
    [Parameter(Mandatory=$true)]
    [ValidateNotNullOrEmpty()]
    [ValidateLength(3, 15)]
    [string]$Username,

    # Parameter sets (mutually exclusive options)
    [Parameter(ParameterSetName='ByName')]
    [string]$Name,

    [Parameter(ParameterSetName='ById')]
    [int]$Id
)

Common Parameter Patterns

File path parameter with validation:

param(
    [Parameter(Mandatory=$true)]
    [ValidateScript({
        if (-not (Test-Path $_)) {
            throw "Path does not exist: $_"
        }
        $true
    })]
    [string]$FilePath
)

Credential parameter:

param(
    [Parameter(Mandatory=$true)]
    [PSCredential]
    [System.Management.Automation.Credential()]
    $Credential
)

# Usage:
# .\script.ps1 -Credential (Get-Credential)

Array parameter:

param(
    [Parameter(Mandatory=$true)]
    [string[]]$ComputerNames
)

# Usage:
# .\script.ps1 -ComputerNames "Server01","Server02","Server03"

Date parameter with validation:

param(
    [ValidateScript({
        $_ -le (Get-Date)
    })]
    [datetime]$Date = (Get-Date)
)

Numeric parameter with range:

param(
    [ValidateRange(1, 100)]
    [int]$Percentage = 50
)


Script Organization

Using Functions in Scripts

Pattern 1: Helper Functions

Define helper functions at the top, then use them in main script:

<#
.SYNOPSIS
Processes log files
#>

param(
    [string]$LogPath
)

# --- Helper Functions ---
function Write-Log {
    param([string]$Message)

    $timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
    $logEntry = "[$timestamp] $Message"
    Add-Content -Path "C:\Logs\script.log" -Value $logEntry
    Write-Verbose $Message
}

function Test-Prerequisites {
    if (-not (Test-Path $LogPath)) {
        throw "Log path not found: $LogPath"
    }
}

# --- Main Script ---
Write-Log "Script started"
Test-Prerequisites

# Main processing logic here...
Write-Log "Processing $LogPath"

Write-Log "Script completed"

Pattern 2: Function Library

Create a script file that's just a collection of related functions:

# DiskUtilities.ps1
<#
.SYNOPSIS
Collection of disk utility functions
#>

function Get-DiskSpace {
    param([string]$DriveLetter)
    # Implementation...
}

function Get-LargestFiles {
    param([string]$Path, [int]$Count = 10)
    # Implementation...
}

function Remove-OldFiles {
    param([string]$Path, [int]$DaysOld)
    # Implementation...
}

# Export functions if using as a module
Export-ModuleMember -Function Get-DiskSpace, Get-LargestFiles, Remove-OldFiles

Code Regions (Optional)

Use regions to organize large scripts (works in VS Code, ISE):

#region Initialization
$ErrorActionPreference = "Stop"
$LogPath = "C:\Logs\script.log"
$ConfigPath = "C:\Config\settings.json"
#endregion

#region Functions
function Get-Data {
    # Function code
}

function Process-Data {
    # Function code
}
#endregion

#region Main Script
Write-Verbose "Starting main execution..."
$data = Get-Data
Process-Data -InputData $data
#endregion

#region Cleanup
Write-Verbose "Cleaning up..."
Remove-Variable -Name data
#endregion

Benefits of regions:

  • Fold/unfold sections in editor
  • Better navigation in large scripts
  • Logical grouping of related code

Dot Sourcing

Loading Functions from Other Files

Dot sourcing loads and runs another script in your current scope, making its functions available:

# Load all functions from another script
. C:\Scripts\MyFunctions.ps1

# Now you can use functions from that file
Get-MyFunction -Parameter "Value"

The dot-space (.) before the path is required!

Why Use Dot Sourcing?

  • Reuse functions across multiple scripts
  • Keep common utilities in one place
  • Organize large projects into multiple files
  • Avoid code duplication

Dot Sourcing Patterns

Pattern 1: Relative Path

Load from the same directory as your script:

# Get the directory where this script is located
$scriptPath = Split-Path -Parent $MyInvocation.MyCommand.Path

# Or use $PSScriptRoot (PowerShell 3.0+)
$scriptPath = $PSScriptRoot

# Load functions from same folder
. "$scriptPath\CommonFunctions.ps1"
. "$scriptPath\Utilities.ps1"

# Now use the functions
Write-Log "Script started"

Pattern 2: Module-Style Organization

# Project structure:
# C:\Scripts\MyProject\
#   ├── Main.ps1
#   └── lib\
#       ├── DatabaseFunctions.ps1
#       ├── EmailFunctions.ps1
#       └── LoggingFunctions.ps1

# In Main.ps1:
$libPath = Join-Path $PSScriptRoot "lib"

. "$libPath\DatabaseFunctions.ps1"
. "$libPath\EmailFunctions.ps1"
. "$libPath\LoggingFunctions.ps1"

# Now all functions from those files are available
Connect-Database -Server "SQL01"
Send-EmailReport -To "admin@contoso.com"
Write-Log "Connected to database"

Pattern 3: Load All Scripts in a Folder

# Load all .ps1 files from lib folder
$libFiles = Get-ChildItem -Path "$PSScriptRoot\lib" -Filter "*.ps1" -File

foreach ($file in $libFiles) {
    Write-Verbose "Loading: $($file.Name)"
    . $file.FullName
}

Write-Verbose "Loaded $($libFiles.Count) function libraries"

Pattern 4: Conditional Loading

# Only load if not already loaded
if (-not (Get-Command Write-Log -ErrorAction SilentlyContinue)) {
    . "$PSScriptRoot\LoggingFunctions.ps1"
}

# Load different files based on environment
if ($env:COMPUTERNAME -like "PROD*") {
    . "$PSScriptRoot\ProductionConfig.ps1"
}
else {
    . "$PSScriptRoot\DevelopmentConfig.ps1"
}

Dot Sourcing vs. Import-Module

Dot Sourcing (. .\file.ps1):

  • Quick and simple
  • No module structure needed
  • Functions loaded into current scope
  • Good for simple scripts

Import-Module:

  • More formal module system
  • Better for distributing code
  • Version management
  • Can control what's exported
  • Good for larger projects
# Dot sourcing
. .\MyFunctions.ps1

# Module
Import-Module MyModule

Begin/Process/End Blocks

Pipeline-Aware Scripts

When your script accepts pipeline input, use begin, process, and end blocks:

<#
.SYNOPSIS
Processes computer names from pipeline

.PARAMETER ComputerName
Computer name to process
#>

[CmdletBinding()]
param(
    [Parameter(ValueFromPipeline=$true, Mandatory=$true)]
    [string]$ComputerName
)

begin {
    # Runs ONCE before any pipeline input is processed
    Write-Verbose "Initializing..."
    $results = @()
    $count = 0
}

process {
    # Runs ONCE for EACH item from the pipeline
    Write-Verbose "Processing: $ComputerName"

    try {
        $ping = Test-Connection -ComputerName $ComputerName -Count 1 -Quiet
        $results += [PSCustomObject]@{
            ComputerName = $ComputerName
            Online = $ping
            Timestamp = Get-Date
        }
        $count++
    }
    catch {
        Write-Warning "Failed to process $ComputerName : $_"
    }
}

end {
    # Runs ONCE after all pipeline input is processed
    Write-Verbose "Processed $count computers"
    return $results
}

When to Use Begin/Process/End

Use when:

  • Script accepts pipeline input
  • Need initialization before processing items
  • Need cleanup or summary after processing
  • Working with collections via pipeline

Example usage:

# Without pipeline (runs process block once)
.\Test-Computer.ps1 -ComputerName "Server01"

# With pipeline (runs process block for each item)
"Server01", "Server02", "Server03" | .\Test-Computer.ps1

# From file
Get-Content servers.txt | .\Test-Computer.ps1

Requirements and Dependencies

#Requires Statement

Use #Requires to specify script requirements:

# Require minimum PowerShell version
#Requires -Version 5.1

# Require specific module
#Requires -Modules ActiveDirectory

# Require multiple modules
#Requires -Modules ActiveDirectory, ImportExcel

# Require administrator privileges
#Requires -RunAsAdministrator

# Combine multiple requirements
#Requires -Version 5.1
#Requires -Modules ActiveDirectory, AzureAD
#Requires -RunAsAdministrator

Placement: After help, before param() block

What happens: Script immediately exits with error if requirements aren't met.

Manual Prerequisite Checks

For more control, check prerequisites manually:

# Check PowerShell version
if ($PSVersionTable.PSVersion.Major -lt 5) {
    throw "This script requires PowerShell 5.0 or later. Current version: $($PSVersionTable.PSVersion)"
}

# Check if module is available
if (-not (Get-Module -ListAvailable -Name ActiveDirectory)) {
    throw "ActiveDirectory module not found. Please install RSAT tools."
}

# Import module with error handling
try {
    Import-Module ActiveDirectory -ErrorAction Stop
}
catch {
    throw "Failed to load ActiveDirectory module: $_"
}

# Check if running as administrator
$isAdmin = ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole]::Administrator)

if (-not $isAdmin) {
    throw "This script must be run as Administrator. Please restart PowerShell as admin."
}

# Check for required file
if (-not (Test-Path "C:\Config\settings.json")) {
    throw "Configuration file not found: C:\Config\settings.json"
}

# Check network connectivity
if (-not (Test-Connection "server01.contoso.com" -Count 1 -Quiet)) {
    throw "Cannot reach required server: server01.contoso.com"
}

Error Handling in Scripts

Script-Level Error Preferences

Set error handling behavior at the start of your script:

<#
.SYNOPSIS
Backup script with error handling
#>

param([string]$SourcePath)

# Stop on any error (recommended for most scripts)
$ErrorActionPreference = "Stop"

# OR continue after errors (for monitoring/reporting scripts)
# $ErrorActionPreference = "Continue"

# Main script with try/catch
try {
    # Operations that might fail
    Copy-Item -Path $SourcePath -Destination "D:\Backup" -Recurse
    Write-Output "Backup completed successfully"
    exit 0  # Exit with success code
}
catch {
    Write-Error "Backup failed: $_"
    Write-Error $_.ScriptStackTrace
    exit 1  # Exit with error code
}

Exit Codes

Use exit codes to indicate success or failure:

# Success
exit 0

# Generic error
exit 1

# Custom error codes
if ($diskSpace -lt 10) {
    Write-Error "Insufficient disk space"
    exit 100  # Custom error code for low disk space
}

if ($serviceNotRunning) {
    Write-Error "Required service not running"
    exit 101  # Custom error code for service issue
}

Checking exit codes:

# Run script and check result
.\MyScript.ps1

if ($LASTEXITCODE -eq 0) {
    Write-Output "Script succeeded"
}
else {
    Write-Error "Script failed with code: $LASTEXITCODE"
}

Comprehensive Error Handling

<#
.SYNOPSIS
Production script with comprehensive error handling
#>

[CmdletBinding()]
param(
    [Parameter(Mandatory=$true)]
    [string]$SourcePath
)

# Configuration
$ErrorActionPreference = "Stop"
$logFile = "C:\Logs\backup-$(Get-Date -Format 'yyyyMMdd-HHmmss').log"

# Logging function
function Write-Log {
    param([string]$Message, [string]$Level = "INFO")

    $timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
    $logEntry = "[$timestamp] [$Level] $Message"

    Add-Content -Path $logFile -Value $logEntry
    Write-Verbose $Message
}

try {
    Write-Log "Script started"

    # Validate prerequisites
    if (-not (Test-Path $SourcePath)) {
        throw "Source path not found: $SourcePath"
    }

    Write-Log "Source validated: $SourcePath"

    # Main operations
    $files = Get-ChildItem -Path $SourcePath -File
    Write-Log "Found $($files.Count) files to process"

    foreach ($file in $files) {
        try {
            # Process each file
            Write-Log "Processing: $($file.Name)"
            # ... processing logic ...
        }
        catch {
            # Handle individual file errors but continue
            Write-Log "Failed to process $($file.Name): $_" -Level "ERROR"
        }
    }

    Write-Log "Script completed successfully"
    exit 0
}
catch {
    # Handle fatal errors
    Write-Log "Script failed: $_" -Level "ERROR"
    Write-Log $_.ScriptStackTrace -Level "ERROR"
    exit 1
}
finally {
    # Cleanup code that always runs
    Write-Log "Cleanup complete"
}

Complete Example Script

Here's a complete, production-ready script demonstrating all best practices:

<#
.SYNOPSIS
Backs up files from source to destination with retention policy

.DESCRIPTION
Copies files modified within the specified number of days from a source
folder to a timestamped backup destination. Automatically removes old
backups based on retention policy.

Creates detailed logs and supports WhatIf for testing.

.PARAMETER SourcePath
Path to folder containing files to backup. Must exist.

.PARAMETER DestinationPath
Path where backup folder will be created. Must exist.

.PARAMETER DaysBack
Number of days to look back for modified files.
Files modified within this period will be backed up.
Default is 7 days.

.PARAMETER RetentionDays
Number of days to retain old backups. Backups older than this
will be automatically deleted. Default is 30 days.

.PARAMETER IncludeSubfolders
Include files from subfolders in backup.

.EXAMPLE
.\Backup-Files.ps1 -SourcePath "C:\Data" -DestinationPath "D:\Backups"
Backs up files from C:\Data modified in last 7 days to D:\Backups

.EXAMPLE
.\Backup-Files.ps1 -SourcePath "C:\Data" -DestinationPath "D:\Backups" -DaysBack 30 -IncludeSubfolders
Backs up all files modified in last 30 days, including subfolders

.EXAMPLE
.\Backup-Files.ps1 -SourcePath "C:\Data" -DestinationPath "D:\Backups" -WhatIf
Shows what would be backed up without actually copying files

.NOTES
Author: Raymond Smith
Version: 1.2.0
Last Modified: 2025-12-16
Requires: PowerShell 5.1 or later

.LINK
Copy-Item

.LINK
Get-ChildItem
#>

#Requires -Version 5.1

[CmdletBinding(SupportsShouldProcess=$true)]
param(
    [Parameter(Mandatory=$true, HelpMessage="Path to source folder")]
    [ValidateScript({
        if (-not (Test-Path $_)) {
            throw "Source path does not exist: $_"
        }
        $true
    })]
    [string]$SourcePath,

    [Parameter(Mandatory=$true, HelpMessage="Path to backup destination")]
    [ValidateScript({
        if (-not (Test-Path $_)) {
            throw "Destination path does not exist: $_"
        }
        $true
    })]
    [string]$DestinationPath,

    [ValidateRange(1, 365)]
    [int]$DaysBack = 7,

    [ValidateRange(1, 365)]
    [int]$RetentionDays = 30,

    [switch]$IncludeSubfolders
)

# --- Configuration ---
$ErrorActionPreference = "Stop"
$timestamp = Get-Date -Format "yyyy-MM-dd_HHmmss"
$backupFolder = Join-Path $DestinationPath "Backup_$timestamp"
$logFile = Join-Path $backupFolder "backup.log"

# --- Functions ---
function Write-Log {
    param(
        [string]$Message,
        [ValidateSet("INFO", "WARNING", "ERROR")]
        [string]$Level = "INFO"
    )

    $timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
    $logEntry = "[$timestamp] [$Level] $Message"

    if (Test-Path $logFile) {
        Add-Content -Path $logFile -Value $logEntry
    }

    $color = switch ($Level) {
        "INFO"    { "White" }
        "WARNING" { "Yellow" }
        "ERROR"   { "Red" }
    }

    Write-Host $logEntry -ForegroundColor $color
}

function Remove-OldBackups {
    param([string]$BackupRoot, [int]$RetentionDays)

    Write-Log "Checking for old backups to remove..."

    $cutoffDate = (Get-Date).AddDays(-$RetentionDays)
    $oldBackups = Get-ChildItem -Path $BackupRoot -Directory |
        Where-Object { $_.Name -match "^Backup_\d{4}-\d{2}-\d{2}_\d{6}$" } |
        Where-Object { $_.CreationTime -lt $cutoffDate }

    foreach ($backup in $oldBackups) {
        if ($PSCmdlet.ShouldProcess($backup.FullName, "Remove old backup")) {
            try {
                Remove-Item -Path $backup.FullName -Recurse -Force
                Write-Log "Removed old backup: $($backup.Name)"
            }
            catch {
                Write-Log "Failed to remove $($backup.Name): $_" -Level "WARNING"
            }
        }
    }
}

# --- Main Script ---
try {
    Write-Log "========================================" -Level "INFO"
    Write-Log "Backup script started" -Level "INFO"
    Write-Log "Source: $SourcePath" -Level "INFO"
    Write-Log "Destination: $backupFolder" -Level "INFO"

    # Create backup folder
    if ($PSCmdlet.ShouldProcess($backupFolder, "Create backup folder")) {
        New-Item -Path $backupFolder -ItemType Directory -Force | Out-Null
        Write-Log "Created backup folder"
    }

    # Get files to backup
    $cutoffDate = (Get-Date).AddDays(-$DaysBack)
    Write-Log "Looking for files modified since: $cutoffDate"

    $getChildItemParams = @{
        Path = $SourcePath
        File = $true
    }

    if ($IncludeSubfolders) {
        $getChildItemParams.Recurse = $true
        Write-Log "Including subfolders in search"
    }

    $files = Get-ChildItem @getChildItemParams |
        Where-Object { $_.LastWriteTime -ge $cutoffDate }

    Write-Log "Found $($files.Count) files to backup"

    # Copy files
    $copiedCount = 0
    $errorCount = 0

    foreach ($file in $files) {
        try {
            if ($PSCmdlet.ShouldProcess($file.FullName, "Copy to backup")) {
                # Preserve folder structure
                $relativePath = $file.FullName.Substring($SourcePath.Length).TrimStart('\')
                $destination = Join-Path $backupFolder $relativePath
                $destFolder = Split-Path $destination

                # Create destination folder if needed
                if (-not (Test-Path $destFolder)) {
                    New-Item -Path $destFolder -ItemType Directory -Force | Out-Null
                }

                # Copy file
                Copy-Item -Path $file.FullName -Destination $destination -Force
                $copiedCount++

                Write-Verbose "Copied: $relativePath"
            }
        }
        catch {
            Write-Log "ERROR copying $($file.Name): $_" -Level "ERROR"
            $errorCount++
        }
    }

    Write-Log "Backup complete: Copied $copiedCount of $($files.Count) files"

    if ($errorCount -gt 0) {
        Write-Log "Encountered $errorCount errors during backup" -Level "WARNING"
    }

    # Clean up old backups
    Remove-OldBackups -BackupRoot $DestinationPath -RetentionDays $RetentionDays

    Write-Log "Script completed successfully"
    Write-Log "Backup location: $backupFolder"
    Write-Log "========================================" -Level "INFO"

    exit 0
}
catch {
    Write-Log "FATAL ERROR: $_" -Level "ERROR"
    Write-Log $_.ScriptStackTrace -Level "ERROR"
    exit 1
}

Tips & Tricks

Use CmdletBinding for Professional Scripts

# Adds -Verbose, -Debug, -ErrorAction automatically
[CmdletBinding()]
param(
[string]$Path
)

Write-Verbose "Processing: $Path"  # Only shows with -Verbose

# Now your script supports:
# .\MyScript.ps1 -Path "C:\Data" -Verbose
# .\MyScript.ps1 -Path "C:\Data" -Debug

Validate Parameters to Fail Fast

# GOOD - Catches errors before script runs
param(
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path $_})]
[string]$SourcePath,

[ValidateRange(1, 100)]
[int]$Percentage = 50
)

# Script won't even start if validation fails!

Use Relative Paths with $PSScriptRoot

# BAD - Hardcoded absolute path
. C:\Scripts\MyProject\lib\Functions.ps1

# GOOD - Relative to script location
$libPath = Join-Path $PSScriptRoot "lib"
. "$libPath\Functions.ps1"

# Works regardless of where script is installed!

Structure Scripts Consistently

# Recommended order (top to bottom):

# 1. Help (must be first)
<#
.SYNOPSIS
Script description
#>

# 2. Requirements
#Requires -Version 5.1

# 3. Parameters
param([string]$Path)

# 4. Configuration
$ErrorActionPreference = "Stop"

# 5. Functions
function Get-Data { }

# 6. Main script logic
try {
# Your code here
}
catch {
Write-Error $_
exit 1
}

Don't Use Aliases in Scripts

# BAD - Aliases break script portability
gci C:\Temp | % { Write-Host $_.Name }

# GOOD - Full cmdlet names
Get-ChildItem -Path C:\Temp | ForEach-Object {
Write-Host $_.Name
}

# Aliases are fine interactively, but scripts should be explicit

Don't Prompt for Input in Automated Scripts

# BAD - Breaks automation
$path = Read-Host "Enter path"
$proceed = Read-Host "Continue? (Y/N)"

# GOOD - Use parameters with defaults
param(
[string]$Path = "C:\Default",
[switch]$Force
)

# Now scriptable: .\script.ps1 -Path "C:\Data" -Force

Don't Ignore Errors Silently

# BAD - Errors are hidden
Get-ChildItem C:\FolderThatDoesntExist -ErrorAction SilentlyContinue
# Script continues, no idea something failed!

# GOOD - Handle errors explicitly
try {
$files = Get-ChildItem C:\Data -ErrorAction Stop
}
catch {
Write-Error "Failed to read directory: $_"
# Log it, handle it, or re-throw it
exit 1
}

Don't Mix Tabs and Spaces

# Pick one and stick with it (spaces recommended)
# Most teams use 4 spaces for indentation

# Configure your editor:
# VS Code: "editor.insertSpaces": true, "editor.tabSize": 4
# ISE: Tools → Options → Use spaces

Additional Resources