How to Export File Metadata from a Local Server
The most direct path to a metadata export. No API keys, no cloud setup, no installations. Just a terminal command on the machine where your files live.
Every operating system has built-in tools for listing file metadata. On macOS and Linux, the find and stat commands do the job. On Windows, PowerShell'sGet-ChildItem is all you need. No software to install, no API keys to generate — just open a terminal on the machine (or SSH in) and run one of the scripts below. This is the fastest path to a metadata spreadsheet for on-premises file servers, NAS drives, mounted volumes, and local workstations.
Bash Script (macOS / Linux)
This is the simplest and fastest method. A short Bash script walks your file tree, reads metadata from the file system using stat, and writes it directly to a CSV file. Works on macOS, Linux, and any Unix-like system including NAS devices with SSH access.
export-metadata.sh) or paste it directly into your terminal.find line to point at the folder you want to scan. The script defaults to ~/Documents — change this to your target directory (e.g., /mnt/shared, /Volumes/NAS, etc.).bash export-metadata.sh. The CSV file will be created in your current directory.#!/bin/bash
echo 'Name,Size,Modified' > output.csv
find ~/Documents -print0 | while IFS= read -r -d '' file; do
size=$(stat -f%z "$file")
mod=$(stat -f%Sm -t '%Y-%m-%dT%H:%M:%S' "$file")
echo "\"$file\",$size,$mod"
done >> output.csv
echo "Done!"
# To include more fields (owner, permissions, file type):
# echo 'Name,Size,Modified,Owner,Permissions,Type' > output.csv
# find /your/path -print0 | while IFS= read -r -d '' file; do
# size=$(stat -f%z "$file")
# mod=$(stat -f%Sm -t '%Y-%m-%dT%H:%M:%S' "$file")
# owner=$(stat -f%Su "$file")
# perms=$(stat -f%Sp "$file")
# type=$([[ -d "$file" ]] && echo "directory" || echo "file")
# echo "\"$file\",$size,$mod,$owner,$perms,$type"
# done >> output.csv-maxdepth 3 after the directory path on the find line to limit the scan depth during your first test run. Remove it when you're ready for the full export. On a server with millions of files, a full recursive scan can take several minutes.stat flags above (-f%z, -f%Sm) are macOS (BSD) syntax. On Linux, use: stat --format='%s' for size and stat --format='%y' for modification time. The Python method (Method 3) avoids this difference entirely.PowerShell (Windows)
PowerShell is the native scripting tool on Windows and provides the richest metadata access for NTFS file systems. The Get-ChildItem cmdlet walks the file tree, and Export-Csv writes it directly to a spreadsheet-ready file.
Win + X and select "Windows PowerShell" or "Terminal").-Path parameter in the script below to point at your target directory (e.g., D:\\Shared\\Marketing, \\\\server\\share, etc.).# Basic export: file name, path, size, dates, extension
Get-ChildItem -Path "C:\Users\You\Documents" -Recurse -File |
Select-Object Name, FullName, Length, LastWriteTime, CreationTime, Extension |
Export-Csv -Path "file_metadata.csv" -NoTypeInformation
# Extended export with owner and attributes
Get-ChildItem -Path "C:\Users\You\Documents" -Recurse -File | ForEach-Object {
$owner = (Get-Acl $_.FullName).Owner
[PSCustomObject]@{
Name = $_.Name
FullPath = $_.FullName
SizeBytes = $_.Length
Extension = $_.Extension
Created = $_.CreationTime
Modified = $_.LastWriteTime
Accessed = $_.LastAccessTime
Owner = $owner
ReadOnly = $_.IsReadOnly
Attributes = $_.Attributes.ToString()
}
} | Export-Csv -Path "file_metadata.csv" -NoTypeInformation
Write-Host "Export complete. Open file_metadata.csv in Excel."\\\\server\\share\\folder) just like local paths. This makes it easy to scan files on a network file server or NAS without mapping a drive letter first.Python (Cross-Platform)
Python's os module works identically on macOS, Linux, and Windows. This is the best choice when you need a single script that runs on any operating system, or when you want to add custom logic like filtering by file extension or computing checksums.
python3 --version (macOS/Linux) or python --version (Windows).export_metadata.py). Update the ROOT_DIR variable to your target directory.python3 export_metadata.py. The CSV will be created in the same directory.import os
import csv
import hashlib
from datetime import datetime
ROOT_DIR = os.path.expanduser("~/Documents") # Change this
OUTPUT = "file_metadata.csv"
def get_file_hash(path, block_size=65536):
"""Compute MD5 hash of a file (optional, can be slow for large files)."""
h = hashlib.md5()
try:
with open(path, "rb") as f:
for block in iter(lambda: f.read(block_size), b""):
h.update(block)
return h.hexdigest()
except (PermissionError, OSError):
return ""
with open(OUTPUT, "w", newline="") as f:
writer = csv.writer(f)
writer.writerow([
"Name", "Path", "Extension", "Size (bytes)",
"Created", "Modified", "Is Directory"
])
for dirpath, dirnames, filenames in os.walk(ROOT_DIR):
for name in filenames:
filepath = os.path.join(dirpath, name)
try:
stat = os.stat(filepath)
writer.writerow([
name,
filepath,
os.path.splitext(name)[1],
stat.st_size,
datetime.fromtimestamp(stat.st_ctime).isoformat(),
datetime.fromtimestamp(stat.st_mtime).isoformat(),
False,
])
except (PermissionError, OSError):
continue
print(f"Exported to {OUTPUT}")get_file_hash function you can use by adding a column and calling it for each file. This is useful for detecting duplicate files, but note it significantly increases export time on large file systems. Only enable it if you need duplicate detection.What metadata fields can you export?
| Field | Bash (macOS/Linux) | PowerShell (Windows) | Python |
|---|---|---|---|
| File name | ✓ | ✓ | ✓ |
| Full file path | ✓ | ✓ | ✓ |
| File extension | ✓ | ✓ | ✓ |
| File size | ✓ | ✓ | ✓ |
| Created date | ✓ | ✓ | ✓ |
| Modified date | ✓ | ✓ | ✓ |
| Last accessed date | Available | ✓ | ✓ |
| File owner | ✓ | ✓ | OS-dependent |
| Permissions / attributes | ✓ | ✓ | OS-dependent |
| Is directory | ✓ | ✓ | ✓ |
| MD5 / checksum | md5 command | Get-FileHash | ✓ |
| Symlink target | ✓ | ✓ | ✓ |
| Hard link count | ✓ | ✕ | ✓ |
| MIME type | file command | ✕ | python-magic |
- Permission-denied files: Files you don't have read access to will be skipped. On shared servers, run the script as a user with sufficient privileges (or use
sudoon Linux/macOS). - Symlinks and mount points: By default,
findandos.walkfollow symlinks, which can cause infinite loops if there are circular links. Add-not -type l(Bash) or setfollowlinks=False(Python) to skip them. - Network-mounted volumes: Scanning network file shares (NFS, SMB/CIFS) over a slow connection can be very slow. If possible, run the script directly on the file server rather than over the network.
- macOS vs. Linux stat syntax: The
statcommand has different flags on macOS (BSD) and Linux (GNU). The Bash script above uses macOS syntax. See the callout in Method 1 for the Linux equivalent.
You have your metadata export.
Now score it.
Upload your CSV or Excel file to MQS and get a structural metadata health score out of 100 with dimension breakdowns and actionable diagnostics.