MQS™
Google Drive

How to Export File Metadata from Google Drive

Google Drive has no built-in 'export file list' button. Here are three proven methods to get your file metadata into a spreadsheet.

Built-in Export
No
Best Method
Apps Script
Best Output
Google Sheet
Time to First Export
15-30 min
i
The short answer

Google Drive does not have a button to export your file metadata as a spreadsheet. You cannot select a folder and download a CSV of all files with their names, sizes, owners, and dates. The three methods below are the standard approaches, ranging from no-code to developer-level.

1

Google Apps Script to Google Sheet

EasyBest for: Content teams, marketing ops, anyone with a Google accountOutput: Google Sheet~15 minutes

This is the simplest approach. You paste a short script into Google Apps Script, run it, and your file metadata appears directly in a Google Sheet. No installations, no API keys, no command line. Once in a Sheet, you can download it as CSV, XLSX, or keep it in Sheets.

1
Open a new Google Sheet. Go to Extensions → Apps Script. This opens the Apps Script editor in a new tab.
2
Delete any code in the editor and paste the script below. If you want to scan a specific folder instead of your entire Drive, replace DriveApp.getFiles() with DriveApp.getFolderById('YOUR_FOLDER_ID').getFiles().
3
Click the Run button (play icon). Google will ask you to authorize the script to access your Drive. Review the permissions and click Allow.
4
Go back to your Google Sheet. Your file metadata will be populated row by row. For large Drives (10,000+ files), the script may take a few minutes and might need pagination (see the tip below).
5
Download the Sheet as CSV or XLSX: File → Download → Comma-separated values (.csv) or Microsoft Excel (.xlsx).
javascript
function listDriveFiles() {
  var sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
  sheet.clear();
  sheet.appendRow([
    "File Name", "File ID", "URL", "Size (bytes)", "Type",
    "Created", "Last Updated", "Owner", "Description", "Sharing"
  ]);

  var files = DriveApp.getFiles();
  // To scan a specific folder instead:
  // var files = DriveApp.getFolderById("YOUR_FOLDER_ID").getFiles();

  while (files.hasNext()) {
    var file = files.next();
    sheet.appendRow([
      file.getName(),
      file.getId(),
      file.getUrl(),
      file.getSize(),
      file.getMimeType(),
      file.getDateCreated(),
      file.getLastUpdated(),
      file.getOwner() ? file.getOwner().getEmail() : "",
      file.getDescription() || "",
      file.getSharingAccess() + " / " + file.getSharingPermission()
    ]);
  }
}
Handling large Drives
Apps Script has a 6-minute execution time limit. If your Drive has more than ~10,000 files, the script may time out. In that case, modify the script to save a continuation token using PropertiesService and use a time-based trigger to resume. Or consider Method 2 for large-scale exports.
2

Python + Google Drive API v3

TechnicalBest for: Developers, IT admins, consultantsOutput: CSV or Excel~30-60 minutes

The Drive API gives you the most control over which metadata fields to extract and how to format the output. This method handles pagination natively and can export tens of thousands of files without the time limits of Apps Script.

1
Go to the Google Cloud Console and create a project (or use an existing one). Enable the Google Drive API under APIs & Services.
2
Create credentials: either an OAuth 2.0 Client ID (for personal Drive) or a Service Account (for Workspace-wide access with domain delegation). Download the credentials JSON file.
3
Install the Google client library: pip install google-api-python-client google-auth-oauthlib
4
Run the script below. It paginates through all files and writes the metadata to a CSV file. Adjust the fields parameter to include or exclude specific metadata.
python
from googleapiclient.discovery import build
from google.oauth2 import service_account
import csv

# Authenticate (adjust path to your credentials file)
creds = service_account.Credentials.from_service_account_file(
    "credentials.json",
    scopes=["https://www.googleapis.com/auth/drive.readonly"]
)
service = build("drive", "v3", credentials=creds)

# Paginate through all files
all_files = []
page_token = None
while True:
    results = service.files().list(
        pageSize=1000,
        fields="nextPageToken, files(id, name, mimeType, createdTime, "
               "modifiedTime, owners, size, description, shared, "
               "parents, webViewLink)",
        pageToken=page_token,
        supportsAllDrives=True,
        includeItemsFromAllDrives=True
    ).execute()
    all_files.extend(results.get("files", []))
    page_token = results.get("nextPageToken")
    if not page_token:
        break

# Write to CSV
with open("google_drive_metadata.csv", "w", newline="") as f:
    writer = csv.writer(f)
    writer.writerow([
        "ID", "Name", "MIME Type", "Created", "Modified",
        "Owner", "Size", "Description", "Shared", "Link"
    ])
    for file in all_files:
        writer.writerow([
            file.get("id"),
            file.get("name"),
            file.get("mimeType"),
            file.get("createdTime"),
            file.get("modifiedTime"),
            file["owners"][0]["emailAddress"] if file.get("owners") else "",
            file.get("size", "N/A"),
            file.get("description", ""),
            file.get("shared"),
            file.get("webViewLink", "")
        ])

print(f"Exported {len(all_files)} files to google_drive_metadata.csv")
i
Available fields
The Drive API v3 exposes 50+ metadata fields per file. The script above includes the most commonly needed ones. You can add fields like permissions (who has access), starred, md5Checksum, version, and labelInfo (Enterprise labels). See the Drive API Files resource documentation for the full list.
3

Rclone CLI

ModerateBest for: IT admins, ops teams comfortable with command lineOutput: JSON → CSV~20-30 minutes

Rclone is a free, open-source command-line tool that supports 40+ cloud storage providers. It can list all files in your Google Drive as structured JSON, which you then convert to CSV. No coding required beyond a one-line command.

1
Install Rclone from rclone.org/install. Available for macOS, Windows, and Linux.
2
Configure your Google Drive remote: run rclone config and follow the prompts to authorize Rclone to access your Drive. This creates a named remote (e.g., "gdrive").
3
Run the export command below. It lists all files recursively and outputs JSON. The second command converts the JSON to CSV.
bash
# List all files as JSON
rclone lsjson gdrive: --recursive > drive_files.json

# Convert JSON to CSV (requires Python or jq)
python3 -c "
import json, csv, sys
with open('drive_files.json') as f:
    data = json.load(f)
w = csv.DictWriter(sys.stdout,
    fieldnames=['Path','Name','Size','MimeType','ModTime','IsDir'])
w.writeheader()
for item in data:
    w.writerow(item)
" > google_drive_metadata.csv
!
Rclone field limitations
Rclone exports core file properties (name, path, size, modification time, MIME type) but does not include Drive-specific metadata like owners, sharing permissions, or descriptions. If you need those fields, use Method 1 or Method 2.

What metadata fields can you export?

FieldApps ScriptDrive APIRclone
File name
File path / parent folder
File size
MIME type
Created date
Modified date
Owner email
Description
Sharing permissionsBasicDetailed
Web view URL
Starred
MD5 / SHA checksumMD5 only
Labels (Enterprise)
Version number
!
Known limitations
  • Google-native files (Docs, Sheets, Slides) do not have a file size in Drive. The size field returns empty for these file types.
  • Drive Labels (custom structured metadata) require Google Workspace Enterprise and the Drive Labels API. They are not available on personal or Business Starter accounts.
  • Shared Drives require the supportsAllDrives and includeItemsFromAllDrives flags in API calls. Apps Script's DriveApp class does not support Shared Drives natively — use the Advanced Drive Service instead.

You have your metadata export.
Now score it.

Upload your CSV or Excel file to MQS and get a structural metadata health score out of 100 with dimension breakdowns and actionable diagnostics.

Get Your Free ReportSee How It Works

Exporting from another platform?

Dropbox
How to Export File Metadata from Dropbox
Box
How to Export File Metadata from Box
SharePoint
How to Export Document Metadata from SharePoint
Local Server
How to Export File Metadata from a Local Server
Amazon S3
How to Export Object Metadata from AWS S3
Adobe AEM
How to Export Asset Metadata from AEM
Salsify
How to Export Product Metadata from Salsify
Bynder
How to Export Asset Metadata from Bynder
Contentful
How to Export Content Metadata from Contentful
Airtable
How to Export Metadata from Airtable
Canto
How to Export Asset Metadata from Canto
Acquia DAM
How to Export Asset Metadata from Acquia DAM
Orange Logic
How to Export Asset Metadata from Orange Logic
PhotoShelter for Brands
How to Export Asset Metadata from PhotoShelter for Brands
PhotoShelter for Photographers
How to Export Image Metadata from PhotoShelter for Photographers