How to Upload and Schedule YouTube Videos via API (2026 Guide)


YouTube is the second-largest search engine on the planet. If you are building a tool that publishes video content — a CMS, an AI video pipeline, a client management platform, a content repurposing workflow — you will eventually need to upload and schedule videos to YouTube programmatically.
The YouTube Data API v3 can do it. But "can" and "should" are different conversations. The API has a strict quota system, resumable upload requirements for any production use case, and an OAuth consent flow that Google will audit before you can go live. This guide covers both the native route and a simpler unified alternative so you can make an informed decision.
Table of Contents
- Why Upload to YouTube via API?
- Option 1: YouTube Data API v3 (Native)
- Option 2: PostEverywhere API (Unified)
- Comparison: Native vs Unified API
- YouTube API Quota Management
- Shorts vs Long-Form via API
- Error Handling and Edge Cases
- FAQs
Why Upload to YouTube via API?
Manual uploads work fine when you are publishing one video a week. They stop working when:
- You are managing multiple YouTube channels for clients or brands
- Your pipeline produces video at scale (AI-generated content, repurposed clips, batch renders)
- You need to schedule uploads in advance with specific publish times
- Your CMS or internal tool needs to push video directly to YouTube without human intervention
- You are building a product that offers social media scheduling as a feature
In all of these cases, you need a programmatic path from "video file on disk" to "published (or scheduled) on YouTube." There are two ways to get there.
Option 1: YouTube Data API v3 (Native)
The YouTube Data API v3 is Google's official API for managing YouTube resources — videos, playlists, channels, comments, and more. The videos.insert endpoint handles uploads.
Prerequisites
Before you write a single line of code, you need:
- A Google Cloud project — Create one in the Google Cloud Console.
- YouTube Data API v3 enabled — Enable it under APIs & Services > Library.
- OAuth 2.0 credentials — Video uploads require OAuth 2.0 (API keys are not sufficient for write operations). Create credentials under APIs & Services > Credentials.
- OAuth consent screen configured — You must configure the consent screen, add the
https://www.googleapis.com/auth/youtube.uploadscope, and submit for verification if you are going beyond testing. - A verified application (for production) — Google audits apps requesting sensitive YouTube scopes. This can take weeks.
If you have worked with other social media APIs, you know that authentication is always the hardest part. YouTube's OAuth flow is no different — but the Google Cloud Console adds extra layers compared to Meta or TikTok.
Understanding the Quota System
This is the part that catches most developers off guard. The YouTube Data API v3 uses a quota system rather than simple rate limits.
Every Google Cloud project gets 10,000 quota units per day by default. Different operations cost different amounts:
| Operation | Quota Cost |
|---|---|
videos.insert (upload) |
1,600 units |
videos.update |
50 units |
videos.list |
1 unit |
channels.list |
1 unit |
search.list |
100 units |
thumbnails.set |
50 units |
With the default 10,000 units per day, you can upload a maximum of 6 videos per day. That is not a lot if you are managing multiple channels.
Quota resets at midnight Pacific Time. There is no way to purchase additional quota directly — you must submit a quota increase request through the Google Cloud Console and justify your use case.
Resumable Uploads
For any file over a trivial size, you should use resumable uploads. This is not optional in production — network interruptions, timeouts, and large file sizes make simple uploads unreliable.
A resumable upload works in three steps:
- Initiate the upload — Send a POST request with video metadata. The response includes a resumable upload URI.
- Upload the file — Send the actual file bytes to the resumable URI in one or more chunks.
- Handle interruptions — If the upload fails mid-stream, query the URI to find out how many bytes were received, then resume from that offset.
Here is the full flow in Node.js using the Google APIs client library:
import { google } from 'googleapis';
import fs from 'fs';
const oauth2Client = new google.auth.OAuth2(
process.env.GOOGLE_CLIENT_ID,
process.env.GOOGLE_CLIENT_SECRET,
process.env.GOOGLE_REDIRECT_URI
);
oauth2Client.setCredentials({
access_token: process.env.YOUTUBE_ACCESS_TOKEN,
refresh_token: process.env.YOUTUBE_REFRESH_TOKEN
});
const youtube = google.youtube({ version: 'v3', auth: oauth2Client });
async function uploadVideo() {
const response = await youtube.videos.insert({
part: ['snippet', 'status'],
requestBody: {
snippet: {
title: 'How to Build a REST API in 10 Minutes',
description: 'Step-by-step tutorial covering Express.js setup, routing, and deployment.',
tags: ['api', 'rest api', 'tutorial', 'express'],
categoryId: '28', // Science & Technology
defaultLanguage: 'en'
},
status: {
privacyStatus: 'private',
publishAt: '2026-04-20T15:00:00Z',
selfDeclaredMadeForKids: false
}
},
media: {
body: fs.createReadStream('./video.mp4')
}
});
console.log('Video uploaded:', response.data.id);
return response.data;
}
A few things to note in this example:
- Scheduling requires
privacyStatus: "private"and apublishAttimestamp. This is non-obvious. You cannot setpublishAton a public video. The video must be private, and YouTube will automatically make it public at the specified time. categoryIdis a number, not a string label. You need to look up the category ID for your region using thevideoCategories.listendpoint.selfDeclaredMadeForKidsis required. If you omit it, the API may reject the request or default to the channel-level setting.
The googleapis library handles resumable uploads automatically under the hood when you pass a readable stream. If you are using raw HTTP requests, you will need to implement the resumable protocol yourself.
Setting a Custom Thumbnail
YouTube does not allow you to set a custom thumbnail in the same videos.insert call. It is a separate API call:
await youtube.thumbnails.set({
videoId: response.data.id,
media: {
body: fs.createReadStream('./thumbnail.jpg')
}
});
This costs an additional 50 quota units. The thumbnail must be under 2 MB, at least 1280 x 720 pixels, and in JPG, GIF, or PNG format.
Python Example
If you prefer Python, the google-api-python-client library follows a similar pattern:
from googleapiclient.discovery import build
from googleapiclient.http import MediaFileUpload
from google.oauth2.credentials import Credentials
credentials = Credentials(
token=access_token,
refresh_token=refresh_token,
token_uri='https://oauth2.googleapis.com/token',
client_id=client_id,
client_secret=client_secret
)
youtube = build('youtube', 'v3', credentials=credentials)
request_body = {
'snippet': {
'title': 'How to Build a REST API in 10 Minutes',
'description': 'Step-by-step tutorial for Express.js.',
'tags': ['api', 'rest api', 'tutorial'],
'categoryId': '28'
},
'status': {
'privacyStatus': 'private',
'publishAt': '2026-04-20T15:00:00Z',
'selfDeclaredMadeForKids': False
}
}
media = MediaFileUpload(
'video.mp4',
chunksize=256 * 1024, # 256 KB chunks
resumable=True
)
request = youtube.videos().insert(
part='snippet,status',
body=request_body,
media_body=media
)
response = None
while response is None:
status, response = request.next_chunk()
if status:
print(f'Upload progress: {int(status.progress() * 100)}%')
print(f'Video uploaded: {response["id"]}')
The Python client makes chunked resumable uploads explicit — you call next_chunk() in a loop and can monitor progress. This is useful for large files and gives you a natural place to implement retry logic.
Option 2: PostEverywhere API (Unified)
If you have read the social media scheduling API guide, you know the pattern: instead of integrating with each platform's native API individually, a unified API gives you one endpoint for all platforms.
PostEverywhere's API handles YouTube uploads alongside Instagram, TikTok, LinkedIn, Facebook, X, and Threads — all through the same interface.
Why Use a Unified API for YouTube?
The YouTube Data API v3 is powerful, but it comes with baggage:
- Quota management is your problem. You need to track units, handle quota exhaustion errors, and potentially manage multiple Google Cloud projects.
- OAuth consent screen verification can take weeks and requires ongoing compliance.
- Resumable upload implementation adds complexity to your codebase.
- Token refresh logic needs to handle Google's specific OAuth flows.
A unified API abstracts all of this. You send a video URL and metadata, and the API handles quota management, upload chunking, token refresh, and error recovery.
Code Example
const response = await fetch('https://api.posteverywhere.com/v1/posts', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
platforms: ['youtube'],
content: {
title: 'How to Build a REST API in 10 Minutes',
description: 'Step-by-step tutorial covering Express.js setup, routing, and deployment.',
tags: ['api', 'rest api', 'tutorial', 'express'],
videoUrl: 'https://storage.example.com/video.mp4'
},
scheduledAt: '2026-04-20T15:00:00Z'
})
});
const post = await response.json();
console.log('Scheduled:', post.id);
That is the entire upload and scheduling flow. No OAuth token management, no quota tracking, no resumable upload protocol. You pass a URL to the video file (hosted anywhere accessible), and the API handles ingestion and publishing.
Want to publish to YouTube and 7 other platforms from one API call? Check out the PostEverywhere API docs — schedule video, images, and text to all major platforms with a single request.
Cross-Platform Scheduling
The real power of a unified API shows up when you need to publish the same video across platforms. One request can schedule to YouTube, TikTok, Instagram Reels, and LinkedIn simultaneously:
body: JSON.stringify({
platforms: ['youtube', 'tiktok', 'instagram', 'linkedin'],
content: {
title: 'How to Build a REST API in 10 Minutes',
description: 'Step-by-step tutorial for Express.js.',
videoUrl: 'https://storage.example.com/video.mp4'
},
scheduledAt: '2026-04-20T15:00:00Z'
})
The API handles platform-specific formatting — YouTube gets the full title and tags, TikTok gets a caption within character limits, Instagram Reels get the video cropped appropriately. This is the same approach covered in the scheduling API guide.
Comparison: Native vs Unified API
| Feature | YouTube Data API v3 | PostEverywhere API |
|---|---|---|
| Authentication | OAuth 2.0 + consent screen verification | API key |
| Quota | 10,000 units/day (6 uploads) | Based on plan tier |
| Upload method | Resumable uploads (you implement) | Video URL (API handles transfer) |
| Scheduling | publishAt + privacyStatus: private |
scheduledAt field |
| Thumbnails | Separate API call (50 units) | Included in post payload |
| Shorts support | Same endpoint, vertical < 60s | Same endpoint, auto-detected |
| Multi-platform | YouTube only | 8 platforms in one request |
| Token refresh | Your responsibility | Managed |
| Setup time | Hours to days (consent screen) | Minutes |
| Cost | Free (with quota limits) | Starts at $19/mo |
When to use the native API: You need deep YouTube-specific features (managing playlists, comments, community posts, live streams), you only publish to YouTube, or you are building YouTube-specific tooling.
When to use a unified API: You publish to multiple platforms, you want faster integration, or you do not want to manage OAuth flows, quota tracking, and resumable uploads yourself. If you are already scheduling YouTube videos through a tool, the API is the programmatic version of that same workflow.
YouTube API Quota Management
Quota is the single biggest pain point with the YouTube Data API. Here is how to manage it effectively.
How Quota Works
- Every Google Cloud project gets 10,000 units per day by default
- Quota resets at midnight Pacific Time (not UTC)
- Each API method has a fixed cost — uploads at 1,600 units are the most expensive
- Read operations are cheap (1 unit for most
listcalls) butsearch.listis expensive at 100 units - Failed requests still consume quota
Requesting a Quota Increase
If 6 uploads per day is not enough (it usually is not), you can request a higher quota:
- Go to the Google Cloud Console > IAM & Admin > Quotas
- Filter for "YouTube Data API v3"
- Select the quota you want to increase and click "Edit Quotas"
- Fill out the request form — you need to explain your use case, expected usage, and why the default is insufficient
Google reviews these manually. Approval can take anywhere from a few days to several weeks. Common reasons for rejection:
- Vague use case description — Be specific about what your application does and why it needs more uploads
- No OAuth consent screen verification — Google wants to see that your app is verified before granting higher quotas
- Requesting too much too fast — Start with a moderate increase and scale up
Common Quota Pitfalls
Polling burns quota fast. If you are checking upload status or video processing state by repeatedly calling videos.list, those 1-unit calls add up. Use webhooks or YouTube's push notification system instead.
Search is expensive. A single search.list call costs 100 units. If your application searches YouTube as part of its workflow, switch to videos.list or channels.list with specific IDs where possible.
Batch requests do not save quota. Unlike some Google APIs, the YouTube Data API counts each operation in a batch separately. Batching saves HTTP connections, not quota units.
Errors still cost quota. A failed videos.insert still burns 1,600 units. Validate your metadata and file before attempting the upload.
Tracking Quota Usage
Monitor your quota usage through the Google Cloud Console under APIs & Services > Dashboard. Set up alerts to notify you when you approach your daily limit. In code, you can check the X-RateLimit-Remaining header in API responses, though not all endpoints return it consistently.
Shorts vs Long-Form via API
YouTube Shorts use the exact same videos.insert endpoint as long-form videos. There is no separate "Shorts API." The distinction is based on the video itself.
What Makes a Video a Short
YouTube classifies a video as a Short if:
- Duration is 60 seconds or less
- Aspect ratio is vertical (9:16) or square (1:1)
That is it. You do not need to set a "Short" flag in the API. YouTube auto-detects based on the video properties.
Metadata Differences
While the upload endpoint is identical, there are practical differences in how you handle metadata:
Title: Shorts titles tend to be shorter and punchier. The title appears below the video in the Shorts feed, not above it like on regular watch pages. Including #Shorts in the title is no longer necessary — YouTube detects Shorts automatically — but some creators still add it for clarity.
Description: Shorts descriptions are less prominent in the UI. Viewers rarely see them in the Shorts feed. However, descriptions still matter for search and discovery, so include relevant keywords.
Thumbnails: YouTube generates thumbnails for Shorts from the video itself. You can set a custom thumbnail via the API (thumbnails.set), but it may only appear in certain surfaces (like channel pages and search results), not in the Shorts feed itself.
Tags: Tags work the same way for Shorts and long-form. YouTube uses them as a secondary signal for categorisation.
Code Example: Uploading a Short
const response = await youtube.videos.insert({
part: ['snippet', 'status'],
requestBody: {
snippet: {
title: 'Build an API in 60 seconds',
description: 'Express.js speed run. Full tutorial on our channel.',
tags: ['api', 'express', 'coding', 'tutorial'],
categoryId: '28'
},
status: {
privacyStatus: 'public',
selfDeclaredMadeForKids: false
}
},
media: {
body: fs.createReadStream('./short-vertical.mp4') // < 60s, 9:16
}
});
The only difference from a long-form upload is the video file itself. If you want to learn more about scheduling Shorts specifically, the YouTube Shorts scheduling guide covers the workflow in detail.
Scheduling Shorts and long-form from one dashboard? PostEverywhere's YouTube scheduler handles both formats — upload, schedule, and publish without worrying about format detection.
Error Handling and Edge Cases
The YouTube Data API returns standard Google API error responses. Here are the ones you will hit most often:
Common Upload Errors
quotaExceeded (403): You have exhausted your daily quota. There is no way to continue until quota resets at midnight Pacific Time. The only options are waiting or using a different Google Cloud project.
uploadLimitExceeded (403): You have hit the per-channel upload limit. YouTube limits the number of videos a channel can upload per day (the exact number is not public and varies by channel age and standing).
invalidMetadata (400): Something in your snippet or status object is wrong. Common causes: invalid categoryId, publishAt in the past, or a title exceeding 100 characters.
forbidden (403): The authenticated user does not have permission to upload to the target channel. Verify your OAuth scopes include youtube.upload.
videoTooLong (400): The video exceeds the maximum duration for the account. Unverified accounts are limited to 15-minute videos. Verify the channel through YouTube Studio to remove this limit.
Retry Strategy
For resumable uploads, implement exponential backoff:
async function uploadWithRetry(youtube, params, maxRetries = 5) {
for (let attempt = 0; attempt < maxRetries; attempt++) {
try {
const response = await youtube.videos.insert(params);
return response.data;
} catch (error) {
if (error.code === 503 || error.code === 500) {
const delay = Math.pow(2, attempt) * 1000 + Math.random() * 1000;
console.log(`Retry ${attempt + 1} in ${delay}ms`);
await new Promise(resolve => setTimeout(resolve, delay));
continue;
}
throw error; // Non-retryable error
}
}
throw new Error('Max retries exceeded');
}
Do not retry on quotaExceeded or invalidMetadata — those will not resolve on their own.
Processing State
After a successful upload, the video enters a processing state. It is not immediately available to viewers. You can check processing status with:
const status = await youtube.videos.list({
part: ['processingDetails'],
id: [videoId]
});
const processing = status.data.items[0].processingDetails;
console.log('Status:', processing.processingStatus);
// "processing", "succeeded", or "failed"
Processing time depends on video length, resolution, and YouTube's current load. A 10-minute 1080p video typically takes 5-15 minutes to process.
Putting It All Together
If your use case is YouTube-only and you need fine-grained control over playlists, comments, and live streaming, the native YouTube Data API v3 is the right choice. Accept the complexity of OAuth, quota management, and resumable uploads as the cost of that control.
If you are building a multi-platform publishing tool, managing content for clients, or just want to get videos on YouTube without wading through Google Cloud Console setup, a unified API approach saves significant development time. The PostEverywhere API handles the hard parts — OAuth token management, quota optimization, chunked uploads, and cross-platform formatting — so you can focus on your core product.
For more on how scheduling APIs work across all major platforms, read the complete scheduling API guide. Or if you are specifically looking for the best tools (not APIs) for scheduling YouTube content, we have covered that too. You can also explore PostEverywhere's YouTube scheduler to see the no-code version of the same workflow.
FAQs
How many videos can I upload per day with the YouTube API?
With the default quota of 10,000 units per day, you can upload 6 videos (each videos.insert call costs 1,600 units). You can request a quota increase through the Google Cloud Console, but approval takes days to weeks and is not guaranteed. Setting custom thumbnails costs an additional 50 units per video.
Do I need OAuth 2.0 to upload videos to YouTube?
Yes. YouTube video uploads require OAuth 2.0 authentication with the youtube.upload scope. API keys only work for read operations like searching and listing public videos. You also need to configure and verify your OAuth consent screen through the Google Cloud Console, which can take time for production apps.
How do I schedule a YouTube video via API?
Set privacyStatus to "private" and include a publishAt timestamp in ISO 8601 format in the video's status object. YouTube will automatically change the video to public at the specified time. You cannot set publishAt on a video that is already public — it must be private.
Is there a separate API for YouTube Shorts?
No. Shorts use the same videos.insert endpoint as long-form videos. YouTube automatically classifies a video as a Short if it is 60 seconds or less and has a vertical (9:16) or square (1:1) aspect ratio. No special flag or parameter is needed.
What happens if I exceed my YouTube API quota?
All API requests will fail with a quotaExceeded error (HTTP 403) until the quota resets at midnight Pacific Time. There is no way to pay for additional quota on demand. Your only options are waiting, distributing uploads across multiple Google Cloud projects, or using a third-party API that manages quota across its own infrastructure.
Can I upload to multiple YouTube channels via API?
Yes, but each channel requires its own OAuth authorization. Your application needs to store and manage separate access and refresh tokens for each channel. When making API calls, you use the token for the specific channel you want to upload to. A unified scheduling API simplifies this by managing all channel connections through a single dashboard.
What video formats does the YouTube API accept?
YouTube accepts MOV, MPEG-4 (MP4), AVI, WMV, FLV, 3GPP, and WebM. MP4 with H.264 encoding and AAC audio is the recommended format for best quality and fastest processing. Maximum file size is 256 GB or 12 hours, whichever is less. Unverified channels are limited to 15-minute uploads.
How long does YouTube take to process an uploaded video?
Processing time varies by video length and resolution. A 10-minute 1080p video typically takes 5-15 minutes. 4K videos take longer — sometimes 30 minutes or more. During processing, the video is not available to viewers even if set to public. You can poll the processingDetails via the API, but be mindful that each videos.list call costs 1 quota unit.

Founder & CEO of PostEverywhere. Writing about social media strategy, publishing workflows, and analytics that help brands grow faster.