🎸
🚀 Beta Running
PYNGUP: Rebellion against toxic productivity
Beta limited to 100 spots. Tasks become social commitments instead of lonely to-dos.
File uploads are one of the most common features in modern web apps, but implementing them correctly with Supabase Storage can be tricky. Many developers struggle with permission errors, slow uploads, or insecure configurations.
In this article, I'll show you step by step how to implement file uploads with Supabase Storage securely and performantly - from bucket configuration to production-ready frontend integration.
Supabase Storage is an S3-compatible object storage system seamlessly integrated into your Supabase project. It offers:
Go to your Supabase Dashboard → Storage and create a new bucket:
-- Option 1: Via Dashboard (recommended for beginners)
-- Storage > Create bucket > "uploads" > Choose Public/Private
-- Option 2: Via SQL
INSERT INTO storage.buckets (id, name, public)
VALUES ('uploads', 'uploads', false);
Important: Only set public
to true
if all files should be publicly accessible (e.g., for product images).
This is the critical part! Without correct RLS policies, uploads won't work:
-- Policy for uploads (INSERT)
CREATE POLICY "Authenticated users can upload files"
ON storage.objects
FOR INSERT
TO authenticated
WITH CHECK (
bucket_id = 'uploads' AND
(storage.foldername(name))[1] = auth.uid()::text
);
-- Policy for reading own files (SELECT)
CREATE POLICY "Users can view own files"
ON storage.objects
FOR SELECT
TO authenticated
USING (
bucket_id = 'uploads' AND
(storage.foldername(name))[1] = auth.uid()::text
);
-- Policy for deleting own files (DELETE)
CREATE POLICY "Users can delete own files"
ON storage.objects
FOR DELETE
TO authenticated
USING (
bucket_id = 'uploads' AND
(storage.foldername(name))[1] = auth.uid()::text
);
-- Policy for updating own files (UPDATE)
CREATE POLICY "Users can update own files"
ON storage.objects
FOR UPDATE
TO authenticated
USING (
bucket_id = 'uploads' AND
(storage.foldername(name))[1] = auth.uid()::text
)
WITH CHECK (
bucket_id = 'uploads' AND
(storage.foldername(name))[1] = auth.uid()::text
);
Explanation: These policies allow users to only manage files in their own folders (folder name = user ID).
Here's a complete, production-ready upload component:
// components/FileUpload.tsx
import { useState, useRef } from 'react'
import { createClient } from '@supabase/supabase-js'
import { useUser } from '@/hooks/useUser' // your auth hook
const supabase = createClient(
process.env.NEXT_PUBLIC_SUPABASE_URL!,
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
)
interface FileUploadProps {
onUploadComplete: (url: string) => void
allowedTypes?: string[]
maxSize?: number // in MB
bucket?: string
}
export default function FileUpload({
onUploadComplete,
allowedTypes = ['image/jpeg', 'image/png', 'image/webp'],
maxSize = 5,
bucket = 'uploads'
}: FileUploadProps) {
const [uploading, setUploading] = useState(false)
const [uploadProgress, setUploadProgress] = useState(0)
const [error, setError] = useState<string | null>(null)
const fileInputRef = useRef<HTMLInputElement>(null)
const { user } = useUser()
const handleFileSelect = async (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0]
if (!file || !user) return
setError(null)
// Validation
if (!allowedTypes.includes(file.type)) {
setError(`File type not allowed. Allowed: ${allowedTypes.join(', ')}`)
return
}
if (file.size > maxSize * 1024 * 1024) {
setError(`File too large. Maximum: ${maxSize}MB`)
return
}
await uploadFile(file)
}
const uploadFile = async (file: File) => {
setUploading(true)
setUploadProgress(0)
try {
// Generate unique filename
const fileExt = file.name.split('.').pop()
const fileName = `${Date.now()}-${Math.random().toString(36).substring(2)}.${fileExt}`
const filePath = `${user.id}/${fileName}`
// Upload with progress tracking
const { data, error } = await supabase.storage
.from(bucket)
.upload(filePath, file, {
cacheControl: '3600',
upsert: false
})
if (error) {
throw error
}
// Generate public URL
const { data: { publicUrl } } = supabase.storage
.from(bucket)
.getPublicUrl(filePath)
onUploadComplete(publicUrl)
// Reset input
if (fileInputRef.current) {
fileInputRef.current.value = ''
}
} catch (error: any) {
console.error('Upload error:', error)
setError(error.message || 'Upload failed')
} finally {
setUploading(false)
setUploadProgress(0)
}
}
return (
<div className="space-y-4">
<div className="flex items-center justify-center w-full">
<label className="flex flex-col items-center justify-center w-full h-32 border-2 border-gray-300 border-dashed rounded-lg cursor-pointer bg-gray-50 hover:bg-gray-100">
<div className="flex flex-col items-center justify-center pt-5 pb-6">
<svg className="w-8 h-8 mb-4 text-gray-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M7 16a4 4 0 01-.88-7.903A5 5 0 1115.9 6L16 6a5 5 0 011 9.9M15 13l-3-3m0 0l-3 3m3-3v12" />
</svg>
<p className="mb-2 text-sm text-gray-500">
<span className="font-semibold">Click to upload</span> or drag and drop
</p>
<p className="text-xs text-gray-500">
{allowedTypes.join(', ')} (max. {maxSize}MB)
</p>
</div>
<input
ref={fileInputRef}
type="file"
className="hidden"
onChange={handleFileSelect}
accept={allowedTypes.join(',')}
disabled={uploading || !user}
/>
</label>
</div>
{uploading && (
<div className="w-full bg-gray-200 rounded-full h-2.5">
<div
className="bg-blue-600 h-2.5 rounded-full transition-all duration-300"
style={{ width: `${uploadProgress}%` }}
></div>
</div>
)}
{error && (
<div className="p-3 text-sm text-red-600 bg-red-50 rounded-md">
{error}
</div>
)}
{uploading && (
<p className="text-sm text-gray-600 text-center">
Uploading... {uploadProgress}%
</p>
)}
</div>
)
}
// pages/profile.tsx
import FileUpload from '@/components/FileUpload'
export default function ProfilePage() {
const handleAvatarUpload = (url: string) => {
console.log('Avatar uploaded:', url)
// Update user profile with new avatar URL
updateUserProfile({ avatar_url: url })
}
return (
<div>
<h1>Edit Profile</h1>
<FileUpload
onUploadComplete={handleAvatarUpload}
allowedTypes={['image/jpeg', 'image/png']}
maxSize={2}
bucket="avatars"
/>
</div>
)
}
For real progress tracking, you need to chunk the upload:
const uploadFileWithProgress = async (file: File) => {
const chunkSize = 1024 * 1024 // 1MB chunks
const totalChunks = Math.ceil(file.size / chunkSize)
for (let chunkIndex = 0; chunkIndex < totalChunks; chunkIndex++) {
const start = chunkIndex * chunkSize
const end = Math.min(start + chunkSize, file.size)
const chunk = file.slice(start, end)
// Upload chunk...
const progress = ((chunkIndex + 1) / totalChunks) * 100
setUploadProgress(progress)
}
}
const resizeImage = (file: File, maxWidth: number, maxHeight: number): Promise<File> => {
return new Promise((resolve) => {
const canvas = document.createElement('canvas')
const ctx = canvas.getContext('2d')!
const img = new Image()
img.onload = () => {
// Calculate aspect ratio
const ratio = Math.min(maxWidth / img.width, maxHeight / img.height)
canvas.width = img.width * ratio
canvas.height = img.height * ratio
// Draw image on canvas
ctx.drawImage(img, 0, 0, canvas.width, canvas.height)
// Convert back to File
canvas.toBlob((blob) => {
if (blob) {
const resizedFile = new File([blob], file.name, {
type: file.type,
lastModified: Date.now()
})
resolve(resizedFile)
}
}, file.type, 0.9)
}
img.src = URL.createObjectURL(file)
})
}
// Usage:
const handleFileSelect = async (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0]
if (!file) return
// Resize image before upload
const resizedFile = await resizeImage(file, 1920, 1080)
await uploadFile(resizedFile)
}
const uploadMultipleFiles = async (files: FileList) => {
const uploadPromises = Array.from(files).map(async (file, index) => {
const fileExt = file.name.split('.').pop()
const fileName = `${Date.now()}-${index}.${fileExt}`
const filePath = `${user.id}/${fileName}`
return supabase.storage
.from('uploads')
.upload(filePath, file)
})
try {
const results = await Promise.all(uploadPromises)
console.log('All files uploaded:', results)
} catch (error) {
console.error('Some uploads failed:', error)
}
}
For sensitive uploads or server-side processing:
// pages/api/upload.ts
import { createServerSupabaseClient } from '@supabase/auth-helpers-nextjs'
import { NextApiRequest, NextApiResponse } from 'next'
import formidable from 'formidable'
import fs from 'fs'
export const config = {
api: {
bodyParser: false, // Disable body parsing for file uploads
},
}
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
if (req.method !== 'POST') {
return res.status(405).json({ error: 'Method not allowed' })
}
const supabase = createServerSupabaseClient({ req, res })
// Check authentication
const { data: { user }, error: authError } = await supabase.auth.getUser()
if (!user) {
return res.status(401).json({ error: 'Unauthorized' })
}
try {
const form = formidable({
maxFileSize: 10 * 1024 * 1024, // 10MB
keepExtensions: true,
})
const [fields, files] = await form.parse(req)
const file = Array.isArray(files.file) ? files.file[0] : files.file
if (!file) {
return res.status(400).json({ error: 'No file provided' })
}
// Read file
const fileBuffer = fs.readFileSync(file.filepath)
// Generate unique filename
const fileExt = file.originalFilename?.split('.').pop()
const fileName = `${Date.now()}-${Math.random().toString(36).substring(2)}.${fileExt}`
const filePath = `${user.id}/${fileName}`
// Upload to Supabase
const { data, error } = await supabase.storage
.from('uploads')
.upload(filePath, fileBuffer, {
contentType: file.mimetype || 'application/octet-stream',
cacheControl: '3600'
})
if (error) {
throw error
}
// Get public URL
const { data: { publicUrl } } = supabase.storage
.from('uploads')
.getPublicUrl(filePath)
// Clean up temp file
fs.unlinkSync(file.filepath)
res.status(200).json({
message: 'Upload successful',
url: publicUrl,
path: data.path
})
} catch (error: any) {
console.error('Upload error:', error)
res.status(500).json({ error: error.message || 'Upload failed' })
}
}
const getUserFiles = async () => {
const { data, error } = await supabase.storage
.from('uploads')
.list(user.id, {
limit: 100,
offset: 0,
sortBy: { column: 'created_at', order: 'desc' }
})
if (error) {
console.error('Error listing files:', error)
return []
}
return data
}
const deleteFile = async (filePath: string) => {
const { error } = await supabase.storage
.from('uploads')
.remove([filePath])
if (error) {
console.error('Error deleting file:', error)
return false
}
return true
}
Supabase offers automatic image transformations:
// Generate different sizes
const getImageUrl = (path: string, options?: {
width?: number
height?: number
quality?: number
format?: 'webp' | 'jpeg' | 'png'
}) => {
const { data } = supabase.storage
.from('uploads')
.getPublicUrl(path, {
transform: {
width: options?.width,
height: options?.height,
quality: options?.quality,
format: options?.format
}
})
return data.publicUrl
}
// Usage:
const thumbnailUrl = getImageUrl('user123/image.jpg', {
width: 200,
height: 200,
quality: 80,
format: 'webp'
})
Solution: Check your RLS policies. The user must be authenticated and the policy must allow the upload:
-- Debug: Check if user is authenticated
SELECT auth.uid(), auth.role();
-- Debug: Check bucket configuration
SELECT * FROM storage.buckets WHERE id = 'uploads';
Solution: Check the public URL and CORS settings:
// Generate correct URL
const { data: { publicUrl }, error } = supabase.storage
.from('uploads')
.getPublicUrl(filePath)
if (error) {
console.error('Error getting public URL:', error)
}
Solutions:
const ALLOWED_FILE_TYPES = {
images: ['image/jpeg', 'image/png', 'image/webp', 'image/gif'],
documents: ['application/pdf', 'application/msword', 'text/plain'],
videos: ['video/mp4', 'video/webm', 'video/ogg']
}
const validateFileType = (file: File, category: keyof typeof ALLOWED_FILE_TYPES) => {
return ALLOWED_FILE_TYPES[category].includes(file.type)
}
const MAX_FILE_SIZES = {
image: 5 * 1024 * 1024, // 5MB
document: 10 * 1024 * 1024, // 10MB
video: 100 * 1024 * 1024 // 100MB
}
For production apps, you should integrate a malware scanner:
// Example with ClamAV
const scanFile = async (fileBuffer: Buffer) => {
// Integration with malware scanner
// return scanResult
}
const FileList = () => {
const [files, setFiles] = useState([])
const [loading, setLoading] = useState(false)
const [hasMore, setHasMore] = useState(true)
const loadMoreFiles = async () => {
if (loading || !hasMore) return
setLoading(true)
const newFiles = await getUserFiles(files.length)
if (newFiles.length === 0) {
setHasMore(false)
} else {
setFiles(prev => [...prev, ...newFiles])
}
setLoading(false)
}
return (
<div>
{files.map(file => (
<FileItem key={file.id} file={file} />
))}
{hasMore && (
<button onClick={loadMoreFiles} disabled={loading}>
{loading ? 'Loading...' : 'Load More'}
</button>
)}
</div>
)
}
// Service Worker for aggressive caching
const cacheImages = async (urls: string[]) => {
const cache = await caches.open('supabase-images-v1')
await cache.addAll(urls)
}
const ProgressiveImage = ({ src, placeholder, alt }: {
src: string
placeholder: string
alt: string
}) => {
const [imageLoaded, setImageLoaded] = useState(false)
const [imageSrc, setImageSrc] = useState(placeholder)
useEffect(() => {
const img = new Image()
img.onload = () => {
setImageSrc(src)
setImageLoaded(true)
}
img.src = src
}, [src])
return (
<img
src={imageSrc}
alt={alt}
className={`transition-opacity duration-300 ${
imageLoaded ? 'opacity-100' : 'opacity-50'
}`}
/>
)
}
# .env.local
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
-- Enable RLS on storage.objects
ALTER TABLE storage.objects ENABLE ROW LEVEL SECURITY;
-- Set appropriate bucket policies
UPDATE storage.buckets
SET public = false
WHERE id = 'uploads';
// Configure custom domain for storage
const getStorageUrl = (path: string) => {
const baseUrl = process.env.NODE_ENV === 'production'
? 'https://your-custom-domain.com/storage/v1/object/public'
: supabase.storage.from('uploads').getPublicUrl('').data.publicUrl
return `${baseUrl}/${path}`
}
// Track upload metrics
const trackUpload = (fileSize: number, fileType: string, duration: number) => {
// Analytics integration
analytics.track('file_upload', {
file_size: fileSize,
file_type: fileType,
upload_duration: duration
})
}
For very large files or special requirements:
const generatePresignedUrl = async (filename: string) => {
const response = await fetch('/api/generate-presigned-url', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ filename })
})
return response.json()
}
const uploadToS3Direct = async (file: File, presignedUrl: string) => {
await fetch(presignedUrl, {
method: 'PUT',
body: file,
headers: {
'Content-Type': file.type
}
})
}
// Trigger video processing after upload
const handleVideoUpload = async (filePath: string) => {
// Trigger Edge Function for video processing
await supabase.functions.invoke('process-video', {
body: { filePath }
})
}
// Share files with other users
const shareFile = async (filePath: string, userIds: string[]) => {
const { data, error } = await supabase
.from('file_shares')
.insert(
userIds.map(userId => ({
file_path: filePath,
shared_with: userId,
shared_by: user.id,
permissions: 'read'
}))
)
return { data, error }
}
Supabase Storage provides a powerful and flexible solution for file uploads. The key points:
With these implementations, you have a production-ready file upload solution that's secure, performant, and user-friendly.
Having trouble with your file upload implementation? Leave a comment and I'll be happy to help!
Nikolai Fischer is the founder of Kommune3 (since 2007) and a leading expert in Drupal development and tech entrepreneurship. With 17+ years of experience, he has led hundreds of projects and achieved #1 on Hacker News. As host of the "Kommit mich" podcast and founder of skillution, he combines technical expertise with entrepreneurial thinking. His articles about Supabase, modern web development, and systematic problem-solving have influenced thousands of developers worldwide.
Comments