Skip to main content
NikoFischer.com

Main navigation

  • Home
  • About
    • My Reading List
    • Recommended Youtube Channels
    • Life Rules
    • Podcast
  • 50-Day Challenge
  • Impressum
Sprachumschalter
  • German
  • English

Breadcrumb

  1. Home

Supabase Storage: How to Implement File Upload Properly

🎸
🚀 Beta Running

PYNGUP: Rebellion against toxic productivity

Beta limited to 100 spots. Tasks become social commitments instead of lonely to-dos.

🚀 Join Beta 📖 Read Story "€487 wasted"

File uploads are one of the most common features in modern web apps, but implementing them correctly with Supabase Storage can be tricky. Many developers struggle with permission errors, slow uploads, or insecure configurations.

In this article, I'll show you step by step how to implement file uploads with Supabase Storage securely and performantly - from bucket configuration to production-ready frontend integration.

What is Supabase Storage?

Supabase Storage is an S3-compatible object storage system seamlessly integrated into your Supabase project. It offers:

  • Row Level Security for secure file access
  • Automatic image optimization and transformation
  • CDN integration for fast delivery
  • Resumable uploads for large files
  • Webhook integration for automatic processing

Step 1: Setting Up Storage Bucket

Creating a Bucket

Go to your Supabase Dashboard → Storage and create a new bucket:

-- Option 1: Via Dashboard (recommended for beginners)
-- Storage > Create bucket > "uploads" > Choose Public/Private

-- Option 2: Via SQL
INSERT INTO storage.buckets (id, name, public)
VALUES ('uploads', 'uploads', false);

Important: Only set public to true if all files should be publicly accessible (e.g., for product images).

Configuring RLS Policies

This is the critical part! Without correct RLS policies, uploads won't work:

-- Policy for uploads (INSERT)
CREATE POLICY "Authenticated users can upload files"
ON storage.objects
FOR INSERT
TO authenticated
WITH CHECK (
  bucket_id = 'uploads' AND
  (storage.foldername(name))[1] = auth.uid()::text
);

-- Policy for reading own files (SELECT)
CREATE POLICY "Users can view own files"
ON storage.objects
FOR SELECT
TO authenticated
USING (
  bucket_id = 'uploads' AND
  (storage.foldername(name))[1] = auth.uid()::text
);

-- Policy for deleting own files (DELETE)
CREATE POLICY "Users can delete own files"
ON storage.objects
FOR DELETE
TO authenticated
USING (
  bucket_id = 'uploads' AND
  (storage.foldername(name))[1] = auth.uid()::text
);

-- Policy for updating own files (UPDATE)
CREATE POLICY "Users can update own files"
ON storage.objects
FOR UPDATE
TO authenticated
USING (
  bucket_id = 'uploads' AND
  (storage.foldername(name))[1] = auth.uid()::text
)
WITH CHECK (
  bucket_id = 'uploads' AND
  (storage.foldername(name))[1] = auth.uid()::text
);

Explanation: These policies allow users to only manage files in their own folders (folder name = user ID).

Step 2: Frontend Implementation

React/Next.js Upload Component

Here's a complete, production-ready upload component:

// components/FileUpload.tsx
import { useState, useRef } from 'react'
import { createClient } from '@supabase/supabase-js'
import { useUser } from '@/hooks/useUser' // your auth hook

const supabase = createClient(
  process.env.NEXT_PUBLIC_SUPABASE_URL!,
  process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
)

interface FileUploadProps {
  onUploadComplete: (url: string) => void
  allowedTypes?: string[]
  maxSize?: number // in MB
  bucket?: string
}

export default function FileUpload({
  onUploadComplete,
  allowedTypes = ['image/jpeg', 'image/png', 'image/webp'],
  maxSize = 5,
  bucket = 'uploads'
}: FileUploadProps) {
  const [uploading, setUploading] = useState(false)
  const [uploadProgress, setUploadProgress] = useState(0)
  const [error, setError] = useState<string | null>(null)
  const fileInputRef = useRef<HTMLInputElement>(null)
  const { user } = useUser()

  const handleFileSelect = async (event: React.ChangeEvent<HTMLInputElement>) => {
    const file = event.target.files?.[0]
    if (!file || !user) return

    setError(null)

    // Validation
    if (!allowedTypes.includes(file.type)) {
      setError(`File type not allowed. Allowed: ${allowedTypes.join(', ')}`)
      return
    }

    if (file.size > maxSize * 1024 * 1024) {
      setError(`File too large. Maximum: ${maxSize}MB`)
      return
    }

    await uploadFile(file)
  }

  const uploadFile = async (file: File) => {
    setUploading(true)
    setUploadProgress(0)

    try {
      // Generate unique filename
      const fileExt = file.name.split('.').pop()
      const fileName = `${Date.now()}-${Math.random().toString(36).substring(2)}.${fileExt}`
      const filePath = `${user.id}/${fileName}`

      // Upload with progress tracking
      const { data, error } = await supabase.storage
        .from(bucket)
        .upload(filePath, file, {
          cacheControl: '3600',
          upsert: false
        })

      if (error) {
        throw error
      }

      // Generate public URL
      const { data: { publicUrl } } = supabase.storage
        .from(bucket)
        .getPublicUrl(filePath)

      onUploadComplete(publicUrl)
      
      // Reset input
      if (fileInputRef.current) {
        fileInputRef.current.value = ''
      }

    } catch (error: any) {
      console.error('Upload error:', error)
      setError(error.message || 'Upload failed')
    } finally {
      setUploading(false)
      setUploadProgress(0)
    }
  }

  return (
    <div className="space-y-4">
      <div className="flex items-center justify-center w-full">
        <label className="flex flex-col items-center justify-center w-full h-32 border-2 border-gray-300 border-dashed rounded-lg cursor-pointer bg-gray-50 hover:bg-gray-100">
          <div className="flex flex-col items-center justify-center pt-5 pb-6">
            <svg className="w-8 h-8 mb-4 text-gray-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
              <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M7 16a4 4 0 01-.88-7.903A5 5 0 1115.9 6L16 6a5 5 0 011 9.9M15 13l-3-3m0 0l-3 3m3-3v12" />
            </svg>
            <p className="mb-2 text-sm text-gray-500">
              <span className="font-semibold">Click to upload</span> or drag and drop
            </p>
            <p className="text-xs text-gray-500">
              {allowedTypes.join(', ')} (max. {maxSize}MB)
            </p>
          </div>
          <input
            ref={fileInputRef}
            type="file"
            className="hidden"
            onChange={handleFileSelect}
            accept={allowedTypes.join(',')}
            disabled={uploading || !user}
          />
        </label>
      </div>

      {uploading && (
        <div className="w-full bg-gray-200 rounded-full h-2.5">
          <div 
            className="bg-blue-600 h-2.5 rounded-full transition-all duration-300" 
            style={{ width: `${uploadProgress}%` }}
          ></div>
        </div>
      )}

      {error && (
        <div className="p-3 text-sm text-red-600 bg-red-50 rounded-md">
          {error}
        </div>
      )}

      {uploading && (
        <p className="text-sm text-gray-600 text-center">
          Uploading... {uploadProgress}%
        </p>
      )}
    </div>
  )
}

Usage Example

// pages/profile.tsx
import FileUpload from '@/components/FileUpload'

export default function ProfilePage() {
  const handleAvatarUpload = (url: string) => {
    console.log('Avatar uploaded:', url)
    // Update user profile with new avatar URL
    updateUserProfile({ avatar_url: url })
  }

  return (
    <div>
      <h1>Edit Profile</h1>
      <FileUpload
        onUploadComplete={handleAvatarUpload}
        allowedTypes={['image/jpeg', 'image/png']}
        maxSize={2}
        bucket="avatars"
      />
    </div>
  )
}

Step 3: Advanced Features

Progress Tracking for Large Files

For real progress tracking, you need to chunk the upload:

const uploadFileWithProgress = async (file: File) => {
  const chunkSize = 1024 * 1024 // 1MB chunks
  const totalChunks = Math.ceil(file.size / chunkSize)
  
  for (let chunkIndex = 0; chunkIndex < totalChunks; chunkIndex++) {
    const start = chunkIndex * chunkSize
    const end = Math.min(start + chunkSize, file.size)
    const chunk = file.slice(start, end)
    
    // Upload chunk...
    const progress = ((chunkIndex + 1) / totalChunks) * 100
    setUploadProgress(progress)
  }
}

Image Resize Before Upload

const resizeImage = (file: File, maxWidth: number, maxHeight: number): Promise<File> => {
  return new Promise((resolve) => {
    const canvas = document.createElement('canvas')
    const ctx = canvas.getContext('2d')!
    const img = new Image()
    
    img.onload = () => {
      // Calculate aspect ratio
      const ratio = Math.min(maxWidth / img.width, maxHeight / img.height)
      canvas.width = img.width * ratio
      canvas.height = img.height * ratio
      
      // Draw image on canvas
      ctx.drawImage(img, 0, 0, canvas.width, canvas.height)
      
      // Convert back to File
      canvas.toBlob((blob) => {
        if (blob) {
          const resizedFile = new File([blob], file.name, {
            type: file.type,
            lastModified: Date.now()
          })
          resolve(resizedFile)
        }
      }, file.type, 0.9)
    }
    
    img.src = URL.createObjectURL(file)
  })
}

// Usage:
const handleFileSelect = async (event: React.ChangeEvent<HTMLInputElement>) => {
  const file = event.target.files?.[0]
  if (!file) return
  
  // Resize image before upload
  const resizedFile = await resizeImage(file, 1920, 1080)
  await uploadFile(resizedFile)
}

Multiple File Upload

const uploadMultipleFiles = async (files: FileList) => {
  const uploadPromises = Array.from(files).map(async (file, index) => {
    const fileExt = file.name.split('.').pop()
    const fileName = `${Date.now()}-${index}.${fileExt}`
    const filePath = `${user.id}/${fileName}`
    
    return supabase.storage
      .from('uploads')
      .upload(filePath, file)
  })
  
  try {
    const results = await Promise.all(uploadPromises)
    console.log('All files uploaded:', results)
  } catch (error) {
    console.error('Some uploads failed:', error)
  }
}

Step 4: Server-Side Upload (Next.js API Route)

For sensitive uploads or server-side processing:

// pages/api/upload.ts
import { createServerSupabaseClient } from '@supabase/auth-helpers-nextjs'
import { NextApiRequest, NextApiResponse } from 'next'
import formidable from 'formidable'
import fs from 'fs'

export const config = {
  api: {
    bodyParser: false, // Disable body parsing for file uploads
  },
}

export default async function handler(req: NextApiRequest, res: NextApiResponse) {
  if (req.method !== 'POST') {
    return res.status(405).json({ error: 'Method not allowed' })
  }

  const supabase = createServerSupabaseClient({ req, res })
  
  // Check authentication
  const { data: { user }, error: authError } = await supabase.auth.getUser()
  if (!user) {
    return res.status(401).json({ error: 'Unauthorized' })
  }

  try {
    const form = formidable({
      maxFileSize: 10 * 1024 * 1024, // 10MB
      keepExtensions: true,
    })

    const [fields, files] = await form.parse(req)
    const file = Array.isArray(files.file) ? files.file[0] : files.file

    if (!file) {
      return res.status(400).json({ error: 'No file provided' })
    }

    // Read file
    const fileBuffer = fs.readFileSync(file.filepath)
    
    // Generate unique filename
    const fileExt = file.originalFilename?.split('.').pop()
    const fileName = `${Date.now()}-${Math.random().toString(36).substring(2)}.${fileExt}`
    const filePath = `${user.id}/${fileName}`

    // Upload to Supabase
    const { data, error } = await supabase.storage
      .from('uploads')
      .upload(filePath, fileBuffer, {
        contentType: file.mimetype || 'application/octet-stream',
        cacheControl: '3600'
      })

    if (error) {
      throw error
    }

    // Get public URL
    const { data: { publicUrl } } = supabase.storage
      .from('uploads')
      .getPublicUrl(filePath)

    // Clean up temp file
    fs.unlinkSync(file.filepath)

    res.status(200).json({ 
      message: 'Upload successful', 
      url: publicUrl,
      path: data.path 
    })

  } catch (error: any) {
    console.error('Upload error:', error)
    res.status(500).json({ error: error.message || 'Upload failed' })
  }
}

Step 5: File Management

Listing Files

const getUserFiles = async () => {
  const { data, error } = await supabase.storage
    .from('uploads')
    .list(user.id, {
      limit: 100,
      offset: 0,
      sortBy: { column: 'created_at', order: 'desc' }
    })
    
  if (error) {
    console.error('Error listing files:', error)
    return []
  }
  
  return data
}

Deleting Files

const deleteFile = async (filePath: string) => {
  const { error } = await supabase.storage
    .from('uploads')
    .remove([filePath])
    
  if (error) {
    console.error('Error deleting file:', error)
    return false
  }
  
  return true
}

Image Transformations

Supabase offers automatic image transformations:

// Generate different sizes
const getImageUrl = (path: string, options?: {
  width?: number
  height?: number
  quality?: number
  format?: 'webp' | 'jpeg' | 'png'
}) => {
  const { data } = supabase.storage
    .from('uploads')
    .getPublicUrl(path, {
      transform: {
        width: options?.width,
        height: options?.height,
        quality: options?.quality,
        format: options?.format
      }
    })
    
  return data.publicUrl
}

// Usage:
const thumbnailUrl = getImageUrl('user123/image.jpg', { 
  width: 200, 
  height: 200, 
  quality: 80,
  format: 'webp'
})

Common Problems and Solutions

Problem 1: "new row violates row-level security policy"

Solution: Check your RLS policies. The user must be authenticated and the policy must allow the upload:

-- Debug: Check if user is authenticated
SELECT auth.uid(), auth.role();

-- Debug: Check bucket configuration
SELECT * FROM storage.buckets WHERE id = 'uploads';

Problem 2: Upload works but image doesn't display

Solution: Check the public URL and CORS settings:

// Generate correct URL
const { data: { publicUrl }, error } = supabase.storage
  .from('uploads')
  .getPublicUrl(filePath)

if (error) {
  console.error('Error getting public URL:', error)
}

Problem 3: Slow Uploads

Solutions:

  1. Compress files before upload
  2. Chunked upload for large files
  3. Enable CDN (automatic with Supabase)
  4. Client-side image resize

Security Best Practices

1. File Type Validation

const ALLOWED_FILE_TYPES = {
  images: ['image/jpeg', 'image/png', 'image/webp', 'image/gif'],
  documents: ['application/pdf', 'application/msword', 'text/plain'],
  videos: ['video/mp4', 'video/webm', 'video/ogg']
}

const validateFileType = (file: File, category: keyof typeof ALLOWED_FILE_TYPES) => {
  return ALLOWED_FILE_TYPES[category].includes(file.type)
}

2. File Size Limits

const MAX_FILE_SIZES = {
  image: 5 * 1024 * 1024,    // 5MB
  document: 10 * 1024 * 1024, // 10MB
  video: 100 * 1024 * 1024   // 100MB
}

3. Malware Scanning

For production apps, you should integrate a malware scanner:

// Example with ClamAV
const scanFile = async (fileBuffer: Buffer) => {
  // Integration with malware scanner
  // return scanResult
}

Performance Optimization

1. Lazy Loading for File Lists

const FileList = () => {
  const [files, setFiles] = useState([])
  const [loading, setLoading] = useState(false)
  const [hasMore, setHasMore] = useState(true)

  const loadMoreFiles = async () => {
    if (loading || !hasMore) return
    
    setLoading(true)
    const newFiles = await getUserFiles(files.length)
    
    if (newFiles.length === 0) {
      setHasMore(false)
    } else {
      setFiles(prev => [...prev, ...newFiles])
    }
    
    setLoading(false)
  }

  return (
    <div>
      {files.map(file => (
        <FileItem key={file.id} file={file} />
      ))}
      {hasMore && (
        <button onClick={loadMoreFiles} disabled={loading}>
          {loading ? 'Loading...' : 'Load More'}
        </button>
      )}
    </div>
  )
}

2. Image Caching Strategy

// Service Worker for aggressive caching
const cacheImages = async (urls: string[]) => {
  const cache = await caches.open('supabase-images-v1')
  await cache.addAll(urls)
}

3. Progressive Image Loading

const ProgressiveImage = ({ src, placeholder, alt }: {
  src: string
  placeholder: string
  alt: string
}) => {
  const [imageLoaded, setImageLoaded] = useState(false)
  const [imageSrc, setImageSrc] = useState(placeholder)

  useEffect(() => {
    const img = new Image()
    img.onload = () => {
      setImageSrc(src)
      setImageLoaded(true)
    }
    img.src = src
  }, [src])

  return (
    <img
      src={imageSrc}
      alt={alt}
      className={`transition-opacity duration-300 ${
        imageLoaded ? 'opacity-100' : 'opacity-50'
      }`}
    />
  )
}

Production Deployment Checklist

1. Environment Variables

# .env.local
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key

2. Bucket Configuration

-- Enable RLS on storage.objects
ALTER TABLE storage.objects ENABLE ROW LEVEL SECURITY;

-- Set appropriate bucket policies
UPDATE storage.buckets 
SET public = false 
WHERE id = 'uploads';

3. CDN Configuration

// Configure custom domain for storage
const getStorageUrl = (path: string) => {
  const baseUrl = process.env.NODE_ENV === 'production' 
    ? 'https://your-custom-domain.com/storage/v1/object/public'
    : supabase.storage.from('uploads').getPublicUrl('').data.publicUrl
  
  return `${baseUrl}/${path}`
}

4. Monitoring and Analytics

// Track upload metrics
const trackUpload = (fileSize: number, fileType: string, duration: number) => {
  // Analytics integration
  analytics.track('file_upload', {
    file_size: fileSize,
    file_type: fileType,
    upload_duration: duration
  })
}

Advanced Use Cases

1. Direct Upload to S3 (Bypass Supabase)

For very large files or special requirements:

const generatePresignedUrl = async (filename: string) => {
  const response = await fetch('/api/generate-presigned-url', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ filename })
  })
  
  return response.json()
}

const uploadToS3Direct = async (file: File, presignedUrl: string) => {
  await fetch(presignedUrl, {
    method: 'PUT',
    body: file,
    headers: {
      'Content-Type': file.type
    }
  })
}

2. Video Processing Pipeline

// Trigger video processing after upload
const handleVideoUpload = async (filePath: string) => {
  // Trigger Edge Function for video processing
  await supabase.functions.invoke('process-video', {
    body: { filePath }
  })
}

3. Collaborative File Sharing

// Share files with other users
const shareFile = async (filePath: string, userIds: string[]) => {
  const { data, error } = await supabase
    .from('file_shares')
    .insert(
      userIds.map(userId => ({
        file_path: filePath,
        shared_with: userId,
        shared_by: user.id,
        permissions: 'read'
      }))
    )
  
  return { data, error }
}

Conclusion

Supabase Storage provides a powerful and flexible solution for file uploads. The key points:

  1. RLS policies are critical - nothing works without them
  2. Client-side validation improves UX, but server-side is mandatory
  3. Image transformations save bandwidth and improve performance
  4. Chunked upload implementation for large files
  5. Security-first approach with file type and size validation

With these implementations, you have a production-ready file upload solution that's secure, performant, and user-friendly.

Additional Resources

  • Supabase Storage Documentation
  • Storage RLS Policies Guide
  • Image Transformations
  • Storage REST API Reference

Having trouble with your file upload implementation? Leave a comment and I'll be happy to help!

Tags

  • Supabase
  • Supabase Storage

Comments

About text formats

Restricted HTML

  • Allowed HTML tags: <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.

Related articles

Supabase: How to query users table?
Supabase throws "new row violates row-level security policy for table" even though I had created a row level policy for inserts
Supabase vs. Firebase - Unveiling the Differences
Supabase Edge Functions CORS Error Fix - Complete Guide 2025
Building a ChatGPT Clone with Supabase Edge Functions and OpenAI

About the author

Nikolai Fischer is the founder of Kommune3 (since 2007) and a leading expert in Drupal development and tech entrepreneurship. With 17+ years of experience, he has led hundreds of projects and achieved #1 on Hacker News. As host of the "Kommit mich" podcast and founder of skillution, he combines technical expertise with entrepreneurial thinking. His articles about Supabase, modern web development, and systematic problem-solving have influenced thousands of developers worldwide.

Ihre Anmeldung konnte nicht gespeichert werden. Bitte versuchen Sie es erneut.
Ihre Anmeldung war erfolgreich.

Newsletter

Join a growing community of friendly readers. From time to time I share my thoughts about rational thinking, productivity and life.

Nikolai Fischer

✌ Hi, I'm Niko
Entrepreneur, developer & podcaster

Contact me:

  • E-Mail
  • Phone
  • LinkedIn

My Reading List

  • Algorithmic Trading - Ernie Chan
  • Let Me Tell You a Story: Tales Along the Road to Happiness - Jorge Bucay
  • Mindset: The New Psychology of Success - Carol S. Dweck
  • Deep Work: Rules for Focused Success in a Distracted World - Cal Newport
  • The Café on the Edge of the World: A Story About the Meaning of Life - John Strelecky
more
RSS feed