🎸
🚀 Beta Running
PYNGUP: Rebellion against toxic productivity
Beta limited to 100 spots. Tasks become social commitments instead of lonely to-dos.
Build a fully functional ChatGPT clone using Supabase Edge Functions, OpenAI's API, and React. This comprehensive guide covers everything from authentication to real-time streaming responses.
In this tutorial, we'll build a production-ready ChatGPT clone that features:
By leveraging Supabase Edge Functions, we can create a serverless API that scales automatically and provides low-latency responses globally. Combined with OpenAI's powerful language models, we'll create an experience that rivals commercial chat applications.
Before we start, make sure you have:
Head over to database.new to create a new Supabase project. Save your project URL and anon key - we'll need these later.
We'll use Vite for a fast development experience:
npm create vite@latest chatgpt-clone -- --template react-ts
cd chatgpt-clone
npm install
Install the required dependencies:
npm install @supabase/supabase-js @supabase/auth-ui-react @supabase/auth-ui-shared
npm install react-markdown remark-gfm react-syntax-highlighter
npm install @types/react-syntax-highlighter --save-dev
Create a .env
file in your project root:
VITE_SUPABASE_URL=your_supabase_project_url
VITE_SUPABASE_ANON_KEY=your_supabase_anon_key
Create src/lib/supabase.ts
:
import { createClient } from '@supabase/supabase-js'
const supabaseUrl = import.meta.env.VITE_SUPABASE_URL
const supabaseAnonKey = import.meta.env.VITE_SUPABASE_ANON_KEY
export const supabase = createClient(supabaseUrl, supabaseAnonKey)
Let's set up our database tables to store conversations and messages. Navigate to the SQL Editor in your Supabase dashboard and run:
-- Enable UUID extension
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
-- Create conversations table
CREATE TABLE conversations (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
user_id UUID REFERENCES auth.users(id) ON DELETE CASCADE,
title TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- Create messages table
CREATE TABLE messages (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
conversation_id UUID REFERENCES conversations(id) ON DELETE CASCADE,
role TEXT NOT NULL CHECK (role IN ('user', 'assistant', 'system')),
content TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Create usage_stats table for tracking API usage
CREATE TABLE usage_stats (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
user_id UUID REFERENCES auth.users(id) ON DELETE CASCADE,
tokens_used INTEGER NOT NULL,
model TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Add indexes for better performance
CREATE INDEX idx_conversations_user_id ON conversations(user_id);
CREATE INDEX idx_messages_conversation_id ON messages(conversation_id);
CREATE INDEX idx_usage_stats_user_id ON usage_stats(user_id);
-- Enable Row Level Security (RLS)
ALTER TABLE conversations ENABLE ROW LEVEL SECURITY;
ALTER TABLE messages ENABLE ROW LEVEL SECURITY;
ALTER TABLE usage_stats ENABLE ROW LEVEL SECURITY;
-- RLS Policies for conversations
CREATE POLICY "Users can view own conversations" ON conversations
FOR SELECT USING (auth.uid() = user_id);
CREATE POLICY "Users can create own conversations" ON conversations
FOR INSERT WITH CHECK (auth.uid() = user_id);
CREATE POLICY "Users can update own conversations" ON conversations
FOR UPDATE USING (auth.uid() = user_id);
CREATE POLICY "Users can delete own conversations" ON conversations
FOR DELETE USING (auth.uid() = user_id);
-- RLS Policies for messages
CREATE POLICY "Users can view messages from own conversations" ON messages
FOR SELECT USING (
EXISTS (
SELECT 1 FROM conversations
WHERE conversations.id = messages.conversation_id
AND conversations.user_id = auth.uid()
)
);
CREATE POLICY "Users can create messages in own conversations" ON messages
FOR INSERT WITH CHECK (
EXISTS (
SELECT 1 FROM conversations
WHERE conversations.id = messages.conversation_id
AND conversations.user_id = auth.uid()
)
);
-- RLS Policies for usage_stats
CREATE POLICY "Users can view own usage stats" ON usage_stats
FOR SELECT USING (auth.uid() = user_id);
CREATE POLICY "Service role can insert usage stats" ON usage_stats
FOR INSERT WITH CHECK (true);
-- Function to update conversation timestamp
CREATE OR REPLACE FUNCTION update_conversation_timestamp()
RETURNS TRIGGER AS $$
BEGIN
UPDATE conversations
SET updated_at = NOW()
WHERE id = NEW.conversation_id;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
-- Trigger to update conversation timestamp on new message
CREATE TRIGGER update_conversation_on_message
AFTER INSERT ON messages
FOR EACH ROW
EXECUTE FUNCTION update_conversation_timestamp();
Now let's create our Edge Function to handle chat completions. This function will communicate with OpenAI's API and stream responses back to the client.
npm install -g supabase
In your project root:
supabase init
supabase functions new chat-completion
Edit supabase/functions/chat-completion/index.ts
:
import { serve } from "https://deno.land/std@0.168.0/http/server.ts"
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2'
const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
}
interface ChatRequest {
messages: Array<{
role: 'user' | 'assistant' | 'system'
content: string
}>
conversationId?: string
temperature?: number
model?: string
}
serve(async (req) => {
// Handle CORS
if (req.method === 'OPTIONS') {
return new Response('ok', { headers: corsHeaders })
}
try {
const supabaseClient = createClient(
Deno.env.get('SUPABASE_URL') ?? '',
Deno.env.get('SUPABASE_SERVICE_ROLE_KEY') ?? '',
{
global: {
headers: { Authorization: req.headers.get('Authorization')! }
}
}
)
// Get the current user
const {
data: { user },
} = await supabaseClient.auth.getUser()
if (!user) {
return new Response(
JSON.stringify({ error: 'Unauthorized' }),
{
status: 401,
headers: { ...corsHeaders, 'Content-Type': 'application/json' }
}
)
}
const { messages, conversationId, temperature = 0.7, model = 'gpt-3.5-turbo' } = await req.json() as ChatRequest
// Create or verify conversation
let convId = conversationId
if (!convId) {
const { data: newConversation, error: convError } = await supabaseClient
.from('conversations')
.insert({
user_id: user.id,
title: messages[0]?.content.substring(0, 50) + '...' || 'New Chat'
})
.select()
.single()
if (convError) throw convError
convId = newConversation.id
}
// Save user message
const { error: msgError } = await supabaseClient
.from('messages')
.insert({
conversation_id: convId,
role: 'user',
content: messages[messages.length - 1].content
})
if (msgError) throw msgError
// Create OpenAI API request
const openAiResponse = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Authorization': `Bearer ${Deno.env.get('OPENAI_API_KEY')}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
model,
messages,
temperature,
stream: true,
}),
})
if (!openAiResponse.ok) {
const error = await openAiResponse.json()
throw new Error(`OpenAI API error: ${error.error?.message || 'Unknown error'}`)
}
// Set up SSE stream
const encoder = new TextEncoder()
const decoder = new TextDecoder()
let fullResponse = ''
const stream = new ReadableStream({
async start(controller) {
const reader = openAiResponse.body?.getReader()
if (!reader) return
try {
while (true) {
const { done, value } = await reader.read()
if (done) break
const chunk = decoder.decode(value)
const lines = chunk.split('\n')
for (const line of lines) {
if (line.startsWith('data: ')) {
const data = line.slice(6)
if (data === '[DONE]') {
// Save assistant response to database
await supabaseClient
.from('messages')
.insert({
conversation_id: convId,
role: 'assistant',
content: fullResponse
})
// Track usage
const usage = {
user_id: user.id,
tokens_used: Math.ceil(fullResponse.length / 4), // Rough estimate
model
}
await supabaseClient
.from('usage_stats')
.insert(usage)
controller.enqueue(encoder.encode(`data: {"done": true, "conversationId": "${convId}"}\n\n`))
controller.close()
return
}
try {
const parsed = JSON.parse(data)
const content = parsed.choices[0]?.delta?.content || ''
fullResponse += content
if (content) {
controller.enqueue(encoder.encode(`data: ${JSON.stringify({ content })}\n\n`))
}
} catch (e) {
console.error('Error parsing stream:', e)
}
}
}
}
} catch (error) {
controller.error(error)
}
},
})
return new Response(stream, {
headers: {
...corsHeaders,
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
},
})
} catch (error) {
console.error('Error:', error)
return new Response(
JSON.stringify({ error: error.message }),
{
status: 500,
headers: { ...corsHeaders, 'Content-Type': 'application/json' }
}
)
}
})
In your Supabase dashboard, navigate to Edge Functions > Secrets and add:
OPENAI_API_KEY
supabase functions deploy chat-completion
Now let's create the React components for our chat interface.
Create src/components/Auth.tsx
:
import { Auth } from '@supabase/auth-ui-react'
import { ThemeSupa } from '@supabase/auth-ui-shared'
import { supabase } from '../lib/supabase'
export default function AuthComponent() {
return (
<div className="auth-container">
<h1>ChatGPT Clone</h1>
<p>Sign in to start chatting</p>
<Auth
supabaseClient={supabase}
appearance={{ theme: ThemeSupa }}
providers={['google', 'github']}
redirectTo={window.location.origin}
/>
</div>
)
}
Create src/components/Message.tsx
:
import ReactMarkdown from 'react-markdown'
import remarkGfm from 'remark-gfm'
import { Prism as SyntaxHighlighter } from 'react-syntax-highlighter'
import { vscDarkPlus } from 'react-syntax-highlighter/dist/esm/styles/prism'
interface MessageProps {
role: 'user' | 'assistant' | 'system'
content: string
}
export default function Message({ role, content }: MessageProps) {
return (
<div className={`message ${role}`}>
<div className="message-role">
{role === 'user' ? '👤' : '🤖'} {role}
</div>
<div className="message-content">
<ReactMarkdown
remarkPlugins={[remarkGfm]}
components={{
code({ node, inline, className, children, ...props }) {
const match = /language-(\w+)/.exec(className || '')
return !inline && match ? (
<SyntaxHighlighter
style={vscDarkPlus}
language={match[1]}
PreTag="div"
{...props}
>
{String(children).replace(/\n$/, '')}
</SyntaxHighlighter>
) : (
<code className={className} {...props}>
{children}
</code>
)
}
}}
>
{content}
</ReactMarkdown>
</div>
</div>
)
}
Create src/components/Chat.tsx
:
import { useState, useEffect, useRef } from 'react'
import { supabase } from '../lib/supabase'
import Message from './Message'
import { User } from '@supabase/supabase-js'
interface ChatMessage {
role: 'user' | 'assistant' | 'system'
content: string
}
interface Conversation {
id: string
title: string
created_at: string
updated_at: string
}
interface ChatProps {
user: User
}
export default function Chat({ user }: ChatProps) {
const [messages, setMessages] = useState<ChatMessage[]>([])
const [input, setInput] = useState('')
const [isLoading, setIsLoading] = useState(false)
const [conversations, setConversations] = useState<Conversation[]>([])
const [currentConversation, setCurrentConversation] = useState<string | null>(null)
const [streamingMessage, setStreamingMessage] = useState('')
const messagesEndRef = useRef<HTMLDivElement>(null)
const abortControllerRef = useRef<AbortController | null>(null)
useEffect(() => {
loadConversations()
}, [])
useEffect(() => {
scrollToBottom()
}, [messages, streamingMessage])
useEffect(() => {
if (currentConversation) {
loadMessages(currentConversation)
}
}, [currentConversation])
const scrollToBottom = () => {
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' })
}
const loadConversations = async () => {
const { data, error } = await supabase
.from('conversations')
.select('*')
.order('updated_at', { ascending: false })
if (!error && data) {
setConversations(data)
if (data.length > 0 && !currentConversation) {
setCurrentConversation(data[0].id)
}
}
}
const loadMessages = async (conversationId: string) => {
const { data, error } = await supabase
.from('messages')
.select('*')
.eq('conversation_id', conversationId)
.order('created_at', { ascending: true })
if (!error && data) {
setMessages(data.map(msg => ({
role: msg.role,
content: msg.content
})))
}
}
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault()
if (!input.trim() || isLoading) return
const userMessage = input.trim()
setInput('')
setIsLoading(true)
setStreamingMessage('')
// Add user message to UI
const newMessages = [...messages, { role: 'user' as const, content: userMessage }]
setMessages(newMessages)
try {
// Create abort controller for this request
abortControllerRef.current = new AbortController()
const { data: { session } } = await supabase.auth.getSession()
const response = await fetch(
`${import.meta.env.VITE_SUPABASE_URL}/functions/v1/chat-completion`,
{
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${session?.access_token}`
},
body: JSON.stringify({
messages: newMessages,
conversationId: currentConversation,
model: 'gpt-3.5-turbo',
temperature: 0.7
}),
signal: abortControllerRef.current.signal
}
)
if (!response.ok) {
throw new Error('Failed to get response')
}
const reader = response.body?.getReader()
const decoder = new TextDecoder()
let assistantMessage = ''
if (reader) {
while (true) {
const { done, value } = await reader.read()
if (done) break
const chunk = decoder.decode(value)
const lines = chunk.split('\n')
for (const line of lines) {
if (line.startsWith('data: ')) {
try {
const data = JSON.parse(line.slice(6))
if (data.done) {
// Update conversation ID if new
if (data.conversationId && !currentConversation) {
setCurrentConversation(data.conversationId)
loadConversations()
}
// Add complete message to messages
setMessages(prev => [...prev, {
role: 'assistant' as const,
content: assistantMessage
}])
setStreamingMessage('')
} else if (data.content) {
assistantMessage += data.content
setStreamingMessage(assistantMessage)
}
} catch (e) {
console.error('Error parsing stream:', e)
}
}
}
}
}
} catch (error: any) {
if (error.name === 'AbortError') {
console.log('Request aborted')
} else {
console.error('Error:', error)
alert('Failed to get response. Please try again.')
}
} finally {
setIsLoading(false)
abortControllerRef.current = null
}
}
const handleNewChat = () => {
setCurrentConversation(null)
setMessages([])
setStreamingMessage('')
}
const handleStopGeneration = () => {
if (abortControllerRef.current) {
abortControllerRef.current.abort()
}
}
const handleSignOut = async () => {
await supabase.auth.signOut()
}
return (
<div className="chat-container">
<div className="sidebar">
<button onClick={handleNewChat} className="new-chat-btn">
+ New Chat
</button>
<div className="conversations-list">
{conversations.map(conv => (
<div
key={conv.id}
className={`conversation-item ${conv.id === currentConversation ? 'active' : ''}`}
onClick={() => setCurrentConversation(conv.id)}
>
{conv.title}
</div>
))}
</div>
<div className="user-info">
<p>{user.email}</p>
<button onClick={handleSignOut}>Sign Out</button>
</div>
</div>
<div className="chat-main">
<div className="messages-container">
{messages.map((message, index) => (
<Message key={index} {...message} />
))}
{streamingMessage && (
<Message role="assistant" content={streamingMessage} />
)}
<div ref={messagesEndRef} />
</div>
<form onSubmit={handleSubmit} className="input-form">
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type your message..."
disabled={isLoading}
/>
{isLoading ? (
<button type="button" onClick={handleStopGeneration}>
Stop
</button>
) : (
<button type="submit" disabled={!input.trim()}>
Send
</button>
)}
</form>
</div>
</div>
)
}
Replace src/App.tsx
:
import { useState, useEffect } from 'react'
import { supabase } from './lib/supabase'
import { User } from '@supabase/supabase-js'
import Auth from './components/Auth'
import Chat from './components/Chat'
import './App.css'
function App() {
const [user, setUser] = useState<User | null>(null)
const [loading, setLoading] = useState(true)
useEffect(() => {
// Check active sessions and sets the user
supabase.auth.getSession().then(({ data: { session } }) => {
setUser(session?.user ?? null)
setLoading(false)
})
// Listen for changes on auth state
const { data: { subscription } } = supabase.auth.onAuthStateChange((_event, session) => {
setUser(session?.user ?? null)
})
return () => subscription.unsubscribe()
}, [])
if (loading) {
return <div className="loading">Loading...</div>
}
return (
<div className="app">
{user ? <Chat user={user} /> : <Auth />}
</div>
)
}
export default App
Update src/App.css
:
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
sans-serif;
background-color: #f7f7f8;
}
.app {
height: 100vh;
display: flex;
flex-direction: column;
}
.loading {
display: flex;
justify-content: center;
align-items: center;
height: 100vh;
font-size: 1.2rem;
color: #666;
}
/* Auth Styles */
.auth-container {
max-width: 400px;
margin: 100px auto;
padding: 2rem;
background: white;
border-radius: 12px;
box-shadow: 0 2px 10px rgba(0, 0, 0, 0.1);
}
.auth-container h1 {
margin-bottom: 0.5rem;
color: #333;
}
.auth-container p {
margin-bottom: 2rem;
color: #666;
}
/* Chat Container */
.chat-container {
display: flex;
height: 100vh;
}
/* Sidebar */
.sidebar {
width: 260px;
background-color: #202123;
color: white;
display: flex;
flex-direction: column;
padding: 0.5rem;
}
.new-chat-btn {
padding: 0.75rem;
margin: 0.5rem;
background-color: transparent;
border: 1px solid #565869;
border-radius: 6px;
color: white;
cursor: pointer;
transition: background-color 0.2s;
}
.new-chat-btn:hover {
background-color: #2a2b32;
}
.conversations-list {
flex: 1;
overflow-y: auto;
margin: 1rem 0;
}
.conversation-item {
padding: 0.75rem;
margin: 0.25rem 0.5rem;
border-radius: 6px;
cursor: pointer;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
transition: background-color 0.2s;
}
.conversation-item:hover {
background-color: #2a2b32;
}
.conversation-item.active {
background-color: #343541;
}
.user-info {
padding: 1rem;
border-top: 1px solid #565869;
}
.user-info p {
margin-bottom: 0.5rem;
font-size: 0.9rem;
}
.user-info button {
width: 100%;
padding: 0.5rem;
background-color: transparent;
border: 1px solid #565869;
border-radius: 4px;
color: white;
cursor: pointer;
transition: background-color 0.2s;
}
.user-info button:hover {
background-color: #2a2b32;
}
/* Chat Main */
.chat-main {
flex: 1;
display: flex;
flex-direction: column;
background-color: #343541;
}
/* Messages Container */
.messages-container {
flex: 1;
overflow-y: auto;
padding: 2rem 0;
}
/* Message Styles */
.message {
padding: 1.5rem 0;
border-bottom: 1px solid #4e4f60;
}
.message.user {
background-color: #343541;
}
.message.assistant {
background-color: #444654;
}
.message-role {
max-width: 768px;
margin: 0 auto 0.5rem;
padding: 0 1rem;
font-weight: 600;
color: #d1d5db;
text-transform: capitalize;
}
.message-content {
max-width: 768px;
margin: 0 auto;
padding: 0 1rem;
color: #d1d5db;
line-height: 1.6;
}
/* Markdown Styles */
.message-content h1,
.message-content h2,
.message-content h3 {
margin-top: 1.5rem;
margin-bottom: 1rem;
}
.message-content p {
margin-bottom: 1rem;
}
.message-content ul,
.message-content ol {
margin-bottom: 1rem;
padding-left: 2rem;
}
.message-content code {
background-color: #2d2d30;
padding: 0.2rem 0.4rem;
border-radius: 3px;
font-size: 0.9em;
}
.message-content pre {
margin: 1rem 0;
border-radius: 6px;
overflow-x: auto;
}
.message-content pre > div {
padding: 1rem !important;
}
/* Input Form */
.input-form {
max-width: 768px;
width: 100%;
margin: 0 auto;
padding: 1rem;
display: flex;
gap: 0.75rem;
}
.input-form input {
flex: 1;
padding: 0.75rem 1rem;
background-color: #40414f;
border: 1px solid #565869;
border-radius: 6px;
color: white;
font-size: 1rem;
outline: none;
transition: border-color 0.2s;
}
.input-form input:focus {
border-color: #10a37f;
}
.input-form input::placeholder {
color: #8e8ea0;
}
.input-form button {
padding: 0.75rem 1.5rem;
background-color: #10a37f;
border: none;
border-radius: 6px;
color: white;
font-weight: 500;
cursor: pointer;
transition: background-color 0.2s;
}
.input-form button:hover:not(:disabled) {
background-color: #1a7f64;
}
.input-form button:disabled {
opacity: 0.5;
cursor: not-allowed;
}
/* Mobile Responsiveness */
@media (max-width: 768px) {
.chat-container {
flex-direction: column;
}
.sidebar {
width: 100%;
height: auto;
order: 2;
}
.chat-main {
order: 1;
}
.conversations-list {
max-height: 200px;
}
}
One of the key features that makes our ChatGPT clone feel responsive is streaming responses. Let's dive deeper into how this works:
Our Edge Function returns a ReadableStream
with Content-Type: text/event-stream
. This allows us to send chunks of data as they arrive from OpenAI, rather than waiting for the complete response.
Key benefits:
The React component processes the stream using the Fetch API's ReadableStream
:
const reader = response.body?.getReader()
const decoder = new TextDecoder()
while (true) {
const { done, value } = await reader.read()
if (done) break
// Process chunks as they arrive
const chunk = decoder.decode(value)
// Parse and display the content
}
Our implementation includes several features for managing conversation history:
When creating a new conversation, we automatically generate a title from the first message:
title: messages[0]?.content.substring(0, 50) + '...' || 'New Chat'
Users can seamlessly switch between conversations, with messages loading dynamically from the database.
All messages are stored in Supabase, ensuring conversations persist across sessions and devices.
We've added indexes on foreign keys for faster queries:
CREATE INDEX idx_conversations_user_id ON conversations(user_id);
CREATE INDEX idx_messages_conversation_id ON messages(conversation_id);
Messages are loaded only when switching conversations, reducing unnecessary database calls.
We process stream chunks efficiently, updating the UI only when new content arrives.
Supabase Edge Functions automatically handle connection pooling to the database.
All tables have RLS enabled with policies ensuring users can only access their own data:
CREATE POLICY "Users can view own conversations" ON conversations
FOR SELECT USING (auth.uid() = user_id);
The OpenAI API key is stored securely in Edge Function secrets, never exposed to the client.
All API calls require a valid Supabase auth token:
const {
data: { user },
} = await supabaseClient.auth.getUser()
if (!user) {
return new Response('Unauthorized', { status: 401 })
}
While not implemented in this tutorial, you can add rate limiting using the usage_stats
table:
// Check user's recent usage
const { data: recentUsage } = await supabaseClient
.from('usage_stats')
.select('tokens_used')
.eq('user_id', user.id)
.gte('created_at', new Date(Date.now() - 3600000).toISOString()) // Last hour
const totalTokens = recentUsage?.reduce((sum, stat) => sum + stat.tokens_used, 0) || 0
if (totalTokens > 10000) { // 10k tokens per hour limit
return new Response('Rate limit exceeded', { status: 429 })
}
npm run build
npm install -g vercel
vercel
Follow the prompts to deploy your app. Make sure to add your environment variables in the Vercel dashboard.
Update your Edge Function to allow requests from your production domain:
const corsHeaders = {
'Access-Control-Allow-Origin': process.env.NODE_ENV === 'production'
? 'https://your-domain.vercel.app'
: '*',
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
}
Use the Supabase dashboard to monitor:
Add a dropdown to switch between GPT-3.5 and GPT-4:
<select value={model} onChange={(e) => setModel(e.target.value)}>
<option value="gpt-3.5-turbo">GPT-3.5 Turbo</option>
<option value="gpt-4">GPT-4</option>
</select>
Allow users to set custom system prompts for different conversation styles.
Add functionality to export conversations as Markdown or PDF.
Implement full-text search across all conversations:
-- Add search function
CREATE OR REPLACE FUNCTION search_messages(search_query TEXT, user_uuid UUID)
RETURNS TABLE (
conversation_id UUID,
message_content TEXT,
message_role TEXT,
created_at TIMESTAMPTZ
) AS $
BEGIN
RETURN QUERY
SELECT m.conversation_id, m.content, m.role, m.created_at
FROM messages m
JOIN conversations c ON m.conversation_id = c.id
WHERE c.user_id = user_uuid
AND m.content ILIKE '%' || search_query || '%'
ORDER BY m.created_at DESC;
END;
$ LANGUAGE plpgsql;
Integrate the Web Speech API for voice input:
const recognition = new webkitSpeechRecognition()
recognition.onresult = (event) => {
const transcript = event.results[0][0].transcript
setInput(transcript)
}
Ensure your Edge Function includes proper CORS headers and handles OPTIONS requests.
Check that:
Verify that:
Congratulations! You've built a fully functional ChatGPT clone with:
This implementation demonstrates the power of Supabase Edge Functions for building modern AI applications. The serverless architecture ensures your app scales automatically while keeping costs low.
Happy coding! 🚀
Nikolai Fischer is the founder of Kommune3 (since 2007) and a leading expert in Drupal development and tech entrepreneurship. With 17+ years of experience, he has led hundreds of projects and achieved #1 on Hacker News. As host of the "Kommit mich" podcast and founder of skillution, he combines technical expertise with entrepreneurial thinking. His articles about Supabase, modern web development, and systematic problem-solving have influenced thousands of developers worldwide.
Comments