Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

pgflow-dev/ai-web-scraper

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

AI Web Scraper – Quick-start Cheat Sheet

→ View the full tutorial here


0. One-time setup

# Clone & enter the repo
git clone https://github.com/pgflow-dev/ai-web-scraper.git
cd ai-web-scraper

# Copy the environment file example and add your OpenAI key (required by the tasks)
cp supabase/functions/.env.example supabase/functions/.env
# Edit the .env file and add your OpenAI API key
# OPENAI_API_KEY=sk-...

1. Boot the local Supabase stack

npx supabase@2.22.12 start

2. Run all database migrations (table + flow)

npx supabase@2.22.12 migrations up --local

3. Serve the Edge Functions (keep this terminal open)

npx supabase@2.22.12 functions serve

4. Start the worker (new terminal)

curl -X POST http://127.0.0.1:54321/functions/v1/analyze_website_worker

The first curl boots the worker; it stays alive and polls for jobs.


5. Trigger a job (SQL editor or psql)

select * from pgflow.start_flow(
  flow_slug => 'analyzeWebsite',
  input     => '{"url":"https://supabase.com"}'
);

6. Check results

select * from websites;                 -- scraped data
select * from pgflow.runs;              -- run history

That’s it – scrape, summarize, tag, store!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
Morty Proxy This is a proxified and sanitized view of the page, visit original site.