Workflow

Discover Luma Event Hosts

Paste a Luma discover fetch() call to paginate through ALL AI events using a loop node. For each page of events, fetches host profiles including LinkedIn, email, Twitter, website, and bio. Deduplicates hosts and saves to Google Sheets.

Dominic IvisonDominic Ivison

Last updated

February 23, 2026

Connectors used

Google Sheets

Tags

LumaEvent ScrapingLead GenerationSalesLinkedInNetworkingLoop

Key Takeaways

  • Full pagination - Uses a loop node to fetch ALL events, not just the first page
  • Host profile extraction - Gets LinkedIn, email, Twitter, Instagram, website, and bio for each organizer
  • Deduplication built in - Hosts organizing multiple events appear once with an event count
  • Google Sheets output - AI saves structured results directly to your spreadsheet
  • No API key needed - Uses a browser fetch() call with your session cookies

What This Workflow Does

This workflow paginates through Luma's discover API for AI events in a given location, fetches the detail page for each event to extract host profiles, deduplicates hosts across events, and saves the results to Google Sheets. You paste a single fetch() request from your browser, and the loop handles the rest.

Use cases:

  • Build a database of AI event organizers in your city
  • Find potential partners or speakers by their event hosting activity
  • Research the AI community landscape in any geography
  • Generate leads from event organizers with LinkedIn and email data

How It Works

StepWhat Happens
1. Manual triggerYou paste a Luma discover fetch() request from DevTools
2. Parse fetchCode node extracts the URL, headers, and pagination parameters
3. Loop (pagination)Loop node iterates through pages using cursor-based pagination
4. Fetch events pageHTTP request calls the discover API with the current cursor
5. Extract & flattenCode nodes extract event slugs and host details from each page
6. Fetch host detailsFor each event, fetches the detail API to get organizer profiles
7. Save to SheetsAI node writes deduplicated host data (name, LinkedIn, email, etc.) to Sheets

Setup Instructions

  1. Click "Use template" on this page
  2. Go to lu.ma/ai (or any Luma discover page)
  3. Open DevTools (F12), go to the Network tab
  4. Scroll down to trigger the get-paginated-events request
  5. Right-click the request and "Copy as fetch"
  6. Paste the fetch() into the Manual Trigger node
  7. Connect your Google Sheets account
  8. Set the target spreadsheet and sheet name in the save node
  9. Run the workflow

Customization

What You Can ChangeHow
LocationChange the latitude/longitude in the fetch URL
CategoryChange the slug parameter (e.g., ai, tech, design)
Page sizeAdjust pagination_limit in the URL (default: 10)
Output destinationReplace Google Sheets node with email, Slack, or database output
Fields extractedModify the extraction code to include/exclude specific fields

FAQ

Q: How many events can this scrape? A: The loop handles unlimited pagination. It continues until has_more is false.

Q: Do I need a Luma account? A: You need to be logged into Luma in your browser to copy the fetch() request with valid session cookies.

Q: What if some hosts don't have LinkedIn or email? A: Those fields will be empty. Only data the host has added to their Luma profile is available.

Q: Can I run this for multiple cities? A: Yes, run the workflow separately with different latitude/longitude values in the fetch URL.

Want to showcase your own workflows?

Become a Needle workflow partner and turn your expertise into recurring revenue.

Try Needle today

Streamline AI productivity at your company today

Join thousands of people who have transformed their workflows.

Agentic workflowsAutomations, meet AI agents
AI SearchAll your data, searchable
Chat widgetsDrop-in widget for your website
Developer APIMake your app talk to Needle
    Needle LogoNeedle
    Like many websites, we use cookies to enhance your experience, analyze site traffic and deliver personalized content while you are here. By clicking "Accept", you are giving us your consent to use cookies in this way. Read our more on our cookie policy .