Workflow

Export LinkedIn Job Applicants to Google Sheets

Scrape all applicants from a LinkedIn job posting and export their profiles, contact info, and application details to Google Sheets automatically.

Needle Team

Last updated

February 9, 2026

Connectors used

Google Sheets

Tags

LinkedInJob ApplicantsRecruitingGoogle SheetsScraping

Key Takeaways

  • Export job applicants from LinkedIn - Scrape all applicants from any LinkedIn job posting and get their data in Google Sheets
  • Full contact details - Extracts name, LinkedIn URL, headline, location, email, phone, application date, and rating
  • Paginated scraping - Loops through all pages of applicants with 3-second delays between requests
  • Handles up to 2,500 applicants - Processes up to 100 pages with 25 applicants per page
  • Saves as you go - Each applicant is written to Google Sheets in real time using Gemini 3 Flash

What This Workflow Does

This workflow scrapes all applicants from a LinkedIn job posting and exports their profiles to Google Sheets. You copy a network request from your browser's DevTools while viewing the applicant list, paste it into the workflow's manual trigger, and the workflow handles the rest. It parses the request, loops through all applicant pages, extracts profile data, and saves each row to your spreadsheet.

Use cases:

  • Exporting job applicants for review in a spreadsheet
  • Building a centralized applicant tracker across multiple job postings
  • Backing up applicant data outside of LinkedIn

How It Works

StepWhat Happens
1. Manual triggerYou paste the fetch() request copied from LinkedIn DevTools
2. Parse fetch requestA code node extracts the URL, headers, CSRF token, and cookies from the pasted request
3. Pagination loopIterates through applicant pages (up to 100 pages, 25 per page)
4. Build page URLUpdates the API URL with the current page offset
5. Wait 3 secondsPauses between requests to avoid rate limiting
6. HTTP requestFetches the current page of applicants from LinkedIn's Hiring API
7. Extract applicantsParses the API response to pull out profile data, application info, and location names
8. Save to Google SheetsGemini 3 Flash writes each applicant as a row to your spreadsheet
9. Check paginationDetermines if there are more pages and continues the loop

Workflow Nodes

NodeRole
Manual TriggerAccepts the pasted fetch() request from DevTools
Code (Parse Fetch)Extracts URL, headers, CSRF token, cookies, and job posting URN from the fetch request
LoopIterates through applicant pages (condition: hasMore and page < 100)
Code (Build URL)Constructs the API URL for the current page offset
Wait (3s)Delays 3 seconds between API calls
HTTP RequestSends a GET request to LinkedIn's Hiring API
MergeCombines the HTTP response with the build context
Code (Extract Applicants)Parses the LinkedIn response, builds lookup maps for profiles, applications, and locations, deduplicates results
Code (Return Applicants)Returns the applicant array for the AI node
AI Agent (Gemini 3 Flash)Saves each applicant to Google Sheets using upsert_row and add_multiple_rows tools (runs per applicant)
Code (Pagination Info)Passes pagination state back to the loop
MergeCombines AI save results with pagination data
Code (Extract Pagination)Extracts hasMore, nextStart, and totalResults for the next loop iteration

Data Exported Per Applicant

ColumnFieldDescription
AnameFirst and last name
Blinkedin_urlFull LinkedIn profile URL
CheadlineProfessional headline
DlocationGeographic location
EemailContact email (if available)
FphoneContact phone number (if available)
Gapplied_atApplication date (ISO format)
HratingLinkedIn rating (UNRATED, GOOD_FIT, MAYBE, NOT_A_FIT, SHORTLISTED)

Setup Instructions

  1. Add the "Export LinkedIn Job Applicants to Google Sheets" template to Needle
  2. Go to LinkedIn and open the Applicants tab for your job posting
  3. Open browser DevTools (F12) and go to the Network tab
  4. Scroll the applicant list to trigger the API request
  5. Find the request containing start: and count: in the URL (look for responses that are 30 kB or larger)
  6. Right-click the request, then select Copy, then Copy as fetch
  7. Paste the copied fetch() into the Manual Trigger node
  8. Copy the template Google Sheet (link is in the workflow's sticky note)
  9. Connect your Google Sheets account via the Pipedream connector
  10. Run the workflow

Customization

What You Can ChangeHow
Google Sheet columnsUpdate the column mapping in the AI Agent prompt (default: A=name through H=rating)
Target spreadsheetUpdate the sheet URL in the Google Sheets connector
Page delayChange the Wait node duration (default: 3 seconds)
Max pagesChange the loop condition (default: page < 100)

FAQ

Q: How do I find the right network request to copy? A: Look for requests with start: and count: in the URL. They should be relatively large responses (30-100+ kB). Do not copy jobApplicationUrns:List requests or small responses under 10 kB.

Q: My workflow is returning errors. What should I do? A: The LinkedIn cookie expires after roughly 24 hours. Go back to LinkedIn, open DevTools, and copy a fresh fetch() request.

Q: Does this work with any job posting? A: Yes, as long as you have access to view the applicants for that posting on LinkedIn (typically requires being a job poster or having recruiter access).

Q: Are email and phone always available? A: No. Email and phone are only included if the applicant provided them in their application. Some applicants will have empty fields for these.

Want to showcase your own workflows?

Become a Needle workflow partner and turn your expertise into recurring revenue.

Try Needle today

Streamline AI productivity at your company today

Join thousands of people who have transformed their workflows.

Agentic workflowsAutomations, meet AI agents
AI SearchAll your data, searchable
Chat widgetsDrop-in widget for your website
Developer APIMake your app talk to Needle
    Needle LogoNeedle
    Like many websites, we use cookies to enhance your experience, analyze site traffic and deliver personalized content while you are here. By clicking "Accept", you are giving us your consent to use cookies in this way. Read our more on our cookie policy .