Export LinkedIn Company People to Google Sheets
Scrape all people from a LinkedIn company page and export their profiles, headlines, and locations to Google Sheets automatically.
Key Takeaways
- Export every person from any LinkedIn company page - Scrape all employees listed on a company's People tab and get their data in Google Sheets
- Full profile data - Extracts name, LinkedIn URL, headline, location, and summary
- Paginated scraping - Loops through all pages of people with 3-second delays between requests
- Handles up to 2,400 people - Processes up to 200 pages with 12 people per page
- Saves as you go - Each person is written to Google Sheets in real time using Gemini 3 Flash
What This Workflow Does
This workflow scrapes all people listed on a LinkedIn company's People tab and exports their profiles to Google Sheets. You copy a network request from your browser's DevTools while viewing the company's People page, paste it into the workflow's manual trigger, and the workflow handles the rest. It parses the request, loops through all people pages, extracts profile data from LinkedIn's search results, and saves each row to your spreadsheet.
Use cases:
- Building targeted lead lists from specific companies
- Mapping out teams and decision-makers at target accounts
- Competitive intelligence on who works at competitor companies
- Sales prospecting by exporting employee profiles for outreach
- Recruiting by finding potential candidates at specific organizations
How It Works
| Step | What Happens |
|---|---|
| 1. Manual trigger | You paste the fetch() request copied from LinkedIn DevTools |
| 2. Parse fetch request | A code node extracts the URL, headers, CSRF token, and cookies from the pasted request |
| 3. Pagination loop | Iterates through people pages (up to 200 pages, 12 per page) |
| 4. Build page URL | Updates the API URL with the current page offset |
| 5. Wait 3 seconds | Pauses between requests to avoid rate limiting |
| 6. HTTP request | Fetches the current page of people from LinkedIn's Search API |
| 7. Extract people | Parses the API response to pull out profile data from search result entities |
| 8. Save to Google Sheets | Gemini 3 Flash writes each person as a row to your spreadsheet |
| 9. Check pagination | Determines if there are more pages and continues the loop |
Workflow Nodes
| Node | Role |
|---|---|
| Manual Trigger | Accepts the pasted fetch() request from DevTools |
| Code (Parse Fetch) | Extracts URL, headers, CSRF token, cookies, and company ID from the fetch request |
| Loop | Iterates through people pages (condition: hasMore and page < 200) |
| Code (Build URL) | Constructs the API URL for the current page offset |
| Wait (3s) | Delays 3 seconds between API calls |
| HTTP Request | Sends a GET request to LinkedIn's Search API |
| Merge | Combines the HTTP response with the build context |
| Code (Extract People) | Parses the LinkedIn response, extracts EntityResultViewModel entities, deduplicates results |
| Code (Return People) | Returns the people array for the AI node |
| AI Agent (Gemini 3 Flash) | Saves each person to Google Sheets using add_multiple_rows tool (runs per person) |
| Code (Pagination Info) | Passes pagination state back to the loop |
| Merge | Combines AI save results with pagination data |
| Code (Extract Pagination) | Extracts hasMore, nextStart, and totalResults for the next loop iteration |
Data Exported Per Person
| Column | Field | Description |
|---|---|---|
| A | name | Full name as shown on LinkedIn |
| B | linkedin_url | Full LinkedIn profile URL |
| C | headline | Professional headline |
| D | location | Geographic location (when available) |
| E | summary | Additional context from search |
Setup Instructions
- Add the "Export LinkedIn Company People to Google Sheets" template to Needle
- Go to the target LinkedIn company page and click the "People" tab
- Open browser DevTools (F12) and go to the Network tab
- Scroll the people list to trigger the API request
- Find the request containing
ORGANIZATIONS_PEOPLE_ALUMNIin the URL (look for responses that are 10 kB or larger) - Right-click the request, then select Copy, then Copy as fetch
- Paste the copied fetch() into the Manual Trigger node
- Copy the template Google Sheet (link is in the workflow's sticky note)
- Connect your Google Sheets account via the Pipedream connector
- Run the workflow
Customization
| What You Can Change | How |
|---|---|
| Google Sheet columns | Update the column mapping in the AI Agent prompt (default: A=name through E=summary) |
| Target spreadsheet | Update the sheet URL in the AI Agent prompt |
| Page delay | Change the Wait node duration (default: 3 seconds) |
| Max pages | Change the loop condition (default: page < 200) |
FAQ
Q: How do I find the right network request to copy?
A: Look for requests with ORGANIZATIONS_PEOPLE_ALUMNI in the URL. They should be responses of 10-100+ kB. The URL also contains currentCompany,value:List(...) with the company ID.
Q: My workflow is returning errors. What should I do? A: The LinkedIn cookie expires after roughly 24 hours. Go back to LinkedIn, open DevTools, and copy a fresh fetch() request.
Q: Does this work with any LinkedIn company? A: Yes, it works with any company page that has a People tab visible to you. You do not need to be an employee of the company.
Q: Why are some location or summary fields empty? A: LinkedIn's search API does not always return all fields for every person. Location and summary depend on the person's profile settings and what LinkedIn includes in the search response.
Q: Can I scrape companies with more than 2,400 people? A: The default limit is 200 pages (2,400 people). You can increase this by editing the loop condition, but LinkedIn may limit the total results returned by their search API regardless of pagination.
Want to showcase your own workflows?
Become a Needle workflow partner and turn your expertise into recurring revenue.