Using n8n and AI Agents to Generate LinkedIn Sales Leads Without Paid Tools
Sales teams can automate LinkedIn lead generation using n8n and AI – without paid tools. We’ll build a workflow that searches public LinkedIn for target profiles, scrapes names/roles/companies, enriches each lead (e.g. guessing emails), and feeds the data to an AI agent for scoring via the Model Context Protocol (MCP). Finally the leads go to a CSV or Google Sheet. This pipeline leverages free, publicly available data and open tools (Puppeteer, Browserless, n8n, free email verifiers, local AI agents). The result: a CRM-ready list of qualified leads.
1. Searching and Scraping LinkedIn Profiles
We start by automating a LinkedIn people search for keywords (e.g. “Marketing Director Software”). Because LinkedIn’s official APIs are limited, we treat LinkedIn like any public website. It’s legal to scrape publicly visible LinkedIn data. n8n can use a headless browser (Puppeteer) or Browserless.io integration to load LinkedIn search pages and parse the HTML. For example, the n8n Puppeteer node (community plugin) “provides full access to Puppeteer’s API … for any browser automation task”.
- Trigger the workflow (manual, cron, or webhook).
- Open LinkedIn Search: Use the n8n Puppeteer or HTTP Request node to navigate to a URL like
https://www.linkedin.com/search/results/people/?keywords=MARKETING%20DIRECTOR
. (If needed, log into LinkedIn or use a service like Browserless with your account.) - Wait for results and scrape: In Puppeteer, wait for the list selector (e.g.
.search-results__list
). Then extract each result’s name, title, and company. For example, using page functions:const items = await page.$$eval('.reusable-search__result-container', els => els.map(el => ({ name: el.querySelector('.entity-result__title-text').innerText, title: el.querySelector('.entity-result__primary-subtitle').innerText, company: el.querySelector('.entity-result__secondary-subtitle').innerText })) ); return items;
n8n’s HTTP Request or Scrape nodes can also fetch the search page HTML and use CSS selectors or RegEx to parse it. (Regardless of method, we’re extracting the public fields: names, job titles, company names – exactly the data a LinkedIn scraper can legally retrieve.)
- Collect results: The workflow now has an array of leads with
{name, title, company}
. Save or pass these items for the next steps.
This simple chain – Trigger → LinkedIn Search (Puppeteer) → Extract fields – forms the core of our LinkedIn scraping sub-flow. For example, a text-based diagram of this part might look like:
[Cron Trigger] → [Browserless (Puppeteer) Node: LinkedIn People Search] → [Scrape/HTML Extract: name, title, company]
“A web scraper visits web pages, extracts the HTML code, and identifies the data you need based on predefined rules.” For LinkedIn, that means pulling “names, job titles, company information” from public profiles or search results. n8n makes this easy with nodes for headless browsers and parsing.
2. n8n Workflow: Extracting Names, Roles, Companies
In n8n, we build a workflow that takes the raw search results and formats them. Typical steps include:
- Transform/Format Data: Use a “Function” or “Set” node to ensure each item has
name
,title
,company
fields. You might rename or trim fields, or split first/last names if needed. - Loop if needed: If your search results page has multiple pages, add a loop. For example, use a while-loop or multiple HTTP Requests with the
page
parameter in the LinkedIn URL. - Filter duplicates: (Optional) If the same person appears, use an n8n “Remove Duplicate” function or a Set with unique IDs.
By the end of this stage, your workflow has a clean list of leads like:
Name | Job Title | Company |
---|---|---|
Jane Smith | Marketing Director | ACME Corp |
John Doe | VP of Sales | Globex Inc |
… | … | … |
Each row is one lead. The key is that n8n can chain nodes easily:
[ Trigger ] → [ LinkedIn Search/Scrape ] → [ Function/Set (Normalize fields) ] → [ (optional loops/filters) ]
3. Enriching Leads Data (Emails & Company Info)
With names, titles, and companies in hand, we can enrich each lead using free public info:
- Guess Email Addresses: Use the company name to find its domain (e.g.
acmecorp.com
). Then apply common email patterns (firstname.lastname@domain, firstinitiallastname@domain, etc.). Tools like Anymail Finder or free scripts can test these. For instance, Evaboot recommends guessing patterns like[email protected]
or[email protected]
. In n8n, you might use a Function node to generate a list of possible emails for each lead. Then validate them (see next bullet). - Validate Emails: Pass guessed emails to an email verification service. n8n has nodes for free or freemium verifiers (Mailcheck, Hunter, UProc, etc.). Validating reduces bounce rates. Even just checking MX records or using a free API can help confirm if an email is deliverable.
- Company Research: Add any public details about the company. For example, scrap the company website for contacts or use a free API (Clearbit has a limited free tier to get domain info). Even Google-searching
"company name team"
or using n8n’s HTTP Request oncompanywebsite.com/about
can reveal org info. - Role Clues: If you know the ideal customer profile (e.g. tech startup, size 10-50), you might add tags or flags based on company size/industry. n8n can use HTTP or RSS nodes to hit free databases or Crunchbase (limited) to get industry data.
By the end of enrichment, each lead item might look like:
{
"name": "Jane Smith",
"title": "Marketing Director",
"company": "ACME Corp",
"email": "[email protected]",
"companySize": "100-200",
"industry": "Technology"
}
This extra context makes leads more actionable. As Evaboot notes, if you can’t find a verified email, “guessing email patterns” is often the next step. n8n handles all of this data stitching in the workflow.
4. AI Lead Scoring via Model Context Protocol
Now comes the AI agent part. We want the AI to score or qualify each lead (e.g. 1–5, “hot vs cold”, or a recommendation). This is where the Model Context Protocol (MCP) shines. MCP is an open standard that lets AI agents (LLMs) connect to tools and data. In practice:
- Set up an MCP Agent: Run a local AI assistant that supports MCP (e.g. Claude Desktop or an OpenAI agent with MCP enabled).
- Use n8n MCP Client Node: n8n’s community MCP node can connect to this agent. This “MCP Client” node “lets you interact with MCP servers … connect to MCP servers, access resources, execute tools, and use prompts”.
- Pass Lead Data to AI: Configure the MCP node to send the list of leads (or one lead at a time) as the context or in a prompt. For example, ask:
“Here is a list of sales leads with name, title, and company. Based on our ideal customer profile (technology companies, 10-50 employees), assign each lead a score 1–5 and a brief justification.”
The AI will process it like a multi-turn chat. MCP ensures n8n and the AI understand each other’s data format.
- Receive Scored Output: The AI will return JSON or text that n8n can parse. For example, each lead item might get a field
"score": 4
or"qualified": true/false
. n8n can then split or merge this result back into the lead item.
This effectively turns the AI into a dynamic “qualification engine”. As 2 Acre’s article on MCP explains, the protocol standardizes tool interactions so “any AI that speaks MCP can use it”. With n8n as the client, the AI agent scores leads on your behalf.
“MCP is rapidly gaining traction in the AI community… more and more LLM-driven applications and frameworks are integrating MCP to let AI agents fetch information, perform calculations, or execute actions in a standardized way.”.
Using MCP, your sales lead data flows seamlessly to the AI and back, within the n8n workflow.
5. Exporting Leads to Spreadsheet/CRM
Once each lead has a score and enriched data, we output it for sales action. Options in n8n include:
- Spreadsheet File (CSV): The Spreadsheet File node can append rows to a CSV with all fields. This CSV can be opened in Excel or uploaded to a CRM.
- Google Sheets: The Google Sheets node can add rows to a Google Sheet. This creates a live, shared lead list. You might have columns: Name, Title, Company, Email, Score, etc.
- CRM APIs: If your CRM has an API (HubSpot, Salesforce, Zoho, etc.), use n8n’s HTTP Request or dedicated CRM nodes to push the lead directly into your CRM’s “Leads” table.
In any case, the output is structured: one row per lead. Lobstr.io notes that scrapers “can save [data] in a structured format like a spreadsheet”. We follow that here. For example, after the AI scoring step you might have a node like:
[MCP Scoring Result] → [Spreadsheet File: Append] → [CSV: leads.csv]
This final step means your marketing or sales team has a ready-to-use list of qualified prospects. Use n8n’s built-in Excel, Google Sheets, or database nodes as needed.
6. Automating and Triggering the Workflow
To keep leads fresh, automate the workflow’s execution. Here are some trigger ideas:
- Cron/Schedule Node: Run daily or weekly at set times. (E.g. every morning at 8 AM, gather new leads.)
- Webhook / Google Sheets Trigger: Have a Google Sheet where sales adds new search terms or target companies. The n8n Google Sheets Trigger can kick off the flow whenever that sheet is updated. Alternatively, use a generic Webhook node: trigger it via Zapier, IFTTT, or a simple Google Apps Script.
- New Email or Form: Trigger on receiving an email or web form submission with criteria (e.g. a marketing form asking for leads). n8n has email and HTTP triggers.
By scheduling or connecting to Google Sheets, you can run this pipeline hands-free. For example:
[Cron (every morning)] → [LinkedIn Search & Scrape] → [Process & Enrich] → [MCP Scoring] → [Export to Sheets]
This way, each day your team has new AI-scored leads in their inbox or CRM.
Conclusion
Using n8n and MCP, even small teams can build a LinkedIn sales automation pipeline without expensive tools. We scraped public LinkedIn data, enriched it (email, company info), and tapped an AI agent to do lead scoring. The result is a dynamic leads database ready for outreach – all on open, mostly free components.
This approach scales: add more search terms, refine your enrichment (additional API calls), or improve the AI prompt for better scoring. If you hit a snag, n8n’s community nodes (like the Puppeteer and MCP nodes) offer examples and support. (For more on connecting AI agents via MCP, see our blog “Connecting AI Agents to n8n with the Model Context Protocol (MCP)”.)
Ultimately, “n8n LinkedIn lead generation” becomes just one automated workflow among many. By combining LinkedIn scraping with AI lead scoring via MCP, you streamline prospecting and can focus on personalized outreach to high-quality leads.
Happy automating – and happy selling!
Share this:
- Click to share on Facebook (Opens in new window) Facebook
- Click to share on LinkedIn (Opens in new window) LinkedIn
- Click to share on X (Opens in new window) X
- Click to share on Tumblr (Opens in new window) Tumblr
- Click to share on Mastodon (Opens in new window) Mastodon
- Click to share on Reddit (Opens in new window) Reddit