NetSPI Blog

Collecting Contacts from LinkedIn Using linkedin_crawl

Mike Larch
June 1st, 2015

Linkedin_crawl is a module for the recon-ng framework that can be used for collecting employee names and titles from a specified company on LinkedIn. It operates by spidering through the “People also Viewed” pane that’s available on most LinkedIn user public pages, and scraping user data. That information can be used  to generate a list of emails for phishing campaigns, or usernames for online dictionary attacks executed during internal/external penetration tests.


Since Linkedin_crawl is part of the Recon-ng framework a simple

git clone

should do the trick. For more information follow the usage guide here.


*examples are edited for anonymity*
1. A seed employee for the targeted company must be identified. This is pretty easy with Google, search “company name Linkedin.” Or use this Google dork by Tim Tomes: inurl:pub -inurl:dir “at ” “Current”

2. This seed employee should have the name of the targeted company spelled correctly and the “Viewers of this profile also viewed…” section should exist. Copy this employee’s URL.  In the example below, we will be using a seed page for John Doe from the “Example Company”.

3. Load up the Recon-ng framework and navigate to the linkedin_crawl module, set the options and run.

root@kali:~/recon-ng# ./recon-ng
[recon-ng][default] > use recon/companies-contacts/linkedin_crawl
[recon-ng][default][linkedin_crawl] > show options

  Name     Current Value  Req  Description
  -------  -------------  ---  -----------
  COMPANY                 no   override the company name harvested...
  URL                     yes  public LinkedIn profile URL (seed)

[recon-ng][default][linkedin_crawl] > set URL
URL =>
[recon-ng][default][linkedin_crawl] > show options

  Name     Current Value                   Req  Description
  -------  -------------                   ---  -----------
  COMPANY                                  no   override the company...;
  URL  yes  public linkedin profile...

[recon-ng][default][linkedin_crawl] > run

[*] Parsing ‘
[*] Added: John Doe, Software Developer at Example Company(Washington...
[*] Parsing ‘
[*] Added: Ali Price, Director at Example Company
[*] Parsing ‘
[*] Parsing ‘
[*] Added: Matt James, Director of Software Services at Example Company...

Expected Results

The module will begin crawling contacts from the “Viewers of this profile also viewed…” section and scrape their information if they are part of the company found on the seed page. If the company is small, it will not find many contacts and the module will only take about 30 seconds to run. If it is a large company, it could find thousands of contacts and the module could take hours to run. Regardless, it should be working and collecting contacts from the targeted company.  When the module finally finishes view the contacts in the database.

[recon-ng][default] > show contacts

  | rowid | first_name |  | last_name | email |                    title                |
  | 1     | Ali        |  | Price     |       | Director at Example Company             |
  | 2     | John       |  | Doe       |       | Software Developer at Example Company   |
  | 3     | Marc       |  | Smith     |       | Computer Tech at Example Company        |
  | 4     | Matt       |  | James     |       | Director at Example Company             |
  | 6     | Robert     |  | Fiker     |       | Floor Manager at Example Company        |
  | 5     | Tina       |  | Beard     |       | Marketing Consultant at Example Company |

[*] 6 rows returned

This shows a nice list of names, titles, and regions which could be helpful for a social engineering type campaign or for generating different possible username dictionaries. The recon-ng framework also has plenty of other modules to mangle the contacts or export them to another format, which I find useful.


Hopefully this short intro was helpful getting you started using this tool for all of your contact gathering needs. This being part of a community framework please feel free to contribute fixes or features, and thanks to those who already have!

Leave a Reply


This site uses Akismet to reduce spam. Learn how your comment data is processed.

Notify of