Remote Data Mining And Management Job In Data Science And Analytics

Developer Needed for Web Crawler/Data Scraper Project to Obtain Contact Info of People and Companies

Find more Data Mining And Management remote jobs posted recently Worldwide

I am looking for a developer to build me a web crawler/spider/spiderbot to systematically browse the internet and scrape/harvest/extract contact information for employees of commercial real estate investment firms and lenders/banks. The crawler would need to find company websites, conference lists, other online data sources where commercial real estate subject matter is located and extract First Name, Last Name, Title, Phone, Email, Company Name, Company Summary (About Us, Overview, Description), where available. The simplest is a company website, where the crawler would extract every team member/employees name and contact info, harder sources might be a PDF report of some type on a attendee or conference website.
About the recuiter
Member since Mar 14, 2020
Sr Consultancy
from Texas, United States

Open for hiringApply before - Oct 1, 2024

Work from Anywhere

40 hrs / week

Fixed Type

Remote Job

$476.87

Cost

Offer to work on this project closes in 30 days!
Are you interested in this Opportunity?

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Data Mining/ Extraction

I need the following put in a spreadsheet:

I need someone to find me Groupon page links,names, phone numbers, email addresses, links to the company facebook page and the business owners PERSONAL Facebook page or PERSONAL Instagram for the fo...read more

Downloading lists of numbers for me to call

I am looking for someone is is organized and can get a job done relatively quickly.
I have a list of buildings and I need someone to go through my list of buildings, take the address, go into my other platform that allows you to type the building...read more

Web crawling with scrapy and import to Drupal 8

Need to establish a process to crawl data from various websites, store on a S3 like server and import to drupal 8 database.