Remote Web Development Job In IT And Programming

Backend developer needed to build a custom data scraper & parser service & API

Find more Web Development remote jobs posted recently Worldwide

The goal of this project is to ingest, sanitize, and structure data from LinkedIn in order to receive a continuous stream of updated profiles that meet specific criteria. The objective is to both obtain an initial dataset that is in a machine-readable format (CSV, XML, etc.) but then also to provide updated versions when a change is found.

Preliminary research into data access has shown that some information is available via:
The LinkedIn API
Query parameters in the URL that can be reverse-engineered to map to different values
Structured data accessible via a profile pages markup
Unstructured information
Different search products offered by LinkedIn (the free product, Recruiter, etc.).

Were looking to run searches for specific values (no current employment, number of years tenured in most recent position, etc.), keywords (practice area(s) selected from a specified list, etc.), and compound queries (attending a Top 50 ranked law school, Tier 1 law schools, etc.).

The types of queries were looking to track for a change are similar to:
When an individual from a specified list of companies leaves their current employer and changes to a new employment status that meets specific criteria.
When someone we currently track in our CRM switches jobs.
Etc.

We will be developing a separate rules engine that will provide the parameters, bounds, and frequency of updates for any given ongoing search. While the structure of these requests is not yet defined, the tool should be built so that a query can be carried out based on a set of parameters (JSON, etc.).

The tool should be able to search on behalf of specific employees who have granted the tool access to their accounts in aggregate.

Results and updates should be delivered every week. This check could be done via polling only for deltas, comparing two entire data sets, etc. The data should have a flag and a unique identifier to signal whether each result is new or an update.

Each batch of results should also have relevant timestamps to reflect both the time of change and the time the change was crawled.

As an output, we will need all of the data in a single file in a machine-readable format (CSV, XML, etc.).

Additionally, providing the data via a documented API that delivers the output in XML/JSON could be part of the initial scope or another phase.
About the recuiter
Member since Nov 11, 2022
Pembba Tsering
from Liege, Belgium

Skills & Expertise Required

Data Scraping Web Scraping 

Open for hiringApply before - Oct 7, 2024

Work from Anywhere

40 hrs / week

Hourly Type

Remote Job

$17.23

Cost

Offer to work on this project closes in 81 days!
Are you interested in this Opportunity?

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Need Data Scraped From Dropdown Searches on Site

Looking for a developer to perform a scrape of a website that uses a simple drop-down system of search values. To give you a great paralell, think about a car part website. We need to scrape a drop-down from a site like this that uses a make, model,...read more

Data scraping/email scraping databases

I have a few databases that I would like to scrape -- to obtain email addresses. Would like to show a freelancer those databases and see if they are possible to scrape, and then to export the email addresses, etc.

Thanks, and looking forward...read more

Virtual Assistant needed for Web Data Scraping

-Manually source top websites, blogs, forums, and any other relevant digital media platforms covering crowdfunding projects on Kickstarter and Indiegogo within the past three years.

-Utilize findings to build a comprehensive organized media...read more

Simple Excel/VBA Auto-fill auto-populate dynamic PDF project

Our existing excel workbook has the following Data:
Sheet 1 = Every Customer information (Name, Address, Phone, email etc)
Sheet 2 = Every Contractor information (Name, Address, Phone, email etc)
Sheet 3 = Every Project Information (type...read more

Computer Programmer Experienced With Python & R.

We need a programmer who is experienced with Data Development with R &/or Python and Data Scraping.