Remote Data Mining And Management Job In Data Science And Analytics

Looking for Azure ETL & Analytics Engineer

Find more Data Mining And Management remote jobs posted recently Worldwide

Requirements:
* 5+ yrs of software development background building backend applications involving integration
with database systems.
* 3+ yrs of experience using public cloud based data processing, analytics and storage services
like Azure Data Factory , Azure Databricks, Azure Functions/Logic Apps , Azure SQL DB ,
Azure DataWarehouse, Azure Search/Cognitive services and Azure Data Lake.
* Experience with designing and implementing Azure Data Factory ETL/ELT pipelines -
API/Web based data extraction and collector frameworks, job schedulers, database related
transforms and load workflows.
* Perform advanced data transforms using python/R/Scala based analytics toolsets in Azure
Databricks.
* Having knowledge of modern continuous integration and deployment practices with respect to
ETL frameworks like Azure DataFactory or Azure Databricks.
* Ability to define data models, write stored procedure to perform inSQL data transforms, ensure
data consistency across replicas and tune the performance of database queries based on
application use-cases.
* Practise of documenting ETL based solution design, functional specifications, test use cases
and user manuals as needed.
* Habit of taking data driven decisions, proving through actions, being open minded and
receptive to ideas from team members.
* Ability to work with remote team members in different time-zones and collaborate with them
effectively.
* Composure to work in a challenging, fast paced, dynamic but planned environment.
Language Expertise: Python, C#, Transact-SQL, PL/SQL, R, Scala
Databases: Azure SQL DB, Azure DataWarehouse, Microsoft SQL
DB Migration/Analytics/BI Tools: Azure Data Factory, Azure Databricks, Azure Power BI,
Tableau
Operating Systems: MS Server, Linux
Any of the following skill sets in a candidate are a huge plus:
Public Cloud Certifications: Azure, AWS, GCP
Big Data + ML Technologies: Hadoop YARN, Spark, Kafka, Azure ML + TensorFlow
About the recuiter
Member since May 20, 2018
Roopam Goenka
from Zhejiang, China

Skills & Expertise Required

Python Oracle PL/SQL Microsoft SQL Server C# Scala 

Open for hiringApply before - Oct 28, 2024

Work from Anywhere

40 hrs / week

Hourly Type

Remote Job

$26.78

Cost

Offer to work on this project closes in 30 days!
Are you interested in this Opportunity?

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Looking for a detail-oriented engineer to integrate our digital health product with hospital systems

Electronic health record systems are notoriously complex and varied, so we need a careful engineer who is up for the challenge of ensuring a high quality of data collection. You will script data collection routines using our proprietary Python-based...read more

I need an expert with Dialogic DSI stack.

I need to convert some DSI commands into a php or python wrapper

I need help with an assignment

Looking for experienced python professional to help me with a data exercise for an interview.

Generate CSR and save to Azure Key Vault (Private and Public)

Generate CSR Certificate and store in Azure Key Vault
C# code how to get Thumbprint in Azure Key Vaults Certificates object
C# code how to get encrypted Private and Public key that store in Azure Key Vaults Key object
Experience with X...read more

Writer for Java/Spring Tutorial Blog

Were looking for programmers to write high-quality articles on Java/Spring topics. Wed generally like to keep it focused on back-end and back-end infrastructure:

- Java (Java SE, Java EE)
- The Spring Framework
- Spring Boot
-...read more