Requirements:
* 5+ yrs of software development background building backend applications involving integration
with database systems.
* 3+ yrs of experience using public cloud based data processing, analytics and storage services
like Azure Data Factory , Azure Databricks, Azure Functions/Logic Apps , Azure SQL DB ,
Azure DataWarehouse, Azure Search/Cognitive services and Azure Data Lake.
* Experience with designing and implementing Azure Data Factory ETL/ELT pipelines -
API/Web based data extraction and collector frameworks, job schedulers, database related
transforms and load workflows.
* Perform advanced data transforms using python/R/Scala based analytics toolsets in Azure
Databricks.
* Having knowledge of modern continuous integration and deployment practices with respect to
ETL frameworks like Azure DataFactory or Azure Databricks.
* Ability to define data models, write stored procedure to perform inSQL data transforms, ensure
data consistency across replicas and tune the performance of database queries based on
application use-cases.
* Practise of documenting ETL based solution design, functional specifications, test use cases
and user manuals as needed.
* Habit of taking data driven decisions, proving through actions, being open minded and
receptive to ideas from team members.
* Ability to work with remote team members in different time-zones and collaborate with them
effectively.
* Composure to work in a challenging, fast paced, dynamic but planned environment.
Language Expertise: Python, C#, Transact-SQL, PL/SQL, R, Scala
Databases: Azure SQL DB, Azure DataWarehouse, Microsoft SQL
DB Migration/Analytics/BI Tools: Azure Data Factory, Azure Databricks, Azure Power BI,
Tableau
Operating Systems: MS Server, Linux
Any of the following skill sets in a candidate are a huge plus:
Public Cloud Certifications: Azure, AWS, GCP
Big Data + ML Technologies: Hadoop YARN, Spark, Kafka, Azure ML + TensorFlow
About the recuiterMember since May 20, 2018 Roopam Goenka
from Zhejiang, China