Remote Network And System Administration Job In IT And Networking

Train me on hadoop ecosystem.

Find more Network And System Administration remote jobs posted recently Worldwide

1. Sourcing data from multiple system into different file formats.
2. Ingesting the extracted data into hadoop cluster (csv, xml, json, fixed width etc)
3. Using mapreduce, pig to transform the data and load it into mpp.
4. Stitch all above into one process using schedulers.


About the recuiter
Member since Mar 14, 2020
Credit Manageme
from Antioquia, Colombia

Skills & Expertise Required

Apache Flume Apache Hive Apache Spark Hadoop Hbase 

Open for hiringApply before - Sep 14, 2024

Work from Anywhere

40 hrs / week

Hourly Type

Remote Job

$19.14

Cost

Offer to work on this project closes in 59 days!
Are you interested in this Opportunity?

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Data Engineer

We are looking for a freelancer who has proven experience in Data Engineering projects.

The requirements we are looking for:
- Experience with Python
- Experience with Big Data tools (eg: Hadoop, Cassandra, Kafka)
- Experience wi...read more

Small Task on Python & Spark (PySpark)

HI,
I need a small less than 1 hour task to be done using Pyspark.
Add some spark code to existing python code.

Build out Advance Analytics and BI Platform

I will share my use case and am looking to build out Advance analytics and BI platform thats cost-effective yet viable (capable of working successfully) and scalable with shortliste candidates.

Current Use Case is to collect data from CRM...read more

Converting JSON or Avro files to Parquet

I need to convert JSON, Avro or other row-based format files in S3 into Parquet columnar store formats using an AWS service like EMR or Glue.

I already have code that converts JSON to parquet using Python but the process is very manual, acco...read more