Required part time developer for Bigdata/hadoop who can build data pipeline or ETL using kafka, flame, sqoop.
₹24000-25000 INR
Paid on delivery
Required single dedicated part time developer on below skills
At least 3 years of experience working in Hadoop Ecosystem and big data technologies
Build data pipelines and ETL using heterogeneous sources to Hadoop using Kafka, Flume, Sqoop, Spark Streaming etc.
Experience in batch (Spark. Scala) or real time data streaming (Kafka)
Ability to dynamically adapt to conventional big-data frameworks and open source tools if project demands
Knowledge of design strategies for developing scalable, resilient, always-on data lake
Experience in agile(scrum) development methodology.
Who can willing to work in IST morning time in between 7 to 11 am, and ready to work wia remotely like screen sharing.
I need daily support for min 3 to 4 hours average on monthly basis.
We have fixed monthly paid project so mention budget is for 30 days.
Bid would not be accepted for less then 30 days and exceed then the mention budget.
Project ID: #18362185
About the project
15 freelancers are bidding on average ₹28200 for this job
Greetings ! On gone through requirement and figure out that you are looking for Big data developers for your projects We are the team of big data developers with 2-3 years of experience in developing big data ba More
I am working as a software developer with 10+ years of experience. Industry experience includes Big Data, Backend, DevOps (AWS and Azure) and etc. I did my MTECH in the Department of Computer Science Engineering, India More
I am having 5+ years of experience in Java ,Hadoop ,Spark,Scala .Mostly worked on real time applications using spark,kafka,hbase. Also experienced in Apache Nifi tool, Hive,Oozie,Falcon. As per your description you More
Hello? I am a Data scientist with over 5 years experience in using python for data mining, analysis, visualization and modeling. I can provide intuitive python and scala codes for different analytical tasks. I look fo More
Hi, I can do this for you. Please let me know the requirements of you are interested. Thanks and Regards.
I have over 18 years of software development experience in Telecom, data analytics using Kafka, Cassandra and spark. Experience in c, Java and python.
working in data pipelines , which matches current requirement , please ping me if you want to know about development
Hello..! i have 5+ years of experience as Hadoop developer. I can build the work flow as you mentioned. Let's discuss.!
1)Have 3.5 years of experience in Writing Spark applications in java and scala. 2)Have a deep understanding of spark architecture and have worked on all versions from 1.6 and above.(Currently working on 2.3) 3)Skille More
Let me know the project , I can help you out. I have around 5.5 years of experience with Spark, Big data Also, with Python and Java
I am senior developer in java/j2ee, scala/akka/play, groovy & grails, big data technologies such as Hadoop, Hbase, HDFS, Spark, Hive, Kafka, Nifi, Flume, Storm based technology stack with over 14 years of experience. More
I have a team which has good knowledge and experience in big data. I have read your description and agree to the condition.