Linkedin: [login to view URL]
I am a Senior Big Data Developer/ Data Engineer, specialized in designing and building scalable data pipelines to collect, parse, clean and transform data from multiple source systems and generate high-quality data sets for advanced analytics, dashboards, alerts, and visualizations.
I have 6.5 in Data Engineering and Big Data technologies with on-prem and azure cloud.
I love to connect dots and passionate about learning anything that deals with Data.
• Experience in Data feasibility and Data Quality analysis, Business Requirements gathering, prioritizing the features for development, user acceptance testing.
• Specialized in building robust and scalable data pipelines to collect, parse, clean and transform data from multiple source systems and generate high-quality data sets for advanced analytics, dashboards, alerts, and visualizations.
• Good Experience in building Data Pipelines to ingest and transform different types of Semi-Structured Data (CSV, JSON, XML and Production Log Files) and Structured Data (Oracle, MySQL) using different Big Data Technologies.
• Experience in migration of existing on-premise systems/applications to Azure cloud
• Experience in leading end to end data efforts for a digital project. Experience in mentoring juniors and vendor team developers.
Skillset:
• Data Processing and Storage Technologies: Apache Spark, Python, SQL, Databricks, Azure Data Factory, ADLS, Cloudera, Hadoop, Hive, Impala, Pentaho DI
• Programming Languages: Python, Scala, Unix Shell Scripting
• Databases: MySQL, Oracle, SQL Server
• Tools: Git, JIRA, Postman, Control-M, Crontab, Tableau
• Project Domains: Telco, Banking, Mining, Manufacturing, Shipping and logistics
Certifications :
• Cloudera Spark and Hadoop Developer (CCA175) - December 2018
• Azure Cloud Fundamentals (AZ-400) - October 2020
I believe that we can achieve more on a team — that the whole is greater than the sum of its parts. I rely on others' candid feedback for continuous improvement.
StackOverFlow :
[login to view URL]