Epicareer Might not Working Properly
Learn More

Senior Data Engineer

Salary undisclosed

Checking job availability...

Original
Simplified

Key Responsibilities:

  • Design, develop, document, and implement data pipelines and integration processes, including ETL/ELT jobs and workflows.
  • Perform data analysis, profiling, cleansing, lineage, mapping, and transformation to meet business requirements.
  • Monitor and optimize ETL/ELT processes for enhanced data quality, efficiency, and reliability.
  • Recommend and implement best practices for data management, coding standards, error handling, auditing, and data archiving.
  • Prepare test data and assist in creating and executing test plans, cases, and scripts.
  • Collaborate with Data Architects, Data Modelers, IT teams, SMEs, and stakeholders to gather requirements and deliver data solutions aligned with business goals.
  • Provide BAU support for data issues and change requests, documenting investigations and resolutions.

Skills & Experience:

  • Bachelor in IT, Computer Science, or Engineering
  • 3–5 years of hands-on experience with Big Data technologies, including Azure and AWS Big Data Solutions, Hadoop, Hive, HBase, Spark, Sqoop, Kafka, and Spark Streaming
  • Proven experience in data pipeline development, ETL/ELT processes, and data transformation.
  • Strong knowledge of data analysis, profiling, cleansing, and mapping techniques.
  • Expertise in optimizing data workflows and implementing best practices for data lifecycle management.
  • Ability to prepare and execute test plans and scripts with attention to detail.
  • Strong collaboration and communication skills to work effectively with cross-functional teams.

Key Responsibilities:

  • Design, develop, document, and implement data pipelines and integration processes, including ETL/ELT jobs and workflows.
  • Perform data analysis, profiling, cleansing, lineage, mapping, and transformation to meet business requirements.
  • Monitor and optimize ETL/ELT processes for enhanced data quality, efficiency, and reliability.
  • Recommend and implement best practices for data management, coding standards, error handling, auditing, and data archiving.
  • Prepare test data and assist in creating and executing test plans, cases, and scripts.
  • Collaborate with Data Architects, Data Modelers, IT teams, SMEs, and stakeholders to gather requirements and deliver data solutions aligned with business goals.
  • Provide BAU support for data issues and change requests, documenting investigations and resolutions.

Skills & Experience:

  • Bachelor in IT, Computer Science, or Engineering
  • 3–5 years of hands-on experience with Big Data technologies, including Azure and AWS Big Data Solutions, Hadoop, Hive, HBase, Spark, Sqoop, Kafka, and Spark Streaming
  • Proven experience in data pipeline development, ETL/ELT processes, and data transformation.
  • Strong knowledge of data analysis, profiling, cleansing, and mapping techniques.
  • Expertise in optimizing data workflows and implementing best practices for data lifecycle management.
  • Ability to prepare and execute test plans and scripts with attention to detail.
  • Strong collaboration and communication skills to work effectively with cross-functional teams.