Epicareer Might not Working Properly
Learn More

Data Analyst/Engineer

Salary undisclosed

Apply on


Original
Simplified

Location: Kuala Lumpur

Employment: 12-Months Contract & Renewable/ Convert to Permanent with SQ based on performance

Industry: Insurance

*Preferably Local candidate

Main Responsibility:

Data Pipeline Development

  • Discuss with Solution Architect on designing data pipelines
  • Develop data pipelines with proposed solution
  • Manage and maintain data pipelines

Data Modelling

  • Create and maintain data models, including entity-relationship diagrams, to represent data structures and relationships
  • Design database schemas that optimize data storage, retrieval and query performance

BAU Tasks

  • Understanding and translating business requirements into technical requirements
  • Develop/Enhance SAP reports based on business requirements
  • Develop/Enhance Dashboards based on business requirements
  • Fix bug whenever detected
  • Deploy changes into GitHub
  • Create documentation for data pipelines, data models and challenges faced in BAU tasks

Key Result Area

  • Design and optimize of the ETL/ELT architectures and performances
  • Develop efficient and scalable data processing workflows
  • Monitor data pipelines, detect issue and fix
  • Improve query performances
  • Optimize on database storage
  • Develop reports, dashboards and visualizations that effectively communicate data-driven insights
  • Ensure the reports are delivered on schedule and in a timely manner
  • Deliver reports to business team.
  • Ensure data accuracy and consistency
  • Knowledge sharing among team members

Qualification and Experience Requirement:

  • Education: Bachelor’s degree in computer science, Engineering, or related field; master’s degree preferred
  • Experience: 3 years of hands-on experience of data integration and analytics in a corporate or consulting setting
  • Design, develop and maintain efficient data pipelines to support data processing, transformation and integrations needs.
  • Work with various data sources and formats, ensuring data quality and consistency.
  • Monitor and troubleshoot data pipelines to ensure consistency of data flow.
  • Handle Management Information Systems (MIS) including data collection, analysis and reporting for decision-making purpose.
  • Able to build report by tools (SAP, PowerBI or any reputable BI tools)
  • Hands on experience in AWS services (AWS Redshift, AWS Glue, AWS S3, AWS Lambda, AWS RDS, AWS AuroraDB, AWS API Gateway)
  • Coding language: Python, PySpark, SQL, Linux
  • Other skills: ETL/ELT skills, Architecture modelling, Data Modelling, Dashboarding, Story Telling, Documentation