Epicareer Might not Working Properly
Learn More

Data Engineer

RM 4,000 - RM 7,999 / Per Mon

Apply on


Original
Simplified

Job Summary:

We are seeking a skilled and experienced Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining our data infrastructure, ensuring efficient data flow, and supporting analytics and data science initiatives. The ideal candidate will have a strong understanding of data architecture, data warehousing, and ETL processes, with a passion for optimizing and automating data pipelines to ensure the accessibility and reliability of the organization's data assets.

Key Responsibilities:

  • Data Pipeline Development: Design, develop, and maintain scalable data pipelines to ingest, process, and store data from various sources into data warehouses, data lakes, or other storage solutions.
  • ETL Processes: Build and optimize Extract, Transform, Load (ETL) processes to ensure data accuracy and integrity across multiple systems.
  • Data Infrastructure: Implement and maintain data architectures, such as data warehouses and data lakes, ensuring they are secure, scalable, and efficient.
  • Collaboration: Work closely with data scientists, analysts, and other engineering teams to understand data needs, develop solutions, and support analytics and machine learning efforts.
  • Data Quality & Governance: Ensure data is clean, accurate, and adheres to governance standards, performing regular audits and implementing data quality checks.
  • Performance Optimization: Identify and implement performance improvements for existing data pipelines and data structures.
  • Monitoring & Troubleshooting: Proactively monitor data pipelines and systems to ensure reliable operation, troubleshooting and resolving issues as they arise.
  • Documentation: Maintain up-to-date documentation of data architecture, data flows, and processes to support team knowledge and onboarding of new team members.

Qualifications:

  • Education: Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience: 2+ years of experience in data engineering, with a strong focus on data architecture, ETL, and database management.
  • Technical Skills:
    • Proficiency in programming languages such as Python, Java, or Scala.
    • Experience with SQL and working with relational databases like PostgreSQL, MySQL, or SQL Server.
    • Familiarity with big data technologies such as Hadoop, Spark, or Kafka.
    • Experience with cloud platforms such as AWS, Google Cloud Platform (GCP), or Microsoft Azure for data storage and processing.
    • Knowledge of data warehousing solutions like Snowflake, Redshift, or BigQuery.
  • Data Modeling: Strong understanding of data modeling and data architecture principles.
  • ETL Tools: Experience with ETL tools such as Apache NiFi, Talend, Informatica, or Airflow.
  • Version Control: Proficient in using Git for version control and collaboration.
  • Problem-Solving Skills: Strong analytical and problem-solving skills with a passion for optimizing data solutions.
  • Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders.