Epicareer Might not Working Properly
Learn More

APAC Data Architect

  • Full Time, onsite
  • Medline Industries, LP
  • Wilayah Persekutuan Kuala Lumpur, Malaysia
Salary undisclosed

Apply on


Original
Simplified

ABOUT THE ROLE

The APAC Data Architect plays a critical role in designing and implementing data solutions across the APAC region. You will be responsible for creating and maintaining a robust data architecture that supports data integration, analytics, and business intelligence across the Azure landscape.

RESPONSIBILITIES

  • Data Architecture Strategy and Planning: Collaborate with stakeholders to define data architecture standards, principles, and guidelines. Develop a data strategy that ensures scalability, security, and performance. Create a roadmap for data architecture enhancements and improvements.
  • Data Architecture Design: Develop and maintain end-to-end Azure data architecture, adhering to data-related policies and standards. Analyse system integration challenges and propose effective solutions.
  • Data Modeling and Design: Design and maintain conceptual, logical, and physical data models. Ensure data consistency, integrity, and quality. Evaluate and select appropriate data storage technologies (e.g., Azure SQL Database, Synapse Analytics, Cosmos DB, etc.).
  • Data Integration and ETL: Architect and implement data pipelines for ingesting, transforming, and loading data from various sources. Optimise ETL processes for efficiency and reliability.
  • Security and Compliance: Define data access controls, encryption, and authentication mechanisms. Ensure compliance with data privacy regulations (e.g., GDPR, PDPA, APAC Data Laws etc.).
  • Collaboration: Work closely with data scientists, data analysts, and business stakeholders to understand data requirements. Translate business needs into technical solutions. Collaborate with other IT team members to support data initiatives and maintain a consistent, high-quality data delivery architecture across projects.
  • Data Engineering and Operations: Design, build, and maintain scalable data pipelines for batch and real-time data processing. Implement ETL processes to transform raw data into usable formats for analysis.
  • Data Analytics: Perform data analysis to extract insights and support business decision-making. Create reports and dashboards to visualize data trends and patterns. Collaborate with data analysts and business stakeholders to understand data needs and provide actionable insights.

REQUIREMENTS

  • Bachelor’s degree in computer science, Information Systems, or a related field.
  • 3 to 5 years of experience in data architecture, data modeling, database design and data engineering
  • Solid grasp of data warehousing concepts, including star schema, snowflake schema, fact tables, and dimension tables. Must understand how to model data, create tables, manage partitions, and optimise performance within Synapse Analytics.
  • Architectural Skills: Ability to design scalable, distributed data systems using modern cloud-based platform. Knowledge of microservices architecture and containerization (e.g., Docker, Kubernetes).

Technical Skills:

  • Azure Data Services: Proficiency in using Azure services such as Azure Data Factory, Azure Synapse Analytics, Azure Stream Analytics, Azure Event Hubs, Azure Data Lake Storage, and Azure Databricks.
  • Data Processing Languages: Solid knowledge of SQL, Python, and Scala for data processing tasks.
  • Big Data Tools: Experience with Apache Spark for large-scale data processing.
  • ETL (Extract, Transform, Load): Ability to design and build ETL pipelines.
  • Data Modeling: Proficiency in designing and maintaining data models.
  • Security and Compliance: Knowledge in implementing data security measures and ensuring compliance with regulations.

Please note only shortlisted candidates will be contacted.