Data Architect
Salary undisclosed
Apply on
Original
Simplified
Job Title: Data Architect
Job Type: Payroll and Contract through Accord Innovations Sdn Bhd
Duration: 12 months (extendable/renewable)
Location: Work From Home (WFH)
Project: Aviation
Key Responsibilities- Research and evaluate sources of information to assess reliability and usability.
- Identify, design, and assist teams in implementing internal process improvements.
- Identify opportunities for automating manual processes, optimizing data delivery, and redesigning infrastructure for scalability.
- Design the infrastructure for optimal extraction, transformation, and loading (ETL) of data from diverse sources using SQL and big data technologies.
- Prepare detailed reports for management and other departments by analyzing and interpreting data.
- Train team members on organizing findings and effectively reading collected data.
- Develop and enhance software and applications by writing and updating computer code in various programming languages.
- Utilize previous findings to determine the best methods for data collection.
- Bachelor’s or Master’s degree in Mathematics, Statistics, Computer Science, or a related field.
- Strong math and analytical skills essential for job success.
- Experience handling private and sensitive personal information.
- Confident decision-maker with the ability to explain processes and choices clearly.
- Strong computer skills and proficiency in using relevant databases and software.
- Excellent interpersonal and customer service skills for client interactions.
- Outstanding multitasking abilities and effective task management strategies.
- Proven ability to meet multiple deadlines and complete milestones simultaneously.
- Minimum 5 years of experience as a Data Architect, specializing in designing optimized data pipelines, data lakes, and business intelligence (BI) solutions.
- 7-10 years of experience in building optimized big data pipelines.
- Proficiency with:
- Data Lake/Data Mesh/Data Fabric architectures.
- At least two cloud platforms (Azure, AWS, GCP, or Private).
- Architecting solutions for hybrid environments.
- Building cloud-agnostic systems.
- Big data tools (Hadoop, Spark, Kafka, BigQuery, etc.).
- Relational SQL and NoSQL databases (Postgres, Cassandra, MongoDB, Cosmos DB, Neo4j).
- Data pipeline and workflow management tools (ADF, AWS Glue, Talend, Airflow, Azkaban, Luigi, etc.).
- Azure cloud services (Data Lake, Data Factory, etc.).
- Stream-processing systems (Kafka Stream, Kafka Connect, KSQL, Storm, Spark-Streaming, etc.).
- Flexible and willing to take on new roles and responsibilities as directed by the company.
Similar Jobs