Data Architect
Apply on
Job Title: Data Architect
Duration: One year extendable contract
Location: Kuala Lumpur, Malaysia
Work Type: Remote
Overall years of experience : 5
Relevant years of experience : 7-10
Skills : Please refer to the below JD
Mandatory skills :
- Designing data pipelines, data lakes, and BI solutions (5+ years)
- Building big data pipelines (7-10 years)
- Data Lake/Data Mesh/Data Fabric architectures
- Cloud platforms: Azure/AWS/GCP/Private Cloud (experience with at least 2)
- Hybrid environment and cloud-agnostic solutions
- Big data tools: Hadoop, Spark, Kafka, Big Query, etc.
- Relational SQL and NoSQL databases: Postgres, Cassandra, MongoDB, Cosmos Db, Neo4j
- Data pipeline and workflow tools: ADF, AWS Glue, Talend, Airflow, Azkaban, Luigi, etc.
- Azure cloud services: Data Lake, Data Factory, etc.
- Stream-processing systems: Kafka Stream, KSQL, Storm, Spark-Streaming
KEY RESPONSIBILITIES:
• Research and properly evaluate sources of information to determine possible limitations in reliability or usability
• Identify, design, and help teams to implement internal process improvements.
• Identify scope for automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Design the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
• Prepare detailed reports for management and other departments by analyzing and interpreting data
• Train assistants and other members of the team how to properly organize findings and read data collected
• Design computer code using various languages to improve and update software and applications
• Refer to previous instances and findings to determine the ideal method for gathering data
QUALIFICATIONS:
• Bachelor's / master’s degree in mathematics, statistics, computer science or related field
• Strong math and analytical skills are essential to complete job requirements successfully
• Experience working with private and sensitive personal information
• Confident in decision making and the ability to explain processes or choices as needed
• Strong computer skills and ability to use necessary databases and software
• Interpersonal and customer service skills are required when meeting with and interviewing potential clients
• Excellent multitasking skills and task management strategies
• Ability to complete milestones and work toward multiple deadlines simultaneously
EXPERIENCE:
• At least 5 years of experience as Data Architect in designing data optimized pipelines, data lake and BI
• At least 7-10 years of experience in building optimized big data pipelines
• They should also have experience using the following software/tools:
o Experience in building Data Late/Data Mesh/ Data Fabric
o Experience with at least 2 cloud platform Azure/ AWS/GCP / Private.
o Experience in architecting solution for hybrid environment.
o Experience in building cloud agnostic
o Experience with big data tools: Hadoop, Spark, Kafka, Big Query, etc.
o Experience with relational SQL and NoSQL databases, including Postgres / Cassandra/ MongoDB /Cosmos Db/Noe4j.
o Experience with data pipeline and workflow management tools like :ADF,AWS Glue, Talend Airflow, Azkaban, Luigi etc.
o Experience with Azure cloud services: Data Lake, Data Factory etc.,
o Experience with stream-processing systems: Kafka Stream, Kafka Connect, KSQL, Storm, Spark-Streaming, etc.
o Flexible and able to take up new role and responsibility as per company’s direction