Apply on
Original
Simplified
PRIMARY OBJECTIVE
• As subject matter expert in the field of data science
• To provide support to data scientist on machine learning, AI and other advanced analytics
• To do continuous knowledge sharing/transfer of best practice/techniques in the data science space
REQUIREMENTS (Qualification/Experience/Skills)
- Bachelor Degree/Masters/PhD in a quantitative field such as Computer Science, Economics, Finance, Mathematics, Statistics or equivalent.
- Python
- Machine learning
- SAS
- Advanced statistical knowledge
- Dataiku
- Gen AI (LLM, Knowledge library)
- MLOps
KEY RESPONSIBILITIES
- Lead complex data mining and extraction/transformation projects.
- Utilize advanced machine learning tools for model building to support decision making and to drive use cases
- Contribute in the development of Gen AI use cases.
- Contribute to the development of best practices in data science.
- Develop insights from data patterns to support decision-making and drive implementation
- Contribute to the development of project from data driven approach.
- Provide thought leadership on emerging trends in data science.
- Mentor and guide data scientist in analytics projects as well in data science related activities
- Collaborate with peers to share expertise and insights.
- To support data scientist in regular review sessions with respective Retail stakeholders, SME and Commercial Banking to ensure alignment on data-solutions, deliverables and standards against Business Challenges.
- Contribute to the development of the data science strategy.
- Formulate MLOps framework for the department
Senior Data Engineer
PRIMARY OBJECTIVES
- Providing technical leadership, guidance and act as the de facto point of reference for data engineering and any analytical solutions especially Data Products.
- Developing Data Engineering solutions in line with the latest technologies and best practices.
- Drive knowledge journeys and facilitate communication on Data Engineering, DataOps & MLOps across units.
REQUIREMENTS (Qualification/Experience/Skills)
- Degree/Master/PhD in Computer Science, Data Engineering or equivalent.
- Operational Knowledge of Data Engineering Architectures & Implementation, well verse with the concept of Medallion Architecture
- Minimum 5 -7 years of experience and domain knowledge in banking & insurance.
- Minimum 3 – 4 years of experience in Apache Hive, HBase, Solr & Kafka
- Minimum 1 -2 years of implementing Apache Iceberg and migrating from Apache Hive to Iceberg.
- Operational experience in Data Mesh and Data Product conceptualization & development
KEY RESPONSIBILITIES
- Designing and evolving the overall data architecture, ensuring scalability, flexibility, and alignment with business goals.
- Assessing and integrating third-party solutions into the data architecture.
- Optimizing end-to-end data pipelines for maximum efficiency and performance.
- Implementing advanced caching, parallel processing, and optimization techniques.
- Establishing and enforcing security protocols to protect sensitive data.
- Ensuring compliance with data privacy regulations and industry standards.
Similar Jobs