Job Summary
The Data Engineer - Digital Banking is responsible for designing, developing, and managing
robust data pipelines and systems to support KAF Digital Bank’s digital transformation
initiatives. This role ensures the seamless integration and transformation of data across
diverse platforms while leveraging cloud technologies and advanced data architectures.
With expertise in the Azure ecosystem, Snowflake, and big data technologies, the Data
Engineer contributes to the efficient and secure management of data, supporting analytics
and operational excellence in a dynamic digital banking environment.
Key Responsibilities
Data Integration and Transformation:
Lead the integration of structured and unstructured data from multiple sources, ensuring quality, consistency, and availability.
Develop and manage end-to-end data workflows using Azure Data Factory, Talend, or equivalent ETL platforms.
Cloud and Platform Expertise:
Design, implement, and optimize data engineering solutions using Azure tools such as Data Factory, API Management, and SQL.
Manage modern cloud data warehousing solutions like Snowflake and/or Fabric for high-performance analytics.
Data Architecture and Management:
Develop scalable, reliable, and secure data architectures with bronze, silver, and gold standards for data storage and processing.
Optimize complex SQL queries for efficient data analysis and processing.
Big Data and Distributed Systems:
Utilize big data tools like Spark and manage large-scale databases to handle massive datasets effectively.
CI/CD and Workflow Automation:
Create and maintain CI/CD pipelines to streamline deployments and ensure smooth data engineering workflows.
Data Governance and Compliance:
Establish and maintain robust data governance processes to ensure data quality, consistency, and adherence to organizational policies.
Collaboration and Stakeholder Engagement:
Collaborate with Product, Finance, and Operations teams to gather requirements and deliver actionable data solutions.
Qualifications and Requirements
Experience:
3+ years as a Data Engineer, preferably in banking or financial services.
Technical Expertise:
Proficiency in Azure tools (Data Factory, API Management) and modern data warehousing (Snowflake, Fabric).
Strong skills in integrating data across multiple systems and platforms.
Advanced SQL programming and familiarity with Spark for big data management.
Proficiency in Python or Scala for data manipulation and transformation.
Data Governance:
Experience in establishing data governance frameworks and processes.
Preferred Skills:
Familiarity with core banking systems like Temenos, Flexcube, Finacle, or T24.
Knowledge of data visualization tools like Power BI or Tableau.
Why Join Us?
Work in a fast-paced, innovative digital banking environment.
Access cutting-edge tools and technologies like Snowflake, Fabric, and the Azure ecosystem.
Competitive compensation and benefits with opportunities for growth and collaboration.
Job Summary
The Data Engineer - Digital Banking is responsible for designing, developing, and managing
robust data pipelines and systems to support KAF Digital Bank’s digital transformation
initiatives. This role ensures the seamless integration and transformation of data across
diverse platforms while leveraging cloud technologies and advanced data architectures.
With expertise in the Azure ecosystem, Snowflake, and big data technologies, the Data
Engineer contributes to the efficient and secure management of data, supporting analytics
and operational excellence in a dynamic digital banking environment.
Key Responsibilities
Data Integration and Transformation:
Lead the integration of structured and unstructured data from multiple sources, ensuring quality, consistency, and availability.
Develop and manage end-to-end data workflows using Azure Data Factory, Talend, or equivalent ETL platforms.
Cloud and Platform Expertise:
Design, implement, and optimize data engineering solutions using Azure tools such as Data Factory, API Management, and SQL.
Manage modern cloud data warehousing solutions like Snowflake and/or Fabric for high-performance analytics.
Data Architecture and Management:
Develop scalable, reliable, and secure data architectures with bronze, silver, and gold standards for data storage and processing.
Optimize complex SQL queries for efficient data analysis and processing.
Big Data and Distributed Systems:
Utilize big data tools like Spark and manage large-scale databases to handle massive datasets effectively.
CI/CD and Workflow Automation:
Create and maintain CI/CD pipelines to streamline deployments and ensure smooth data engineering workflows.
Data Governance and Compliance:
Establish and maintain robust data governance processes to ensure data quality, consistency, and adherence to organizational policies.
Collaboration and Stakeholder Engagement:
Collaborate with Product, Finance, and Operations teams to gather requirements and deliver actionable data solutions.
Qualifications and Requirements
Experience:
3+ years as a Data Engineer, preferably in banking or financial services.
Technical Expertise:
Proficiency in Azure tools (Data Factory, API Management) and modern data warehousing (Snowflake, Fabric).
Strong skills in integrating data across multiple systems and platforms.
Advanced SQL programming and familiarity with Spark for big data management.
Proficiency in Python or Scala for data manipulation and transformation.
Data Governance:
Experience in establishing data governance frameworks and processes.
Preferred Skills:
Familiarity with core banking systems like Temenos, Flexcube, Finacle, or T24.
Knowledge of data visualization tools like Power BI or Tableau.
Why Join Us?
Work in a fast-paced, innovative digital banking environment.
Access cutting-edge tools and technologies like Snowflake, Fabric, and the Azure ecosystem.
Competitive compensation and benefits with opportunities for growth and collaboration.