Checking job availability...
Original
Simplified
Data Engineer - ETL Location: Department: Manufacturing IT Business Intelligence (BI) Domain Experience Level & Education • Minimum 5 years of IT working experience, specifically in Data Engineering and ETL-related work. • Undergraduate or master’s degree in computer science, Information Technology, Engineering, or a related field. Main Purpose & Job Scope • Support Assy & Test manufacturing data analytics operations and projects. • Deliver ETL (Extract-Transform-Load) solutions using data from manufacturing, enterprise, and other source systems/applications in Hadoop and cloud environments. Key Responsibilities • Design, develop, and deploy ETL solutions across BI and Big Data Analytics related projects. • Build and maintain automation data feeds from various source applications to target systems. • Perform data integration tasks including administration, optimization, and performance evaluation. • Support production portfolios and resolve issues through troubleshooting. • Participate in the full software development lifecycle: requirement gathering, design, development, testing, deployment, and documentation. • Lead and manage User Acceptance Testing (UAT). • Collaborate with DBAs, infrastructure, and BI teams to drive improvements in data governance and methodologies. • Demonstrate analytical thinking, scripting skills, and a passion for data-driven decision-making. • Stay updated with new technologies and trends in data analytics. • Training will be provided as needed. Required Qualifications • Strong experience in Business Intelligence or Data Analytics projects. • Data integration, ETL, mining & analytics experience is a MUST. Skilled in manipulating and analyzing complex, high-volume, high-dimensionality datasets from multiple sources. • Expertise in SQL and experience with databases such as MSSQL, Teradata, or Oracle. • Development experience or good understanding of Hadoop ecosystem tools like Apache Nifi, Hive, Spark, Airflow, Kafka etc. is an added advantage. • Experience in cloud-based ETL solutions (AWS/GCP/Azure) is highly preferred. Equivalent certification in cloud computing is a plus, though not necessary. • Experience with AWS services S3, Lambda, Glue, DynamoDB (NoSQL), Athena (data querying/ analysis) etc. and AWS Cloud Development Kit (for IaaC) is advantageous. • Other preferred skills: Python, PySpark, Java, JavaScript, TypeScript, Groovy, Shell scripting. • Databricks experience, especially with delta lake and unity catalog framework is valuable but not essential. • Experience with version control systems (e.g., Git, GitHub, AWS CodeCommit), CI/CD tools (e.g. Azure DevOps, AWS CodePipeline, GitHub actions) is an added advantage. • Familiarity with Business Intelligence tools like Tableau, Power BI, Power Query is beneficial. • Strong communication skills, both written and verbal, to collaborate across global teams is highly desired. • Background in quantitative analysis statistics, math, and data modeling is a plus. • Prior experience in a manufacturing domain is advantageous. • Familiarity with Agile methodologies and project management practices is beneficial. • Should be comfortable working in cross-functional virtual teams.