Nxp Semiconductor_Data Engineer_ Contract Role (1 year contract)
Apply on
Availability Status
This job is expected to be in high demand and may close soon. We’ll remove this job ad once it's closed.
Experience level Someone with minimum 5 years or more IT working experience
especially in Data Engineering ETL related work.
Education
required
Undergraduate Degree/ Master Degree
Main purpose
and scope of the
job
Purpose: Resource to support Assy & Test manufacturing data
analytics operations & projects
Scope: Deliver reporting ETL (extract-transfer-load) solutions using data
elements from Manufacturing, Enterprise and other source
systems/applications in Hadoop and Cloud environment
Responsibilities *
The individual will be responsible to provide ETL (extract-transferload)
solutions and deployment of these solutions thus involve in
various BI and Big Data Analytics related projects.
The individual will also be responsible in building necessary
automation data feeds (ETL/ELT) mechanisms for data transfer
from various source applications to target applications. Perform
data integration - administration, optimization of performance
evaluation & monitoring to the data feeds.
Support existing production portfolio and troubleshoot issues in
providing fixes and solutions.
Participate in all aspects of software product development life
cycle (SDLC) including requirement gathering, work flow
architecture design, development, testing, demo, training and
documentation. Plan, prepare and lead User Acceptance Test
(UAT) for releases and ensure testing completeness.
The individual should demonstrate hi-energy and desire for data
mining, scripting, problem solving and data analysis.
Work collaboratively with DBAs, infrastructure, network, data
enablement teams and application BA analysts.
Collaborate improvements in data governance, methodologies and
processes. Demonstrate a deep interest and understanding of big
data, data modelling, data structures, data catalogue and how to
manipulate data in an efficient manner. Quick in formulating
quality, feasible and practical solution fit to big data application
Able to perform knowledge sharing of theoretical and practical.
Knack of exploring and try out new things related to Data Analytics
world.
Training will be provided on needed basis.
Qualifications*
Someone with minimum 5 years of IT working experience
especially into Business Intelligence or Data Analytics related job.
Possess at least a Degree in Computer Science, Information
Technology, Engineering or related courses.
Data integration (ETL), mining, analytics experiences is A MUST.
Experience in manipulating and analyzing complex, high-volume,
high-dimensionality data from varying sources and perform ETL
(extract transfer load) scripts.
Development experience in Data Lake solutions such as Data
Bricks, Azure DataLake, Fabric, AWS, Snowflake, Cloudera or
Data Fabric such as Atlan, cinchy, IBM for Big Data solution is
preferred.
Have experience in development in Hadoop environment is an
added advantage and understand components the applications in
Hadoop eco-system using ETL framework components such as
Nifi, Kafka, Pentaho, Spark, Datalake Insights etc.
Cloud environment development experience is y preferred.
HCIA/HCIP/HCIE certificate in cloud computing, or equivalent
certificates in the industry preferred, ep: AWS, Azure, GCP cloud
certificates. Familiarity with AWS particularly services S3,
Lambda, EMR, EC2 is added advantage.
SQL experience a MUST. Perform SQL as daily routine job.
Working experiences in development database (such as MSSQL
Server, Teradata or Oracle).
Have working experience or exposure with Business Intelligence
tools is highly recommended. Preferably: Tableau, Power Query,
Power BI Desktop etc.
Coming from manufacturing background (but not essential).
Other preferred software skill(s): PHP, Python, Java, Javascript
is an added advantage.
Strong visualization capability and passionate in quantitative
analysis – statistics, math, modeling, design etc.