Epicareer Might not Working Properly
Learn More

DATA ENGINEER

Salary undisclosed

Checking job availability...

Original
Simplified
Job description Job Summary: As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure. You will collaborate closely with data scientists, analysts, and other stakeholders to ensure data availability, quality, and accessibility for analytical purposes. Your expertise in data engineering tools and technologies will play a pivotal role in our data-driven decision-making processes. Key Responsibilities: • Data Pipeline Development: Design, implement, and maintain data pipelines to collect, process, and store data from various sources. • Data Modeling: Create and optimize data models and schemas for efficient data storage and retrieval. • ETL (Extract, Transform, Load): Develop ETL processes to transform raw data into usable formats for analysis. • Data Integration: Integrate data from different sources, both structured and unstructured, to create a unified data ecosystem. • Data Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and consistency of data. • Scalability: Architect and optimize data infrastructure for scalability, performance, and cost-effectiveness. • Security: Implement data security measures and best practices to protect sensitive information. • Monitoring and Optimization: Monitor data pipelines for performance issues and optimize as needed. • Documentation: Maintain detailed documentation of data engineering processes and pipelines. • Collaboration: Work closely with cross-functional teams to understand data requirements and deliver solutions to meet those needs. Requirements: • Bachelor's degree in Computer Science, Data Engineering, or a related field. • Proven experience as a Data Engineer, preferably with at least 5 years in a senior or lead role. • Proficiency in data engineering technologies such as Apache Spark, Hadoop, Kafka, and SQL databases. • Strong programming skills in languages like Python, Java, or Scala. • Expertise in ETL tools and data integration techniques. • Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and big data services. • Knowledge of data modeling and schema design. • Experience with data warehousing solutions (e.g., Redshift, BigQuery) is a plus. • Excellent problem-solving skills and the ability to work in a collaborative, fast-paced