Senior Data Engineer
Riyadh
Contract
Related Jobs
Position: Senior Data Engineer
Location: Riyadh, Saudi Arabia
Contract: 12 months + Extendable
Benefit Package: Salary negotiable + Visa + Health Insurance+ EOSB + Annual Leave + Flights
Our client is among the top ten largest banks in the GCC covering Retail Banking, Wealth Management, Corporate and Institutional Banking, Treasury and Islamic Banking.
Key Responsibilities:
• Data Integration & ETL:
o Utilize ETL tools such as Informatica and Apache NiFi to design and implement data integration solutions.
o Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
o Ensure data quality and integrity through rigorous data validation, cleansing, and error handling.
• Process Improvement & Automation:
o Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
o Automate and improve data processes and workflows to enhance scalability and efficiency.
• Big Data Technologies:
o Work with big data technologies like Hadoop (Cloudera), Spark, Kafka, Hive, and Iceberg.
o Implement Change Data Capture (CDC) mechanisms using tools like Oracle GoldenGate, and develop real-time capabilities with technologies such as Kafka, NiFi, and Spark Streaming.
o Design and optimize data storage solutions, including data warehouses and data lakes.
• Data Modeling:
o Maintain logical and physical data models, ensuring accurate metadata management for the Big Data platform.
o Create conceptual data models to identify key business entities and visualize their relationships.
o Perform data profiling and analysis to establish, modify, and maintain data models.
• Collaboration & Documentation:
o Collaborate with international stakeholders and cross-functional teams to understand data requirements and design appropriate data models.
o Document data pipelines, processes, and best practices for knowledge sharing.
o Address data model performance issues to optimize big data performance and enhance overall system functionality.
• Architectural & Code Quality:
o Demonstrate strong architectural knowledge and comfort working across multiple repositories, services, and environments.
o Improve code and data quality by leveraging and contributing to internal tools that automatically detect and mitigate issues.
Qualifications:
• Proven experience with data integration and ETL tools such as Informatica and Apache NiFi.
• Strong understanding of CDC technologies (e.g., Oracle GoldenGate) and real-time processing (e.g., Kafka, NiFi, Spark Streaming).
• Experience with big data technologies, including Hadoop (Cloudera), Spark, Kafka, Hive, and Iceberg.
• Proficiency in designing and optimizing data pipelines, data storage solutions, and data models.
• Strong analytical and problem-solving skills with attention to detail.
• Excellent communication and interpersonal skills.
• Ability to collaborate effectively with international teams and various stakeholders.
• Experience with data validation, cleansing, error handling, and performance optimization in big data environments.
Preferred Skills:
• Familiarity with cloud-based data platforms and services.
• Experience in developing and maintaining metadata management systems.
• Understanding of data governance and compliance requirements.
Don’t miss this opportunity to advance your career. If you have the passion, skills, and experience to excel in this role, apply today!