Jobs at Hub71 startups

Are you ready to join a vibrant community of tech startups that are shaping the future of innovation?

The Hub71 careers portal connects you with the leading startups that are transforming industries at the heart of Abu Dhabi's Global Tech Ecosystem. Explore a diverse range of opportunities with high-potential startups that are scaling globally from the UAE capital.

Background Icon

Data Engineer (ETL, Data Pipelines & Warehousing)

Aumet

Aumet

Data Science
Amman, Jordan
Posted on Sep 17, 2024

Aumet is a leading healthcare technology company dedicated to revolutionizing the way medical supplies are sourced and distributed globally. Our platform connects healthcare providers with a vast network of suppliers, streamlining the procurement process and ensuring efficient access to essential medical products.

Job Overview:

The Data Engineer is responsible for building and managing scalable data pipelines, ETL processes, and data warehouses. This role ensures that data is efficiently ingested, transformed, and made available for analytics, reporting, and AI model training, while ensuring data quality and security.

Responsibilities:

  • Design and maintain efficient and scalable data pipelines for structured and unstructured data.
  • Implement and optimize ETL processes to ingest and transform data for various modules.
  • Manage relational and NoSQL databases, such as PostgreSQL for transactional data and MongoDB for unstructured data.
  • Build and optimize data warehouses (e.g., Amazon Redshift, Google BigQuery) to support reporting and advanced analytics.
  • Collaborate with Data Scientists to ensure data pipelines support the needs of AI models.
  • Manage real-time data streaming using tools like Apache Kafka or RabbitMQ.
  • Monitor and troubleshoot data pipelines to ensure data quality, performance, and scalability.
  • Ensure data security, including encryption at rest and in transit.

Requirements:

  • Minimum 4 years of experience
  • Strong experience in SQL and NoSQL databases (e.g., PostgreSQL, MongoDB).
  • Expertise in designing and managing data pipelines and ETL processes.
  • Experience with data warehousing technologies such as Amazon Redshift, Google BigQuery.
  • Familiarity with real-time data processing tools like Kafka, RabbitMQ.
  • Solid understanding of data security, including encryption and compliance protocols.
  • Experience with large-scale data environments and optimizing queries for performance.
  • Proficiency with data integration between different systems.
  • Experience working with data scientists to support AI and machine learning models.
  • Strong problem-solving skills and ability to manage complex data systems.