Junior Data Engineer
Taager
About Taager
Taager is the first B2B startup specialized in supporting social sellers. We are democratizing the social e-commerce space by enabling entrepreneurs — whether beginners or experienced — to sell online without capital, inventory, or operational experience. We handle product selection, storage, logistics, payment collection, and customer service on behalf of our merchants.
Launched in 2019 with a team of just 8 people, we’ve grown to over 350 employees across Egypt, Saudi Arabia, the UAE, and more recently, Morocco. We serve over 34,000 social commerce sellers from diverse backgrounds — from students seeking side income to seasoned digital marketers aiming to become independent entrepreneurs. Our sellers have access to over 2,500 high-potential products.
Our teams are driven by our mission and deeply motivated to provide the best experience for our sellers. With a commitment to quality and operational excellence, we're transforming the social commerce landscape in the MENA region!
Our Mission
To empower anyone to start and grow their own e-commerce business.
Our Vision
We envision a world where anyone can sell online, earn a living, and even build wealth — all within a simple, low-risk environment. A world where the magic of technology is made accessible to the most talented merchants.
Why Join Taager?
- You'll work in an international environment with team members from over 10 nationalities.
- You’ll have access to a very attractive compensation plan, depending on your ability to scale.
- We invest in team development and prioritize internal promotions.
- You’ll work alongside ambitious, kind, and talented individuals.
About the Role
We are looking for a Junior Data Engineer to join our data team and help build and maintain the data infrastructure that powers our analytics and business intelligence. You will work closely with senior engineers to develop data pipelines, ensure data quality, and support the team in delivering reliable data solutions. This is an excellent opportunity to grow your skills in modern data engineering practices.
Responsibilities:
- Build and maintain ETL/ELT pipelines to extract, transform, and load data from various sources into our data warehouse.
- Write and optimize SQL queries for data transformation and analysis.
- Develop and maintain dbt models following best practices for data modeling (Bronze/Silver/Gold layers).
- Monitor data pipeline health and troubleshoot data quality issues.
- Collaborate with data analysts and scientists to understand data requirements and implement solutions.
- Write clean, well-documented code and participate in code reviews.
- Support the maintenance of data infrastructure including orchestration tools (e.g., Airflow) and data warehouses.
- Contribute to documentation of data processes, pipelines, and data models.
Required Skills:
- Strong proficiency in SQL for data manipulation and transformation.
- Ability to write Python scripts for data processing, automation, and pipeline development.
- Hands-on experience with dbt Core for data transformation.
- Understanding of database concepts, data modeling, and relational databases.
- Familiarity with version control systems (e.g., Git, GitLab).
- Basic understanding of data warehousing concepts and ETL/ELT processes.
- Strong attention to detail and commitment to data quality.
- Problem-solving mindset and willingness to learn new technologies.
- Good communication skills and ability to work collaboratively in a team.
- Basic understanding of software development best practices.