Data Warehouse Engineer
Vị trí thuộc Socium - Teams Done Differently, làm việc remote theo múi giờ UAE, hợp đồng 1 năm (có thể gia hạn). Ứng viên sẽ xây dựng, duy trì pipeline dữ liệu, mô hình dữ liệu, đảm bảo chất lượng và tối ưu hiệu suất hệ thống dữ liệu phục vụ phân tích kinh doanh. Thu nhập hấp dẫn cùng môi trường làm việc chuyên nghiệp và đội ngũ quốc tế.
Yêu cầu ứng viên có kinh nghiệm làm ETL, Data Engineering, thành thạo SQL, một trong số các ngôn ngữ Python/Java/Scala; sử dụng thành thạo Hadoop, Spark, Hive, Airflow và các nền tảng đám mây (AWS, Azure, GCP). Có bằng đại học ngành CNTT hoặc liên quan, hiểu biết tốt về kho dữ liệu, tư duy logic và khả năng phối hợp mạnh.
About Socium - Teams Done Differently
The Technology & Transformation Directors’ Delivery engine. Supporting customers through change and technology-enabled transformation, by building program teams to plan, execute and deliver major programs of work, globally, with the unrivaled speed of deployment.
Using our extensive industry networks and connections, we provide enterprise-sized corporates through to high-growth tech start-ups with quality project teams.
We operate two models that are designed to generate clear business growth for our clients:
- Teams as a Service - provides enterprise-sized companies going through a technology-enabled transformation with pre-formed teams that deliver on your objectives and upskill your in-house team, along the way
- Socium for Scale - partnering with start-up tech companies looking to scale - quickly. Provides a complete end-to-end, bespoke recruitment solution to the most challenging tech hiring requirements
Position: Data Warehouse Engineer
Company: Socium - Teams Done Differently
Location: Southeast Asia (Remote)
Employment Setup: Remote / UAE hours
Contract Duration: Initial 1 year, subject to extension
Role Overview
We are looking for a skilled Data Engineer to design, build, and maintain robust data infrastructure that supports analytics and business decision-making. You will develop scalable ETL pipelines, optimize data systems, and ensure data quality, security, and reliability while collaborating with cross-functional teams.
Key Responsibilities
- Build and maintain scalable data pipelines and ETL processes for data warehouse/data lake.
- Design data models and schemas based on business and analytics requirements.
- Ensure data quality, consistency, and timely delivery across systems.
- Monitor, troubleshoot, and optimize data infrastructure performance.
- Implement data governance, security, and compliance best practices.
- Partner with data scientists, analysts, and engineers to support advanced analytics.
- Stay current with emerging data technologies and recommend improvements.
- Document processes, data flows, and architectures for knowledge sharing.
Qualifications
- Bachelor’s degree in Computer Science, Engineering, or related field.
- Proven experience in data engineering and large-scale ETL pipeline development.
- Strong skills in SQL and at least one programming language (Python, Java, or Scala).
- Hands-on experience with big data tools (e.g., Hadoop, Spark, Hive) and workflow tools (e.g., Airflow).
- Familiarity with cloud platforms (AWS, Azure, or GCP).
- Knowledge of data warehousing concepts and best practices.
- Strong problem-solving, communication, and teamwork skills.
- Detail-oriented with a focus on delivering high-quality, reliable solutions.
Similar Jobs



