Senior Data Engineer

March 25, 2025
Open
Open
Location
Vietnam
Occupation
Full-time
Experience level
Senior
Apply

Key Responsibilities:

  1. Data Architecture and Design
    • Design and implement scalable, secure, and efficient data pipelines and architectures on AWS
    • Define and manage the architecture for real-time and batch data processing.
  2. Data Pipeline Development
    • Build ETL/ELT processes to ingest, transform, and store structured and unstructured data from diverse sources.
    • Develop solutions for data integration, data quality, and data governance.
  3. Optimization and Monitoring
    • Optimize data processing workflows for performance and cost-effectiveness.
    • Monitor data pipelines and systems, troubleshooting and resolving issues promptly.
  4. Collaboration
    • Partner with data scientists, analysts, and business teams to understand data needs and deliver high-quality solutions.
    • Collaborate with DevOps teams to implement CI/CD for data workflows.
  5. Security and Compliance
    • Ensure data solutions comply with organizational policies and industry security, privacy, and governance standards.
    • Implement data anonymization and encryption techniques as needed.
  6. Innovation
    • Stay updated with emerging AWS technologies and data engineering best practices.
    • Drive continuous improvement in processes, tools, and methodologies.

Required Qualifications:

  1. Education
    • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field.
  2. Experience
    • 5+ years of experience as a Data Engineer or in a similar role, focusing on AWS cloud solutions.
    • Proven experience designing and building data architectures and pipelines.
  3. Technical Skills
    • Expertise in AWS services, including but not limited to S3, EMR, Lambda, and RDS.
    • Proficiency in SQL, Python, and Java for data processing and manipulation.
    • Experience with data tool stacks such as: Spark, Flink, Airflow.
    • Strong knowledge of data modeling, data warehousing, and data lakes.
    • Experience with data lake formats such as Delta or Iceberg.
    • Familiarity with CI/CD tools and infrastructure as code (e.g., Terraform).
    • Experience with data governance, security, and compliance standards.
    • Containerization experience with Docker and Kubernetes (K8s).
    • Real-time data integration using Debezium and Kafka.
    • Experience with Linux and Bash commands.
    • Experience with query engines like Trino and Presto.
  4. Soft Skills
    • Strong problem-solving and analytical skills.
    • Excellent communication and collaboration abilities.
    • Ability to work in an agile and fast-paced environment.
    • Ability to communicate efficiently in English.

Nice to have:

  • Knowledge of machine learning workflows and integration with data pipelines.
  • Knowledge of monitoring tools like CloudWatch, Datadog, or Prometheus.
  • Knowledge of modern data lakehouse platforms like Apache Doris and Pinot.
  • Knowledge and experience in Banking/Finance/Securities/Fintech domains.
Apply now
Thanks you!
Oops! Something went wrong while submitting the form.
Please let us know if this job is expired. Your support helps us maintain an accurate job board!
GTV Vietnam.jpg
GTV Vietnam
Multinational Thailand - based Group focusing on Power Business, Insurance and Banking. Green Tech Ventures Public Company Limited is a holding company that strives toward corporate growth and sustainable development through worldwide investment. Founded in 2000, GTV group currently owns 8 subsidiaries with 3 core businesses, investing in power, real estate, and information and Utilities business. We are building team in Vietnam for banking project with very attractive benefits for candidates
HQ Location
Company type
Start-up
Domain
Banking
Website
📨 New remote jobs in your inbox, every Monday!
Subscribe to get your 5-minute brief on tech remote jobs every Monday