Data Engineer – GCP

Data Engineer – GCP

Job Category: Engineering
Job Type: Full Time
Job Location: Remote

About the Role:

We are seeking a skilled Data Engineer with strong GCP (Google Cloud Platform) expertise to design, develop, and maintain scalable data pipelines and infrastructure. The ideal candidate will play a key role in managing large-scale data processing and transformation workflows that support business intelligence, analytics, and machine learning initiatives.


Key Responsibilities:

  • Design and implement scalable, reliable data pipelines using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.).
  • Build and maintain data ingestion frameworks to extract, transform, and load data from various structured and unstructured sources.
  • Work closely with data analysts, scientists, and stakeholders to understand data requirements and ensure data availability.
  • Monitor data quality, performance, and reliability, and implement necessary improvements.
  • Optimize SQL queries and ETL jobs for performance and cost-efficiency in GCP.
  • Ensure data security, compliance, and governance standards are met.
  • Automate and schedule data workflows using tools like Cloud Composer (Apache Airflow).
  • Participate in code reviews, documentation, and CI/CD for data solutions.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
  • 3–6 years of experience in data engineering or a related role.
  • Strong hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Cloud Storage, Pub/Sub, etc.).
  • Proficiency in SQL, Python, or Java for data processing.
  • Experience with ETL tools and orchestration frameworks (e.g., Apache Airflow).
  • Understanding of data modeling, warehousing concepts, and performance tuning.
  • Familiarity with CI/CD, Git, and DevOps practices.

Preferred Qualifications:

  • GCP Data Engineer certification is a strong plus.
  • Experience with Terraform or Infrastructure as Code (IaC) for GCP.
  • Knowledge of stream processing (e.g., Apache Beam or Kafka).
  • Familiarity with data privacy regulations (e.g., GDPR, HIPAA).
  • Exposure to BI tools like Looker or Data Studio.

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
Scroll to Top