Apply now »

Specialist - Software Engineering

Job Req Id:  1455455

Senior Data Engineer – GCP 

 

We are seeking a highly skilled Data Engineer with strong experience in Google Cloud Platform (GCP), particularly BigQuery, to design, build, and maintain scalable data pipelines. The ideal candidate will have hands-on experience with Python, SQL, Airflow/Composer, and CI/CD practices, and will play a key role in expanding and migrating Adobe-based data pipelines while integrating and operationalizing Orion datasets across hybrid cloud environments.


Key Responsibilities

  • Design, build, and maintain ETL/ELT data pipelines using Python, SQL, Airflow, and GCP Composer.
  • Support the migration and expansion of Adobe-based data pipelines.
  • Integrate Orion datasets into existing enterprise datasets and data platforms.
  • Develop new data pipelines for Orion event data collection.
  • Implement and manage data ingestion, transformation, and orchestration workflows in BigQuery.
  • Optimize pipeline performance, including runtime, compute usage, storage tiering, and query costs, with a strong focus on BigQuery cost optimization.
  • Establish data quality checks, validation logic, and reconciliation processes to ensure data accuracy and reliability.
  • Work across hybrid data environments, supporting data movement and transformation across AWS, GCP, and Snowflake.
  • Implement and maintain CI/CD pipelines using GitHub and enterprise DevOps practices.
  • Create and maintain clear, comprehensive documentation for all data pipelines, integrations, and operational processes.
  • Collaborate with cross-functional teams to support analytics, reporting, and downstream data consumers.

Required Skills and Qualifications

  • Strong hands-on experience with Google Cloud Platform (GCP), particularly BigQuery.
  • Proficiency in Python and SQL for data engineering use cases.
  • Experience with Apache Airflow and Cloud Composer.
  • Hands-on experience with GitHub and CI/CD pipelines.
  • Solid understanding of data warehousing concepts, including:
    • Star schemas
    • Dimensional modeling
    • OLTP vs. OLAP architectures
  • Experience designing and supporting ETL/ELT pipelines in large-scale data environments.
  • Familiarity with Adobe data platforms and event-based data pipelines.
  • Experience working in multi-cloud and hybrid environments (AWS, GCP, Snowflake).
  • Strong problem-solving skills and attention to data quality and performance.

Nice to Have

  • Experience with large-scale event data and real-time/near-real-time pipelines.
  • Prior experience supporting marketing or analytics platforms (Adobe ecosystem).
  • Exposure to cost optimization strategies in cloud-based data platforms.
  • Strong documentation and stakeholder communication skills.
Min Salary: 
Max Salary: 


Job Segment: Cloud, Software Engineer, Data Warehouse, Developer, Database, Technology, Engineering

Apply now »