Apply now »

GCP Cloud Architect

Job Req Id:  1309187

About Us:


LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree — a Larsen & Toubro Group company — combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit


Role: GCP Cloud Architect

Location: Prague, Czech Republic

Employment Type: Permanent

Hybrid Work Model: Hybrid: 2 – 3 Days Work From Office and rest Remote (Per Week).

Language Proficiency: English & Czech (Preferred)

Total Experience: 12-15+ Years Experience


Key Responsibilities:


Cloud Architecture:

Design and implement robust cloud architectures on Google Cloud Platform, ensuring alignment with business requirements for scalability, reliability, and performance.

Collaborate with cross-functional teams to integrate data solutions seamlessly into the GCP environment.


Cloud Services Expertise:

Demonstrate proficiency in Google Cloud Platform services, including Compute Engine, Kubernetes Engine, Cloud Storage, and BigQuery.

Design cloud automation, CICD, Privacy, Access Management, Data Layers and orchestration etc.


Data Architecture and Modeling:

Develop Conceptual, Logical, and Physical data models, showcasing expertise in domain-driven design patterns.

Utilize Data Vault 2.0, 3NF, and Dimensional modeling methodologies, with proficiency in data modeling tools such as Erwin, Innovator etc.


Big Data Technologies:

Showcase strong technical skills in Big Data technologies and languages, including Kafka, Spark, HIVE, Python, etc.

Leverage hands-on experience with Cloudera/Hortonworks and related ecosystem components in architecture design.


Data Analysis and Governance:

Conduct Data Analysis, Data Profiling, and implement Data Governance solutions.

Develop data quality frameworks to ensure the integrity and reliability of the data.


Collaboration and Communication:

Independently collaborate with customers to understand, translate, and drive business requirements into technical solutions.

Work closely with Data Engineering and Data Scientists teams to develop ETL pipelines, build data flows, implement data strategies, and conduct performance tests.




  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • Minimum 1 years of experience in Data Architecture, Solution Design, and Data Modeling.
  • Hands-on experience designing data solutions, preferably on GCP, with exposure to AWS and Azure clouds.
  • Strong technical skills in Cloud technologies: Compute Engine, Kubernetes Engine, Cloud Storage, and BigQuery, etc.
  • Strong technical skills in Big Data technologies and languages: Kafka, Spark, HIVE, Python, etc.
  • Experience in design solution and frameworks for cloud infrastructure and data pipelines. 
  • Experience in Data Analysis, Data Profiling, Data Governance solutions, and creating data quality frameworks.
  • Experience with Data Vault 2.0 modeling, 3NF, and Dimensional modeling methodologies.
  • Ability to independently collaborate with customers and translate business requirements into technical solutions.
  • Collaboration experience with Data Engineering and Data Scientists teams.


Min Salary: 
Max Salary: 

Job Segment: Cloud, Computer Science, Consulting, Architecture, Engineer, Technology, Engineering

Apply now »