Google Cloud Platform

  • Bhubaneswar
  • Deloitte

We are looking for Candidates who are having experience as GCP.


Experience Required:
10+ years.

Location:
Bhubaneswar, Odisha


Responsibilities:

Develop scalable data pipelines using GCP tools such as Dataflow, Dataproc, BigQuery, Composer, and Dataplex.

Proficient in Java/Python code and complex SQL queries for data processing, extraction, transformation, and loading tasks.

Strong analytical and problem-solving skills, with the ability to derive insights from complex datasets.

Build and maintain data lakes/lakehouses on GCP, integrating structured and unstructured data sources while adhering to governance and security standards.

Implement robust auditing mechanisms and exception handling processes to ensure data quality and reliability.

Apply Slowly Changing Dimension methodologies to manage historical data and track changes over time effectively.

Possess expertise in Apache Spark and its optimizations for processing large-scale data efficiently.

Design and implement Directed Acyclic Graphs (DAGs) for orchestrating complex data workflows.

Work closely with cross-functional teams, including data scientists and business stakeholders.

Must have working experience with various storage formats like Parquet, Avro, JSON, and XML, understanding their characteristics and best practices

Insert your email to proceed to Deloitte's job offer

or