Lead Data Engineer

  • Pune
  • Talent Et Au Dela

Key Responsibilities

  • Lead role is responsible for developing / managing, designing / architecting data pipelines / data
  • products in the cloud.
  • Full stack data developer who is well versed in ingestion, transformation, storage, processing,
  • and consumption.
  • Collect requirements, partner with different consumer personas, propose and implement, own
  • solutions / designs.
  • Implement reliable, reusable, scalable, fast, and maintainable data-pipelines (through large
  • datasets, both clean and un-clean, data integration).
  • Follow established project execution processes, coding standards, principles and industry best
  • practices.
  • Should be aware about AWS, Snowflake and Data security principles.
  • Ensures system changes meet business requirements while considering overall system impact, if
  • any.
  • Problem solver with analytical thinking, demonstrate a good learning bearing to quickly master
  • new services / technologies, adapt to working and mentor / lead team.
  • Report status updates as required by the project.
  • Get actively involved in Training, self-development & knowledge sharing.
  • Participate and contribute in enterprise initiatives.
  • Focus on continuous improvement in solutions and as individuals.
  • Accept and adapt to a changing environment, technology and changing priorities.
  • Acquire good BU domain knowledge and facilitate the growth of domain knowledge throughout
  • the technical teams.

Qualifications

Education:
Graduate - Bachelor's degree (any stream)

Must have fluent English communication skills (spoken and written)

Skills set:

  • Strong data background and Strong understanding on data warehouse/data lake
  • Strong Relational Database and SQL Skills

Expertise/Hands on experience in one or more technologies, learning and adapting to data related

  • AWS cloud services and Snowflake
  • Cloud Platforms – AWS and Snowflake
  • Storage – Object (Amazon S3), SQL(Snowflake, Amazon RDS), NoSQL(Amazon DynamoDB), Files (JSON, Delimited) and Datasets
  • Compute – AWS Glue, AWS Lambda, Amazon EC2
  • Orchestration – Step Functions, Glue Workflow
  • Security – AWS Lake Formation, IAM, VPC, KMS
  • Querying / Reporting – Amazon Athena, Amazon QuickSight / Power BI / BI tools
  • Programming languages such as Python, PySpark
  • Familiarity with CI/CD pipelines (CDK), GitHub, test-driven development, logs, etc.
  • Snowflake Architecture – Storage, Processing, Services, Role, Warehouse, Caching,
  • Stored Procedures, Stages, Stream, Task, Snowpark, Snowpipe, Data sharing, etc

Good to have:

Hands on experience/Familiarity with one/more of the following technologies:

Database, Data Models, Data Virtualization, Scripting, ETL. Reporting/Visualization, Data

Quality/Data Protection/Data Governance practices, Data mesh architecture, On-prem to Cloud

integration


#AWS #Snowflake #S3 #SQL #AmazonRDS #NoSql #Lambda #EC2 #Orchestration #datawarehouse #datalake #DWH #DL #Python #PySpark #IAM #VPC #QuickSight #PowerBI #BI #Datasecurity #PuneJobs #OpentoWork

Insert your email to proceed to Talent Et Au Dela's job offer

or