Data Engineering Advisor [T500-**]

  • Hyderabad
  • Evernorth Health Services
About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Data Engineering Advisor

Position Summary: We are looking for a talented Data Engineer to join our Home Based Care (HBC) Engineering team as part of Care Delivery Solutions. As a Data Engineer, the candidate will work with a highly agile team of developers to develop, execute, validate, and maintain Home Based Care eco system. The candidate needs to be creative, responsive, flexible, and willing to participate in an open collaborative peer environment and guide the team as necessary. The candidate enjoys working in a team of high performers, who hold each other accountable to perform to their very best and does not shy away from opportunities to provide and take feedback with team members. The candidate works towards delivering a Minimal Viable Product with proper testing, avoids scope creep, and follows Software Engineering best practices as defined by Evernorth. The candidate is expected to actively participate in all ceremonies like Daily Stand-ups, Story grooming, review user stories & sprint retrospectives.

About HBC Org:

The current HBC Engineering focuses on enabling the product capabilities for HBC business. These include the conceptualization, architecture, design, development and support functions for the Home Based Care Business Products. The strategic roadmap for HBC focuses on patient activation and routine care for various LOBs of Home Based Care. Following are the different capabilities of HBC Engineering Organization. Patient Activation Capabilities and Integrations with native EMRs Development of Non-Clinical Apps Data integrations for Home-based Care Engineering business Data Interoperability Shared Services Capabilities

Responsibilities: Work with Solution Architects to drive the definition of the data solution design, mapping business and technical requirements to define data assets that meet both business and operational expectations. Own and manage data models, data design artefacts and provide guidance and consultancy on best practice and standards for customer focused data delivery and data management practices. Be an advocate for data driven design within an agile delivery framework. Plan and implement procedures that will maximize engineering and operating efficiency for application integration technologies. Identify and drive process improvement opportunities. Actively participate in the full project lifecycle from early shaping of high-level estimates and delivery plans through to active governance of the solution as it is developed and built in later phases. Capture and manage risks, issues and assumptions identified through the lifecycle, articulating the financial and other impacts associated with these concerns. Complete accountability for the technology assets owned by the team. Provide leadership to the team ensuring the team is meeting the following objectives: Design, configuration, implementation of middleware products and application design/development within the supported technologies and products. Proactive monitoring and management design of supported assets assuring performance, availability, security, and capacity. Sizes User Stories based on time / difficulty to complete. Provides input on specific challenges facing User Stories. Discuss risks, dependencies, and assumptions. Selects User Stories to be completed in the Iteration, based on User Story priority and team capacity and velocity.

Qualifications: Experience of leading data design delivering significant assets to an organization e.g. Data Warehouse, Data Lake, Customer 360 Data Platform. Be able to demonstrate experience within data capabilities such as data modelling, data migration, data quality management, data integration, with a preference for ETL/ELT and data streaming experience. Experience with

ETL

tools such as

Databricks , Apache Airflow, automation of data pipeline processes,

AWS , SQL Server, Tableau, Bhoomi, Power BI tool sets. Experience in Python, Java, or Scala. Proficiency in SQL is crucial for database management. Experience with Big Data Technologies like Hadoop, Spark, and Apache Kafka. Experience with Data Warehousing solutions like Amazon Redshift or Google Big Query. Track record of working successfully in a globally dispersed team would be beneficial. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Proven experience with architecture, design, and development of large-scale enterprise application solutions. Strong written and verbal communication skills with the ability to interact with all levels of the organization. Proactive participation in design sessions, Program Increments (PI) Planning and sprint refinement meetings

Required Experience & Education: 11 to 13 years of IT experience and 6 to 8 years in a Data Architecture or Data Engineering role is required. College degree (Bachelor) in related technical/business areas or equivalent work experience.

Desired Experience: Exposure to serverless AWS Exposure to EKS