Easter Seals Jobs

Job Information

UnitedHealth Group Software Engineer - SQL, Python/Scala, Azure in Gurugram, India

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.

We are seeking an innovative Data Engineer to join our distinguished team. This role offers a unique opportunity to significantly impact a pioneering function within an industry-leading organization. The primary responsibilities include designing, developing, implementing, testing, deploying, monitoring, and maintaining systematic data delivery methods. This position encompasses all key development and maintenance activities across various technology functions to ensure the delivery of high-quality data for users, applications, and services. The roles demands to design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Primary Responsibilities:

  • Collaborate with lead data engineers, data scientists and architects to understand data requirements and design optimal data solutions

  • Design, develop, implement, and manage cross-domain, modular, optimized, flexible, scalable, secure, reliable, and high-quality data solutions for meaningful analyses and analytics, ensuring operability

  • Enhance data efficiency, reliability, and quality by developing performant data solutions

  • Integrate instrumentation in the development process to monitor data pipelines, using measurements to detect internal issues before they cause user-visible outages or data quality problems

  • Develop processes and diagnostic tools to troubleshoot, maintain, and optimize solutions, responding to customer and production issues

  • Reduce technical debt and transform technology through the adoption of open-source solutions, cloud integration, and HCP assessments

  • Maintain comprehensive documentation with clearly writing down process flows, data flows, code flows and other technical aspects of a process deployed in production

  • Monitor and analyze system performance metrics, identifying areas for improvement and implementing solutions

  • Interpret policies and leverage experience to solve issues

  • Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues

  • Explore ways to enhance data quality and reliability

  • Lead the development of new concepts, technologies, and products to meet emerging needs

  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:

  • Undergraduate degree or equivalent experience

  • Experience in working with Big Data and cloud tools such as Spark, Scala, Python, Pyspark, Github Actions, Maven, Parquet, Kafka, Avro etc.

  • Core understanding in basics of distributed systems

  • Proficiency in SQL and experience with data modeling, ETL processes, and data warehousing solutions

  • Exhaustive experience in Snowflake, Azure Data Factory, Azure Databricks

  • Demonstrate attitude in gathering business requirements with asking right questions and putting the understanding in detailed flow diagrams

Preferred Qualifications:

  • Background in US healthcare

  • Familiarity with Streamlit, Docker, Gen AI APIs

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

DirectEmployers