Easter Seals Jobs

Job Information

SoftwareONE Sr Expert Data Engineering in Gurgaon, India

Why SoftwareOne?

Here at SoftwareOne, we give you the flexibility to unleash your creativity, without limits. We encourage autonomy and thinking outside the box - and we can’t wait to hear your new ideas., and although all businesses say it, we truly believe in work - life harmony. Our people are our greatest asset, and we’ll go the extra mile to ensure you’re happy here. We want our people to be their true authentic selves at all times, because that’s when real creativity happens.

At SoftwareOne, we believe that our people are our greatest asset. We offer:

  • A flexible work environment that encourages creativity and innovation.

  • Opportunities for professional growth and development.

  • An inclusive team culture where your ideas are valued and your contributions make a difference.

  • The chance to work on ambitious projects that push the boundaries of technology.

The role

We are seeking a highly skilled and motivated Senior Data Engineer with expertise in Databricks and Azure to join our team. As a Senior Data Engineer, you will be responsible for designing, developing and maintaining our data lakehouse and pipelines. You will work closely with the Data & Analytics teams to ensure efficient data flow and enable data-driven decision-making. The ideal candidate will have a strong background in data engineering, experience with Databricks, Azure Data Factory and other Azure services and a passion for working with large-scale data sets.

Role Description

  • Design, develop and maintain the solutions required for data processing, storage and retrieval.

  • Create scalable, reliable and efficient data pipelines that enable data developers and engineers, data analysts and business stakeholders to access and analyze large volumes of data.

  • Closely collaborates with other team members and Product Owner.

What we need to see from you

Key Responsibilities

  • Collaborate with the Product Owner, Business analyst and other team members to understand requirements and design scalable data pipelines and architectures.

  • Build and maintain data ingestion, transformation and storage processes using Databricks and Azure services.

  • Develop efficient ETL/ELT workflows to extract, transform and load data from various sources into data lakes.

  • Design solutions and drive implementation for enhancing, improving and securing Data Lakehouse.

  • Optimize and fine-tune data pipelines for performance, reliability and scalability.

  • Implement data quality checks and monitoring to ensure data accuracy and integrity.

  • Work with data developers, engineers and data analysts to provide them with the necessary data infrastructure and tools for analysis and reporting.

  • Troubleshoot and resolve data-related issues, including performance bottlenecks and data inconsistencies.

  • Stay up to date with the latest trends and technologies in data engineering and recommend improvements to existing systems and processes.

    Skillset

  • Highly self-motivated, work Independently, assume ownership and results oriented.

  • A desire and interest to stay up to date with the latest changes in Databricks, Azure and related data platform technologies.

  • Time-management skills and the ability to establish reasonable and attainable deadlines for resolution .

  • Strong programming skills in languages such as SQL, Python, Scala or Spark.

  • Experience working with Databricks and Azure services, such as Azure Data Lake Storage, Azure Data Factory, Azure Databricks, Azure SQL Database and Azure Synapse Analytics.

  • Proficiency in data modeling, database design and Spark SQL query optimization.

  • Familiarity with big data technologies and frameworks like Hadoop, Spark and Hive.

  • Familiarity with data governance and security best practices.

  • Knowledge of data integration patterns and tools.

  • Understanding of cloud computing concepts and distributed computing principles.

  • Excellent problem-solving and analytical skills.

  • Strong communication and collaboration skills to work effectively in an agile team environment.

  • Ability to handle multiple tasks and prioritize work in a fast-paced and dynamic environment.

    Qualifications

  • Bachelor's degree in Computer Science, Engineering or a related field.

  • 4+ years of proven experience as a Data Engineer, with a focus on designing and building data pipelines.

  • Experience in working with big and complex data environments.

  • Certifications in Databricks or Azure services is a plus.

  • Experience with data streaming technologies such as Apache Kafka or Azure Event Hubs is a plus.

Job Function

IT & Solutions

Accommodations

SoftwareOne welcomes applicants from all backgrounds and abilities to apply. If you require reasonable adjustments at any point during the recruitment process, email us at reasonable.accommodations@softwareone.com. Please include the role for which you are applying and your country location. Someone from our organization that is not part of the decision-making process will be in touch to discuss your specific needs and we will make every effort to accommodate you. Any information shared will be stored securely and treated in the strictest of confidence in line with GDPR.

At SoftwareOne, we are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and teammates without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Additionally, we encourage experienced individuals that have taken an intentional career break and are now prepared to return to work to explore our SOAR program.

DirectEmployers