Easter Seals Jobs

Job Information

AT&T Azure Data Factory (ADF) / Synapse Developer in Bengaluru, India

Job Description:

Roles & Responsibilities:

  • Understand business requirement and actively provide inputs from Data perspective.

  • Understand the underlying data and flow of data.

  • Build simple to complex pipelines & dataflows.

  • Should be able to implement modules that has security and authorization frameworks.

  • Recognize and adapt to the changes in processes as the project evolves in size and function.

  • To be an owner of the Data Integration pipeline.

  • Bring in Data integration standards and implement the same.

  • Build Dataflows, workflows and have job fail over design.

  • Build Re-usable assets and framework components.

Knowledge, Skills & Abilities:

  • Expert level knowledge on Azure Data Factory.

  • Advance knowledge of Azure SQL DB & Synapse Analytics, Power BI, T-SQL, Logic Apps , Function Apps.

  • Should be able to analyze and understand complex data.

  • Monitoring day to day Data factory pipeline activity.

  • Designing, configuring, and managing pipelines to orchestrate data workflows.

  • Implementing different types of activities such as Copy Activity, Data Flow, Databricks Activity, and Control Flow activities.

  • Connecting to and integrating on-premises data sources using Self-hosted Integration Runtime.

  • Setting up and managing triggers (Schedule, Event, Manual) to automate pipeline executions.

  • Configuring linked services to connect to various data stores and defining datasets for data structures.

  • Knowledge of Azure data lake is required and Azure Services like Analysis Service, SQL Databases, Azure DevOps, CI/CD is a must.

  • Knowledge of master data management, data warehousing and business intelligence architecture.

  • Experience in data modeling and database design with excellent knowledge of SQL Server best practices.

  • Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision.

  • Should have clear understanding of DW lifecycle and contribute in preparing Design documents, Unit Test plans, Code review reports.

  • Experience working in Agile environment (Scrum, Lean, Kanban) is a plus

  • Knowledge of Big data technologies - Spark Framework, NoSQL, Azure Data Bricks , Python, Snowflake, Jupiter Note Working knowledge, R- Programming

  • Knowledge on various file systems and recommend based on design.

  • MPP Design and recommend design for optimal cluster utilization.

  • Expert in python and pyspark.

Qualifications & Experience:

  • Bachelor's or master's degree in computer science or related field.

  • At least 6-10 years of Data engineering or Software development experience.

#SoftwareEngineering

Weekly Hours:

40

Time Type:

Regular

Location:

Bangalore, Karnataka, India

It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities.

AT&T will consider for employment qualified applicants in a manner consistent with the requirements of federal, state and local laws

We expect employees to be honest, trustworthy, and operate with integrity. Discrimination and all unlawful harassment (including sexual harassment) in employment is not tolerated. We encourage success based on our individual merits and abilities without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, disability, marital status, citizenship status, military status, protected veteran status or employment status

DirectEmployers