Easter Seals Jobs

Job Information

CAI Data Engineer - Tableau in REMOTE, India

Data Engineer - Tableau

Req number:

R4337

Employment type:

Full time

Worksite flexibility:

Remote

Who we are

CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.

Job Summary

We are looking for a Data Engineer with experience in building data lake architectures using AWS and related technologies like S3, Redshift, AWS Glue, EMR, Data Bricks, ETL Pipelines, SQL, Python and Lambda , Reporting tool tableau if you are looking for your next career move, apply now.

Job Description

We are looking for a Data Engineer . This position will be full-time and Remote.

What You’ll Do

  • Design and develop data lakes, manage data flows that integrate information from various sources into a common data lake platform through an ETL Tool.

  • Code and manage delta lake implementations on S3 using technologies like Databricks or Apache Hoodie.

  • Triage, debug and fix technical issues related to Data Lakes.

  • Design and Develop Data warehouses for Scale.

  • Design and Evaluate Data Models (Star, Snowflake and Flattened).

  • Design data access patterns for OLTP and OLAP based transactions.

  • Coordinate with Business and Technical teams through all the phases in the software development life cycle.

  • Participate in making major technical and architectural decisions.

What You'll Need

  • 1+ Years of working knowledge in tableau.

  • 3+ Years of Experience with AWS Data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS and Redshift.

  • ·3+ Years of Experience building Data Warehouses on Snowflake, Redshift, HANA, Teradata, Exasol etc.

  • 3+ Years of Experience in building Delta Lakes using technologies like Apache or Data bricks.

  • 3+ Years of Experience working on any ETL tools and technologies.

  • 3+ Years of Experience in any programming language (Python, R, Scala, Java).

  • Bachelor’s degree in computer science, information technology, data science, data analytics or related field.

  • Experience working on Agile projects and Agile methodology in general.

  • 4+ Years of Experience operating on AWS Cloud with building Data Lake architectures.

  • Strong RDBMS and data modelling skills.

  • AWS cloud certification is a big plus.

  • Strong communication skills; both written and spoken.

Physical Demands

  • Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc.

  • Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard, and monitor.

Reasonable accommodation statement

If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

DirectEmployers