Easter Seals Jobs

Job Information

Plato Systems Senior Data Engineer in San Francisco, California

Plato Systems is pioneering the use of spatial AI to boost capacity, productivity, and safety across physical operations, starting with manufacturing. Established as a spin-off out of Stanford and funded by NEA in 2019, Plato has developed an Operations Digital Twin and AI copilot fueled by our proprietary hardware that uses machine perception and sensor fusion on the edge to digitize patterns of activity. Advanced manufacturing companies use the unprecedented capabilities in our platform to root cause complicated systemic operational issues rapidly and at scale, thereby taking Kaizen and compliance tracking into the age of AI. We are deployed across multinational electronics manufacturers, semiconductor fabs, and EMS companies in several countries. You can find out more about us by visiting our website (https://www.plato.systems/).

We are seeking a Senior Data Engineer with over 7 years of relevant experience to join our growing team. The ideal candidate will have a deep understanding of the design, development, and maintenance of many ETL pipelines at the same time, will be confident working with time-series data, and will have some prior experience with business analyst or business intelligence functions.

Core Responsibilities & Qualifications:

  • ETL Pipeline Design and Development: Work closely with our Head of Product to create and maintain complex ETL data pipelines using SQL and PySpark, while ensuring data quality. Familiarity with Databricks is preferred.

  • Time-Series Data Processing: Prior experience in designing, & implementing production pipelines (i.e. normalizing, aggregating, aligning, correlating, processing time-series data from sensors, machines, and processes as well as orchestrating & monitoring).

  • Production Data Product Experience: Demonstrated ability to work on the production of data products, including handling data quality issues, orchestration and automation, and testing.

  • Coding Skills: Proficiency in Python, SQL, and Java or other backend languages. Familiarity with cloud platforms like AWS, GCP or Azure is a plus.

  • Communication Requirements: Effective communication skills, both verbal and written, to clearly explain complex data concepts to non-technical team members and stakeholders.

  • Team Collaboration: Proven experience working effectively in cross-functional teams, demonstrating a collaborative mindset.

    Preferred Qualifications:

  • Domain Engagement: Demonstrated prior experience understanding business requirements and domain in order to create data products that address customer needs. Success in this role hinges on grasping the nuances and intricacies of our domain to offer relevant and impactful data solutions.

  • Autonomy & attention to detail: Derived tables that feed dashboards and analytics are only useful if they contain clean high quality data. This requires good judgment to find the right balance of unit tests, integration tests, regression tests, and monitoring / alerting to ensure data flow and its quality.

  • BI, Data Visualization & Reporting: Prior experience creating and maintaining automated reporting, dashboards, and consistent analysis to bolster data-driven decisions.

  • Versatility of Prior Work: Experience working in different industries or domains, as well as different stages (early stage build-out to mature pipelines and processes), showing flexibility and ability to adapt to new types of data and/or business domains.

DirectEmployers