Easter Seals Jobs

Job Information

Onix Networking Corp. Technical Lead - Big Lake (GCS) in Lakewood, Ohio

About Onix: Onix is a trusted cloud consulting company that helps companies get the most out of their technology with cloud-powered solutions, best-in-class services, and the Datametica Birds, data migration products that unleash AI potential. We are able to deliver exceptional results for our customers because of our 20+ years of cloud experience, depth of technology expertise, and IP-driven data and AI solutions. We offer solutions across a wide range of use cases and industries that are tailored to the unique needs of each customer. From advanced cloud security solutions to innovative AI capabilities and data migration products, we have you covered. Our global team of experts are the most reliable, talented and knowledgeable in the industry. Summary: Onix seeks an experienced GCP Technical Lead (Data Platform) with a strong background in managing and optimizing data platforms. The ideal candidate will have primary expertise in PostgreSQL, Starburst/Trino, DBT, Dataform, Big Lake (GCS), BigQuery, Cloud Observability, and GCP Services. Along with the hands-on experience with data technologies Additionally, they should possess secondary skills in Spark, DataProc, DataFlow, Python, Airow, and have experience in data and database migration. Role: Technical Lead - Big Lake (GCS) Location: Remote Primary Responsibilities: Provide technical leadership and guidance to the data platform team. Oversee the design, implementation, and maintenance of scalable and high-performance data platforms. Analyze the source data solutions like PostgreSQL, Starburst/Trino and Architect and optimize and migrate the same to BigQuery, and other GCP services. Ensure data models and architectures meet business requirements and best practices. Lead data migration projects, including: Migration from PostgreSQL to BigQuery or Cloud SQL. Migration of workloads from Spark to DataProc. Migration of historical data from AWS to GCP. Develop and implement migration strategies, ensuring data integrity, security, and minimal downtime. Design and implement robust ETL/ELT processes using DBT, Dataform, and other tools. Ensure data pipelines are efcient, maintainable, and meet performance requirements. Manage and optimize Big Lake (GCS) and BigQuery environments. Implement best practices for data storage, partitioning, and querying to ensure cost-effectiveness and performance. Utilize GCP services to enhance data platform capabilities. Implement and manage cloud observability tools to monitor data pipelines and infrastructure health. Use Spark, DataProc, DataFlow, and Python for complex data processing tasks. Implement orchestration and scheduling of data workows using Airow. Work closely with data engineers, data scientists, and other stakeholders to meet data requirements. Mentor team members and provide technical training and guidance. Minimum qualications: Minimum 8+ years of experience in data engineering, data architecture, and technical leadership. 5+ years of overall experience in architecting, developing, testing & implementing Data Platform projects using GCP Components (e.g. BigQuery, Dataow, Dataproc,DLP,BigTable,Pub/Sub,Compos etc..). Good Understanding of Data Structures Worked with large datasets and solving difcult analytical problems. Experience working with GIT for Source Code Management Worked with Structured and Unstructured data E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Worked with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform Automating manual processes to speed up delivery. Good Understanding of Data Pipeline (Batch and Streaming) and Data Governance Experience in code deployment from lower environment

DirectEmployers