Staff DataOps Engineer
San Diego, California
Job Title: Staff DataOps Engineer
What you will do:
Automate. Ramp up DataOps for the development and operation of data pipelines in support of multiple teams’ descriptive business analytics. Manage dynamic configurations, CI/CD for data pipelines, cloud data environments and repositories for versioned code. Use emerging industry standards to create data pipeline infrastructure as code, automate testing and deployments, employ monitoring tools, and thereby manage the lifecycle of data pipelines and their target data sets needed by multiple business analysis teams in Manufacturing Operations and Supply Chain. Maintain, enhance, and deploy development and test environments for core data engineering team and distributed business analytics teams. Oversee the creation and use of early, automated detection and monitoring for data errors that threaten data pipeline operations.
What you need for this position:
You know DataOps, and you are eager to commit to driving the necessary rigor for rapid, automated tests and deployments of core data pipeline code and data sets, because you also own those smooth data pipeline operations in production.
Primary Duties and Responsibilities:
• Support data engineers who build data pipelines from a variety of OLTP systems, who use ecosystems such as Google Cloud, streaming services such as Confluent (Kafka), and DBT for transformation into cloud platforms such as Snowflake, and who in-turn support business analysts who then present the data in Tableau or Looker.
• Employ Terraform to own and automate the provisioning and lifecycle of development and testing environments, data sets and tools for data pipelines, cloud data warehouse clusters, as infrastructure as code.
• Facilitate and mentor others on GitHub code branching, pull-requests, and merge/commits
• Deploy sophisticated CI/CD pipelines.
• Administer role-based security in cloud data platform via Active Directory Groups fed via Okta SSO
• Contribute to building, then own data pipeline monitoring/observability for early detection of threats to data pipeline.
• Work with data engineers to automatically test, dynamically configure, orchestrate, and deploy their data pipelines.
• Work cross-functionally with business (SQL / DBT / Python) analysts to foster DataOps discipline, while seeking out opportunities for versioned code-control over deployable Tableau shared data sources and workbooks
• Embed into development projects and assist teams in delivering new features, services, and consumer experiences through production with speed and quality.
• Cultivate alignment across teams: development, system administrators, InfoSec, compliance (FDA regulated), and change management for continuous improvement in the quality and velocity of releases and operational service levels.
• Build and maintain CI/CD orchestration using GitHub, Jenkins, Terraform, and Python
• Ensure work is performed in compliance with company policies including HIPAA and other regulatory requirements
• Other duties as assigned.
Required Experience and Qualifications:
• 5 plus years of IT-operations or DevOps experience provisioning infrastructure, managing configurations, implementing CI/CD pipelines for applications and infrastructure, implementing testing and monitoring tools
• Experience creating and managing CI/CD orchestration
• Expertise with GitHub, DBT, Jenkins, Terraform, and Python
• Strong customer-service orientation.
• Proficient in communicating to both technical and management levels
• Enthusiastic support of win-win relationships across multi-disciplinary teams, so that helpful information sharing becomes a routine expectation
• Proactive at implementing change initiatives, providing guidance to others in meeting changing needs.
• Strong ability to coordinate with people across teams.
• Experience with a leading cloud data warehouse like Snowflake and leading data replication tools.
• Productive skill level with ANSI SQL
• Knowledge and experience with FDA regulation, HIPAA, and ISA-95 industry standard
• Bonus: Exposure to machine learning operations or quantitative analysis methods, such as statistical process control, for process optimization via monitoring
• Option: Tableau for data visualization
• Wide-ranging experience resolving complex issues in creative and effective ways.
• Success with complex issues where analysis of situations or data requires in-depth evaluation of variable factors.
• Confidence in asserting processes on new assignments and coordinating with others.
Minimum level of education / certifications:
• BS or MS in computer science, analytics, or equivalent combination of education and applicable job experience.
• Equivalent education includes technical school training and certifications in networks and servers.