Skip to Main Content

Job Title


Remote GCP Data Engineer


Company : Insight Global


Location : Edmundson, MO


Created : 2026-04-04


Job Type : Full Time


Job Description

Job Description Design, build, and optimize BigQuery datasets and SQL models Develop and maintain batch and streaming pipelines using Dataflow/Beam Orchestrate workflows in Airflow/Cloud Composer Implement scalable ETL/ELT pipelines with incremental and CDC patterns Tune performance and manage query/storage costs Ensure data quality, schema evolution, and lineage tracking Collaborate with analytics, engineering, and business teams Secure sensitive data using best practices for compliance Monitor pipelines, troubleshoot failures, and improve reliability Contribute to code reviews, documentation, and platform standards We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: Skills and Requirements 5+ years of data engineering experience, including 2+ years on Google Cloud Platform Expert BigQuery skills: Advanced SQL (CTEs, window functions, complex joins) Partitioning, clustering, and query/cost optimization Materialized & authorized views Solid understanding of BigQuery architecture (slots, shuffles, distributed execution) Hands-on experience with Dataflow & Apache Beam (Python or Java) Batch & streaming pipelines Performance tuning, monitoring, and error handling Strong Cloud Composer / Airflow experience DAG development, operators, orchestration, and troubleshooting Proven ability to build production-grade ETL/ELT pipelines at terabyte scale Expert SQL and strong understanding of data warehousing concepts Strong Python for data pipelines and transformations Experience with relational databases (Postgres, MySQL, SQL Server) Data security fundamentals: Row-level security PII/PHI handling Audit logging and access controls Git-based version control and basic shell scripting Google Cloud Professional Data Engineer certification Healthcare data experience (clinical or administrative) dbt for analytics engineering Infrastructure as Code (Terraform) DevOps / CI-CD experience for data pipelines Experience with: Cloud Spanner Bigtable / Firestore Cloud DLP API Knowledge of data mesh / data fabric architectures Data visualization tools (Looker, Tableau, Power BI) ML workflows on GCP (Vertex AI) Docker & Kubernetes (GKE)