Job Search

Data Platform Engineer

REMOTE, US

Piper Companies Logo

Job Id:
157987

Job Category:

Job Location:
REMOTE, US

Security Clearance:
No Clearance

Business Unit:
Piper Companies

Division:
Piper Enterprise Solutions

Position Owner:
Kiersten Schaefer

Piper Companies is seeking a Data / Platform Engineer to join a growing technology organization on a long-term contract with strong potential for conversion. This is a fully remote, US-based role focused on building and operating the core data platform that powers large-scale integrations, analytics, and event-driven services.


Responsibilities

  • Own and evolve multi-stage data pipelines that ingest data from external sources including APIs, direct queries, and file-based feeds into a centralized lake and warehouse.
  • Build and maintain event-driven services and ETL workflows that enrich, deduplicate, and validate data at scale.
  • Design and enforce data contracts, schemas, and cutover strategies to support safe backfills, corrections, and parity with legacy systems.
  • Implement observability across pipelines and jobs, including monitoring, alerting, lineage, and operational runbooks.
  • Operate and secure cloud-based data infrastructure on AWS using infrastructure-as-code practices.
  • Optimize data models and SQL performance for large-scale analytical workloads.
  • Partner with engineering and product teams to support downstream integrations, APIs, and user-facing data access.
  • Document architecture, standards, and workflows to support a growing and collaborative platform team.

Qualifications

  • 5+ years of experience in backend and/or data engineering roles with ownership of production data platforms.
  • Strong hands-on experience building event-driven or batch data pipelines using Python, Node, or similar languages.
  • Deep experience operating AWS-based systems and provisioning infrastructure using Terraform.
  • Advanced SQL skills with experience tuning performance in MPP data stores such as Redshift, Snowflake, or equivalent.
  • Proven experience delivering reliable, idempotent data pipelines and validating outputs against legacy systems.
  • Experience working in regulated or highly governed environments (e.g., healthcare, financial services).
  • Strong understanding of security, compliance, and audit-readiness best practices (e.g., SOC 2, ISO 27001).
  • Comfortable owning systems end-to-end and troubleshooting complex production issues.

Preferred Experience

  • Workflow orchestration tools such as Airflow or similar.
  • Streaming and messaging technologies (Kafka, Kinesis, SQS).
  • Data modeling and transformation tools (e.g., dbt).
  • Data quality, metadata, and lineage tooling.
  • Exposure to real-time enrichment or rules-based processing systems.

Compensation:

  • Hourly rate: $80/hr - $95/hr depending on experience (some flexibility might be possible)
  • Comprehensive Benefits: Medical, Dental, Vision, 401k, Sick Leave if required by law

This job opens for applications on 1/15/25. Applications for this job will be accepted for at least 30 days from the posting date.


#LI-KS1

#LI-REMOTE


Apply For This Position


Personal Information

Required
Required
Required
Required
Required
Required
Required

Additional Details

Required
Required
Required

Voluntary Self-identification Form

Required
Required
Required

Veteran Status *

Discharge Date:

Resume Upload

Please note only files with .pdf, .docx, or .doc file extensions are accepted.

Currently selected file:

Don't have a resume?