Job Search

Data Engineer II

Remote

Piper Companies Logo

Job Id:
160341

Job Category:

Job Location:
Remote

Security Clearance:
No Clearance

Business Unit:
Piper Companies

Division:
Piper Enterprise Solutions

Position Owner:
Connor Gordon

Piper Companies is seeking a Data Engineer II to support a key federal healthcare initiative under the HQRII CMS contract. This role offers the opportunity to work with a highly collaborative Agile team to design and optimize modern data pipelines, backend systems, and scalable data solutions supporting mission-critical federal health programs.

As a Data Engineer II, you will contribute to building efficient data architectures, optimizing Spark workloads, and ensuring the reliability of backend data services.


Responsibilities

  • Build and maintain PySpark data pipelines within the Databricks platform.
  • Optimize Spark job performance, addressing bottlenecks, and improving resource utilization across distributed systems.
  • Design, develop, and maintain backend data components and services that support large-scale data processing.
  • Conduct research and develop proof-of-concepts for new tools, frameworks, and solutions in the data engineering ecosystem.
  • Write clean, maintainable, and scalable code following best practices and coding standards.
  • Perform code reviews to ensure quality, consistency, and reliability across the team.
  • Debug and troubleshoot backend data issues, proactively identifying potential system risks.
  • Develop and maintain thorough technical documentation.

Team & Delivery Responsibilities

  • Actively participate in Agile ceremonies (standups, sprint planning, retrospectives).
  • Support estimation, task breakdown, and prioritization to meet delivery timelines.
  • Stay current with emerging technologies and share insights with the engineering team.

Qualifications

Education

  • Bachelor’s Degree in Computer Science, Computer Engineering, or related technical discipline.

Required Experience

  • 5+ years of experience in data engineering, backend engineering, or similar roles.

Required Technical Skills

  • Strong experience with Python and Apache Spark.
  • Working knowledge of R.
  • Solid understanding of data modeling, ETL processes, and distributed computing architectures.
  • Strong foundation in software engineering fundamentals, including data structures, algorithms, and design patterns.

Required Skills & Abilities

  • Experience working within Agile/Scrum development teams.
  • Ability to work independently and collaboratively.
  • Strong analytical thinking and problem-solving skills.
  • Excellent written and verbal communication abilities.

Preferred Qualifications

  • Experience with AWS services (S3, EC2, Glue, Lambda, etc.).
  • Additional hands-on experience with R.
  • Databricks or Apache Spark professional certifications.
  • Experience with SAS.

Compensation

  • $100,000-$115,000


Keywords:

#DataEngineer, #DataEngineerJobs, #DataEngineering, #PySpark, #ApacheSpark, #Databricks, #PythonDeveloper, #BigData, #ETL, #ETLPipelines, #DataPipelines, #DistributedComputing, #DataModeling, #BackendEngineering, #SoftwareEngineering, #Agile, #AWS, #AWSS3, #AWSEC2, #AWSGlue, #AWSLambda, #SparkOptimization, #RProgramming, #SAS, #CloudEngineering, #CloudData, #HealthcareIT, #CMS, #PiperCompanies, #TechJobs, #Hiring, #NowHiring, #LI‑CG1, #REMOTE

Apply For This Position


Personal Information

Required
Required
Required
Required
Required
Required
Required

Additional Details

Required
Required
Required

Voluntary Self-identification Form

Required
Required
Required

Veteran Status *

Discharge Date:

Resume Upload

Please note only files with .pdf, .docx, or .doc file extensions are accepted.

Currently selected file:

Don't have a resume?