Piper Companies is currently searching for a Data Engineer Technical Lead to lead the team responsible for data and data-related products, utilizing your big data engineering background to collaborate with product owners on prioritization and implementation. This position will own big data architectural vision and coordinate team delivery. You will act as a hands-on leader for the team as you work together on collecting, storing, processing and analyzing huge sets of data using cutting edge AWS technologies.
Duties and Responsibilities
- Lead the team that transforms big data into the fuel that drives Major Life Purchase products
- Design, develop, and maintain structures, pipelines, and transformations for Data Lake and other key components that comprise our data platform.
- Mentor teammates in best practices for big data engineering, growing the team’s skills and performance over time
- Collaborate with data scientists to implement algorithms and models supporting identity graph and event classification processes
- Coordinate team activity to develop and maintain a high delivery velocity
- Work with product owners to develop both the roadmap for data-oriented products and the short-term goals for the team
- Optimize queries and transformations for fast data retrieval minimizing processing time on large data sets.
- Triage and/or analyze data defects and propose solutions to rectify issues. Proactively research potential data issues to assure data quality.
- You are clever, self-motivated, and not happy unless you are getting things done and seeing how your efforts directly impact your customers.
- You strongly value ownership of the things that you and your team build and communication both within and outside of your immediate team.
- You are a continuous learner and embrace challenging yourself technically to learn new tools, technologies, and processes to deliver the best possible solutions.
- You excel in a collaborative environment where your co-workers rely on your contributions to help guide the team to success.
- You have experience in working with Agile/Lean approaches and you understand the difference between being Agile/Lean and doing Agile/Lean.
- Five or more years of data engineering experience
- Three or more years of big data engineering experience
- Demonstrable big data architectural experience including proposing and designing scalable and maintainable data models
- Experience leading a team (full-time or contract)
- Proficiency building data processing systems with a programmatic approach using Python, Scala, and/or Java (Python preferred).
- Experience building data processing systems using Apache Spark and PySpark.
- Experience using AWS Services related to data processing such as EMR. Experience using querying tools such as Athena, Redshift, Hive.
- Bachelor’s degree in Computer Science/Engineering preferred.
Nice to Haves
- Demonstrable experience utilizing Agile (Scrum/Kanban) practices.
- Experience developing AdTech or MarTech applications and solutions.
- Experience with developing Machine Learning applications.
- Experience with version control tools such as GitLab.
- Experience with handling streaming data sets with tools such as AWS Kinesis.
- Experience with automating ETL pipelines with AWS Glue, AWS Lambda, AWS Step Functions.