As a Data Engineer you will play a key role on our Data & Analytics team. You will have the opportunity to propose and drive high quality engineering practices while building data infrastructure and ETL pipelines at scale. You will provide technical leadership, oversight, and set standards in the realm of Data Governance and Data Management. You will collaborate with business leaders, software developers, data scientists and data analysts in modernizing our current data infrastructure.
Duties and Responsibilities
- Develops and maintains data pipelines including solutions for data collection, transformation, aggregation, management, metadata, and usage leading to performant and easy to use data models.
- Works with the team to document data lineage and overall data flow architecture that ease consumption of our data assets.
- Work collaboratively as part of an Agile centralized team to support cross-functional analysts, data scientists, and end users to access the data they need to make informed business decisions
- Breaks down work into estimable pieces and delivers accurate and consistent difficulty estimates
- Participates in full SDLC for our data engineering solutions – requirements, development, testing and training and support
- Partners with cross-functional stakeholders to understand business and technical requirements, plans and execute projects, and communicate status, risks and issues
- Analyzing large datasets to identify gaps and inconsistencies, provide data insights, and advance continuous improvement in the quality of our data assets
- Act as in house data expert and make recommendations regarding standards for code quality and data pipeline performance
- Education: Bachelor’s Degree in Computer Science or any quantitative filed required.
- Experience: 4+ years of expertise in implementing ETL pipelines and data best practices.
- Experience with writing IAC. Cloud Formation or Terraform.
- Experience in the following tools/languages is preferred: C#, Qlik, Infor Birst, Infor Datalake
- Experience or Certification with cloud technologies preferred (Azure, AWS, Snowflake)
- In depth knowledge of designing, building, and maintaining data transformation pipelines (ETL, ELT)
- Experience building and consuming APIs for data flows. Eg Flask, Fast API etc
- Proficiency in Python and SQL.
- Experience with data warehousing, relational databases, NoSQl databases and Data Lakes.
- Experience with project documentation and Agile methodologies
- Strong interpersonal and communication skills required
- Strong oral and visual presentation skills