
Burtch Works
Senior Data Engineer needed to support a leading industry consultancy. This is an internal opportunity to serve as a SME for the design and optimization of data platforms, developing strategic roadmaps and translating requirements for customers and stakeholders. Remote with some travel. This role does not offer sponsorship.
Responsibilities:
- Develop strategic roadmaps and project plans based on customer goals
- Develop customer solutions for data privacy and security
- Design end-to-end modern data platforms for analytics and AI use cases for multiple clients simultaneously
- Design data ingestion, data storage, data modeling, data virtualization, self-service data preparation and analytics pipelines
- Create workload orchestration using common tools like Jenkins, Airflow and MLFlow
- Document and present recommendations to customers, including senior leadership
- Work with Big Data Engineers to translate requirements into LOE estimates, design documents and project plans
- Implement customer roadmaps in coordination with Big Data Engineers
- Be the technical liaison between customers and engineering teams; communicate complex technical concepts in easy-to-understand non-technical language
- Work with Project Managers to set customer expectations, drive alignment, and coordinate timelines
- Support pre-sales engineers in proposal design and positioning
- Mentor Big Data Engineers and more junior consultants
- Support Data Science & Engineering process improvement initiatives
- Support Data Science & Engineering recruiting efforts by participating on interview panels
Requirements:
- 5+ years experience architecting solutions for optimal extraction, transformation and loading of data from a wide variety of traditional and non-traditional sources such as structured, unstructured, and semi-structured data using SQL, NoSQL and data pipelines for real-time, streaming, batch and on-demand workloads
- 5+ years experience designing and implementing creative data solutions leveraging the latest in Big Data frameworks
- 3+ years experience with analytics/data management strategy formulation, architectural blueprinting, and business case development SQL, Database, Data Modeling, Data Warehousing and Development skills
- Advanced Experience with AWS, GCP, or Azure
- Experience with Dashboarding and Reporting Tools used in the Industry (Tableau, Qlik, etc.)
- Experience leading teams, training, and mentoring more junior team members
- Experience with implementation of data security, encryption, PII/PSI legislation, identity and access management across sources and environments
Keywords: Python, Data Engineer, pipelines, GCP, AWS, Azure
To apply for this job please visit www.linkedin.com.