Our client is looking for an experienced and mission-driven Senior Principal to join their Platform Data Engineering team. The role will report directly to the VP of Enterprise Architecture & Data Engineering and join the company’s transformative IT team. You will be responsible for developing and mentoring a highly motivated group of data engineers covering multiple disciplines, refining the company’s transformative systems to meet its customer’s needs, and contributing to the strategic direction of the business.
Essential Duties & Responsibilities
- Architects develop and implement Data Engineering strategies to support organizational initiatives.
- Leads the architecture function of the Data Engineering (Pipeline) team
- Provides leadership and mentorship to data Engineering staff.
- Lead teams of data & analytics engineers that build, manage, and support enterprise data warehouse and analytics technology infrastructure, tools, and products
- Inform and influence the overall Enterprise Data technology vision.
- Accelerate building and improving capabilities centered around data and analytics, such as foundational data assets, data engineering pipelines, data platforms, and products.
- Build cross-functional relationships with Business product owners, data scientists, and analysts to understand product needs and delivery expectations.
- Oversee compliance reviews, including partnering with Information security and privacy teams on policy content, creation, maintenance, and awareness to ensure our customer data is secure and protected from unauthorized access.
- Stay informed on strategic & emerging trends in the Data and Analytics world, lead proofs of concept, and continuously drive innovation across the data and analytics teams.
Education & Experience
- 12+ years of Tech Industry experience and have prior experience in architecting and implementing data strategies
- 3+ years of architecture experience
- Data Warehousing Experience with SQL Server, Oracle, Redshift, Teradata, etc.
- Experience with Big Data Technologies (NoSQL databases, Hadoop, Hive, Hbase, Pig, Spark, Elasticsearch, Databricks, etc.)
- Experience with Bigquey and Snowflake
- Experience with confluent kafka real-time data processing and API platforms.
- Experience in using Python, Java, and/or other data engineering languages
- Experience with a breadth of Data Integration tools, Databases, Big Data Platforms, Cloud-based data platforms and Data Streaming technologies, Cloud Azure and GCP platforms
- Strong understanding of Data Architecture and, most importantly, deep understanding of the overall Data in an enterprise and ability to understand Data in the context of Business Goals
- Expert/Advanced level experience with ETL technologies, Data Streaming using integration patterns like Kafka, Spark Streaming
- Experience in architecting data pipelines and solutions for streaming and batch integrations using tools/frameworks like dbt, Talend, Azure Data Factory, Spark, Spark Streaming, etc.
- Experience in building from the ground up a modern next-generation data warehouse platform