The Data Science team is looking for a Senior Data Engineer to help us take our platform to the next level. You can do so by improving data ingestion and standardization within our development environment, ensuring the Data Scientists have predictable, reliable, scalable data available for our models. Moving forward, you’ll also have a significant opportunity to help define our deployment strategy, ensuring what we build makes direct business impact in production.
The Senior Data Engineer will help build, scale and maintain the entire data science platform. The ideal candidate will have a deep technical understanding, hands-on experience in distributed computing, big data, ETL, dimensional modeling, columnar databases and data visualization. The candidate should feed on challenges and love to be hands on with recent technologies.This job plays a key role in elevating the data science function at Auth0 by building reliable pipelines and infrastructure to help the team improve modeling efficiency. You should be passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source Data technologies and software paradigms.
What You’ll Do:
- Create efficiencies for Data Scientists in the team by building reliable, standardized data ingestion pipelines
- Help reduce data scientist stress by intercepting and alerting on upstream schema changes that would have material impact on our model architectures
- Collaborate in conversations with software engineers and architects to ensure our team gain/retain access to the upstream data we need to build effective models.
- Feel comfortable working in a distributed team environment (think remote, lots of Zoom + Slack), interacting asynchronously with multiple groups on a daily basis.
- Become an Auth0 product power user to be informed as to what motivates our team’s priorities.
- Document your pipeline architectures regularly, to help the rest of the team understand data lineage.
- Advocate for the Data Science team’s needs in conversation with other Data Engineers throughout the organization
What You’ll Need:
- BA/BS in Computer Science, related technical field or equivalent practical experience.
- 5+ years of experience in a professional data engineering role
- Experience with Airflow, Luigi or similar tools for batch execution/orchestration
- Experience with Snowflake or similar cloud-hosted database
- Familiarity with BI platforms such as Tableau/Looker
Bonus points for:
- Familiarity with Data Science & Machine Learning concepts
- Hands-on experience in Spark (or similar distributed data processing tool) in production environments
- Designing and implementing efficient Data Warehouse architectures for DWH platform (e.g. building systems to consume data from log or event streams)