We are looking for a Senior Data Engineer to join our team in Business Technology as part of a core, centralized business intelligence function serving the entire organization. In this role, you will be responsible for designing and developing scalable solutions for a large data infrastructure in a fast-paced, Agile environment. You will participate in detailed technical design, development, and implementation of applications using cutting-edge technology stacks.
Our main focus is on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact.
Job Duties and Responsibilities:
- Design scalable and reliable data pipelines to consume, integrate and analyze large volumes of complex data from different sources to support the growing needs of our business
- Build a data access layer to provide data services to internal and external stakeholders
- Analyze a variety of data sources, structures, metadata and develop mapping, transformation rules, aggregations and ETL specifications
- Proactively develop architectural patterns to improve efficiency
- Interface with stakeholders to gather requirements and build functionality
- Support and enhance existing data infrastructure
- Build data expertise and own data quality for areas of ownership
- Experiment with different tools and technology. Share learnings with the team
- Contribute to the evaluation of new technologies such as Docker, AWS lambda, ECS etc.
- BS in Computer Science, Engineering or another quantitative field of study
- 5+ years in a data engineering role
- 3+ years in the data warehouse and data lake space
- Experience with building distributed systems
- Expertise in a programming language (Preferably Python or JAVA)
- 5+ years of experience working with SQL. Strong knowledge of SQL capabilities and best practices.
- 3+ years of experience with ETL tools such as Airflow, Oozie, Luigi or Informatica
- 3+ years of experience with relational databases and columnar MPP databases like Snowflake, Athena or, Redshift
- 3+ years of experience with database and application performance tuning
- Experience with CI/CD tools and procedures such as Jenkins, Git, Chef, and Ansible
- Experience with cloud infrastructure/platforms (AWS, Azure, Google Cloud Platform)
- Familiar with Jira and Confluence and Agile methodologies
- Experience with real-time data streaming using Storm, Kinesis or Spark
- Experience building APIs using Java spring boot or other programming languages
- Familiar with data visualization tools such as Tableau, Looker, QlikView, MicroStrategy
- Team player
- Detail oriented, Innovative and ability to execute
- Self-driven personality
- Excellent oral and written communication skills, both technical and non-technical audience
Okta is an Equal Opportunity Employer