Senior Data Engineer
This role is eligible for our hybrid work model: Two days in-office .
Why this job’s a big deal:
Enjoy working with big data? We do too! In fact, data lies at the very core of our business! As a data engineer you will be working on data systems that serve and generate billions of events each day. Being part of the Data Services team and working with a data warehouse you will leverage data to understand, meet, and anticipate the needs of finance business users.
In this role you will get to:
-
Participate in cloud migration and modernizing current data systems.
-
Oversee data systems that serve millions of real time events a day.
-
Maintain and design early warning and analytics systems using cutting edge big data technologies.
-
Collaborate on building software that collects and queries data, while also composing queries for investigation purposes.
-
Analyze and provide data supported recommendations to improve product performance and customer acquisition.
-
Diagnose and troubleshoot any issues within data infrastructure.
-
Effectively collaborate and engage in team efforts, mentoring the junior members in the team, speak up for what you think are the best solutions and be able to converse respectfully and compromise when necessary.
Who you are:
-
Bachelor’s degree in Computer Science or a related field.
-
Minimum of 5 years of experience in software engineering and development, with a strong focus on Big Data technologies and data-driven solutions.
-
Demonstrated expertise in Big Data solutions, including frameworks such as Apache Spark and Kafka. Proficiency in the Google Cloud Platform (GCP) stack is highly desirable.
-
Advanced knowledge in data modeling and schema design for optimized data storage and retrieval.
-
Strong command of databases, including CloudSQL (MySQL, PostgreSQL), BigQuery, BigTable, and Oracle.
-
Extensive experience with GCP tools, specifically Dataflow, Dataproc and datastream for building and managing large-scale data processing pipelines.
-
Proficiency in data orchestration frameworks, particularly Apache Airflow, for scheduling and managing complex workflows.
-
Proven ability to analyze, debug, and resolve data processing issues efficiently.
-
Proficient in Python and/or Java, Unix scripting with the ability to write efficient, maintainable code in either language.
-
Experience with data visualization tools, such as Tableau, Looker for representing data insights.
-
Familiarity with advanced data processing strategies, including indexing, machine-learned ranking algorithms, clustering techniques, and distributed computing concepts.
-
Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust.
-
The Right Results, the Right Way is not just a motto at Priceline; it’s a way of life. Unquestionable integrity and ethics is essential.
#LI-hybrid