Senior Data Engineer
This role is eligible for our hybrid work model: Two days in-office.
Senior Data Engineer
Our Technology team is the backbone of our company: constantly creating, testing, learning and iterating to better meet the needs of our customers. If you thrive in a fast-paced, ideas-led environment, you’re in the right place.
Why this job’s a big deal:
Enjoy working with big data? We do too! In fact, data lies at the very core of our business! As a data engineer you will be working on data systems that serve and generate billions of events each day. Being part of the Fintech team and working with Financial data warehouse and ERP systems, you will leverage data to understand, meet, and anticipate the needs of finance business users.
In this role you will get to:
-
Participate in cloud migration and modernizing current data systems.
-
Oversee data systems that serve millions of real time events a day.
-
Maintain and design early warning and analytics systems using cutting edge big data technologies.
-
Collaborate on building software that collects and queries data, while also composing queries for investigation purposes.
-
Analyze and provide data supported recommendations to improve product performance and customer acquisition.
-
Diagnose and troubleshoot any issues within data infrastructure.
-
Effectively collaborate and engage in team efforts, mentoring the junior members in the team, speak up for what you think are the best solutions and be able to converse respectfully and compromise when necessary.
Who you are:
-
Bachelor’s degree in Computer Science or a related field.
-
Minimum of 6 years of experience in software engineering and development, with a strong focus on Big Data technologies and data-driven solutions.
-
Demonstrated expertise in Big Data solutions, including frameworks such as Apache Spark and Kafka. Proficiency in the Google Cloud Platform (GCP) stack is highly desirable.
-
Advanced knowledge in data modeling and schema design for optimized data storage and retrieval.
-
Strong command of databases, including CloudSQL (MySQL, PostgreSQL), BigQuery, and Oracle.
-
Extensive experience with GCP tools, specifically Dataflow and Dataproc, for building and managing large-scale data processing pipelines.
-
Proficiency in data orchestration frameworks, particularly Apache Airflow, for scheduling and managing complex workflows.
-
Proven ability to analyze, debug, and resolve data processing issues efficiently.
-
Proficient in Java or Python, with the ability to write efficient, maintainable code in either language.
-
Experience with data visualization tools, such as Tableau, for representing data insights.
-
Familiarity with advanced data processing strategies, including indexing, machine-learned ranking algorithms, clustering techniques, and distributed computing concepts.
-
Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust.
-
The Right Results, the Right Way is not just a motto at Priceline; it’s a way of life. Unquestionable integrity and ethics is essential.
#LI-hybrid