Data Engineer
This role is eligible for our hybrid work model: Two days in-office.
Our Technology team is the backbone of our company: constantly creating, testing, learning and iterating to better meet the needs of our customers. If you thrive in a fast-paced, ideas-led environment, you’re in the right place.
Why this job’s a big deal
As part of the Flights Engineering team, you’ll design and build scalable data pipelines that curate and consolidate high-impact data generated across our platforms in real time. Your work will power critical business decisions, elevate platform productivity and observability, and enable customers to experience the moments that matter—through travel, experiences, and loyalty—helping Priceline become the world’s ultimate dealmaker.
In this role, you will get to:-
1. Build and Optimize Data Pipeline
-
Build and optimize data pipelines for ingestion, transformation, and validation
-
Assist with CI/CD practices to automate pipeline deployment and monitoring
-
Develop and maintain data validation and quality assurance tests to ensure reliable outputs
2. Data Architecture & Modeling
-
Work in tandem with Data Architects to align on data architecture requirements
-
Build and maintain data models that support analytics, reporting, and product features
-
Define solution parameters for data preparation, integration, and storage.
3. Collaboration & Cross-Functional Alignment
-
Partner with Product Managers, Analysts, and Engineering teams to understand data requirements
-
Facilitate communication between data, product, and development teams to ensure alignment
-
Provide guidance on best practices for data management and pipeline design.
4. Continuous Improvement
-
Identify opportunities to improve pipeline performance, reliability, and maintainability
-
Evaluate and implement new data tools, technologies, and frameworks
-
Share knowledge and mentor team members on data engineering practices
5. Reporting & Quality Metrics
-
Monitor and report on data pipeline performance, failures, and data quality issues
-
Ensure transparency of data availability, integrity, and lineage for stakeholders.
Support release readiness by validating that data pipelines deliver accurate and timely data.
Who you are
-
Bachelor’s degree in Computer Science or a related field
-
3–5 years of hands-on experience building and maintaining data pipelines and architectures in a corporate or product-driven environment.
-
Proficient with SQL and modern data tools (e.g., Spark, Airflow, Snowflake, dbt)
-
Demonstrated expertise in Big Data solutions, including frameworks such as ApacheSpark and Kafka.
-
Strong command of databases, including CloudSQL (MySQL, PostgreSQL), BigQuery, and Oracle.
-
Familiar with data governance, security protocols, and compliance best practices
-
Proficiency in data orchestration frameworks, particularly Apache Airflow, for scheduling and managing complex workflows.
-
Proven ability to analyze, debug, and resolve data processing issues efficiently.
-
Proficient in Java or Python, with the ability to write efficient, maintainable code in either language.
-
Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust
-
The Right Results, the Right Way is not just a motto at Priceline; it’s a way of life. Unquestionable integrity and ethics is essential
#LI-hybrid