Data Developer

Data Developer

R4968

Location

Toronto

Career Track

Technology

Data Developer

This role is eligible for our hybrid work model: Two days in-office.

Our Technology team is the backbone of our company: constantly creating, testing, learning and iterating to better meet the needs of our customers. If you thrive in a fast-paced, ideas-led environment, you’re in the right place.

Why this job’s a big deal:

Enjoy working with big data? We do too! In fact, data lies at the very core of our business! As a data engineer you will be working on data systems that serve and generate billions of events each day. With the most advanced big data technologies at your fingertips, you will leverage data to understand, meet, and anticipate the needs of our customers.

In this role you will get to:

  • Participate in cloud migration and modernizing current data systems.

  • Oversee data systems that serve hundreds of millions of real time events a day.

  • Collaborate on building software that collects and queries data, while also composing queries for investigation purposes.

  • Diagnose and solve any issues within data infrastructure.

  • Develop and maintain scalable data pipelines on a cloud platform (e.g., GCP, AWS, Azure), utilizing data orchestration tools like Apache Airflow, cloud-native tools

  • Develop ETL and ELT processes to ingest, transform, and cleanse data from various sources into cloud-based data warehouses

  • Create and maintain data models in cloud data warehouses (BigQuery, Redshift, Synapse) optimized for analytics, reporting, and financial analysis

  • Implement data validation and testing frameworks to ensure the accuracy, quality, and compliance of financial data.

  • Implement monitoring and alerting for data pipelines using tools

  • Develop and maintain machine learning pipelines for financial data analysis, including data preparation, feature engineering, model training, and deployment

Who you are:

  • Bachelor’s degree in Computer Science.

  • 5+ years of work experience in Software Engineering and development.

  • Proficient in applying search engines like Apache Lucene , Apache Solr and ElasticSearch.

  • Expert knowledge of Big Data solutions such as Cassandra, Hadoop, Spark, Kafka, and RabbittMQ.

  • Experienced with strategies such as indexing, machine-learned rankings, clustering and distributed concepts.

  • Expertise in Google Cloud Platform (GCP) services, especially BigQuery, Cloud Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, and Cloud Functions.

  • Strong with ETL/ELT processes, data ingestion, and data transformation, particularly with tools like Apache Beam, Dataflow, and Airflow.

  • Proficiency in Python, SQL, and ideally one or more of Java/Scala for data processing.

  • Experience with continuous integration and deployment for data pipelines using GitHub Actions, Jenkins

  • Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust.

  • The Right Results, the Right Way is not just a motto at Priceline; it’s a way of life. Unquestionable integrity and ethics is essential.

#LI-JB1 #LI-Hybrid