This role is eligible for our five-day flex office work model.
Our Technology team is the backbone of our company: constantly creating, testing, learning and iterating to better meet the needs of our customers. If you thrive in a fast-paced, ideas-led environment, you’re in the right place.
Are you passionate about complex, highly scalable software systems? Do you enjoy coming up with appropriate solutions to the most challenging business problems? Do you enjoy collaborating with the business to improve and grow the company’s products? Are you looking for a company that is not too big, not too small, and has a casual work environment? If so, then we have the perfect job for you!
At Priceline we are all about teamwork, accountability, innovation, and a customer-first approach. We work hard, but do so in a collaborative, fun and flexible work environment. As a member of the team, you will have the opportunity to work on mission-critical projects with direct impact on the evolution of Priceline's business. You will be able to apply your programming skills towards building low latency and high throughput transactional services.
You will work on systems that serve hundreds of millions of events a day, generate billions of events, and are monitored 24×7 using early warning and analytics systems built using the best of the big data technologies, including Kafka, RabbitMQ, Cassandra, Solr Cloud, Splunk and Cloud Native Technologies in GCP. We constantly explore new technologies and engineer better solutions to meet the dynamic needs of the business.
We are looking for a seasoned engineer who combines a craftsman’s coding expertise with a leader’s inspiring approach. We want someone with a demonstrated ability to collaborate closely and efficiently with customers and engineers around the globe. If this describes you, then join our team today!
Java: Strong object-oriented design and development skills and advanced knowledge of core Java or other JVM programming languages. Knowledge and experience with third party libraries, frameworks, and technologies is a plus.
Kafka: Administer enterprise-wide Kafka Messaging Infrastructure, design and implement event sourcing strategies, and provide daily management of the Kafka ecosystem.
Database: Proficiency with SQL composition skills. Knowledge of big data and NoSql databases is a plus! We not only write software that collects and queries data, but we also compose queries for investigation and analysis. We collect a lot of data in real time from our applications and being able to compose ad hoc queries is vital to develop and support our products.
Analysis & Problem Solving: You will need to understand our codebase and systems and business requirements they implement so you can effectively make changes to our applications and investigate issues.
Communication: Whether via face-to-face discussion, phone, email, chat, white-boarding, or other collaboration platforms, an effective communicator who can advise, explain, enable, teach, persuade, coordinate, etc.
Team Collaboration: Effectively collaborate and share ownership of your team's codebase and applications. You must be willing to fully engage in team efforts, speak up for what you think are the best solutions, and be able to converse respectfully and compromise when necessary.
Knowledge and Experience: A well-rounded software engineer will have broad and/or deep knowledge of various topics, tools, frameworks, and methodologies related to software engineering.
A 4-year degree in Computer Science (or a related field). Graduate degree is helpful.
3+ years of solid Kafka Admin experience in managing critical 24/7 applications
Kafka Developer and/or Admin certification is a huge plus.
Spring and Spring Boot Frameworks.
Design, build, assemble, and configure application or technical architecture components using business requirements
Experience in High availability cluster setup, maintenance and ongoing 24/7 support
Hands on experience in standing up and administering Kafka platform from scratch which includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL) in the Cloud.
Knowledge of Kafka API and experience in writing Kafka Producer and Consumer Application using Java or related technologies
Knowledge of Kafka Schemas and use of Schema Registry
Establish standards for configuring Source and Sink connectors.
Knowledge of Kafka clustering, and it fault-tolerance model supporting High Availability
Solid understanding in Kafka Client configurations and troubleshooting
Experience in open source and confluence Kafka, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center
Experience in Kafka Mirror Maker or Confluent Replicator
Experience with Big Data solutions such as Cassandra, Spark, Hadoop etc.
Experience in building Kafka pipelines using Terraform, Ansible, Cloud formation templates, and shell scripts etc.
Knowledge of best practices related to security, performance, and disaster recovery.
Ability to concentrate on a wide range of loosely defined complex situations, which require creativity and originality, where guidance and counsel may be unavailable
Demonstrate a product mindset with an ability to set forward thinking and direction
Ability to synthesize large amounts of complex data into meaningful conclusions and present recommendations
Ability to maintain a positive demeanor while working with high demands and short deadlines that leads to working after hours
Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust.
The Right Results, the Right Way is not just a motto at Priceline; it’s a way of life. Unquestionable integrity and ethics is essential. Add bullet points here