Senior Data Engineer - Zazmic
CHARITY MEETUP

All you want to know about the Google Cloud Professional Certification

Learn More
Back to List

Senior Data Engineer

Our project, the world’s largest travel guidance platform, helps hundreds of millions of people each month become better travelers, from planning to booking to taking a trip. Travelers across the globe use the our site and app to discover where to stay, what to do and where to eat based on guidance from those who have been there before. With more than 1 billion reviews and opinions of nearly 8 million businesses, travelers turn to our project to find deals on accommodations, book experiences, reserve tables at delicious restaurants and discover great places nearby. As a travel guidance company available in 43 markets and 22 languages, our project makes planning easy no matter the trip type.

 

Let's talk about You:

  • Bachelor’s degree in Computer Science or related field
  • 5+ years experience in commercial software development
  • Ability to break down complex problems into simple solutions
  • Excellent verbal and written communication skills; Proven interpersonal skills andability to convey key insights from complex analyses in summarized businessterms; Ability to effectively communicate with technical teams
  • Ability to work with shifting deadlines in a fast-paced environment
  • Experience with Big Data technologies such as Hive/Spark
  • Experience in data mining, profiling, and analysis
  • Expert level skills writing and optimizing complex SQL
  • Experience with complex data modeling and ETL/ELT design, including definingSLAs, performance measurements, tuning and monitoring
  • Proficiency with Linux command line and systems administrations
  • Experience in configuring and operating cloud infrastructure using infrastructureas code technologies
  • Knowledge of cloud data warehouse concepts
  • Experience building/operating highly available, distributed systems of dataextraction, ingestion, and processing of large data sets
  • Experience writing large-scale application software in Java or similar language
  • Experience with relational and NoSQL databases
  • Demonstrated efficiency in treating data: data lineage, data quality, dataobservability and data discoverabilityPreferred Qualifications
  • At least 3 years’ experience building out data pipelines and products on top ofAWS and/or Snowflake; AWS certifications desired
  • Authoritative in ETL optimization, designing, coding and tuning big dataprocesses in Spark, Snowflake, or similar technologies
  • Experience with building stream-based data pipelines and applications with lowlatencies
  • Experience in migrating and porting on-premises big data workloads into cloudBig Data technologies
  • While not mandatory, experience in building and operating data pipelines andproducts in compliance with the data mesh philosophy would be beneficial

You are passionate to work with:

  • Ongoing architecture improvements in a growing startup
  • Developing new features
  • Designing and implementing architecture approaches for new infrastructure components
  • Keeping high quality of product
  • Agile values and principles

Your area of responsibility and growth:

  • 1. Provide expert assembly of AWS and Snowflake technologies to solve data lakeand warehouse ingestion pattern problems that are compatible with Tripadvisor’sdata tool stack:
  • a. Ingesting on-premises psql into S3/Snowflake with change-data-capture
  • b. Ingesting RDS into S3/Snowflake with change-data-capture
  • c. Ingesting event data into S3/Snowflake from cloud-native microservices
  • d. Ingesting Salesforce data into S3/Snowflake (stream and batch)
  • e. Streaming event data from on-premises Kafka into S3/Snowflake.
  • 2. Build out cost prevention and monitoring measures within our cloud data platformtopology, which includes S3, Athena, EMR, EKS, SageMaker, and Snowflake.
  • 3. With collaboration of in-house data engineering leads, migrate on-premises Hadoop workloads to our new cloud data platform, which is built on both AWSand Snowflake. The migration will avoid lift and shift as much as possible andinstead will undergo a domain decomposition exercise:
  • a. Build domain-driven data pipelines using our in-house data tool stack fororchestration, dependency management and lineage
  • b. Port on-premises hive-SQL data workloads to supported SQL dialects andtechnologies: Spark-SQL, Trino, Snowflake
  • c. Migrate on-premises Spark workloads to AWS
  • d. Optimize the cost profile for cloud-based workloads where possible
  • 4. Help migrate in-house based tooling applications from an on-premises datacenter to AWS. We have a mix of Kubernetes applications, web-based Javaapplications running on VMs.
  • 5. Provide expert-level cost optimization guidance and oversight for big dataworkloads in our cloud data platform, including: Snowflake, Athena, S3,EMR/Spark, SageMaker, EKS.
  • 6. Provide cloud expertise on cost-optimized data archiving strategies and helpbuild out or onboard tooling capable of enforcing defined retention policies acrossour cloud data platform data stack.
  • 7. Help provide operational support for our data infrastructure, data toolkits, anddata pipelines.
  • 8. Provide knowledge transfer and proper hand-over for all deliverables in the formof thorough documentation and curated training sessions where applicable:
  • a. Documentation should include items like design docs, tests plans, andoperational runbooks.
  • b. Training sessions should include guided design discussions andoperational walk-throughs
  • 9. Build out a pod capable of executing against all items captured in the statementof work. Provide managerial assistance for directing staff and providing executionoversight

Technical Landscape:

  • Big Data: Cloudera Hadoop (5.14.2), Trino (Presto), Hive, Spark
  • Databases: Postgres
  • Languages: Java, Kotlin, Python
  • Infrastructure Tools: Terraform/CDK, Cloud Formation
  • AWS: S3, Athena, SageMaker, EKS, Glue, EMR, RDS, EC2, MKS/Kinesis
  • Snowflake

Why join us:

  • Ability to work remotely from anywhere in the world
  • Close cooperation with the development team and client
  • Opportunity to influence product development
  • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
  • We cover English classes (with a native speaker)
  • Boost professional brand: you can participate in local conferences as a listener or as a speaker
  • Regular team buildings: have fun with teammates
  • Gifts for significant life events (marriage, childbirth)
  • Tech and non-tech Zazmic Communities: support and share experience with each other

Job ID:

  • 2250133
hr@zazmic.com

Interested? You know what to do. Apply for this position