Python - Zazmic
JUNE 8, 2022 ONLINE CHARITY MEETUP

All you want to know about the Google Cloud Professional Certification

Learn More
Back to list

Dear Candidate,

Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

Thanks for understanding

This position requires work hours alignment with UK/Lisbon time zone

 

About the project – №1 Travel platform in the world!

We believe that we are better together, and we welcome you for who you are. Our workplace is for everyone, as is our people powered platform. At company, we want you to bring your unique perspective and experiences, so we can collectively revolutionize travel and together find the good out there.

Product is the world’s largest travel site, operates at scale with over 500 million reviews, opinions, photos, and videos reaching over 390 million unique visitors each month.  We are a data driven company that leverages our data to empower our decisions. Product is extremely excited to play a pivotal role in supporting our travelers.

Our data engineering team is focused on delivering Product’s first-in-class data products that serve all data users across the organization. As a member of the Data Platform Enterprise Services Team, you will collaborate with engineering and business stakeholders to build, optimize, maintain, and secure the full data vertical including tracking instrumentation, information architecture, ETL pipelines, and tooling that provide key analytics insights for business-critical decisions at the highest levels of Product, Finance, Sales, CRM, Marketing, Data Science, and more. All in a dynamic environment of continuously modernizing tech stack including highly scalable architecture, cloud-based infrastructure, and real-time responsiveness.

Product provides a unique, global work environment that captures the speed, innovation and excitement of a startup, at a thriving, growing and well-established industry brand.

We take pride in our data engineering and are looking for a talented and highly-motivated engineer with a passion for solving interesting problems to add to our high-performing team.

What you will do:

  • Provide the organization’s data consumers high quality data sets by data curation, consolidation, and manipulation from a wide variety of large scale (terabyte and growing) sources
  • Build first-class data products and ETL processes that interact with terabytes of data on leading platforms such as Snowflake and BigQuery
  • Partner with our Analytics, Product, CRM, and Marketing teams
  • Be responsible for the data pipelines’ SLA and dependency management
  • Write technical documentation for data solutions, and present at design reviews
  • Solve data pipeline failure events and implement anomaly detection
  • Work with various teams from Data Science, Product owners, to Marketing and software engineers on data solutions and solving technical challenges
  • Mentor junior members of the team

Who You Are:

  • Bachelor’s degree in Computer Science or related field
  • 6+ years experience in commercial data engineering or software development
  • Experience with Big Data technologies such as Snowflake, Databricks, PySpark
  • Expert level skills in writing and optimizing complex SQL; advanced data exploration skills with proven record of querying and analyzing large datasets
  • Solid experience developing complex ETL processes from concept to implementation to deployment and operations, including SLA definition, performance measurements and monitoring
  • Hands-on knowledge of the modern AWS Data Ecosystem, including AWS S3
  • Experience with relational databases such as Postgres, and with programming languages such as Python and/or Java
  • Knowledge of cloud data warehouse concepts
  • Experience in building and operating data pipelines and products in compliance with the data mesh philosophy would be beneficial. Demonstrated efficiency in treating data, including data lineage, data quality, data observability and data discoverability
  • Excellent verbal and written communication skills. Ability to convey key insights from complex analyses in summarized business terms to non technical stakeholders and also ability to effectively communicate with other technical teams
  • Strong interpersonal skills and the ability to work in a fast-paced and dynamic environment
  • Ability to make progress on projects independently and enthusiasm for solving difficult problems

Why join us:

  • Ability to work remotely from anywhere in the world
  • Close cooperation with the development team and client
  • Opportunity to influence product development
  • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
  • We cover English classes (with a native speaker)
  • Boost professional brand: you can participate in local conferences as a listener or as a speaker
  • Regular team buildings: have fun with teammates
  • Gifts for significant life events (marriage, childbirth)
  • Tech and non-tech Zazmic Communities: support and share experience with each other

Apply




    Pdf, doc, docx allowed. Max 20mb


    Back to list

    Dear Candidate,

    Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

    Thanks for understanding

    About the project – №1 Travel platform in the world! 

    We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform. At company, we want you to bring your unique perspective and experiences, so we can collectively revolutionize travel and together find the good out there.

    Our team is building out a next generation Machine Learning Platform for all data scientists and machine learning engineers across all of Company’s brands. Our mission is to empower data scientists to work independently and scale their productivity to enable broader and deeper utilization of machine learning techniques to help improve the business performance. Company hosts over 400 million monthly active visitors and operates across multiple cloud environments. Our data is at the petabyte scale, requiring a scalable, efficient, and reliable machine learning platform to support it.

    We are seeking a talented, experienced, Senior Software Engineer to pave the way towards revolutionizing how we leverage data and Machine Learning!

    We leverage Kubernetes and a variety of open source software (e.g. Kubeflow, Seldon, Istio, Mlflow) to train and deploy over one hundred models serving a billion requests per day.

    What you will do:

    • Develop across our evolving technology stack - we’re using Python, Java, Kubernetes, Apache Spark, Postgres, Redis, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that environment.
    • You will have the opportunity to learn many cutting edge technologies around Machine Learning Platform. You will push the boundaries, to test, develop and implement new ideas, technology and opportunities, and be well rewarded and recognized for doing so
    • Take responsibility for all aspects of software engineering, from design to implementation, QA, maintenance, and support
    • Touch code at every level – from the UI, backend microservices, database, big data processing, operations, to CD/CI automation
    • Collaborate closely with data science teams to define requirements, discuss solutions, and develop high quality deliverables for our customers

    Who You Are:

    • Computer Science degree or equivalent experience
    • At least 5 years’ experience of commercial software development.
    • Willingness and ability to take on new projects and technologies
    • Ability to break down complex problems into simple solutions
    • Strong analytical skills and desire to write clean, correct and efficient code
    • Sense of ownership, urgency and pride in your work.
    • Experience with Python
    • Experience with Java, Docker, Kubernetes, Argo, Spark and AWS cloud services are plus.
    • Exposure to Machine Learning practices is a plus

    Why join us:

    • Ability to work remotely from anywhere in the world
    • Close cooperation with the development team and client
    • Opportunity to influence product development
    • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
    • We cover English classes (with a native speaker)
    • Boost professional brand: you can participate in local conferences as a listener or as a speaker
    • Regular team buildings: have fun with teammates
    • Gifts for significant life events (marriage, childbirth)
    • Tech and non-tech Zazmic Communities: support and share experience with each other

    Apply




      Pdf, doc, docx allowed. Max 20mb


      Back to list

      https://www.eazyops.com/

      Eazyops AI is the leading all-in-one platform for Kubernetes automation, optimization, security, and cost management.

      Role and Responsibilities:

      • Developing and maintaining backend services for the EazyOps platform
      • Implementing server-side logic and integrating front-end elements
      • Writing reusable, testable, and efficient code
      • Design and implement of low-latency, high-availability, and performant applications
      • Integration of user-facing elements developed by front-end developers with server-side logic
      • Implementation of security and data protection
      • Performance tuning, improvement, balancing, usability, and automation

      Required Skills and Qualifications:

      • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
      • Proven work experience as a Python Developer, seniority level - middle or senior
      • Familiarity with some ORM (Object Relational Mapper) libraries
      • 4+ years of experience with Python web frameworks such as Django, Flask, etc
      • Understanding of fundamental design principles behind a scalable application
      • Knowledge of user authentication and authorization between multiple systems, servers, and environments
      • Familiarity with event-driven programming in Python
      • Able to create database schemas that represent and support business processes
      • Strong unit test and debugging skills
      • Proficient understanding of code versioning tools such as Git

      Preferred (but not required) Skills:

      • Knowledge of Kubernetes, Docker, and other cloud technologies
      • Experience with automation and configuration management tools
      • Knowledge of network security and data protection
      • Familiarity with continuous integration/continuous deployment (CI/CD) and Agile methodologies
      • Experience with other programming languages like JavaScript, Ruby, etc
      • Experience in using cloud services like AWS, Google Cloud, or Azure

      Why join us:

      • Close cooperation with the development team and client
      • Opportunity to influence product development
      • We cover English classes (with a native speaker)
      • Boost professional brand: you can participate in local conferences as a listener or as a speaker
      • Regular team buildings: have fun with teammates
      • Gifts for significant life events (marriage, childbirth)
      • Tech and non-tech Zazmic Communities: support and share experience with each other

      Apply




        Pdf, doc, docx allowed. Max 20mb