Remote - Zazmic
JUNE 8, 2022 ONLINE CHARITY MEETUP

All you want to know about the Google Cloud Professional Certification

Learn More
Back to list

Dear Candidate,

Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

Thanks for understanding

About the project – №1 Travel platform in the world! 

We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform.
At company, we want you to bring your unique identities, abilities, and experiences, so we can collectively revolutionize travel and together find the good out there.

Our client, the world’s largest travel site, operates at scale with over 500 million reviews, opinions, photos, and videos reaching over 390 million unique visitors each month.  We are a data driven company that leverages our data to empower our decisions.  prject is extremely excited to play a pivotal role in supporting our travelers.

Our data engineering team is focused on delivering product’s first-in-class data products that serve all data users across the organization. As a member of the Data Platform Enterprise Services Team, you will collaborate with engineering and business stakeholders to build, optimize, maintain, and secure the full data vertical including tracking instrumentation, information architecture, ETL pipelines, and tooling that provide key analytics insights for business-critical decisions at the highest levels of Product, Finance, Sales, CRM, Marketing, Data Science, and more. All in a dynamic environment of continuously modernizing tech stack including highly scalable architecture, cloud-based infrastructure, and real-time responsiveness.
product provides a unique, global work environment that captures the speed, innovation and excitement of a startup, at a thriving, growing and well-established industry brand.

We take pride in our data engineering and are looking for a talented and highly-motivated engineer with a passion for solving interesting problems to add to our high-performing team.

What you’ll do:

  • Providing the organization’s data consumers high quality data sets by data curation, consolidation, and manipulation from a wide variety of large scale (terabyte and growing) sources.
  • Building data pipelines and ETL processes that interact with terabytes of data on leading platforms such as Snowflake and BigQuery.
  • Developing and improving our enterprise data by creating efficient and scalable data models to be used across the organization
  • Partnering with our analytics, data science, CRM, and machine learning teams
  • Responsible for enterprise data integrity, validation, and documentation
  • Solving data pipeline failure events and implementing sound anomaly detection

What we are looking for:

  • 4+ years of data engineering or general software development experience
  • Experience working with large datasets (terabyte scale and growing) and familiarity with various technologies and tooling associated with databases and big data Relational DB (PostgreSQL/MySQL)
  • Demonstrated proficiency in data design and data modeling
  • Relational DB (PostgreSQL/MySQL)
  • Big Data (i.e. Hadoop, Hive, BigQuery, Snowflake)
  • Demonstrated proficiency in data design and data modeling
  • Experience in developing complex ETL processes from concept to implementation; these should include defining SLA, performance measurements and monitoring.
  • Proficiency in query language and data exploration skills, proven record of writing complex SQL queries across large datasets.
  • Systems performance and tuning experience, with an eye for how systems architecture and design impacts performance and scalability
  • Strong software engineering principles
  • Experience in functional programming in Python or in an equivalent language
  • Self-paced, organized, and detail-oriented person with a strong sense of ownership
  • Strong communication skills to effectively communicate with both business and technical teams.
  • Ability to work in a fast-paced and dynamic environment
  • Ability to break down complex problems into simple solutions.
  • Strong interpersonal skills, intense curiosity, and an enthusiasm for solving difficult problems.

Nice to have:

  • Experience with data governance
  • Exposure to and/or interest in machine learning and data science specifically to help solve day-to-day problems and reach objectives in an innovative way
  • Experience working with task orchestration tools, Airflow or similar systems
  • Experience in programming languages such as Java or equivalent

Why join us:

  • Ability to work remotely from anywhere in the world
  • Close cooperation with the development team and client
  • Opportunity to influence product development
  • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
  • We cover English classes (with a native speaker)
  • Boost professional brand: you can participate in local conferences as a listener or as a speaker
  • Regular team buildings: have fun with teammates
  • Gifts for significant life events (marriage, childbirth)
  • Tech and non-tech Zazmic Communities: support and share experience with each other

Apply




    Pdf, doc, docx allowed. Max 2mb


    Back to list

    Dear Candidate,

    Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

    Thanks for understanding

    About the project – №1 Travel platform in the world! 

    We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform.
    At company, we want you to bring your unique identities, abilities, and experiences, so we can collectively revolutionize travel and together find the good out there.

    Our client, the world’s largest travel site, operates at scale with over 500 million reviews, opinions, photos, and videos reaching over 390 million unique visitors each month.  We are a data driven company that leverages our data to empower our decisions.  prject is extremely excited to play a pivotal role in supporting our travelers.

    Our data engineering team is focused on delivering product’s first-in-class data products that serve all data users across the organization. As a member of the Data Platform Enterprise Services Team, you will collaborate with engineering and business stakeholders to build, optimize, maintain, and secure the full data vertical including tracking instrumentation, information architecture, ETL pipelines, and tooling that provide key analytics insights for business-critical decisions at the highest levels of Product, Finance, Sales, CRM, Marketing, Data Science, and more. All in a dynamic environment of continuously modernizing tech stack including highly scalable architecture, cloud-based infrastructure, and real-time responsiveness.
    product provides a unique, global work environment that captures the speed, innovation and excitement of a startup, at a thriving, growing and well-established industry brand.

    We take pride in our data engineering and are looking for a talented and highly-motivated engineer with a passion for solving interesting problems to add to our high-performing team.

    What you’ll do:

    • Providing the organization’s data consumers high quality data sets by data curation, consolidation, and manipulation from a wide variety of large scale (terabyte and growing) sources
    • Building data pipelines and ETL processes that interact with terabytes of data on leading platforms such as Snowflake and BigQuery.
    • Developing and improving our enterprise data by creating efficient and scalable data models to be used across the organization.
    • Partnering with our analytics, data science, CRM, and machine learning teams.
    • Responsible for enterprise data integrity, validation, and documentation.
    • Solving data pipeline failure events and implementing sound anomaly detection

    What we are looking for:

    • 4+ years of data engineering or general software development experience
    • Experience with Big Data technologies such as Snowflake, Databricks, BigQuery
    • Demonstrated proficiency in data design and data modeling
    • Experience in developing complex ETL processes from concept to implementation to deployment and operations, including SLA definition, performance measurements and monitoring.
    • Proficiency in writing and optimizing SQL queries; data exploration skills with proven record of querying and analyzing large datasets
    • Hands-on knowledge of the AWS ecosystem, including storage (S3) and compute (EKS, ECS, Fargate) services
    • Experience with relational databases such as Postgres, and with programming languages such as Python and/or Java
    • Knowledge of cloud data warehouse concepts

    Why join us:

    • Ability to work remotely from anywhere in the world
    • Close cooperation with the development team and client
    • Opportunity to influence product development
    • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
    • We cover English classes (with a native speaker)
    • Boost professional brand: you can participate in local conferences as a listener or as a speaker
    • Regular team buildings: have fun with teammates
    • Gifts for significant life events (marriage, childbirth)
    • Tech and non-tech Zazmic Communities: support and share experience with each other

    Apply




      Pdf, doc, docx allowed. Max 2mb


      Back to list

      Dear Candidate,

      Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

      Thanks for understanding

      About the project – №1 Travel platform in the world! 

      We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform.
      At company, we want you to bring your unique identities, abilities, and experiences, so we can collectively revolutionize travel and together find the good out there.

      About the position:

      Are you ready to revolutionize the way travelers discover and explore the world? Join Machine Learning Team as a Machine Learning Scientist II, and embark on a thrilling journey to build the next generation Points of Interest (POI) system.

      The entire project platform is built on top of our POI metadata for the millions of hotels, attractions, and restaurants listings. This role is pivotal in maintaining the accuracy, coverage, and high-quality place listings that form the bedrock of Clients’ unrivaled user experience. As a ML Scientist II, you will partner with product and engineering to create innovative solutions that evaluate, match and integrate POI data from different sources to build the most accurate, complete, fresh & rich database of travel-related POIs in the world.

      What you’ll do:

      • Work with product and engineering to understand the business problems related to place metadata, including how it’s processed and stored, and define the requirements for machine learning models and algorithms to solve them.
      • Automate ETL pipelines to process large databases of POIs from different sources efficiently and reliably.
      • Prototype, deploy, and maintain new models and algorithms for POI matching/conflation, source evaluation, identifying closed businesses, scraping metadata, extracting attributes from UGC, and flagging inaccurate data
      • Communicate progress, experimental findings, and insights clearly and concisely to both technical and business stakeholders

      What we are looking for:

      • A master’s degree or PhD in computer science, data science, engineering, statistics, or a related field.
      • 2+ years of experience in developing and deploying end-to-end machine learning models and algorithms at scale.
      • Proficiency in Python, R or a similar numerical/statistical programming language
      • Ability to interpret and write complex SQL queries
      • Experience with handling data heterogeneity, incompleteness, duplication, and uncertainty issues in large datasets
      • Preferred: Experience working with large POI databases and/or geospatial data
      • Preferred: Experience with LLMs and traditional NLP techniques
      • Preferred: Experience with text embeddings generation and manipulation, as well as storage and retrieval technology such as vector databases

      Why join us:

      • Ability to work remotely from anywhere in the world
      • Close cooperation with the development team and client
      • Opportunity to influence product development
      • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
      • We cover English classes (with a native speaker)
      • Boost professional brand: you can participate in local conferences as a listener or as a speaker
      • Regular team buildings: have fun with teammates
      • Gifts for significant life events (marriage, childbirth)
      • Tech and non-tech Zazmic Communities: support and share experience with each other

      Apply




        Pdf, doc, docx allowed. Max 2mb


        Back to list

        Dear Candidate,

        Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

        Thanks for understanding

        We are seeking a skilled and experienced Marketing Automation Specialist to join our team. In this role, you will be responsible for updating and creating landing pages, email templates, and form styling utilizing various marketing automation platforms including Marketo, HubSpot, Salesforce Account Engagement (Pardot), and Salesforce Marketing Cloud. Your expertise in these platforms will be essential in driving effective marketing campaigns and enhancing our overall digital presence

        What you’ll do:

        • Update and create landing pages, email templates, and form styling utilizing Marketo, Hubspot, Salesforce Account Engagement (Pardot), and Salesforce Marketing Cloud platforms
        • Develop effective and engaging content assets for digital campaigns
        • Optimize content and layouts for maximum audience conversion and engagement
        • Manage and support the process of marketing communication automation through various CRM and marketing platforms

        What we are looking for:

        • 3+ years of experience with leaning page and email templating using Market
        • Marketo certification required

        Nice to have:

        • Experience with Hubspot
        • Experience with Salesforce Account Engagement (Pardot)
        • Experience with Salesforce Marketing Cloud platforms

        Why join us:

        • Ability to work remotely from anywhere in the world
        • Close cooperation with the development team and client
        • Opportunity to influence product development
        • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
        • We cover English classes (with a native speaker)
        • Boost professional brand: you can participate in local conferences as a listener or as a speaker
        • Regular team buildings: have fun with teammates
        • Gifts for significant life events (marriage, childbirth)
        • Tech and non-tech Zazmic Communities: support and share experience with each other

        Apply




          Pdf, doc, docx allowed. Max 2mb


          Back to list

          Dear Candidate,

          Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

          Thanks for understanding

          About the project – №1 Travel platform in the world! 

          We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform.
          At company, we want you to bring your unique identities, abilities, and experiences, so we can collectively revolutionize travel and together find the good out there.

          What we do in Cloud Data Services:

          Our client is the world’s largest travel website, and we have a LOT of data! With over 1 billion reviews, opinions, photos, and videos, reaching an audience of hundreds of millions worldwide each month. We are a data driven company, and our data infrastructure forms the foundation.

          Our team is responsible for building the tools that unlock that data across product. We build orchestration tools, scheduling tools, and ETL generation tools to enable cloud migrations and support petabyte scale operations.

           

          What you’ll do:

          Across product, we’ve worked hard to create a great environment for engineers – minimizing process, shipping products quickly, and doing everything to avoid big company paralysis.

          In this role, within the data platform engineering group, you will help us design, build, and operate our data services infrastructure which is at the core of our data-driven culture

          What you’ll do:

          • Take responsibility for the quality of the code produced.Take responsibility for all aspects of software engineering, from design to implementation, QA, and maintenance.
          • Operate across our evolving technology stack - Java, Kotlin, React, SQL, Spark, Snowflake and more.
          • Implement new features in a powerful and widely used ETL orchestration tool.
          • Investigate and use Cloud Technologies to deploy our software.
          • Integrate with production Data Sources and our petabyte scale data lake.
          • Operate across our evolving technology stack - we’re developing in Java, React, SQL and more.
          • Collaborate closely with Product, Data Engineering, Machine Learning, Analytics as well as other functional teams to define feature specifications and develop high-quality deliverables for our customers.
          • Work with technical leadership to make strategic technology decisions
          • Work alongside other engineering groups located around the world (US, Lisbon, UK, India).

          What we are looking for:

          • 3+ Years Experience as a professional engineer
          • BS or MS in Computer Science or equivalent
          • A strong history of development with Java or a JVM based language
          • Proven record of successful software development

          Nice to have:

          • Knowledge of the modern AWS Data Ecosystem, including AWS Glue, AWS Athena, AWS EKS, AWS S3, and AWS Lambda
          • Experience developing ETL processes and streaming data pipelines; including defining SLAs and performance monitoring
          • Experience with Kotlin.
          • Experience with Sping.
          • Experience developing for large-scale, full life-cycle, software applications
          • Strong interpersonal skills, intense curiosity, and enthusiasm for solving difficult problems.
          • Familiarity with big data modeling and tools (Spark, Snowflake, Big Query, Presto,etc)
          • Experience with frontend technologies (React, GraphQL, etc.)

          Why join us:

          • Ability to work remotely from anywhere in the world
          • Close cooperation with the development team and client
          • Opportunity to influence product development
          • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
          • We cover English classes (with a native speaker)
          • Boost professional brand: you can participate in local conferences as a listener or as a speaker
          • Regular team buildings: have fun with teammates
          • Gifts for significant life events (marriage, childbirth)
          • Tech and non-tech Zazmic Communities: support and share experience with each other

          Apply




            Pdf, doc, docx allowed. Max 2mb


            Back to list

            Dear Candidate,

            Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

            Thanks for understanding

            About the project – №1 Travel platform in the world! 


            We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform.
            At company, we want you to bring your unique identities, abilities, and experiences, so we can collectively revolutionize travel and together find the good out there.

            What we do in the Tracking & Experimentation Team: 

            Our client,  host 400 million monthly active visitors, and assist them to explore the world, and up-level their travel. Client also has a culture of “test and learn” to encourage finding new ways to better serve travelers.

            We are responsible for providing software that enables product to better understand visitor preferences in order to enable tailoring our site, app and product offerings to assist them in having more delightful travel experiences. We are also responsible for the software system that empowers Product Managers, General Managers, and Engineering to experiment and test out new innovative ideas across all product surfaces including the website, mobile app, and customer communications in email.  These experiments in turn drive millions of dollars of revenue annually.

            What you’ll do:

            Our client is looking for a Software Engineer II to take this exciting opportunity to join our fast-moving tracking and experimentation group. In this role, you will help us build, upgrade, and sustain successful tracking and experimentation infrastructure to serve the world’s largest and most trusted travel site, visited by over 500 million travelers each month, and the world’s leading travel brands, from large OTAs to independent boutique chains.

            Do you like building features end to end? Do you like working with a large number of technologies? Do you like moving quickly, releasing features daily, and working with other smart and talented engineers? If this sounds like you, we’d love to talk to you.

            What you’ll do:

            • Take on projects with independence and a mandate to leave things better than you found them.
            • Participate in the planning and initial steps for key changes on the site.
            • Be pragmatic when solving problems with a deep understanding of the purpose and goal of your work.
            • Touch code at all levels, from client ingestion to data storage, data analysis whatever is required to complete your project. Take responsibility for all aspects of software engineering, from design to implementation, QA, and maintenance.
            • Have a CI/CD mindset. Most of our engineers release code to production every few days and we have a daily release cycle.
            • Be integral for the code quality on your team through leadership in design and code review. Take responsibility for the quality of the code produced by you and the team
            • Operate across our evolving technology stack - we’re developing in Java, React, SQL, and more.
            • Collaborate closely with Product and design teams to define feature specifications and develop high quality deliverables for our stakeholders.
            • Work alongside other engineering groups located around the world.

            What we are looking for:

            • 5+ years of experience in commercial software development
            • A strong history of development with Java
            • Some exposure to the following technologies a plus: HTML5, JavaScript, React, GraphQL, CSS, SQL, Postgres, Linux, Python, Gradle, Apache Tomcat, BERT, Hive, Spark
            • Familiarity with Linux
            • Familiarity with designing infrastructure on AWS or other cloud providers
            • Bachelor of Science in Computer Science, Engineering or equivalent
            • Solid foundation in data structures, algorithms, and OO design
            • Willingness and ability to take on new technologies
            • Ability to break down complex problems into simple solutions
            • Strong analytical skills and desire to write clean, correct, and efficient code
            • Sense of ownership, urgency, and pride in your work
            • Exposure to developing scalable code for high-volume systems is a plus

            Nice to haves:

            • Having a data mindset along with Software Engineering expertise. Also, have worked on designing infrastructures that deal with the processing of large data sets.
            • Experience working with and processing large quantities of data - Hive, Snowflake, NoSQL databases.

            Why join us:

            • Ability to work remotely from anywhere in the world
            • Close cooperation with the development team and client
            • Opportunity to influence product development
            • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
            • We cover English classes (with a native speaker)
            • Boost professional brand: you can participate in local conferences as a listener or as a speaker
            • Regular team buildings: have fun with teammates
            • Gifts for significant life events (marriage, childbirth)
            • Tech and non-tech Zazmic Communities: support and share experience with each other

            Apply




              Pdf, doc, docx allowed. Max 2mb


              Back to list

              Dear Candidate,

              Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

              Thanks for understanding

              About the project – №1 Travel platform in the world! 


              We believe that we are better together, and at company we welcome you for who you are. Our workplace is for everyone, as is our people powered platform.
              At company, we want you to bring your unique identities, abilities, and experiences, so we can collectively revolutionize travel and together find the good out there.

              What we do in Cloud Data Services:

              Our client is the world’s largest travel website, and we have a LOT of data! With over 1 billion reviews, opinions, photos, and videos, reaching an audience of hundreds of millions worldwide each month. We are a data driven company, and our data infrastructure forms the foundation.

              Our team is responsible for building the tools that unlock that data across product.  We build orchestration tools, scheduling tools, and ETL generation tools to enable cloud migrations and support petabyte scale operations.

               

              What you’ll do:

              • Take responsibility for the quality of the code produced.Take responsibility for all aspects of software engineering, from design to implementation, QA, and maintenance
              • Operate across our evolving technology stack - Java, Kotlin, React, SQL, Spark, Snowflake and more
              • Implement new features in a powerful and widely used ETL orchestration tool.
              • Investigate and use Cloud Technologies to deploy our software
              • Integrate with production Data Sources and Peta-byte scale data lake
              • Collaborate closely with Product, Data Engineering, Machine Learning, Analytics as well as other functional teams to define feature specifications and develop high-quality deliverables for our customers
              • Work with technical leadership to make strategic technology decisions
              • Work alongside other engineering groups located around the world (US, Lisbon, UK, India)

              What we are looking for:

              • 5+ Years Experience as a professional engineer
              • BS or MS in Computer Science or equivalent
              • A strong history of development with Java or a JVM-based Language.
              • Proven record of innovation via non-trivial solutions to day-to-day problems
              • Experience developing for large-scale, full life-cycle, software applications
              • Experience developing complete JVM-based (or equivalent) applications, preferably for large distributed systems.

              Nice to have:

              • Hands-on knowledge of the modern AWS Data Ecosystem, including AWS Glue, AWS Athena, AWS Kinesis, AWS S3, and AWS Lambda
              • Experience developing ETL processes and streaming data pipelines; including defining SLAs and performance monitoring·
              • Experience with frontend technologies (React, GraphQL, etc.)
              • Strong interpersonal skills, intense curiosity, and enthusiasm for solving difficult problems
              • Familiarity with big data modeling and tools (Spark, Hive, Snowflake, Big Query, Presto,etc)

              Why join us:

              • Ability to work remotely from anywhere in the world
              • Close cooperation with the development team and client
              • Opportunity to influence product development
              • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
              • We cover English classes (with a native speaker)
              • Boost professional brand: you can participate in local conferences as a listener or as a speaker
              • Regular team buildings: have fun with teammates
              • Gifts for significant life events (marriage, childbirth)
              • Tech and non-tech Zazmic Communities: support and share experience with each other

              Apply




                Pdf, doc, docx allowed. Max 2mb


                Back to list

                Dear Candidate,

                Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

                Thanks for understanding

                Our client  is seeking an amazing, back-end Senior Software Engineer (SSE) to be part of its award winning Conversation AI team. Conversation AI leverages the power of AI to create the ultimate sales coaching and productivity improvement experience. As a senior, back-end engineer, you will team with Product, SRE, and other Engineers to deliver its next-generation capabilities, taking on complex challenges and delivering amazing outcomes.

                Primary Responsibilities (other duties may be assigned):

                • ake ownership of features across the entire development lifecycle, working closely with your Product counterparts
                • Apply industry best practices and tooling to server-side development
                • Writes clean, maintainable code that is well tested and observable
                • Refactor code as features evolve to increase clarity and improve velocity of future development
                • Build expert-level proficiency with relevant 3rd party integrations
                • Identify and advocate for process improvements both within the team and at the organization level
                • Contribute pull request reviews and respond to feedback
                • Author technical documentation as needed to share knowledge on complex areas
                • Actively manage work in Jira, proactively communicating updates so as to keep constituents informed
                • Be an active and amazing teammate!

                Qualifications:

                • 5 - 7+ years of commercial Java development experience in SaaS/PaaS environments
                • Significant experience with Spring 5/Spring Boot 2, Maven, ORMs
                • Strong, demonstrable experience in designing and developing high quality, secure REST APIs, preferably in API first environments
                • Experience working with Redis and PostgreSQL
                • Expertise in designing, developing, debugging, and operating software in resilient, distributed systems
                • Experience with front end development in React or similar Javascript framework(s) a plus
                • Experience with Elasticsearch a strong plus
                • An excellent communicator, with the ability to simplify key messages, present compelling stories and promote technical and personal credibility with internal and external stakeholders
                • Passion for teamwork and collaboration, adaptability, communication, problem-solving, customer focus, results, and innovation

                Apply




                  Pdf, doc, docx allowed. Max 2mb


                  Back to list

                  Dear Candidate,

                  Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

                  Thanks for understanding

                  This position requires analyzing tasks, understanding their logic and connections to other systems, and transforming them into Airflow Directed Acyclic Graphs (DAGs) on GCP, while documenting the process. Some familiarity with systems, networking, and infrastructure is preferred due to our hybrid setup of on-premise and cloud data systems

                  Responsibilities:

                  • Create & document efficient data pipelines (ETL/ELT).
                  • Write and optimize complex queries on large data sets.
                  • Transform data and map them to more valuable and understandable sets for consumption.
                  • Work with other teams (Big Data, Marketing, Customer Support, etc.) to gather requirements enable them to access datasets, and empower them to understand and contribute to our data processes
                  • Design and implement data retrieval, storage, and transformation systems at scale.
                  • Understanding and implementing data lineage, data governance, and data quality practices.
                  • Create tooling to help with day to day tasks.
                  • Troubleshoot issues related to data accuracy and maintaining an accurate source of truth.
                  • Introduce new technologies to the environment through research and POCs.

                  In terms of legacy tech stack:

                  • python scripts
                  • shell scripting
                  • SQL scripts and stored procedures
                  • understanding of systems / infrastructure deployment.
                  • Hadoop / hive / mapr / SQL Server / mysql

                  Newer tech stack:

                  • python
                  • airflow
                  • GCP
                  • kubernetes / working with docker containers
                  • messaging systems / RabbitMQ

                  And comfortable with :

                  • using terminal / ssh / git
                  • s/w development workflows
                  • CI/CD terminologies, tools
                  • infrastructure as code

                  Should understand concepts like:

                  • Handling of PII data
                  • data governance
                  • data lake, data warehouses, batch/stream processing etc.

                  Requirements:

                  • Advanced experience ( ~5 yr ) with Python and SQL.
                  • Experience with RabbitMQ and Kubernetes
                  • Experience working in hybrid environments( GCP / Azure / AWS / on-prem ) and, specifically with storage, database, and distributed processing services.
                  • Experience working with REST APIs and Python data tools (Pandas/SQLAlchemy)
                  • Experience building ETL and big data pipelines (Airflow/Dagster/Prefect)
                  • Ability to read and understand existing code/scripts/logic
                  • Experience with an IaC tool like Terraform.
                  • Comfortable using Terminal / ssh / Powershell / Bash / Code versioning, branching / git
                  • Experience managing a project backlog and working cross-functionally with multiple stakeholders.
                  • Ability to work effectively on a self-organizing team with minimal supervision.
                  • Initiative in communicating with co-workers, asking questions and learning.
                  • Excellent oral and written communication skills.

                  Bonus points for:

                  • Experience with dbt.
                  • Experience with CICD tools (Jenkins/Github/Gitlab).
                  • Experience with stream-processing systems (Storm/Flume//Kafka).
                  • Experience with schema design and dimensional data modeling.

                  Apply




                    Pdf, doc, docx allowed. Max 2mb


                    Back to list

                    Dear Candidate,

                    Before submitting your resume, please pay attention to the location – we will not be able to review your resume and provide feedback if you are not (in fact) located in the location of the vacancy.

                    Thanks for understanding

                    Drive strategic growth initiatives and contribute to the organization’s expansion. This role is for a Field Sales Representative with German.

                    Responsibilities:

                    • Sales and Business Development:
                    • Develop and execute sales and sales operations strategies to increase the reach of the customer-facing organization.
                    • Report on key sales metrics to track progress and identify potential growth opportunities.
                    • Forge strategic alliances to broaden the company's reach and enhance customer acquisition.
                    • Work closely with the GCP Sales Team to showcase Zazmic's offerings and expand market presence.
                    • Acquire new customers to boost revenue and strengthen market position.
                    • Retain existing clientele through exceptional customer service, fostering long-term relationships.
                    • Some travel may be required each quarter for events
                    • Collaboration with Google:
                    • Actively engage with Google Cloud teams to build partnerships and stay informed about product offerings and customer incentives.
                    • Maintain Google Cloud certifications and engage in continuous learning to gain a comprehensive understanding of services and products.
                    • Sales Process:
                    • Qualify leads using the BANT (Budget, Authority, Need, Timeline) framework to identify potential resell opportunities.
                    • Manage the sales process, from lead qualification to contract execution.
                    • Understand Zazmic Internal offerings to identify upselling and cross-selling opportunities for customers.
                    • Provide exceptional customer service and support to address inquiries.
                    • Collaboration internally and externally:
                    • Collaborate with cross-functional Zazmic teams, including Marketing, Project Management, Technical Support

                    Preferred Qualifications:

                    • 3-5 years of experience in Google Cloud products and services, with a deep understanding of their benefits for customers.
                    • Proven track record of sales success and achievement of KPI sales targets.
                    • Familiarity with using CRM systems to track leads and manage sales pipelines.
                    • Ability to build relationships with Customers and Googlers, understanding their needs.
                    • Excellent communication and interpersonal skills for effective engagement with customers and internal teams, both verbally and through email.
                    • Ability to work independently and as part of a team, collaborating with sales team members, engineers, and product managers.

                    Why join us:

                    • Ability to work remotely from anywhere in the world
                    • Close cooperation with the development team and client
                    • Opportunity to influence product development
                    • Professional growth: the certification preparation course is free for our specialists. The company pays for two attempts to pass the exam, regardless of the exam result
                    • We cover English classes (with a native speaker)
                    • Boost professional brand: you can participate in local conferences as a listener or as a speaker
                    • Regular team buildings: have fun with teammates
                    • Gifts for significant life events (marriage, childbirth)
                    • Tech and non-tech Zazmic Communities: support and share experience with each other

                    Apply




                      Pdf, doc, docx allowed. Max 2mb