DevOps Engineer
Task information:
ABOUT THE PROJECT / ROLE:
You will join a diverse DevOps team of multi-skilled engineers across LT/DK/PL. We focus on backend infrastructure that processes airfare data with billions of messages daily. Our new projects are hosted on AWS, and we are in the middle of migration to being 100% Cloud-based. We store data in SQL, Snowflake, and S3. Our infrastructure is managed using Terraform and the code is managed in GitHub. Monitoring is setup via DataDog observability platform.
YOU WILL:
- Contribute to our shared Cloud Platform initiative by developing and implementing standardised solutions and best practices to expedite service delivery and enhance reliability. This will empower our organization to operate more efficiently and automate processes as much as possible.
- Take care of core AWS services within a multi-account environment, employing an Infrastructure as Code (IaC) approach.
- Oversee and maintain a range of Kubernetes clusters, both on premise and cloud based.
- Collaborate with developer teams to enable them to migrate, build and run their own solutions.
- Work closely with SRE Team to improve observability and reliability of our solutions with DataDog, keeping SLA/SLO/SLI in mind.
Requirements:
- Experience as a DevOps Engineer or related field.
- Experience with AWS services in multi account environment: EC2, ALB/NLB, EFS, S3, EKS /ECS/ECR, MSK, RDS/DynamoDB/Aurora, ElastiCache.
- Experience with multiple Kubernetes cluster management (EKS, EKS-D), both on premise and in the cloud.
- Experience with orchestration tools like Terraform, Ansible.
- Experience with observability platform DataDog or similar like New Relic, Splunk, ELK or EFK.
- Fluent in English, both written and spoken.
BONUS IF YOU HAVE:
- AWS Cloud migration experience.
- Experience in debugging live production issues.
- Advanced knowledge in Linux, Networking, Docker/Containers, Kubernetes.
- Experience with Azure DevOps or similar CI/CD tools like GitHub Actions, Bit Bucket pipelines.
- Experience with HashiCorp Vault and Consul or similar secret management and service discovery solutions.
- Experience with Kafka or Snowflake management.
- Understanding Data Pipelines and ETL processes.
- Experience using AWS Organisations and AWS Account Factory for Terraform (AFT).
Company offers:
- The opportunity to work with passionate individuals in the intersection of technology and the travel business domain
- A modern office in a convenient location, with flexible working arrangements that allow remote work or office attendance
- Opportunities for travel to our other offices, fostering a global perspective
- The chance to join an industry-leading international company with a commitment to innovation
- An attractive compensation and benefits package, including private health insurance, a company bonus scheme, and voluntary participation in a company-supported retirement scheme
- A generous annual leave policy, growing with each year of service, and a day off during your birthday month
- Great growth and development opportunities, both professionally and personally
- We will ensure that the exact salary offered for you will be based on your qualifications, competencies, professional experience and requirements for the corresponding job function.
Contacts
Contact person:
Reda Maumevičienė
Phone:
E-mail:
Address:
K. Donelaičio str.62-320, BLC Business Centre, Kaunas, Lithuania
Confidentiality guaranteed. Only selected candidates will be informed.