Description
About the Role
We’re hiring a DevOps Engineer to join our Platform team and take full ownership of infrastructure and DevOps processes for our Data Platform. Today, data engineers are stretched across both data pipeline development and infrastructure maintenance. You’ll step in to streamline, automate, and elevate the backend systems that power our analytics and data-driven products.
This is a hands-on and collaborative role where you’ll have the opportunity to reshape core infrastructure, focusing on AWS, CI/CD systems, database ops, and performance monitoring, while directly supporting some of the most critical projects in the company.
About the Team
You’ll be part of the Platform Engineering team, working closely with:
-
Data engineers (your closest collaborators and internal customers)
-
Backend and Frontend engineers building shared services
-
DevOps colleagues building EKS clusters, migration pipelines, and automation tools
-
Own and evolve cloud infrastructure for the Data Platform, with a focus on AWS services like RDS, S3, IAM, and VPC
-
Build and maintain CI/CD pipelines for data-related workloads (e.g., Airflow, dbt, ingestion pipelines, Terraform)
-
Collaborate with data engineers to orchestrate and scale data pipelines, improving reliability and observability
-
Manage database operations, including migrations, replication, backups, and performance tuning for PostgreSQL/MySQL
-
Implement and enforce IAM policies and security best practices for data infrastructure
-
Optimize data platform cost, availability, and scalability across environments
-
Monitor platform health and implement alerting for failures, bottlenecks, or data anomalies
-
Enable self-service infrastructure for the data team by creating reusable templates and automation toolsChampion infrastructure best practices across the data lifecycle
,
-
4+ years of DevOps or Infrastructure Engineering experience
-
Proven track record working with data engineering teams or on data-intensive platforms
-
Strong hands-on experience with AWS, especially:
-
RDS (PostgreSQL/MySQL): migration, replication, HA, tuning
-
S3 for data lakes
-
IAM, VPC, networking, security best practices
-
Experience with CI/CD for data pipelines or infrastructure-as-code (e.g., deploying Airflow DAGs, dbt jobs)
-
Proficient in Terraform, GitLab CI/CD, and scripting (Bash, Python)
-
Deep understanding of monitoring and observability (e.g., Prometheus, OpenTelemetry, ELK)
-
Comfortable enabling cross-functional collaboration and self-service tooling
,
-
Experience with Databricks, Delta Lake, or S3-based data lakes
-
Familiarity with Apache Airflow, Step Functions, or other workflow orchestration tools
-
Exposure to Spark, EMR, or large-scale data processing environments
-
Background in data validation, schema evolution, or lineage tracking
-
Experience managing secrets and sensitive credentials in data environments (e.g., AWS Secrets Manager, Vault)
What it’s like to work with Us
*Opportunity to join our Employee Stock Options program.
*Opportunity to help scale a unique product.
*Various bonus systems: performance-based, referral, additional paid leave, personal learning budget.
*Paid volunteering opportunities.
*Work location of your choice: office, remote, opportunity to work and travel.
*Personal and professional growth at an exponential rate supported by well-defined feedback and promotion processes.
*Please attach CV’s in English.
Are you interested in this position?
Apply by clicking on the “Apply Now” button below!
#JobsHubEstonia #GlobalRecrutmen
#CareerOpportunities #HiringNow
#JobSeekersNetwork #EstoniaJob
#RecruitmentServices #EmploymentPortal