
Data Platform Engineer
Infolet•Kraków
🏢 remote⭐ senior📄 permanent
💰 Wynagrodzenie
17000 - 20400 PLN/msc
Oryginalnie: 17000 - 20400 PLN/msc
Wygasa za
27 dni
📋 Informacje
LokalizacjaKraków
Tryb pracyZdalnie
EtatPełny etat
DoświadczenieSenior
Min. lat doświadczenia10+
Typ kontraktuUmowa o pracę
KategoriaData Science
🛠 Wymagane technologie
AWSSQLPythonScalaDatabricksAirflowTerraform
🌐 Wymagane języki
PL (C2)EN (B2)
📝 Opis główny / Wstęp
You will
- Design and implement a cloud-based data lakehouse platform ingesting engineering and security telemetry from multiple sources
- Build scalable Bronze/Silver/Gold data layers, ensuring reliable ingestion, transformation, and consumption patterns
- Develop and maintain streaming and batch pipelines using technologies such as Kafka, Flink, Spark Streaming, or Kinesis
- Model complex relationships using graph databases (Neo4j, Neptune, TigerGraph) to support advanced analytics and AI-driven use cases
- Collaborate with data scientists, platform engineers, and security teams to deliver high‑quality, production-grade data solutions
- Implement robust data transformation frameworks (dbt, Databricks SQL, Dataform, or custom SQL/Python pipelines)
- Build and maintain orchestration workflows using Airflow, Prefect, Dagster, Step Functions, or similar tools
- Define and enforce data modeling standards, including dimensional modeling, SCD2, and normalization strategies
- Use Infrastructure as Code (Terraform, CloudFormation, Pulumi) to provision and manage cloud resources
- Ensure observability and reliability of pipelines through monitoring, alerting, and data quality validation
- Work autonomously in a fast‑moving environment, proposing solutions even when requirements are incomplete
Profil kandydata
Must have
- 10+ years of experience in data engineering, (at least 2 years building lakehouse architectures)
- Proven experience delivering production-grade data platforms in cloud environments
- Strong hands-on expertise with AWS (S3/Blob storage, RDS/SQL DB, managed Kafka, serverless compute)
- Expert-level SQL and deep understanding of data modeling (dimensional, SCD2, normalization/denormalization)
- Proficiency with Python or Scala (data processing and automation)
- Experience with stream processing (Kafka, Flink, Spark Streaming, Kinesis)
- Hands-on experience with DBT, Databricks SQL, Dataform, or similar transformation frameworks
- Strong orchestration skills using Airflow, Prefect, Dagster, or equivalent
- Solid understanding of IaC (Terraform, CloudFormation, Pulumi)
- Ability to explain technical trade-offs to non-technical stakeholders
- Strong problem-solving skills
- Fluent English (business and technical)
📡 Metadata statystyk
Źródłosolidjobs
Slug / IDremote-data-platform-engineer-infolet-b81dbe
Opublikowano19 marca 2026
Wygasa19 kwietnia 2026
Pobranie (Ingest)19 marca 2026