
Data Engineer
ROCKWOOL GBSā¢PoznaÅ, Wilda
š° Wynagrodzenie
WideÅki nieujawnione
š Informacje
š Wymagane jÄzyki
š Twój zakres obowiÄ zków
Your responsibilities, You'll be the go-to Databricks expert on the team. You'll have important role in the migration from the legacy stack while designing and building the new platform in parallel ā and āin parallelā is doing real work in that sentence., , The legacy platform runs. Not beautifully, but it runs ā and it serves real business needs that can't wait for the migration to finish. You'll split your time between keeping it stable (and gradually less painful) and building its replacement. If the idea of legacy firefighting makes you want to close this tab, this probably isn't the right role. If you see it as part of the job and take quiet satisfaction in fixing things that are broken ā read on., , More specifically, you will:, - Design & build the new streaming platform (Kafka ā Databricks with Declarative Pipelines), - Migrate existing batch workflows from Airflow + Docker + onāprem Databricks to cloudānative architecture, - Keep the current platform stable while improving its reliability, performance and operability, - Architect the serving layer, - Govern data properly ā Unity Catalog, lineage, access control, data quality ā not as an afterthought, - Enable sharing across organization with Polaris and Iceberg, - Collaborate with data scientists, ML engineers, and business teams across regions, - Use AI tools daily ā we use GitHub Copilot and internal homemade assistants/agents we build on our own within a team; we expect you to help the team get real value from them
3-5+ years of experience in data engineering (the specific tech stack is flexible - we value your way of thinking and problemāsolving above tools), Strong consulting mindset or experience working closely with business stakeholders, with the ability to ask the right questions, challenge assumptions and translate business needs into technical solutions, Endātoāend ownership approach - from designing and building solutions to monitoring, improving and maintaining them, Excellent communication and collaboration skills, enabling you to work effectively across teams and influence decisions, Degree in Computer Science/Engineering or equivalent handsāon experience, Experience with Databricks, Spark and streaming technologies (e.g., Kafka) - a strong plus, Proficiency in English at a minimum B2 level, spoken and written
What we offer, By joining our team, you become a part of the people-centric work environment of a Danish company. We offer you a competitive salary, permanent contract after the probation period, development package, team building events, activity-based office in Poznanās city center in the new prestigious office building ā Nowy Rynek. The building is recognized as a building without barriers, which means that it is fully adapted to the needs of people with disabilities., , Our compensation package on employment contracts includes:, - An office-first approach: home office is available up to 2 days per week, - Adaptable Hours: start your workday anytime between 7:00 AM and 9:00 AM, - Home office subsidy, - Private Medical Care, - Multikafeteria MyBenefit, - Wellbeing program, - Extra Day Off for voluntary activities, ⦠and while in the office you can also use modern office space with beautiful view and high standard furniture, bicycle parking facilities & showers, chill-out rooms with PlayStation, football table, pool table, board games, subsidized canteen with delicious food & fruit.
Benefits, sharing the costs of sports activities, private medical care, sharing the costs of professional training & courses, life insurance, remote work opportunities, flexible working time, fruits, corporate products and services at discounted prices, integration events, corporate sports team, saving & investment scheme, corporate library, no dress code, video games at work, coffee / tea, drinks, leisure zone, holiday funds, redeployment package, birthday celebration, employee referral program, opportunity to obtain permits and licenses, charity initiatives, office massages
Recruitment stages, Phone interview with a Recruiter ā 30 min, Technical interview with the Manager ā 60 min, Final meeting ā 40 min
additional-module
The team you will join, You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several projectāfocused subāteams working across a variety of business areas., , Our data platform is already there in cloud, already in Databricks. But weāre not here to maintain the status quo ā weāre rebuilding it from the ground up to jump into exciting world of real-time data and streaming., , We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select., , This is a greenfield build inside a global company ā real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.
ROCKWOOL GBS, We are a global leader in stone wool solutions., Our team of over 12,000 people across 40 countries delivers products to customers in more than 120 markets., , Our mission is to support sustainable development., We help reduce energy consumption, noise, and fire risks - improving quality of life wherever our solutions are used., , The ROCKWOOL Global Business Services Center has been operating since 2016., We started with 27 people, and today we are a team of over 600 - and still growing., In PoznaÅ, we are developing competence centers in areas such as IT, R&D, Engineering, Sourcing, and Digital Marketing.
This is how we work,
Your responsibilities
You'll be the go-to Databricks expert on the team. You'll have important role in the migration from the legacy stack while designing and building the new platform in parallel ā and āin parallelā is doing real work in that sentence.
The legacy platform runs. Not beautifully, but it runs ā and it serves real business needs that can't wait for the migration to finish. You'll split your time between keeping it stable (and gradually less painful) and building its replacement. If the idea of legacy firefighting makes you want to close this tab, this probably isn't the right role. If you see it as part of the job and take quiet satisfaction in fixing things that are broken ā read on.
More specifically, you will:
- Design & build the new streaming platform (Kafka ā Databricks with Declarative Pipelines)
- Migrate existing batch workflows from Airflow + Docker + onāprem Databricks to cloudānative architecture
- Keep the current platform stable while improving its reliability, performance and operability
- Architect the serving layer
- Govern data properly ā Unity Catalog, lineage, access control, data quality ā not as an afterthought
- Enable sharing across organization with Polaris and Iceberg
- Collaborate with data scientists, ML engineers, and business teams across regions
- Use AI tools daily ā we use GitHub Copilot and internal homemade assistants/agents we build on our own within a team; we expect you to help the team get real value from them
What we offer
By joining our team, you become a part of the people-centric work environment of a Danish company. We offer you a competitive salary, permanent contract after the probation period, development package, team building events, activity-based office in Poznanās city center in the new prestigious office building ā Nowy Rynek. The building is recognized as a building without barriers, which means that it is fully adapted to the needs of people with disabilities.
Our compensation package on employment contracts includes:
- An office-first approach: home office is available up to 2 days per week
- Adaptable Hours: start your workday anytime between 7:00 AM and 9:00 AM
- Home office subsidy
- Private Medical Care
- Multikafeteria MyBenefit
- Wellbeing program
- Extra Day Off for voluntary activities
⦠and while in the office you can also use modern office space with beautiful view and high standard furniture, bicycle parking facilities & showers, chill-out rooms with PlayStation, football table, pool table, board games, subsidized canteen with delicious food & fruit.
Recruitment stages
- Phone interview with a Recruiter ā 30 min
- Technical interview with the Manager ā 60 min
- Final meeting ā 40 min
The team you will join
You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several projectāfocused subāteams working across a variety of business areas.
Our data platform is already there in cloud, already in Databricks. But weāre not here to maintain the status quo ā weāre rebuilding it from the ground up to jump into exciting world of real-time data and streaming.
We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select.
This is a greenfield build inside a global company ā real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.
š Opis gÅówny / WstÄp
additional-module
The team you will join, You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several projectāfocused subāteams working across a variety of business areas., , Our data platform is already there in cloud, already in Databricks. But weāre not here to maintain the status quo ā weāre rebuilding it from the ground up to jump into exciting world of real-time data and streaming., , We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select., , This is a greenfield build inside a global company ā real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.
Your responsibilities
You'll be the go-to Databricks expert on the team. You'll have important role in the migration from the legacy stack while designing and building the new platform in parallel ā and āin parallelā is doing real work in that sentence.
The legacy platform runs. Not beautifully, but it runs ā and it serves real business needs that can't wait for the migration to finish. You'll split your time between keeping it stable (and gradually less painful) and building its replacement. If the idea of legacy firefighting makes you want to close this tab, this probably isn't the right role. If you see it as part of the job and take quiet satisfaction in fixing things that are broken ā read on.
More specifically, you will:
- Design & build the new streaming platform (Kafka ā Databricks with Declarative Pipelines)
- Migrate existing batch workflows from Airflow + Docker + onāprem Databricks to cloudānative architecture
- Keep the current platform stable while improving its reliability, performance and operability
- Architect the serving layer
- Govern data properly ā Unity Catalog, lineage, access control, data quality ā not as an afterthought
- Enable sharing across organization with Polaris and Iceberg
- Collaborate with data scientists, ML engineers, and business teams across regions
- Use AI tools daily ā we use GitHub Copilot and internal homemade assistants/agents we build on our own within a team; we expect you to help the team get real value from them
Recruitment stages
- Phone interview with a Recruiter ā 30 min
- Technical interview with the Manager ā 60 min
- Final meeting ā 40 min
The team you will join
You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several projectāfocused subāteams working across a variety of business areas.
Our data platform is already there in cloud, already in Databricks. But weāre not here to maintain the status quo ā weāre rebuilding it from the ground up to jump into exciting world of real-time data and streaming.
We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select.
This is a greenfield build inside a global company ā real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.
š Co oferujemy (Dodatkowe detale)
Benefits, sharing the costs of sports activities, private medical care, sharing the costs of professional training & courses, life insurance, remote work opportunities, flexible working time, fruits, corporate products and services at discounted prices, integration events, corporate sports team, saving & investment scheme, corporate library, no dress code, video games at work, coffee / tea, drinks, leisure zone, holiday funds, redeployment package, birthday celebration, employee referral program, opportunity to obtain permits and licenses, charity initiatives, office massages
Recruitment stages, Phone interview with a Recruiter ā 30 min, Technical interview with the Manager ā 60 min, Final meeting ā 40 min
additional-module
The team you will join, You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several projectāfocused subāteams working across a variety of business areas., , Our data platform is already there in cloud, already in Databricks. But weāre not here to maintain the status quo ā weāre rebuilding it from the ground up to jump into exciting world of real-time data and streaming., , We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select., , This is a greenfield build inside a global company ā real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.
ROCKWOOL GBS, We are a global leader in stone wool solutions., Our team of over 12,000 people across 40 countries delivers products to customers in more than 120 markets., , Our mission is to support sustainable development., We help reduce energy consumption, noise, and fire risks - improving quality of life wherever our solutions are used., , The ROCKWOOL Global Business Services Center has been operating since 2016., We started with 27 people, and today we are a team of over 600 - and still growing., In PoznaÅ, we are developing competence centers in areas such as IT, R&D, Engineering, Sourcing, and Digital Marketing.
This is how we work,
Your responsibilities
You'll be the go-to Databricks expert on the team. You'll have important role in the migration from the legacy stack while designing and building the new platform in parallel ā and āin parallelā is doing real work in that sentence.
The legacy platform runs. Not beautifully, but it runs ā and it serves real business needs that can't wait for the migration to finish. You'll split your time between keeping it stable (and gradually less painful) and building its replacement. If the idea of legacy firefighting makes you want to close this tab, this probably isn't the right role. If you see it as part of the job and take quiet satisfaction in fixing things that are broken ā read on.
More specifically, you will:
- Design & build the new streaming platform (Kafka ā Databricks with Declarative Pipelines)
- Migrate existing batch workflows from Airflow + Docker + onāprem Databricks to cloudānative architecture
- Keep the current platform stable while improving its reliability, performance and operability
- Architect the serving layer
- Govern data properly ā Unity Catalog, lineage, access control, data quality ā not as an afterthought
- Enable sharing across organization with Polaris and Iceberg
- Collaborate with data scientists, ML engineers, and business teams across regions
- Use AI tools daily ā we use GitHub Copilot and internal homemade assistants/agents we build on our own within a team; we expect you to help the team get real value from them
What we offer
By joining our team, you become a part of the people-centric work environment of a Danish company. We offer you a competitive salary, permanent contract after the probation period, development package, team building events, activity-based office in Poznanās city center in the new prestigious office building ā Nowy Rynek. The building is recognized as a building without barriers, which means that it is fully adapted to the needs of people with disabilities.
Our compensation package on employment contracts includes:
- An office-first approach: home office is available up to 2 days per week
- Adaptable Hours: start your workday anytime between 7:00 AM and 9:00 AM
- Home office subsidy
- Private Medical Care
- Multikafeteria MyBenefit
- Wellbeing program
- Extra Day Off for voluntary activities
⦠and while in the office you can also use modern office space with beautiful view and high standard furniture, bicycle parking facilities & showers, chill-out rooms with PlayStation, football table, pool table, board games, subsidized canteen with delicious food & fruit.
Recruitment stages
- Phone interview with a Recruiter ā 30 min
- Technical interview with the Manager ā 60 min
- Final meeting ā 40 min
The team you will join
You will join our Data Science & Engineering Team, a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering, analytics, and machine learning, and is structured into several projectāfocused subāteams working across a variety of business areas.
Our data platform is already there in cloud, already in Databricks. But weāre not here to maintain the status quo ā weāre rebuilding it from the ground up to jump into exciting world of real-time data and streaming.
We will migrate from a batch-oriented Airflow + Databricks to a streaming-first architecture: Kafka, Databricks with new cool features like Declarative Pipelines, Unity Catalog, Apache Iceberg / Polaris Catalog, and new serving layer, which you will help us to select.
This is a greenfield build inside a global company ā real budget, real data, real stakes. No startup chaos, but real room to make meaningful architectural decisions.