IT Data Engineer
Universal Investment•Polska
💰 Wynagrodzenie
Widełki nieujawnione
📋 Informacje
📝 Opis główny / Wstęp
Grow with us! Universal Investment is on its way to becoming Europe's leading fund services platform and Super ManCo. We have provided innovative fund solutions for asset managers and institutional investors since 1968. We are also well established in Germany, Luxembourg, Ireland and Poland. From our offices in Frankfurt, Krakow, Luxembourg, Dublin, Paris and Stockholm, we continue to expand internationally. We are currently looking for entrepreneurial people who want to grow and share in the profits of our success.
We are looking for a hands-on and business-oriented Data Engineer to support the rollout of our new enterprise data platform. This role is critical to enabling high-quality, reusable data products and pipelines that support fund administration, management company services, and API-based products. The successful candidate will play a critical role in supporting the digital evolution of our data landscape, ensuring it aligns with operational, regulatory, and strategic needs of the financial industry.
Your tasks and responsibilities:
- Build data pipelines: Design and maintain scalable data workflows to ingest, transform, and expose fund-related data in a cloud environment.
- Deliver data products: Support the development of governed, reusable datasets aligned with business needs and data domains.
- Support APIs: Enable backend data flows for APIs ensuring accuracy and reliability.
- Ensure quality and governance: Apply validation, documentation, and lineage practices to maintain data trust and compliance.
- Modernize systems: Migrate legacy processes to platforms like Microsoft Fabric and Databricks.
- Collaborate in agile teams: Deliver iteratively in close coordination with analysts, architects, and domain leads.
Your skills and experience:
- University degree in Computer Science, Data Engineering, or similar field.
- 3–5 years of experience in data engineering, ideally in financial industry.
- Proficient in SQL and procedural extensions (e.g., PL/SQL, T-SQL) for data manipulation and modeling.
- Proficient in Python and hands on in modern data platforms (e.g. Azure Data Factory, Azure Databricks or comparable systems on AWS and GCP) for data processing, automation, and deployment, complemented by practical experience with CI/CD pipelines (e.g. GitLab CI, Azure DevOps, Bicep).
- Team-oriented, proactive, and comfortable working in agile setups.
- Understanding of fund-related data and reporting processes is a plus.
- Proficiency in business English (spoken and written); knowledge of German or French is a plus.