We are looking for an experienced Data Engineer to join our team and support the full lifecycle of our data and integration platforms. You will work with a modern Microsoft- and IBM-aligned data stack to design, build, administer, and maintain scalable data solutions and system integrations. Your work will ensure that data across the organization is accurate, secure, and readily available for analytics and operational needs.
Key responsibilities:
- Design, build, administer, and maintain ETL/ELT data pipelines using IBM DataStage and Microsoft integration technologies (e.g., Microsoft Fabric Data Pipelines, Azure Data Factory, Logic Apps).
- Develop, maintain, and administer datasets, dataflows, warehouses, and pipelines within Microsoft Fabric.
- Work with IBM Cloud Pak for Data to integrate, virtualize, refine, and govern enterprise data.
- Build, maintain, and administer integrations between internal and external systems, using both IBM and Microsoft-based integration tools.
- Support migration, integration, and modernization initiatives across the Azure cloud ecosystem.
- Manage and optimize data environments for performance, scalability, quality, and cost efficiency.
- Ensure data quality, security, compliance, and governance standards are met.
- Collaborate closely with analysts, BI developers, architects, and data scientists.
- Troubleshoot issues, analyze root causes, and improve reliability of data pipelines and integrations.
Requirements:
- Bachelor's degree in Computer Science, Information Systems, Engineering or equivalent experience.
- Proficiency in SQL.
- Experience with IBM DataStage (development and administration).
- Experience with Microsoft Fabric (Data Pipelines, Lakehouse, Dataflows, Warehouse).
- Experience with Microsoft integration technologies such as Azure Data Factory, Logic Apps, Azure Functions, or Fabric Pipelines.
- Experience with IBM Cloud Pak for Data (Data Virtualization, Data Refinery, governance modules).
- Familiarity with Azure data services: Data Lake, Synapse, Storage Accounts, Key Vault, etc.
- Understanding of data modelling, warehousing concepts, integration patterns, and API-based connections.
- Strong analytical, troubleshooting, and problem-solving skills.
Preferred qualifications:
- Experience with Dynamics 365 Finance & Operations (D365FO) as an ERP solution - especially its data structures and integration capabilities.
- Experience with Azure DevOps, CI/CD pipelines, or Git-based workflows.
- Knowledge of Python or PowerShell for automation.
- Experience tuning large-scale data workloads and administering enterprise data platforms.
How to apply
If joining a dynamic and changing company appeals to you, please send us your CV in English using the form below. Ensure your CV contains only information relevant to your academic and professional background, avoiding sensitive personal data.
All applications will be considered with complete confidentiality. Only short-listed applicants will be contacted.
The personal data provided by you will be processed in strict confidence with applicable legislation in this regard.