- Home
- Job offers
- Data Engineer
Data Engineer
Published on:
Valid until:
Huvepharma® is a privately-owned, fast-growing global pharmaceutical and feed additives company with a focus on developing, manufacturing, and marketing human and animal health products.
Over recent years, Huvepharma® further broadened its manufacturing platform through significant investments in its existing facilities, complemented by strategic acquisitions in Europe and the USA.
Today, Huvepharma® is active in more than 100 countries, with production facilities spanning the globe: two production sites in France, three production facilities for biotechnology and vaccines in Bulgaria, and six sites in the USA for vaccines and veterinary medicines.
We are looking for an experienced Data Engineer to join our team and support the full lifecycle of our data and integration platforms. You will work with a modern Microsoft- and IBM-aligned data stack to design, build, administer, and maintain scalable data solutions and system integrations. Your work will ensure that data across the organization is accurate, secure, and readily available for analytics and operational needs.
Key Responsibilities:
- Design, build, administer, and maintain ETL/ELT data pipelines using IBM DataStage and Microsoft integration technologies (e.g., Microsoft Fabric Data Pipelines, Azure Data Factory, Logic Apps).
- Develop, maintain, and administer datasets, dataflows, warehouses, and pipelines within Microsoft Fabric.
- Work with IBM Cloud Pak for Data to integrate, virtualize, refine, and govern enterprise data.
- Build, maintain, and administer integrations between internal and external systems, using both IBM and Microsoft-based integration tools.
- Support migration, integration, and modernization initiatives across the Azure cloud ecosystem.
- Manage and optimize data environments for performance, scalability, quality, and cost efficiency.
- Ensure data quality, security, compliance, and governance standards are met.
- Collaborate closely with analysts, BI developers, architects, and data scientists.
- Troubleshoot issues, analyze root causes, and improve reliability of data pipelines and integrations.
Requirements: - Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent experience.
- Proficiency in SQL.
- Experience with IBM DataStage (development and administration).
- Experience with Microsoft Fabric (Data Pipelines, Lakehouse, Dataflows, Warehouse).
- Experience with Microsoft integration technologies such as Azure Data Factory, Logic Apps, Azure Functions, or Fabric Pipelines.
- Experience with IBM Cloud Pak for Data (Data Virtualization, Data Refinery, governance modules).
- Familiarity with Azure data services: Data Lake, Synapse, Storage Accounts, Key Vault, etc.
- Understanding of data modeling, warehousing concepts, integration patterns, and API-based connections.
Strong analytical, troubleshooting, and problem-solving skills.
Preferred Qualifications:
- Experience with Dynamics 365 Finance & Operations (D365FO) as an ERP solution — especially its data structures, integration capabilities.
- Experience with Azure DevOps, CI/CD pipelines, or Git-based workflows
- Knowledge of Python or PowerShell for automation.
- Experience tuning large-scale data workloads and administering enterprise data platforms.
The company offers:
- Competitive remuneration package and benefits.
- Opportunity to work in an established global company with a strong commitment to employee growth and professional development.
- Career advancement opportunities in an international environment
- 23 days of paid vacation per year.
- Food vouchers - 200 BGN per month.
- Premium health insurance package.
- Multisport card.
If you are interested in this opportunity, do not hesitate to apply today.
Please send us your CV in English. Ensure your CV contains only information relevant to your academic and professional background, avoiding sensitive personal data. Your application package will be treated with strict confidentiality. Only shortlisted applicants will be contacted.
The personal data you provide will be treated in strict compliance with applicable personal data protection legislation.