We are looking for a Data Engineer, who is interested in joining our scrum team for the global reporting tool.
Tasks:
- Data Engineering tasks
- Develop new functionalities for the data pipeline (including improving data quality functions, harmonization of data over multiple nationalities and calculating out of it new KPIs for the Dashboard usage)
- Maintain and improve existing Global Reporting functionalities
Requirements:
- Experience building and optimizing big data pipelines and architectures
- Hands-on experience with Cloud services (preferably Azure Data Factory)
- Deep knowledge in writing complex queries for RDBMS Systems (PostgreSQL)
- Solid Spark knowledge
- Experience with Databricks and Synapse
- Hands on experience with Python
- General understanding of Networking and IT Security Principles
Other skills
- Hands-on experience with SQL database design is a plus
- Practical experience with Linux, shell scripting and GIT ( and other DevOps related tools) is a plus
- Experience with Data Governance Frameworks (e.g. Informatica) and principles is a plus
- Hands-on experience with PowerBI implementation is a plus
What we offer in return:
- Home Office opportunity
- Competitive salary
- Thirteenth and fourteenth monthly salary
- Annual financial bonus – based on individual targets
- Wide package of certified trainings
- Possibility to develop existing and new skills
- Support of work-life balance