Impact Fund Management, Data Engineer

Impact Fund Management, Data Engineer

Job Offer from Innpact S.A.

Role

Innpact is looking for a highly skilled and motivated Data Engineer to join their Fund Management team. The Data Engineer will play a pivotal role in developing, implementing and maintaining their data infrastructure, while also contributing to software development projects related to their growing impact and fund management activities. Working in close collaboration cross-functional teams in an agile environment, the Data Engineer will ensure the smooth management of data pipelines, databases, and ETL processes, with a strong focus on quality, security, and compliance.

Responsibilities

  • Data Pipeline Development: Design, build, and maintain scalable data pipelines to process large volumes of data efficiently. Ensure data integrity, quality, and consistency across diverse data sources while adhering to European data privacy regulations (e.g. GDPR).
  • Database Management: Develop, manage, and optimize hi-performance, secure databases=. Perform data modelling and schema design to meet business requirements. Implement data storage solutions, including relational and NoSQL databases, ensuring reliability and availability.
  • ETL Processes: Create, monitor and troubleshoot robust ETL processes to extract, transform, and load data from multiple sources. Collaborate with data analysts and scientists to understand business needs and deliver timely, reliable data.
  • Agile Collaboration: Work within an agile framework, participate to project meetings, and contribute to data-driven decision making. Continuously promote best practices in data engineering, governance and security while automating processes to improve team efficiency.

The ideal candidate has

  • A strong passion for contributing to sustainable and responsible finance, with a deep commitment to advancing the impact finance sector.
  • 3-5 years of experience as data engineer or in a related role.
  • Master’s degree in Computer Science, Information Technology, or a related field.
  • Expertise in SQL, Python and data pipeline tools (e.g., Apache Kafka, Apache Airflow). Proficiency with structured and NoSQL local and cloud databases (AWS, Azure, or Google Cloud).
  • Comprehensive knowledge of data security protocols, governance, and compliance standards, especially within the financial sector.
  • Advanced proficiency in English, both written and spoken, with excellent communication and presentation skills. Able to work both independently and collaboratively in a dynamic team environment.
  • Strong attention to details with a focus on accuracy and high-quality work.

Apply Now →