Connect with Quadient

At Quadient, we support businesses of all sizes in their digital transformation and growth journey, unlocking operational efficiency with reliable, secure, and sustainable automation processes. Our success in delivering innovation and business growth is inspired by the connections our diverse teams create every day, with our clients and each other. It’s these connections that make Quadient such an exceptional place to grow your career, develop your skills and make a real impact – help our future-focused business lead the way in powering secure and sustainable business connections through digital and physical channels. 

Your role in our future

We are looking for a Data Engineer to design, build, and evolve the next generation of our data platform. In this role, you will take ownership of the end-to-end data lifecycle, from ingestion to activation, ensuring data is reliable, scalable, and ready to support business decision-making across the organization.

You will work closely with cross-functional teams to translate business needs into robust data solutions and contribute to the development of modern, cloud-based data architecture, including real-time data services and AI-enabled workflows.

  • Design, implement, and maintain data pipelines (event-based and batch) using modern orchestration tools (e.g., Dagster)

  • Develop and manage data transformation workflows (rETL) to integrate data with third-party systems such as CRM, CDP, and marketing platforms

  • Own and evolve the data architecture, covering ingestion, transformation, storage, and data serving layers

  • Design scalable data models and ensure data quality, governance, and performance optimization

  • Build data services, APIs, and internal data products to enable real-time and programmatic data access

  • Integrate machine learning models and AI capabilities into data pipelines and workflows

  • Ensure reliability through monitoring, observability, and continuous improvement of data processes

  • Collaborate with product, engineering, and business teams to define and deliver data-driven solutions

 

Your profile

  • Proven experience as a Data Engineer or in a similar role, with strong ownership of data pipelines and architecture

  • Strong proficiency in Python and SQL

  • Experience with orchestration tools such as Dagster, Airflow, or Prefect

  • Hands-on experience with modern cloud data platforms (e.g., BigQuery, Snowflake, Redshift, Databricks)

  • Experience with data modeling, data quality frameworks, and ETL/ELT processes

  • Familiarity with building APIs or microservices for data access

  • Experience integrating machine learning models into production environments

  • Understanding of event-driven architectures (e.g., Kafka, Pub/Sub) is a plus

  • Strong analytical thinking and problem-solving skills

  • Ability to work effectively in cross-functional, international teams

  • Fluent in English

Knowledge gaps can be filled. Even if you don’t satisfy every single requirement or meet every qualification listed, we still want to hear from you. 

Create job alert: