Data Warehouse Engineer (m/f/x) (Remote or Munich) - full-time

Munich (Germany)

About Joyn 

Do you love stories? If so, please keep reading, because we certainly do. We believe the ability to tell stories is what makes us human. Joyn is your streaming app with over 65 live TV channels, exclusive previews, originals and collections. We understand Joyn as a partnership – an invitation to content-providers and users alike to make entertainment more meaningful and fun. Our app aggregates global and especially local content in a relevant way for Germany, both live TV and on-demand content. All kinds of stories and more to come, everyday.

We hire the best, because we need people that are as customer-focused as we are. We are looking for champions to help us further connect with our audience. It’s not a small or easy task, but it’s a fun and rewarding one. Do you think you’re up for it? Great. Then send us your application!


About the Job

At Joyn our users, services, and connected third parties produce about one billion events per day. Collecting and processing that data is the fundamental prerequisite to getting the insights we need to grow our business and improve our product. You and your team of talented and passionate data engineers & scientists enable that by building a highly performant, scalable, and resilient data platform. We do that by living a “we build it, we ship it, we own it” culture where we constantly improve our tools, processes, and software stack. If you love what you do, want to make an impact, and want to embrace the challenge, Joyn is the right place for you to work.

Opportunities to make an impact - what you do

  • Develop, implement and maintain data pipelines, data catalogs, and tools to make data available to other teams and the business with the highest possible quality.
  • Design, implement and maintain a state-of-the-art fully cloud-based data warehouse solution for a broad range of applications working on Snowflake (we use Data Vault 2.0) and BigQuery (the home of our Data Lake).
  • Actively communicate and understand data needs from all stakeholders across the business and translate those into corresponding tables.
  • Test, integrate, and deploy code automatically using GitLab CI/CD and take care of running your services in production on Google Cloud Platform and AWS.
  • Apply software engineering best practices to implement processes, systems, and tools that help you and your team to move fast with high confidence.
  • Participate in technical design and architectural discussions with your- and other teams to solve real user issues.
  • Learn and strive to excel in areas you have not touched before and share your knowledge and learnings with your colleagues.
  • What we are looking for

  • You have a degree in computer science or a related field or a high level of practical experience working with data at scale.
  • You have advanced knowledge in Data Warehouse design and optimization practices - preferably in Snowflake or BigQuery.
  • You use SQL on a daily basis to write highly performant, readable, and understandable queries.  
  • You have knowledge in at least one of the following tools: DBT, Airflow, Airbyte, Prefect, or Databricks. 
  • You have working knowledge in at least one programming language, preferably Python 3.
  • You strive for “everything-as-code” and ideally have working experience with CI/CD & data test automation.
  • You take care of code & data quality, share your knowledge, and love to do code reviews.
  • You are comfortable working in a fast-paced, ever-changing environment that lets you grow.