Job Information
Date Opened
Remote Job
Job Type
Industry
Job Description
This is a remote position.
Profitroom – Empowering hotels directly
We are currently looking for an experienced Data Engineering Team Leader to join our Data Team and help us make our company even more data-driven. The results of your work will directly impact product development, the way we support our customers, and influence our high-level business strategy. If you are ready to take initiative and believe in data-driven decision-making – this role is perfect for you!
Serve as a technical and team leader in the Data Engineering team:
Lead team development and foster soft people management
Support career paths and mentor team members
Act as a Delivery Manager for data-related projects:
Define, prioritize, and ensure timely delivery of engineering tasks
Take ownership of major technical decisions and engineering excellence:
Establish and enforce standards for testing, code review, CI/CD, documentation, and monitoring
Oversee the architecture of our data platform:
Maintain and evolve our Lakehouse infrastructure (preferably Databricks-based)
Introduce new tools and technologies aligned with business and technical goals
Supervise the logical structure and modeling of data:
Ensure semantic and architectural consistency of datasets
Leverage approaches such as the Kimball model or Medallion architecture
Contribute to coding:
Develop and optimize data pipelines using Dagster, Python, and PySpark
Collaborate cross-functionally with data analysts, PMs, and other business stakeholders
Drive and mature data governance practices, including cataloguing and lineage
Requirements
Minimum of 3+ years of experience as a Data Engineer or in a similar role related to data
Experience in leading technical teams or managing data projects = at least 1 year (a big advantage) or willingness to grow into it
Strong knowledge of Python, PySpark (nice to have), orchestration tools like Dagster or Airflow and SQL
Experience working with cloud data platforms (Databricks experience is a plus) and Lake/Lakehouse architectures
Ability to define and uphold high engineering standards and processes
Proficiency in data modeling (Kimball, Medallion)
Excellent communication skills (English at B2+ level)
Strong ownership and self-organization
Nice to have:
Experience with Databricks and Delta Lake
Familiarity with tools such as DataHub, Terraform, Docker
Background in implementing data governance, lineage, and quality frameworks
Python, PySpark, SQL, Databricks, GCP (BigQuery), Dagster, Airflow, Delta Lake, Docker, Terraform, DataHub
Benefits
Enjoy Work-Life Balance: Embrace a fully remote and flexible work environment.
Explore the World: Avail annual 'Work with Us, Travel with Us' vouchers.
Grow Your Skills: Access to English language classes along with a dedicated team development fund.
Stay Healthy: Benefit from co-financed life and medical insurance, access sports facilities and receive professional mental health support whenever needed.
Take Time Off: Get 26 days off with a Contract of Employment and 24 days off break with B2B contracts.
Share hospitality: Take 2 extra days off (annually) for CSR activities.
Join Celebrations: Participate in company retreats, events, and wedding & baby packs, benefit from our employee referral program.
Transparent Culture: Experience a flat hierarchy and open communication channels for transparency.
Contract Enhancements: earn between 25 500 PLN to 30 000 PLN on a B2B contract or between 21 000 to 25 000 PLN gross for Contract of Employment.
About Us:
We are a global hospitality software technology company which originated from Poznan, Poland (and that’s where our HQ is) in 2008 and keeps on growing ever since - spans across 5 continents, with over 3500 customers constantly improving their revenue streams.
We deliver leading SaaS technology and marketing services to give hoteliers the tools to increase revenue performance, bookings and efficiency, while providing their consumers the best services and experiences.