Company Description
Techland is one of the biggest video game companies in Poland, with over 30 years of experience in the gaming industry. From our studios in Wrocław and Warsaw, we’ve built an international team of more than 500 talented professionals, all dedicated to pushing the boundaries of game development.
We’re known for creating iconic franchises like Call of Juarez and the zombie genre-defining Dying Light, which has been played by over 45 million players worldwide. With a focus on open-world action, storytelling, and community engagement, we’re committed to delivering unforgettable experiences to our players.
We’re constantly striving to improve, innovate, and take on new challenges. With ambitious plans for the future, we’re looking for passionate people to be part of this exciting journey.
Job Description
Your daily tasks:
Building and maintaining data pipelines to collect and store data from various sources and in various formats.
Implementing data transformation and ingestion processes optimized for large-scale data.
Developing and maintaining Airflow DAGs for workflow scheduling and orchestration.
Designing and optimizing data storage solutions, including data warehouses and data lakes.
Implementing comprehensive data quality checks and validation processes.
Identifying and resolving potential data inconsistencies and production issues.
Developing custom solutions for data processing requirements.
Working closely with data analysts and game developers to to understand data requirements.
Qualifications
At least 2 years of professional experience in data engineering.
Demonstrated experience building and maintaining production data pipelines.
Experience with both batch and streaming data processing.
Proficiency in Python with experience in data manipulation/data science packages (NumPy, pandas, SciPy, PyTorch, Scikit etc).
Hands-on experience with Apache Spark (preferred PySpark) for large-scale data processing.
Hands-on experience with Cloud Infrastructure (preferred GCP and AWS).
Strong experience building and maintaining Airflow DAGs and workflows.
Extensive experience with data warehousing (e.g. BigQuery - preferred, Snowflake).
Strong experience with variety API data integration.
Experience with working with SQL and NoSQL databases.
Eagerness to learn and improve continuously within our tools, techniques, and industry practices.
Very good communication and collaboration skills.
Good command of English.
Passion for video games.
Nice to have technical skills:
Experience with data visualization tools (e.g. Power BI, Looker).
Basic understanding of data security and compliance requirements.
Experience with monitoring, alerting and logging tools.
Experience with MLOps tasks.
Knowledge of Apache Cloud Software tools for batch and streaming processing (Flink, Kafka, Beam, Storm etc.).
Experience with dbt/dataform analytical pipelines.
Experience with Google Analytics.
Familiarity with Databricks and Delta Tables.
Additional Information
What we can offer:
- A wide array of benefits: private medical care, life insurance, pro-health campaigns, gifts for different occasions.
- An outstanding work atmosphere in a highly-skilled team of professionals, with flexible working hours, no dress code, and full support of the dedicated HR Business Partner.
- Many opportunities for personal development: a dedicated development budget for each employee, extra two paid days for training and CSR, stable career paths, extensive internal and external training, and financing of English and Polish language classes.
- State-of-the-art offices filled with chillout zones, a fully equipped kitchen, a gym (Wrocław office), and a free car park (Warsaw limited amount of space).