At Uvik Software, we are looking for an experienced Data Analyst with strong Python skills and a deep understanding of working with large-scale transactional datasets. You will be at the core of building reliable data workflows, uncovering actionable insights, and ensuring data quality across complex systems.
This is a long-term opportunity to contribute to data-driven decision-making in a modern, cloud-oriented environment. You will collaborate with engineers, analysts, and product teams to transform raw data into valuable business intelligence.
Responsibilities:
— Perform advanced data analysis on large structured transactional datasets to uncover insights, trends, and anomalies
— Build and manage data pipelines using Python and version control tools such as Git
— Conduct data profiling to assess data accuracy, completeness, and consistency
— Define and implement data validation rules and quality assurance processes to ensure data integrity
— Collaborate with cross-functional teams (data engineers, business analysts, product owners) to design data-driven workflows that support business objectives
— Translate business requirements into clear data models, metrics, and dashboards
Requirements:
— Minimum 4 years of hands-on experience as a Data Analyst working with large and complex datasets
— Strong proficiency in Python for data manipulation, cleaning, and automation
— Proficient in Git for versioning and managing analysis workflows
— Solid experience in data profiling, validation, and data quality assurance techniques
— Proven experience working with structured transactional data (e.g., financial, retail, user activity data)
— Deep understanding of data-driven workflows, pipelines, and lifecycle management
— Strong analytical thinking, attention to detail, and ability to independently solve data-related problems
— Excellent communication skills and ability to present findings to both technical and non-technical stakeholders
— Experience working in or currently residing in an EU country (EU citizenship or legal relocation required)
Nice to have:
— Experience with GCP, BigQuery, or other cloud data platforms
— Document analysis processes, data definitions, and validation logic for transparency and reproducibility
— Support stakeholders with ad hoc and recurring data reports, KPIs, and root cause investigations
— Maintain a strong understanding of data governance and compliance best practices
— (If applicable) Use BigQuery and GCP tools for large-scale data querying and storage optimization
Тип вакансії: Повна зайнятість, Постійна робота
Оплата: 12 500,00zł на місяць
Графік:
- 8-годинна зміна
- Вихідні
- З понеділка до п’ятниці
Місце роботи: Дистанційно