Swedium Global is looking for Data Engineers for one of its customers
Location: Poland
Key Responsibilities
- Design, build, and maintain scalable ETL/ELT pipelines and data workflows using Databricks.
- Implement and optimize data ingestion frameworks from multiple sources (structured, semi-structured, unstructured).
- Collaborate with Solution Architects and Data Scientists to prepare clean, reliable, and secure datasets for advanced analytics and ML workflows.
- Lead/participate in migration projects from on-prem platforms (Teradata, Oracle, Cloudera) to Databricks on Azure/AWS.
- Implement data quality, validation, and monitoring processes to ensure accuracy and reliability.
- Apply best practices in data modeling, partitioning, performance tuning, and cost optimization.
- Ensure data governance, lineage, and compliance with regulatory standards (GDPR, AML/KYC, Financial Crime data requirements).
- Collaborate in Agile teams and contribute to CI/CD pipelines for data engineering.
Qualifications & Skills
- Proven experience as a Data Engineer with expertise in Databricks, Spark, and Delta Lake.
- Strong programming skills in Python, PySpark, SQL (Scala is a plus).
- Experience in ETL/ELT design, data modeling, and pipeline orchestration (Azure Data Factory, AWS Glue, or Airflow).
- Strong knowledge of Azure or AWS cloud platforms.
- Hands-on with Terraform/Infrastructure-as-Code (IaC) for cloud resource provisioning (nice to have).
- Familiarity with data governance frameworks, security, and compliance in financial services or cybersecurity domains.
- Strong problem-solving skills, with the ability to optimize performance and troubleshoot complex data issues.
Job Type: Full-time