PySpark Developer
1 week ago
We are looking for an experienced PySpark Developer with strong Microsoft Fabric and Azure engineering skills to join a major transformation programme within the financial-markets domain. This role is fully hands‑on, focused on building and optimising large‑scale data pipelines, dataflows, semantic models, and lakehouse components.Key ResponsibilitiesDesign, build and optimise Spark‑based data pipelines for batch and streaming workloadsDevelop Fabric dataflows, pipelines, and semantic modelsImplement complex transformations, joins, aggregations and performance tuningBuild and optimise Delta Lake / delta tablesDevelop secure data solutions including role‑based access, data masking and compliance controlsImplement data validation, cleansing, profiling and documentationWork closely with analysts and stakeholders to translate requirements into scalable technical solutionsTroubleshoot and improve reliability, latency and workload performanceEssential SkillsStrong hands‑on experience with PySpark, Spark SQL, Spark Streaming, DataFramesMicrosoft Fabric (Fabric Spark jobs, dataflows, pipelines, semantic models)Azure : ADLS, cloud data engineering, notebooksPython programming; Java exposure beneficialDelta Lake / Delta table optimisation experienceGit / GitLab, CI / CD pipelines, DevOps practicesStrong troubleshooting and problem‑solving abilityExperience with lakehouse architectures, ETL workflows, and distributed computingFamiliarity with time‑series, market data, transactional data or risk metricsNice to HavePower BI dataset preparationOneLake, Azure Data Lake, Kubernetes, DockerKnowledge of financial regulations (GDPR, SOX)DetailsLocation : London (office‑based)Type : ContractDuration : 6 monthsStart : ASAPRate : Market ratesIf you are a PySpark / Fabric / Azure Data Engineer looking for a high‑impact contract role, apply now for immediate consideration #J-18808-Ljbffr
-
PySpark Developer
2 days ago
London, United Kingdom DCV Technologies Full timeWe are looking for an experienced PySpark Developer with strong Microsoft Fabric and Azure engineering skills to join a major transformation programme within the financial-markets domain. This role is fully hands-on, focused on building and optimising large-scale data pipelines, dataflows, semantic models, and lakehouse components. Key Responsibilities *...
-
Pyspark Developer Pyspark/python, SQL, Aws
4 days ago
London, United Kingdom Modis Full time**Role : Pyspark/Python, SQL, AWS.** **Location: London ( Hybrid )-This will be Hybrid role** **Contract : 12 Months** PySpark - Must Need : Pyspark/Python, SQL, AWS. Good to have : SAS and Pandas. - Pyspark - Modules are built on pyspark. - SQL - To manipulate the data and provide accurate results. - SAS - To convert SAS code into Pyspark, understanding...
-
PySpark Developer
3 weeks ago
London, United Kingdom DCV Technologies Full timeWe are looking for an experienced PySpark Developer with strong Microsoft Fabric and Azure engineering skills to join a major transformation programme within the financial-markets domain. This role is fully hands-on, focused on building and optimising large-scale data pipelines, dataflows, semantic models, and lakehouse components. Key Responsibilities *...
-
Developer (PySpark + Fabric)
3 weeks ago
London, United Kingdom Stackstudio Digital. Full timeJob DescriptionRole / Job Title:Developer (PySpark + Fabric)Work Location:London (Office Based)The RoleThe role will be integral to realizing the customer's vision and strategy in transforming some of their critical application and data engineering components. As a global financial markets infrastructure and data provider, the customer leverages cutting-edge...
-
Pyspark Developer
1 week ago
London, United Kingdom eFinancial Careers Full time** PySpark Developer - London - Python Spark - £100k - £140k PySpark Developer required by my leading Financial Services client who are building a risk analytics platform in Azure - 100% cloud native with zero on-prem services. This is a permanent opportunity offering hybrid working with 3 days in the London office and 2 days working from home, it is...
-
Pyspark Developer
16 hours ago
London, United Kingdom eFinancial Careers Full time** PySpark Developer - London - Python Spark - £100k - £140k PySpark Developer required by my leading Financial Services client who are building a risk analytics platform in Azure - 100% cloud native with zero on-prem services. This is a permanent opportunity offering hybrid working with 3 days in the London office and 2 days working from home, it is...
-
Data engineer
3 weeks ago
London, United Kingdom Gazelle Global Full timeSenior Data Engineer (Developer) – Pyspark We are supporting a leading global financial markets infrastructure and data provider as they modernise and scale their core data engineering capabilities. This role sits at the centre of their transformation programme, delivering high-quality data pipelines, models, and platforms that underpin critical services...
-
Data engineer
3 weeks ago
London, United Kingdom Gazelle Global Full timeSenior Data Engineer (Developer) – Pyspark We are supporting a leading global financial markets infrastructure and data provider as they modernise and scale their core data engineering capabilities. This role sits at the centre of their transformation programme, delivering high-quality data pipelines, models, and platforms that underpin critical services...
-
Python Data Engineer Azure
6 days ago
London, United Kingdom Brightbox GRP Full timePython Data Engineer Azure & PySpark - SC ClearedContract£400-£458pd (Inside IR35)SC Clearance is EssentialSummary Were looking for a Python Data Engineer skilled in PySpark, Delta Lake, Azure services, containerized development, and Behave-based testing. Youll design and build scalable data pipelines and maintain high-quality, test-driven code in a cloud...
-
Data engineer
3 weeks ago
london (city of london), United Kingdom Gazelle Global Full timeSenior Data Engineer (Developer) – Pyspark We are supporting a leading global financial markets infrastructure and data provider as they modernise and scale their core data engineering capabilities. This role sits at the centre of their transformation programme, delivering high-quality data pipelines, models, and platforms that underpin critical services...