Location
London Area, United Kingdom
Salary
Not specified
Type
fulltime
Posted
Today
Job Description
About the company
BB Energy has a legacy spanning more than six decades. Today, the Group provides expertise across energy trading, operations, logistics, storage and retail. Our diversified portfolio cuts across many segments of the energy industry from trading crude oil and petroleum products to natural gas and renewables. With global operations anchored out of five hub offices in London, Dubai, Singapore, Houston and Athens, BB Energy has a global network of nearly 400 employees in a dozen offices across five continents. Growth and diversification remain cornerstones of BB Energy's business strategy as it supports an energy ecosystem capable of delivering affordable, secure, and increasingly clean energy to the societies that need it most.
For our office in London, we are looking for a Data Engineer to join our team, focused on building, managing, and optimizing risk analytics workflows. You will play a key role in designing and evolving data pipelines, models, and tooling that support critical risk processes, while also contributing to our broader data platform leveraging Microsoft Fabric. Working closely with the platform team, you will help develop shared infrastructure, scalable ingestion frameworks, and high-quality data products.
This role sits at the intersection of data engineering, risk technology and modern data platform practices within a global, always-on trading environment
Why Join Us?
- We are a fast growing business in one of the most dynamic markets.
- Our team includes some of the leading experts in this field.
- We have a strong entrepreneurial spirit.
- We are a well established business with a strong heritage and track record of success globally.
- We have a flat management structure and open management style - all voices heard and all voices count.
- Competitive salary and benefits package offered.
Job Accountabilities
- Understand risk workflows end-to-end and translate them into reliable, production-grade data pipelines and products.
- Build and maintain batch and near-real-time data ingestion pipelines from diverse sources including relational databases, REST APIs, FTP/SFTP feeds, and cloud storage.
- Contribute to the data platform delivering harmonised, governed data products that serve multiple business functions.
- Collaborate with risk, analytics, and engineering teams to productionize and maintain risk models and scripts.
- Implement best practices for code quality, testing, and release management across the data platform.
- Build and support Power BI semantic models and DirectLake datasets.
- Manage and maintain Azure DevOps pipelines for deployment, version control, and CI/CD of risk-related scripts and data workflows.
- Monitor system performance and troubleshoot issues related to data pipelines and deployments.
- Ensure proper data governance, security, and compliance standards are applied.
Required Skills \& Experience
- Hands-on experience with Microsoft Fabric, Azure data services (e.g., Synapse Analytics, Data Factory), or Databricks for large-scale data processing.
- Proficiency in Python / SQL for data engineering and scripting.
- Familiarity with risk analytics environments or financial data.
- Strong experience with Apache Spark (Spark Engine), including performance optimization and distributed data processing.
- Strong experience ingesting data from diverse sources including relational databases, REST APIs, FTP/SFTP file feeds, and cloud storage.
- Experience managing data pipelines and production workflows.
- Experience with Azure DevOps (CI/CD pipelines, repos, release management).
- Experience with version control (Git) and software development lifecycle practices.
Nice to Have
- Experience with streaming data technologies (e.g. Kafka, Azure Event Hubs).
- Exposure to metadata-driven framework design and config-driven pipeline development.
- Knowledge of non-relational databases (e.g. MongoDB, Cosmos DB).
- Familiarity with Data Mesh principles and domain-oriented data ownership.
- Experience with monitoring/logging tools in Azure.
What We're Looking For
- Strong problem-solving mindset and attention to detail.
- Ability to work across technical and business teams.
- Ownership of workflows and proactive approach to improvements.
- Interest in modern data platforms and evolving engineering practices.
- Comfortable working in a fast-moving, always-on trading environment where data quality issues have real business impact.
Looking for more opportunities?
Browse thousands of graduate jobs and entry-level positions.