Work Experience

Senior Consultant Data Engineer at Aiimi

As a trusted data engineer, I leverage my expertise to design, develop, and optimize data systems that unlock valuable insights for businesses. My primary focus is on transforming raw data into actionable intelligence, driving informed decision-making and business growth. By delivering high-quality data solutions, I help clients stay ahead in their industry.

  • I am a co-leader of the Data Community of Practice, facilitating an enviroment where any data professional feels empowered to share experiences, knowledge and recent achievements.
  • I design, develop, and maintain scalable data products that integrate various data models and datasets, delivering actionable insights to drive business decisions.
  • I developed a comprehensive data validation framework, empowering teams to create robust and repeatable tests, ensuring data quality and confidence in their data pipelines.
  • I built an automated documentation tool that generates detailed documentation for complex data models, including markdown files and interactive mermaid diagrams for visualizing lineage, impact analysis, and entity relationships.

Data Engineer at The Access Group

As the primary data engineer for finance in the healthcare sector, I played a critical role in driving financial reporting, forecasting, and invoice generation. Additionally, I led complex patient data migrations to support onboarding new clients and maintained the integrity of our data warehouse reporting layer.

  • I developed and maintained a comprehensive financial forecasting tool that Integrated Care Boards (ICB) used to report to the NHS England
  • I led patient data migration efforts for new clients, guaranteeing accurate and timely importation of vital patient information into our systems.
  • I created a secure data deletion solution that ensured GDPR compliance by removing client data from our multi-client database. The application logged all deletions and operated within specified timeframes to balance data erasure with system performance.
  • I enhanced the existing invoice generation system, significantly reducing error rates and incorporating advanced debugging capabilities for faster troubleshooting.

Service Desk Analyst at CDW

Serve as the primary point of contact for technical support issues across a diverse portfolio of clients. Manage and facilitate access to secure Data Centre facilities, ensuring efficient resolution of technical queries.

  • Successfully interacted with second and third-line teams, on-site engineers, and end-users to provide timely and effective support, ensuring seamless resolution of technical issues and maintaining strong relationships with key stakeholders.
  • Consistently met or exceeded Service Level Agreement (SLA) targets, demonstrating a commitment to delivering exceptional customer service while minimizing downtime and maximizing productivity for our clients.
  • Effectively bridged the gap between technical and non-technical language, translating complex technical concepts into clear, concise explanations that end-users could understand, ensuring effective support and resolution of issues in a rapidly changing technical environment.

Projects

Met Office NIMROD Data Processing

This application is a data processing pipeline for UK Met Office Rain Radar NIMROD data. It automates the workflow of extracting compressed rainfall radar images, converting them into useful raster formats (ASCII), and generating consolidated timeseries data suitable for hydrological modeling (specifically formatted for Infoworks ICM).

  • Concurrent Processing; The application is explicitly designed for Python 3.14t (free-threaded) to leverage multi-threading capabilities, ensuring efficient parallel processing of large datasets.
  • End-to-End Orchestration; A central main.py script manages the entire lifecycle—automatically uncompressing .gz.tar archives, converting .dat binaries to .asc rasters, and compiling final CSV reports without manual intervention.
  • Resource Management; It uses a configurable batch processing system (processing a set number of TAR files at a time) to handle large volumes of data without overwhelming system resources.
  • User Experience Features; The pipeline includes startup safety checks to prevent accidental partial overwrites of existing data and provides a real-time dynamic ETA to track processing progress.

Devin

My own custom "Jarvis" from Iron Man, able to look for folders on my pc, tell me the weather for any location and control my Phillips Hue Lights. With plans to add more automations in time.

  • At home AI Agent, powered all locally with the option to use online LLMs
  • MCP Understanding, Tool building and using
  • Constantly growing tool base to be more helpful

Verity, Financial Planner & Forecaster

Verity is a local-first, privacy-focused financial planning and budgeting application designed to help you manage your money without relying on external cloud services. It runs completely locally on your machine, ensuring that your sensitive financial data remains private and under your control.

  • Local-First Privacy; Verity stores all data in a local SQLite database rather than the cloud. This design choice prioritizes security and privacy, ensuring you are the only one with access to your financial details.
  • Comprehensive Web Interface; It provides a browser-based dashboard for managing your finances, including creating budgets with nested categories (up to 3 tiers), tracking expenses, and managing multiple types of accounts (Current, Cash, Savings, Credit).
  • Modern Tech Stack; The application is built with Python using Flask for the web server, Rust-based tooling (like uv and ruff) for performance and code quality, and a modular architecture that cleanly separates the frontend from the backend logic.

Data Pipeline for YNAB

I designed and developed a robust data ingestion application, adhering to industry best practices in ETL and Medallion architecture, with a focus on seamless scalability, reliability, and maintainability. The application features comprehensive error handling, logging, unit testing, and proper exit codes, ensuring fast and reliable debugging capabilities, all while leveraging optimal tools such as Polars, YAML, and Parquet files.

  • Improved data handling: This application effectively detects and resolves duplicated data and lost data, reducing unnecessary data transfer and improving load times through its delta-style approach.
  • Accelerated processing: leveraging Rust-built Polars for rapid data processing, and minimizing file sizes with Parquet storage for efficient data transfer.
  • Enhanced transparency: integrated documentation empowers users to quickly understand and debug the application's inner workings, while self-documenting code ensures ease of maintenance and updates.