The past years of my development experience has been working on a multitude of projects utilising numerous technologies and reacting quickly to agile requirements. The languages and technologies used would be dependent on a project by project basis, I would recommend the most suitable technologies to use for a project based on the requirements of the system, design, implement and deploy the solution.
I obtained a First Class degree in Computer Studies from Liverpool John Moores University, my dissertation was based on Programme Management Software and the use of software within large corporations, I produced a piece of software which generated Gantt charts for the user and could predict time to complete a project based on the productivity of the team. Some of the modules studied at University were; Web Development, Database Systems, Mainframe Computing and Object Oriented Programming.
When: 2021
Contribution: Redesign and implementation of the entire client and server setup.
Technologies: Node.js, Redis
Overview: Redesigned the existing client-side implementation to be server-side. The reason for switching to a server-side approach was to reduce authentication errors, as only one token could be issued at a time. By handling authentication server-side, I ensured that client users shared the same token, allowing renewals to be managed at a single point.
Key Features & Contributions:
In my role, I led the transition of an existing client-side OAuth authentication system to a server-side implementation. This shift was essential to address recurring authentication errors caused by the limitation of issuing only one token at a time. By centralizing token management on the server, I ensured that multiple instances of a client user could share the same token, with renewals handled at a single, reliable point. Previously, multiple browser tabs would not work for a single user because each tab attempted to initiate OAuth independently, resulting in different tokens being issued.
A key challenge in this project was resolving race conditions that arose during token renewal. Multiple instances attempting to refresh a token simultaneously led to conflicts, breaking authentication for users with concurrent sessions. To mitigate this, I designed a solution leveraging Redis Semaphores and Mutex locks, ensuring that the renewal process was both orderly and resistant to simultaneous requests from a single user.
A Mutex lock was created when the first request was made for a user, and it was released once the token had been returned. Any subsequent requests made while the token was being retrieved would check the lock and return the same token once obtained. When a new tab was opened, it would retrieve the active, valid token instead of requesting a new one. Token renewals followed an identical process—where the first request would acquire the lock, and all subsequent requests would wait for the first request to complete.
Throughout this implementation, I followed a Test-Driven Development (TDD) approach, rigorously validating each aspect of the system to ensure reliability in production. As a result, the new authentication flow has eliminated customer issues related to token renewal and significantly improved system stability.
When: 2019 - 2020
Contribution: Working with a small group doing the design and development of entire system. Helping junior members of the team implement features.
Technologies: Node.js (Signalling), WebRTC, Rust (General backend)
Overview: Developed a peer-to-peer video conferencing solution using WebRTC, enabling real-time communication with minimal latency. Implemented STUN/TURN servers for NAT traversal, a WebSocket-based signaling server for session initiation, and secure media transmission using DTLS-SRTP, making it a core component for multiple bespoke video conferencing applications.
Key Features & Contributions:
This project focused on implementing a peer-to-peer (P2P) video conferencing solution using WebRTC, enabling real-time, low-latency communication without relying on centralised servers. WebRTC can connect multiple clients directly for media streaming, but establishing direct connections across various network conditions required careful handling of NAT traversal and firewall restrictions.
To facilitate seamless connectivity, I deployed both STUN (Session Traversal Utilities for NAT) and TURN (Traversal Using Relays around NAT) servers. The STUN server was used to help clients discover their public IP addresses and determine the type of NAT they were behind, allowing for direct peer-to-peer connections in most cases. However, in scenarios where direct connectivity was not possible—such as symmetric NAT or restrictive firewalls—the TURN server acted as a relay to ensure reliable media transmission.
Additionally, a signaling server was implemented using WebSockets to handle session initiation, exchange SDP (Session Description Protocol) offers and answers, and transmit ICE (Interactive Connectivity Establishment) candidates between peers. While WebRTC itself does not define a signaling mechanism, this server was essential for bootstrapping peer connections before they could establish direct media streams.
When: 2019
Contribution: Design and development of entire system
Technologies: Node.js, AWS S3, PGP, HSM
Overview: I developed a highly secure document storage solution that leveraged PGP encryption and AWS S3, ensuring that sensitive files were never transmitted or stored in an unencrypted state.
Key Features & Contributions:
End-to-End Encryption: Implemented a system where files were encrypted before leaving the user's browser using PGP (Pretty Good Privacy). This meant that even if intercepted during transmission, the files remained unreadable to unauthorised parties.
Secure Storage with AWS S3: Utilised signed URLs to facilitate direct uploads to Amazon S3, preventing unnecessary exposure of unencrypted data to backend systems.
Hardware Security Module (HSM) Integration: Allowed encryption keys to be securely managed using HSMs, ensuring the highest level of protection for private keys while maintaining ease of access for authorised users.
Zero Trust Data Protection: Even if the S3 storage was compromised, the encrypted data remained useless to an attacker, as decryption could only occur with the user's private key.
This system was designed with security, scalability, and efficiency in mind, allowing for seamless document encryption and retrieval while maintaining compliance with strict data protection standards.
In addition to implementing the core WebRTC infrastructure, we developed a Rust-based API to handle various non-signalling connectivity aspects, including authentication, payments, and donation processing. While WebRTC efficiently manages peer-to-peer media transmission, many conferencing applications require additional services beyond signalling—such as user management, session persistence, and monetisation features. Our API provided a secure and high-performance backend to support these needs.
Rust was chosen for its speed, memory safety, and concurrency capabilities, making it ideal for handling real-time transactions and network operations. The API exposed endpoints for processing payments and donations, ensuring seamless integration with the video conferencing platform. We utilised Stripe for payment processing, allowing users to purchase premium features, contribute to creators, or make one-time donations during a call. Webhooks and event-driven architecture enabled real-time updates, ensuring transactions were reflected instantly within the application.
Beyond payments, the API also managed user authentication, access control, and session validation. JWT-based authentication ensured secure access to restricted features, while rate limiting and request validation mechanisms protected against abuse. The API played a crucial role in enhancing the overall conferencing experience, allowing developers to integrate monetisation and security features without interfering with the core peer-to-peer communication layer. By keeping these services separate from WebRTC signalling, we maintained a clean architecture, improving maintainability and scalability across multiple projects that leveraged this conferencing solution.
When: 2015 – 2016
Contribution: Creation of fundamental systems within the API such as transformations and API integration's into external systems.
Technologies: PHP, MySQL, and RabbitMQ.
Requirements: This API was used to talk between existing API's and products, it was used as an external endpoint for systems such as emails and websites to provide booking details.
This API was setup to replace existing systems and legacy API endpoints and to be able to replace sections of the existing site into a system which was easier to maintain and to develop on. The whole product was developed utilising TDD to be able to ensure changes were suitable and regressions were not introduced.
One of the sections that I worked on was to translate an existing API endpoint into a format that could be used for another system which required the data to be in a different format. I created a middleware that interpreted the request data and translated it into a format that Hyperon could use, the normal endpoint would be called from this returning the data. The middleware could also translate the returning data if required. The translations could be performed based on the customer calling the endpoint and the translation mappings can inherit from each other and be expanded upon.
When: 2012 – 2013
Contribution: Integration and adoption of existing code base to change use and adapt to customer requirements
Technologies: Java, JTwitter, Flat File DB, GATE
Requirements: Collect information from social media based on location of tweet and the information contained within the post.
While working with Clinical Justice one of the side projects was to develop a system which could be utilised to collect information from social media sources, this information could then be analysed with a system such as GATE to semantically analyse the content of the strings of peoples posts or tweets. The project that was created in light of these requirements was using Java, there is a twitter API which allows users to collect data based on search parameters and this information is then relayed into the ‘fire-hose’, the Java system would listen to the fire-hose and collect the information if certain further search criteria was matched. The library that was used was based on JTwitter which provides a Java based implementation to access the fire-hose. The system was adapted to listen for tweets coming from the UK as that is where Clinical Justice operated, certain search teams were used such as ‘medical negligence’ and ‘justice’ which allowed the data to be filtered down by the fire-hose relatively well. The posts were linked to a user so that a re-engagement with the customer could occur at a later date.
Once the data has been collected, GATE was used to provide semantic al analysis on the search terms collected so that the true meaning of the post could be deduced automatically, this filtered list could then be used to re-engage with the people who had posted the status.
When: 2013-2014
Contribution: Holiday System, Manager and Auditor access. Warehousing Management system, Linux system administration.
Technologies: HTML5, CSS3, JavaScript, ExtJS4.2, PHP, Symfony2, Doctrine, MySQL, Extensible, ElasticSearch, Continuous Integration with TeamCity, RabbitMq.
The Ingot Portal is a system that has been created and developed by Warrant Group over a long period of time, there has been a few iterations of the product. These iterations have ranged from standard HTML, CSS and PHP deployment to the latest version, which is utilising Symfony, ExtJS and Doctrine to create a structured system which implements an MVC architecture to logically separate the elements of the system. This system is used internally by the Warrant Group staff for management of the system and also customers which are informed of status updates when there shipment reaches its destination or there is a announcement made against it. The customers that Warrant Group deals with previously had no way to automate the process of obtaining the information about where the items were and the Ingot Portal enables the management and distribution of this information across a wide user base effortlessly.
The main system that has revived most of my contributions is the Holiday System – this system is for employees to log into to book, view and manage their holidays. Some of the manager views that have been created in ExtJS are shown below which shows the use of context menus to provide more user interaction with webpages which is generally expected now. ExtJs is considered to be one of the hardest languages associated to JavaScript as it has such a steep learning curve, however the documentation is excellent and this has made the creation of these systems very intuitive.
Another part of the system that has had serious development is the conflicting holidays view which is data bound to the grid to accept and deny holidays. Once a users holiday request is selected the grid refreshed to the date that the holiday request has been made for – the manager will be able to view any holidays which conflict with this request and will be able to use their discretion to either accept or reject the holiday request based on their requirements. Extensible was used to provide the calendar functionality to the system which is a plugin for ExtJS.
When: 2012 – 2013
Contribution: PHP Security(Injections and XSS), jQuery Component Development, Web Design and Validation
Technologies: MySQL, PHP, HTML, CSS, JavaScript, jQuery
Clinical Justice are a Liverpool based company where a new Website was developed to replace the old WordPress website, for versatility purposes the switch to a non-CMS based approach was adopted. The new approach was developed by hand with hand coded HTML following W3C’s recommendations and best practices, so that the site would be viewable and accessible by everyone.
jMenu was used for the navigation on the website, there was a large array of pages which needed to be included, a multi tiered structure was required so that the menu’s can be dropped down into multiple times to get to the required information. The styling of the jMenu that came by default was quite abrasive and the .js file and the .css file which were supplied by this library were edited so that they followed a more subtle styling which was more appropriate to the requirements of the website.
Prepared statements were using within PHP to stop any SQL Injection being able to output the data from a database, PHP prepared statements require the use of mysqli or PDO, mysqli was used in this specific example, the parameters are bound to the values before they are submitted which stops the statement being altered by Injection attacks.
A Captcha is used to hinder the use of bots for submitting information, SecurImage plugin was used to achieve the required functionality.
There was a requirement cURL for submitting the post data to multiple locations, through a third party library which would deal with statistics about what pages the user viewed before they submitted a form and also to store a copy of the information as a record into a MySQL database.
When: 2012 – 2013
Contribution: Integration and adoption of existing code base to change use and adapt to customer requirements
Technologies: Java, JTwitter, Flat File DB, GATE
Requirements: Collect information from social media based on location of tweet and the information contained within the post.
While working with Clinical Justice one of the projects was to develop a system which could be utilised to collect information from social media sources such as Twitter, this information could then be analysed with a system such as GATE to semantically analyse the content of the message to understand the meaning of posts or tweets pragmatically. The project that was created in light of these requirements was using Java, there is a twitter API which allows users to collect data based on search parameters and this information is then relayed into the 'fire-hose', the Java system would listen to the fire-hose and collect the information if certain search criteria was matched it would record this to Redis for later analysis. The library that was used was based on JTwitter which provides a Java based implementation of the Twitter API. The system was adapted to listen for tweets by geographical area, certain search teams were used such as ‘medical negligence’ and ‘justice’ which allowed the data to be filtered down by the fire-hose relatively well. The posts were linked to a user so that a re-engagement with the customer could occur at a later date.
Once the data has been collected, GATE was used to provide semantic al analysis on the search terms collected so that the true meaning of the post could be deduced automatically, this filtered list could then be used to re-engage with the people who had posted the status.
When: 2013
Contribution: Frontend responsive GUI with Masonry grid
Technologies: CSS3, jQuery, Masonry and HTML
Requirements: Responsive recipe shopping system, users need to easily add a recipe into their basket from the recipe browser
Link: Chicory
On this project I worked closely with the development team to link the sever side code to the responsive interface. I interfaced with the owners of the company to understand and develop their requirements and goals of the website and convert these requirements into the finished application. The idea was to create a Pinterest style board with multiple columns of pictures which show a picture of the recipe, but also the ability to add them to the cart without viewing the item.
CSS3 was used to provide much of the functionality of the site such as text-overflow to hide any overflowing text as the recipes were aggregated automatically from outside sources. The responsive recipe layout was implemented with Masonry JavaScript library for a cascading grid layout which could remove and add columns based on the width of the page.