I'm a Fullstack Engineer at Behaviour Lab and lead the front-end team there. At Behaviour Lab, we use behavioural science and advanced analytics to transform how the £70 trillion active asset management industry make their investments. My focus is on data driven information visualization and user experience with creative coding, math, and engineering.
I studied Electrical & Electronic Engineering at Imperial College London and graduated on the Deans List (top 10% academicaly) with First Class Honours. I was also the recepient of the Dennis Gabor Prize for best overall community to the department.
During my time there, I built wireless autonomous supermarket checkout systems, simulations of self-organising multi-agent systems, and probabilistic inference modules to detect COVID from patient blood pathologies for the NHS.
For my dissertation I built a visual search engine to explore 100 million research publications through a novel search representation inspired by bibliometric mapping using D3.js, React, Express, and Elasticsearch.
While search engines are ubiquitous in academic research and modern internet culture, most current digital search applications only parse information into large blocks of text. However, it is widely understood that one can improve understanding, retention, and insights by accompanying textual information with its graphic representation.
VRSE (Visual Research Search Engine) was developed and tested as a prototype visual search engine that leverages quantitative techniques to visually encode publication metadata in search results from a database built with approximately 25 million publications.
This prototype represents publications as nodes in an animated and interactive force-directed graph that uses colour to represent publication years. The novel visual representation in VRSE satisfies the design guidelines for fluid interaction, reduces Norman's Gulfs of Execution and Evaluation, and makes the time distribution of results distinguishable at a glance.