This is an Android application that lets the user add a resizeable transparent widget anywhere on their homescreen.
When they double tap it, their screen locks.
It requires the accessibility permission in order to lock the screen, and collects no data nor does it have any metric or usage tracking libraries.
I created this out of annoyance that the Google Pixel devices do NOT have this feature. My LG G3 from 2014 and Samsung S10E from 2019 both had it.
Meanwhile Google ignores their users' requests and people have to resolve to using 3rd party applications.
This project expanded upon what I did in my fourth-year dissertation. Here, I looked at (1) how an epidemic algorithm could disseminate data between participants in a peer-to-peer manner and how (2) federated model training performs in such an environment. Additionally, I gathered performance data to gauge how the application functioned as a whole. Unfortunately, the study was inconclusive as there were not enough participants, and the data was messy. Nonetheless, it suggested that there could be a benefit to using such a system with more participants.
This project aimed to develop an accurate method of identifying socially anxious individuals based on their application usage session data. This data is essentially the applications individuals use whenever their smartphone is unlocked. Three different decision tree models were created - decision tree, extra trees, and random forest. The project suggested that application usage session data can be considered a good indicator of social anxiety. Still, as it is a student dissertation, no meaningful conclusion can be extracted without more research.
During the Lockdown of 2020, I had some time and wanted a small side project. I decided to learn more about Vue.js as I was using it for a work project at the same time. The goal was to create a website that could autoscroll through image posts from Reddit. With multiple monitors, I would use one for work and another one for scrolling through nature and city photos.
A for-fun Discord bot made in my spare time to play audio files. The program works by initially downloading audio files from a Google drive folder that I own. The bot then lets you play them by name in a voice channel. Additionally, I implemented functionality to download blog posts from a specific URL and convert it to audio via a text-to-speech library. Unfortunately, the conversion proved slow, and thus I disabled it.
A Sustanabinity bin is the automatic robotic assistant that helps you figure out where you should throw your litter. Without thinking, follow the signal or educate yourself by predicting the signal. It creates a universal way of throwing trash without any gender/race/ethnicity bias as you are looking at a light that turns green or stays red. This robotic assistant used image recognition to determine the type of litter, and a Raspberry Pi controlled several motors that opened and closed the relevant plastic garbage bin. This robotic assistant was chosen for first place.
This was built for the JP Morgan & Chase Code for Good hackathon. Our chosen charity was the Glasgow Science Centre, and we create a virtual reality tour that aimed to be easy to use and accessible for everyone. The project was chosen for first place, and the idea was carried over to the JP Morgan & Chase Force for Good initiative that then created a complete solution for the Science Centre based on this.
This was my third year group dissertation project. We were tasked with building a chatbot for the University to help deal with common queries regarding short courses. For example, a common query could resemble "Hi, what is the price for the course Art in the Renaissance?" and the chatbot would be able to easily answer that. This was done to relieve pressure from the administrations team.
This was my first Android application. I am a big fan of the Learn X in Y Minutes website and wanted to see how I could translate it into a mobile app. A separate Python script is run that scrapes the Learn X in Y Minutes Github page and generates the HTML content to be saved in Firebase's Firestore. The app reads from the Firestore database and caches the content locally.
This was a group hackathon project from the Do You Have The GUTS Hackathon 2018. Craneware's challenge was to see who could create the most creative way to visualise big medical data. Our team created a 3D environment that showed scaled statistics of the various data points so that anyone could get "the big picture" view quickly. The data manipulation and calculation were done in Python. The Unity game engine was used to read the processed statistics and then generate the UI elements. The project took first place.