The App

The different Views of the App

The AURA app works hand in hand with the bracelet to help protect users from the dangers of sun exposure. It not only tracks UV levels and provides bioimpedance data but also sends smart notifications, reminding you when to apply sunscreen, when it’s time to perform a skin check, and warning you if UV levels suddenly increase. AURA also integrates artificial intelligence into the diagnostic process. Users can take a
close-up photo of a mole using their phone’s camera. The app analyzes the image locally using a trained CoreML model, which performs binary classification of moles as benign or malignant. This model was trained on a dataset of over 10,000 labeled dermatological images using PyTorch, then converted to the CoreML format for on-device processing. It achieved an 87% training accuracy and an 85% validation accuracy, Figure 16. To improve the model’s generalizability, we used data augmentation and dropout regularization. Validation was performed using a stratified test set. This approach is inspired by published work such as Esteva et al. (Nature, 2017) and Brinker et al. (Lancet Digital Health, 2019), which demonstrate that deep neural networks can match or exceed dermatologists in mole classification accuracy. Importantly, all analysis is performed offline, ensuring privacy and reducing latency. After classification, the app provides the user with a simple message such as “Malignant risk: 83% confidence.”