thinking about distributed, privacy-preserving and collaborative ml 🤔
budding machine learning researcher; member of the cohere for ai community.
At present, my machine learning interests are centred around distributed,
privacy-preserving, efficient
and collaborative machine learning.
In particular, I have a great affinity towards paradigms such as federated learning and
neural network compression,
branching off into areas such as mechanistic interpretability. Generally speaking, the
motivating factor for this is that
I am circumspect towards the promise of centralised machine learning paradigms, which require enormous
amounts of data and
computational resources to achieve their state-of-the-art performance. Instead, I believe that there is
much benefit to
be derived from investigating alternate paradigms that can scale while promoting privacy inherently,
supported by sound
mathematical foundations.
I am actively looking for PhD opportunities to start in 2025 Spring/Autumn. If you know of any
vacancies,
please do not hesitate to reach out.
year | title | authors | tags | paper | code | misc |
---|---|---|---|---|---|---|
2023 | Rumour Detection in the Wild: A Browser Extension for Twitter | jovanović & ross | 💻🗣️🌀 | NLP-OSS @ EMNLP2023 | ACL | |
2023 | EDAC: Efficient Deployment of Audio Classification Models For COVID-19 Detection | jovanović*, mihaly* & donaldson* | 🌳🌀 | arxiv preprint |
More info...
Awarded second place IBM prize in MLP course awards at
UoE.
|
October 2023 - August 2024
(mphil) machine learning and machine intelligence specialising in speech and language processing. Thesis exploring Second-order Optimisation and Imbalanced Class Distribution in Emotion Analysis under the supervision of Brian (Guangzhi) Sun.
Sept 2019 - May 2023
bsc(hons) artificial intelligence and computer science. thesis completed under the supervision of björn ross
Oct 2024 - March 2025
Working as a research assistant under the supervision of Dan Alistarh in the Distributed Algorithms and Systems (DAS) group focusing on federated learning applied to large scale models.
Summer 2023 - Summer 2023
Developed unsupervised learning methods, with an associated parallel data processing pipeline, for anomaly detection which increased previous performance by 30%.
Summer 2022 - Summer 2022
Pursued personal research in Natural Language Processing, building a search engine, powered by dense passage retrieval, with q&a capabilities. Assisted researchers in delivering proof of concept projects, onboarding them to CI/CD platforms.
I have had the fortune to work with many kind and talented individuals throughout my young career. In
particular,
thank you to the following individuals for inspiring me (in no particular order):
alessandro
palmarini
simon yu
sree harsha nelaturu