Daniel Barley, M.Sc.
Daniel Barley is a PhD candidate with the Computing Systems Group at the Institute of Computer Engineering at Heidelberg University. He works primarily on resource-efficient deep learning with a focus on memory consumption, data movement, and efficient utilization of compute resources on GPUs. His work is centered on the training stage of deep neural networks and to that end considers pruning and compression of input activations, as they make up the vast majority of the memory footprint.
Daniel’s work is also part of the “Model-Based AI” project, which is funded by the Carl Zeiss Foundation
Research interests
- Hardware-efficient training of deep neural networks
- Pruning/compression
- Efficient (block-)sparse operators
- GPU architecture
Recent news (2-year horizon)
- 01/2024: Paper presentation at the 6th Workshop on Accelerated Machine Learning (AccML), co-located with the HiPEAC 2024 Conference in Munich - “Compressing the Backward Pass of Large-Scale Neural Architectures by Structured Activation Pruning”
General information
- Short CV: pdf
Recent Teaching (4-year horizon)
- Summer term 2022
- Teaching assistant; graduate course “Parallel Computer Architecture”
- Winter term 2022/23
- Teaching assistant; graduate course “Introduction to High Performance Computing”
Publications
- Less Memory Means smaller GPUs: Backpropagation with Compressed ActivationsITEM Workshop, collocated with ECML-PKDD, 2024| bib
@article{JMLR:v25:18-566, author = {Barley, Daniel and Fr{{\"o}}ning, Holger}, title = {Less Memory Means smaller GPUs: Backpropagation with Compressed Activations}, journal = {ITEM Workshop, collocated with ECML-PKDD}, year = {2024}, }
- Compressing the Backward Pass of Large-Scale Neural Architectures by Structured Activation PruningCoRR, abs/2311.16883, 2023
@article{DBLP:journals/corr/abs-2311-16883, author = {Barley, Daniel and Fr{\"{o}}ning, Holger}, title = {Compressing the Backward Pass of Large-Scale Neural Architectures by Structured Activation Pruning}, journal = {CoRR}, volume = {abs/2311.16883}, year = {2023}, url = {https://arxiv.org/abs/2311.16883}, doi = {10.48550/ARXIV.2311.16883}, eprinttype = {arXiv}, eprint = {2311.16883}, timestamp = {Mon, 04 Dec 2023 00:00:00 +0100}, }