Abstract
Deep neural networks require high computational costs, especially in mHealth applications with limited hardware. The project proposes to explore knowledge distillation in time series classification tasks. This technique involves transferring the learning from a larger network (teacher) to a smaller one (student), which is trained to reproduce similar outputs more efficiently. This approac…