Current Result Document :
ÇѱÛÁ¦¸ñ(Korean Title) |
Num Worker Tuner: An Automated Spawn Parameter Tuner for Multi-Processing DataLoaders |
¿µ¹®Á¦¸ñ(English Title) |
Num Worker Tuner: An Automated Spawn Parameter Tuner for Multi-Processing DataLoaders |
ÀúÀÚ(Author) |
DoangJoo Synn
JongKook Kim
|
¿ø¹®¼ö·Ïó(Citation) |
VOL 28 NO. 02 PP. 0446 ~ 0448 (2021. 11) |
Çѱ۳»¿ë (Korean Abstract) |
|
¿µ¹®³»¿ë (English Abstract) |
In training a deep learning model, it is crucial to tune various hyperparameters and gain speed and accuracy. While hyperparameters that mathematically induce convergence impact training speed, system parameters that affect host-to-device transfer are also crucial. Therefore, it is important to properly tune and select parameters that influence the data loader as a system parameter in overall time acceleration. We propose an automated framework called Num Worker Tuner (NWT) to address this problem. This method finds the appropriate number of multi-processing subprocesses through the search space and accelerates the learning through the number of subprocesses. Furthermore, this method allows memory efficiency and speed-up by tuning the system-dependent parameter, the number of multiprocess spawns.
|
Å°¿öµå(Keyword) |
|
ÆÄÀÏ÷ºÎ |
PDF ´Ù¿î·Îµå
|