Supercomputing speeds up deep learning training

Researchers from UC Berkeley, UC Davis and TACC used Stampede2 to complete a 100-epoch ImageNet deep neural network training in 11 minutes — the fastest time recorded to date. Using 1600 Skylake processors they also bested Facebook’s prior results by finishing a 90-epoch ImageNet training with ResNet-50 in 32 minutes. Given TACC’s large user base and huge capacity, this capability will have a major impact across all fields of science. …read more

Source: EurekAlert

(Visited 3 times, 1 visits today)

Popular Posts