References
Bronstein, Michael M., Joan Bruna, Taco Cohen, and Petar Velickovic.
2021. “Geometric Deep Learning: Grids, Groups, Graphs, Geodesics,
and Gauges.” CoRR abs/2104.13478. https://arxiv.org/abs/2104.13478.
Cho, Dongjin, Cheolhee Yoo, Jungho Im, and Dong-Hyun Cha. 2020.
“Comparative Assessment of Various Machine Learning-Based Bias
Correction Methods for Numerical Weather Prediction Model Forecasts of
Extreme Air Temperatures in Urban Areas.” Earth and Space
Science 7 (4): e2019EA000740. https://doi.org/https://doi.org/10.1029/2019EA000740.
Cho, Kyunghyun, Bart van Merrienboer, Çaglar Gülçehre, Fethi Bougares,
Holger Schwenk, and Yoshua Bengio. 2014. “Learning Phrase
Representations Using RNN Encoder-Decoder for Statistical
Machine Translation.” CoRR abs/1406.1078. http://arxiv.org/abs/1406.1078.
Dumoulin, Vincent, and Francesco Visin. 2016. “A guide to convolution arithmetic for deep
learning.” arXiv e-Prints, March,
arXiv:1603.07285. https://arxiv.org/abs/1603.07285.
He, Kaiming, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2015.
“Deep Residual Learning for Image Recognition.”
CoRR abs/1512.03385. http://arxiv.org/abs/1512.03385.
Hochreiter, Sepp, and Jürgen Schmidhuber. 1997. “Long Short-Term
Memory.” Neural Computation 9 (8): 1735–80.
Ioffe, Sergey, and Christian Szegedy. 2015. “Batch Normalization:
Accelerating Deep Network Training by Reducing Internal Covariate
Shift.” https://arxiv.org/abs/1502.03167.
Loshchilov, Ilya, and Frank Hutter. 2016. “SGDR:
Stochastic Gradient Descent with Restarts.” CoRR
abs/1608.03983. http://arxiv.org/abs/1608.03983.
Olah, Chris, Alexander Mordvintsev, and Ludwig Schubert. 2017.
“Feature Visualization.” Distill. https://doi.org/10.23915/distill.00007.
Osgood, Brad. 2019. Lectures on the Fourier Transform and Its
Applications. American Mathematical Society.
Ronneberger, Olaf, Philipp Fischer, and Thomas Brox. 2015. “U-Net:
Convolutional Networks for Biomedical Image Segmentation.”
CoRR abs/1505.04597. http://arxiv.org/abs/1505.04597.
Sandler, Mark, Andrew G. Howard, Menglong Zhu, Andrey Zhmoginov, and
Liang-Chieh Chen. 2018. “Inverted Residuals and Linear
Bottlenecks: Mobile Networks for Classification, Detection and
Segmentation.” CoRR abs/1801.04381. http://arxiv.org/abs/1801.04381.
Smith, Leslie N. 2015. “No More Pesky Learning Rate Guessing
Games.” CoRR abs/1506.01186. http://arxiv.org/abs/1506.01186.
Smith, Leslie N., and Nicholay Topin. 2017. “Super-Convergence:
Very Fast Training of Residual Networks Using Large Learning
Rates.” CoRR abs/1708.07120. http://arxiv.org/abs/1708.07120.
Srivastava, Nitish, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever,
and Ruslan Salakhutdinov. 2014. “Dropout: A Simple Way to Prevent
Neural Networks from Overfitting.” J. Mach. Learn. Res.
15 (1): 1929–58.
Trefethen, Lloyd N., and David Bau. 1997. Numerical Linear
Algebra. SIAM.
Vistnes, Arnt Inge. 2018. Physics of Oscillations and Waves. With
Use of Matlab and Python. Springer.
Warden, Pete. 2018. “Speech Commands: A Dataset for
Limited-Vocabulary Speech Recognition.” CoRR
abs/1804.03209. http://arxiv.org/abs/1804.03209.
Zhang, Hongyi, Moustapha Cisse, Yann N. Dauphin, and David Lopez-Paz.
2017. “mixup: Beyond Empirical Risk
Minimization.” arXiv e-Prints, October,
arXiv:1710.09412. https://arxiv.org/abs/1710.09412.