Skip to main content

Excitement and frustration in today’s machine learning area

Time: Fri 2018-09-21 15.00

Location: Fantum

Participating: Saikat Chatterjee

Export to calendar

Abstract: All of us possible aware of ‘deep learning’ term. Deep
learning mainly in the architecture of deep neural network provides
unprecedented performance in classification and prediction. Excitement
is evident in all conferences and workshops. However, it comes with high
challenges to understand the stuff and then eventual frustration. In
this seminar, I will discuss some issues in neural networks. For
example, why neural network has a standard architecture, regularization
and overfitting issues, why convolutional neural net got so famous,
certain things that are very difficult to understand (for example,
dropout), alternative architectures (deep sparse representation, deep
wavelet stacks), a very simple network with very good performance which
almost anybody can implement (called extreme learning machine) and still
frustration, etc.