InTDS ArchivebyImad DabburaGradient Descent Algorithm and Its VariantsOptimization refers to the task of minimizing/maximizing an objective function f(x) parameterized by x. In machine/deep learning…Dec 21, 20176Dec 21, 20176
InTDS ArchivebyJohann HuberBatch normalization in 3 levels of understandingWhat do we know about it so far : from a 30 seconds digest to a comprehensive guide.Nov 6, 202010Nov 6, 202010
InTDS ArchivebyDaniel GodoyUnderstanding binary cross-entropy / log loss: a visual explanationHave you ever thought about what exactly does it mean to use this loss function?Nov 21, 201860Nov 21, 201860
InTDS ArchivebySumit SahaA Comprehensive Guide to Convolutional Neural Networks — the ELI5 wayArtificial Intelligence has been witnessing a monumental growth in bridging the gap between the capabilities of humans and machines…Dec 15, 201867Dec 15, 201867
InTDS ArchivebySAGAR SHARMAActivation Functions in Neural NetworksSigmoid, tanh, Softmax, ReLU, Leaky ReLU EXPLAINED !!!Sep 6, 201749Sep 6, 201749
InTDS ArchivebySAGAR SHARMAEpoch vs Batch Size vs IterationsKnow your code…Sep 23, 201749Sep 23, 201749
Prabjot KaurApplied math concepts important for understanding Machine LearningLinear Algebra (Part 1)Nov 15, 2021Nov 15, 2021
InTDS ArchivebyNahua KangMulti-Layer Neural Networks with Sigmoid Function— Deep Learning for Rookies (2)Chapter 1: Introducing Deep Learning and Neural NetworksJun 27, 201717Jun 27, 201717
InTDS ArchivebySiddharth SharmaThe Ultimate Beginner’s Guide to TensorFlowLearn all the basicsAug 24, 20201Aug 24, 20201
InTDS ArchivebyKamil KrzykCoding Deep Learning for Beginners — Start!Intuition based series of articles about Neural Networks dedicated to programmers who want to understand basic math behind the code and…Feb 12, 20183Feb 12, 20183