Full description not available
K**R
Excellent book written by an educator, but definitely check the github...
This book steps you through coding layers, backpropagation, optimization, the trainer for the neural net; all the little bits and pieces that you'll want to know in-depth. It is well-organized and clearly explained. The author comes back to frame ideas as computational graphs over and over, and that provides an anchor for the reader as concepts become more complex. You will apply your code to model data sets, many of them the "usual suspects" that everyone should know.You will want to check the author's github for this book. In some later chapters, parts of the code are omitted from print in the book, but are present in the github repo. There are also some minor corrections and updates that have been made to the repo since the book was printed. This was an excellent resource for me.
T**R
Could be better. Certainly a 2nd Ed is warranted
Some great points to be noted are - An Illustrative approach and lots of code. I like that, less talk and more code.Just getting started on it. But In the math intro, why does the author not use Lim h→0 f(a+h)−f(a)/h and instead use Lim h→0 f(a+h)−f(a-h)/2h to define the mathematically precise definition of the derivative? It's an odd choice of the introduction of derivatives when the norm for introducing the derivative is the former. I mean I'm no Math major but isn't the latter used when f(x) is symmetric about the point a? i.e. f(a+h) = f(a-h)? It's such a restrictive definition. Anyway, I would like to be corrected.The code examples are cool.
K**N
This book will teach you how to code neural networks - MLPs, ConvNets, RNNs with LTSM and GRU cells!
This book is the one book I have found that actually teaches how to code these networks from scratch. The vast majority of other books are simply theoretical in nature, or use a toolkit like Theano, TensorFlow, or PyTorch which gives little understanding of how neural networks actually work. It is one thing to read the paper by Hochreiter on LTSM cells but another to have a complete code implementation. Furthermore, knowledge of precisely how the networks work is necessary to design new types of architectures.
O**O
Code examples are very intuitive!
- Concepts are explained very clearly- Explanation of how Jacobian tensors are used (best placed in Appendix) to compute partial derivatives of matrix transformations would clarify how the back-propagation equations in the book were obtained.* e.g. Computing Jacobians in our code would use up significant computer resources, therefore, rather than computing them for back-prop., we use their essential values (hence, the partial derivatives in the code examples)* Jacobians make the concept of backprop. intuitive for understanding
A**R
Great
Very informative
N**E
Miserably Written. You do the Author's Job for Them.
First chapter and the promise to include all code is already broken as several functions are called that are not written down anywhere. So if you want to follow along, you need to reverse engineer a lot of what the author has done. Not sure why they thought this was acceptable.Edit after more reading:The notation is a disaster. Derivations left unfinished even in the appendix. Meaningless diagrams. Abuse of the notation the author makes up (without warning!). Major parts of the code just left out. The author is a maniac and I'm convinced the positive reviews are fake. Don't bother with this book, it's horrible. The author seems to know their stuff, but they left about half the book out.Information is presented out of order. Critical information is relegated to a footnote and motivated long after it should be. The math and code, while on their own easy enough to read, seem to come from nowhere as the author just presents in a stream of thought mindset. There does not seem to have been an editor, as there is no other way a text could come out this manic.
Z**Y
Not bad
I personally disliked the layout of this book. I think some concepts could have been handled better. But I do have A LOT of highlighting in some chapters. His explanation and examples were very good. If you aren't really comfortable with object oriented programming, you might want to look for another book.
D**Y
This time it's personal
Deep learning is back. If it ever really went away. This is a very good book covering what you need to know and some in-depth content. As a programmer, I am always looking to the next level. I have been circling deep learning for a while now, and have simply been intimidated, in part from a conversation I was a part of some years ago. I see that to get me to the next levelk, I need to be working on this level. This book has got me past my inaccurate blocks, and has given me a ghood base knowlege. I am ready for the next step!
Trustpilot
1 day ago
1 week ago