In machine learning, everyone talks about weights and activations, often in conjunction with a formula of the form
wx+b. While reading machine learning in action I frequently saw this formula but didn’t really understand what it meant. Obviously its a line of some sort, but what does the line mean? Where does w come from? I was able to muddle past this for decision trees, and naive bayes, but when I got to support vector machines I was pretty confused. I wasn’t able to follow the math and conceptually things got muddled.
At this point, I switched over to a different book, machine learning an algorithmic perspective.
Here, the book starts with a discussion on neural networks, which are directly tied to the equation of
wx+b and the concept of weights. I think this book is much better than the other one I was reading. … Read more