6.3 and 6.4 read funny one after the other: put references to the paper in the comments, but change all the notation from the paper.
Probably the worse piece of advice in the article :
> 6.4 Avoid mathematical notations in your variable names Lets say that some quantity in the algorithm is a matrix denoted A. Later, the algorithm requires the gradient of the matrix over the two dimensions, denoted dA = (dA/dx, dA/dy). Then the name of the the variables should not be dA_dx and dA_dy, but gradient_x and gradient_y. Similarly, if an equation system requires a convergence test, then the variables should not be prev_dA_dx and dA_dx, but error_previous and error_current. Always name things for what physical quantity they represent, not whatever letter notation the authors of the paper used (e.g. gradient_x and not dA_dx), and always express the more specific to the less specific from left to right (e.g. gradient_x and not x_gradient).
Especially when you're just starting out, creating your own naming scheme just creates more opportunities to do something wrong.
Have to disagree. Those derivatives are a bad example of a good point. Most mathematical symbols aren't representable in code, at least until we're able to use unicode identifiers and sub/superscripts in every language. When you're forced to write 'theta' instead of then you might as well just say 'angle' so your future maintenance programmer will have an easier time of it.
An equation and an algorithm might achieve the same result but the ways they get there are so different that using different notation styles makes perfect sense. For example, a simple finite summation is a compact block in mathematical notation but it's a multi-line for loop in C. Trying to force the constraints of the 'source' notation on the implementation makes no sense.
Remember, dA/dx means 'gradient'. You're not creating your own naming scheme, you're translating the concept of 'gradient' to the appropriate notation for the medium you're working in.
>Most mathematical symbols aren't representable in code, at least until we're able to use unicode identifiers and sub/superscripts in every language.
If the Linux compose key supported more of the common mathematical symbols, I would have so much trouble not using them in all of my JS code. It's already hard not to use names like to denote unit vectors.
As a side note, I really want to make a JS library called "Eta" for creating progress bars (puns!), where the global namespace is under (the Greek letter), but I think that might piss people off, even if I did allow the visually identical H as an alias.
Google '.xcompose github' (without quotes) and see what you come up with. I use this one, myself:
https://github.com/kragen/xcompose
but there are a lot of others.