Here is image of tanh (hyperbolic tangent) function from Gnuplot37, overlaid with hypertanf sAPL function from "neuralxr" workspace. This sAPL workspace will accept the MNnet4~1.WTT file of Xerion weights for the MarketNet network, and use dot-product to vector multiply the weights to "activate" the Xerion-trained network. This will let me "run" the network, on the iPad. I wrote the function to load the Xerion weights file into sAPL, (format: wt <- readfile fname) and second function to convert the text into numeric (format: wnet <- procwt wt). Currently, wnet is just a high-precision vector of 1281 32-bit floats. Since I'm using hyperbolic tangent instead of logistic as my transfer function, I needed to write this tiny transfer function. The tanh function already exists in GNUplot37. You can start GNUplot, and just enter "plot tanh(x)" and see this S-curve, which is the mechanism by which machine-intelligence is stored in a neural-network. Getting closer for an NN-based iPad-runable Augmenter. [Update: I wrote the function on top-left, but then remembered the APL built-in trig. functions, and yes, "7oX" gives hyperbolic tangent for X. The "o" operator is "ALT-o", and when used dyadic (two arguments), it gives access to all the trig. functions. With full precision of 18 digits enabled, the built-in "tanh" function gives slightly more precise results.]