A collection of lisp snippets I’ll be adding to as I learn more.

### Factorial (recursive function)

(defun factorial (n) (if (= n 0) 1 (if (> n 0) (* n (factorial (- n 1))))))

Say’s Law, also known as the Law of Markets, is an important component of classical economics and formulated thus: It is worthwhile to remark that a product is no sooner created than it, from that instant, affords a market for other products to the full extent of its own value. When the producer has put [...]

A collection of lisp snippets I’ll be adding to as I learn more.

(defun factorial (n) (if (= n 0) 1 (if (> n 0) (* n (factorial (- n 1))))))

Hidden layer neurons are large groups of neurons *between* an input layer and output layer– your sensory receptors and your motor actions. Their size, type, and network topology (who connects to who) affect the complexity of problems they solve and decisions they make. Successful networks find solutions because they are statistically favored. Neurons compete for resources. With successful predictions, they grow stronger. Those that fail to make patterns out of chaos weaken. These cells even contain programming to destroy themselves, a process called apoptosis. This occurs at a rapid rate early in life as the brain first organizes itself, then slows down in adulthood.

We seem to be able to direct our own mental processes, somehow– which is absurd to me. One part of the brain exerting control of another? Different patterns resonating in electrochemical waves: metastable yet amplifying. Consciousness subsuming itself with each passing moment. Alive. It seems awfully political: these neurons in such chaotic, demanding labor to produce a transient mind. They are decision-making agents: competing for scarce resources, struggling for strong connections, forming computational alliances. The competition for statistical significance may be experienced as a tug of attention, a drive to act, or a flash of inspiration. The steps neurons take to assemble personal truth are invisible.

In this tutorial we’ll look at some simple matrix multiplication tools in Python. The arrays are constructed using lists within lists, and the multiplication itself is implemented using for loops. The program works well for small arrays, but really starts bogging down at sizes 500×500 and above. Anyone doing serious work will want to check out NumPy, which has matrix operations built-in and runs several thousand times faster. In Part 2 I’ll compare NumPy’s performance to other optimized methods.

Let’s feast our eyes on the function definitions we’ll be using in matrix.py:

import random from time import * import cProfile def zero(m,n): # Create zero matrix new_matrix = [[0 for row in range(n)] for col in range(m)] return new_matrix def rand(m,n): # Create random matrix new_matrix = [[random.random() for row in range(n)] for col in range(m)] return new_matrix def show(matrix): # Print out matrix for col in matrix: print col def mult(matrix1,matrix2): # Matrix multiplication if len(matrix1[0]) != len(matrix2): # Check matrix dimensions print 'Matrices must be m*n and n*p to multiply!' else: # Multiply if correct dimensions new_matrix = zero(len(matrix1),len(matrix2[0])) for i in range(len(matrix1)): for j in range(len(matrix2[0])): for k in range(len(matrix2)): new_matrix[i][j] += matrix1[i][k]*matrix2[k][j] return new_matrix def time_mult(matrix1,matrix2): # Clock the time matrix multiplication takes start = clock() new_matrix = mult(matrix1,matrix2) end = clock() print 'Multiplication took ',end-start,' seconds' def profile_mult(matrix1,matrix2): # A more detailed timing with process information # Arguments must be strings for this function # eg. profile_mult('a','b') cProfile.run('matrix.mult(' + matrix1 + ',' + matrix2 + ')')