Skip to main content

A.I. Design Notes 1

Here is a brief outline of some of my views on AGI design which I'll expand on in time.

Noise is Necessary.
Noise is an essential ingredient for any reasonably intelligent system. This is less controversial to say now than a few years ago given recent findings in several fields. Maybe you can have limited success in limited domains with a non-noisy brain, but that's about it. And the deltas that come from a rich, complex environment can provide a lot for what is needed in an agent brain. But again I suspect that the limitations will show up and become more pronounced as time goes on.
We know for example via implants that there is an immediate loss of fidelity in hearing if the implant is not a noise-sensitive circuit. As an agent becomes more advanced and the brain becomes larger, I believe this is definitely the case.

Open Architecture & Self Organization.
One take on this idea I heard about is Andy Clark's 'Natural Born Cyborgs', and his great line 'Everything Leaks'. I see it as going up the meta-algorithm hierarchy. Arbitrary algorithms & related brain architecture is induced via development. I had my own saying - 'the intelligence is in the data' or 'the intelligence is already out there'. More on this later.

Sameness & Difference
Both are necessary. However, since the environment gives us plenty of free deltas, and noise adds more, the brain can be biased to 'integrating' and 'unifying' functions. This saves on the workload and required resources.
With sensory deprivation we can see a limited balancing response, due to lack of deltas, by the human brain. This also suggests the above open-architecture of the human brain - it is not a wholly-specified 'unifying machine'. The behaviour is induced and there are higher meta-algorithms that determines architecture. 

Given the last 2 points; if your brain design has a diagram with boxes and arrows then in my opinion It's suspect. If an AI designer tries to specify the structure of a brain, they run the risk of all sorts of pathologies. For such a highly adaptive system they may end up being a necessary component of the machine itself - they will constantly have to intervene to reestablish intended structure and functionality. A cog in their own machine. As a human being, the designer (or engineering team) becomes the bottleneck, the weakest link, for such an extremely complex, high throughput machine.
I consider the necessary meta-algorithms to be a function of the agents embodiment, primarily. True, there is a downside: the cost of a less specified and more adaptive brain is the resource consumption required for self-organization.

Aside note: Jordan Pollack a while back (2001) had an article 'software is a cultural solvent' and I like that metaphor. Even more so, continued IT and broad technological advancement can be thought of as a material solvent. Briefly looking at the article again, he seems to allude to that. As above, I see that the brain in it's 'integrating' role can also be seen as a 'solvent'.

More is Better
Peter Norvig has recently pointed out that the performance of several traditional AI algorithms goes up dramatically after a certain threshold of size is reached. Training data set, training time, and machine size are all increased.
This lines up with my own work. Several years ago while I was very eager and active on my project, I got that terrible, sinking feeling as I began to realize what resources would be needed to make something with non-trivial performance. The human brain is massive for a reason. And this is unbound. 

That's it for now.


Popular posts from this blog

Putting together GPU Workstation

I am setting up a gpgpu workstation for prototyping the neural nets. It's an interesting time in history, this will likely be the last digital, von Neumann machine I use for this work. It is a good, fast number cruncher, but it is the wrong type of machine for ANN's. It's a bad fit. 4 little islands of 4/6gb ram spinning wheels hyper-fast. The interconnects slow as mud in comparison. PCI3.0 doesn't come close, honestly PCI4.0 won't make any real difference.
My ideal is one, extremely large, homogenous 'brain'. That gives me much less work to do. Now I will have to come up with algo's to deal with very fast, slow-connect sub-nets. I will be using compression techniques to communicate between the gpu's and find ways to architect around the hardware. The 4/6gb sub-nets will have to adapt with minimal information from the rest of the network. Trying a rasterizing technique won't be efficient since the buses and HD's will be far too slow for the …

National Biodiesel Board re: Selling Biodiesel

Thanks for writing.
As the national trade association for the US biodiesel industry, our expertise is not in the production side of the industry.  Production technology, plant development, business planning, feasibility studies are issues best serviced by the private sector.  NBB works primarily in the areas of biodiesel education and market development, and we strive to break down barriers while creating demand for US biodiesel.  For business consulting advice, technology providers, and other business-to-business services you can find some resources on the All Things Biodiesel (ATB) section of the NBB web site,
The issues associated with producing biodiesel commercially in the US include:
Registration with EPA as a biodiesel fuel producer: In the US, EPA governs fuel and fuel additive registration and anyone selling biodiesel must first be registered with them. The contact is Mr. Jim Caldwell and his number is (202) 343-9303.  See…