Posts

Showing posts from March, 2008

One citizen's take

I remember watching the Republican Party Presidential debates a few months back & was dumb-founded that every one of the nominees (except for my main-man, Ron Paul) was for staying in Iraq. In fact, they accused eachother of wanting to pull out of Iraq... like it was a bad thing.

Here we are - the dollar is approaching the peso in value, and we're officially in a 'recession'. Hmm, could this have anything to do with it?

Federal Contract Awards by Major Funding Agency (from fedspending.org)

Federal Assistance by Major Agency (from fedspending.org)

Actually, the bottom pie is much larger in total. Homeland Security(30%) amounts to about $700 billion in 2006, and Defense contracts(54%) amount to about $300 billion. It just makes me think that if we could somehow shrink the pie or the biggest slices of the pie... you and me could eat more pie! (FedSpending.org is a great site. I have yet to try out FFATA.org, a similar site with broader data that was recently created because of a bill co-sponsored by Barack Obama.)

I know economies can't be summarized by a simple chart... but the visuals above represent a very important statistic: Where is our government spending your tax dollars?

I would say that our government is broken... except Americans re-elected George W. Bush AFTER we'd had a chance to see what he's made of, so maybe the American ideology is what's broken. I sincerely hope we don't vote another war-monger into office & that our country begins to repair it's reputation across the world. That is the best 'homeland security' money can buy.

The Turk

I'm a fan of the new Fox TV series "Terminator: Sarah Conner Chronicles". Film series like "The Terminator", "2001", & "The Matrix" were great because they rested their plots on the chilling premise that speaks to geeks & technophobes alike: Artificial Intelligence/Life will eventually surpass humans and nature... leading to our own extinction. (of course, that is only if we don't destroy ourselves first). A popular article on this topic is Why The Future Doesn't Need Us, by Bill Joy (computer scientist & co-founder of Sun Microsystems).

In Sarah Conner Chronicles, "The Turk" was a computer that was built for playing chess... but demonstrated emotion & 'child-like' qualities. The Turk was destined to be the technological ancestor of SkyNET - the intelligent computer which would eventually launch a war against human-kind. That makes for great fiction... but I think the questions about the potential of artificial intelligence are deeply more interesting. In reality, artificial intelligence is a joke. There is no computer that comes close to replicating the complexity of the human mind. Computers built for chess are just that... they process all possible moves & counter-moves ahead, selecting the best one. That is not intelligence... it is raw CPU power with a very simple algorithm.

The classic concept from A.I. is the "Turing Test", which posed the question: if a you could type questions into a computer, and couldn't determine by the responses whether it was a computer or person answering... is that computer intelligent? To me, that question is pointless. Intelligence should be qualified by what happens inside the box... not what comes out of the box. Could there be some to-be-discovered special algorithm which could create an intelligent machine like "The Turk"? No way. Chess-playing & stock-market predicting programs are not intelligent. Expert systems which store huge amounts of data & spit out answers are not intelligent. A truly intelligent machine will need extremely rich perceptual systems. Animals, insects, even microbes are able to see, smell, feel, or hear to some degree. Intelligence and learning occurs by applying cognitive tools to categorize and formulate relationships from post-processed information. Visual & auditory stimulus are processed by the brain into their respective shapes, phonemes, etc.... then recognized by their context as 'fuzzy logic' of a neural network constantly gives the "conscious" mind it's surroundings.

Perception of the environment is just half of the challenge. If we had that, could we ever create a machine that is creative and original? In order to do so, I believe emotion is required. Emotion is an essential part of our own cognition & learning. When you are sad, embarrassed, happy, or proud - different chemicals are released by the limbic system of the brain, which in turn encourage growth or elimination of neural connections. I guess that means Data from "Star Trek" or Vicki from "Small Wonder" wouldn't be very smart robots. Call it 'emotion' or not... but when you yell "Bad computer!", the computer isn't very smart unless it understands and corrects it's behavior. Any useful intelligent machine would also require some 'social awareness'. It should understand and predict our desires and intentions. It would also need to be instinctively rewarded (or "feel good") by pleasing us. That's where the scary sci-fi & Asimov's I-Robot "three laws" come in.

Imagine a futuristic home where an intelligent computer controls your security, lights, communications, entertainment, and more. In order for the home computer to be truly useful, it should be able to learn by watching it's inhabitants habits - learning the likes/dislikes of each individual. The computer would need enough perception to tell it's inhabitants apart, their authoritiy, what they desire, and when they are pleased or displeased. Like they say, "necessity is the mother of invention"... and the convenience of this kind of system is what I can see being a good early application of artificial intelligence - not The Turk. Hopefully one day, we won't have SkyNET - but maybe something more useful like HomeNET.