An excellent question.
As machine intelligence has reached increasing levels of complexity various systems have been designed (Avia, Conway's Game of Life, even the game Spore), Each have basic "simple" rules by which internal systems are guided , this means that the overall system is not knowable in any real sense (consider the number of species in a single Spore galaxy), and that the states of those planets are all unique. Very quickly you get into a situation where the numbers are literally astronomical.
In this way we are starting to arrive at this condition, and while for generations, it has been thought that machines will have "rules" such as Asimov's 3 rules or some such, but in reality, it's likely the case that some systems will be given a degree of automony such that sentience is eventually developed intentionally but that we ourselves may not understand that complete principles under which the system run.
One aspect of the current financial crisis was a serious problem in that financial quantitative systems (which used advanced applied variants of genetic algorithms, and heuristic feedbacks were "out of control", or more properly, WE were not able to anticipate their behavior given any particular set of market conditions.
This happened before in the 1980's with the Russian currency collapse but not at the same level of sophistication.
Personally speaking, I think that we should aim to have machines work at our direction and for our benefit. However, in most modern Science Fiction there is a threat that machines "turn" on us.
This is the point of your question really. When it comes to this question, I like to remember an older movie (from the 1960's) which forms the premise of The Terminator, The Matrix, and most other such movies, the movie was called "The Forbin Project", and exactly answers your question, I STRONGLY recommend you get a copy.
Ray Kurzweil calls this "The Singularity", basically a point where humans stop being the smartest species and machines take that title from us.
In "The Forbin Project", originally designed to automate nuclear launch and response, the system "takes over" by virtue of being in control of nuclear weapons (like Skynet or WOPR), but Colossus was the grand-daddy and basically you get to hear the argument.
Consider the first few minutes where Colossus can "speak" to it's creators. http://www.youtube.com/watch?v=-RdHuCyjqKw
http://www.youtube.com/watch?v=gMst6sqRlMU
Towards the conclusion of the movie is the "address" by Colossus to the world - and the Colossus "programming office" is rechristened "world control"