Several years ago I took an online college class on "The future of humanity", run by a woman who I am pretty sure was all into transhumanism. I learned some interesting things in the class, but mostly it got me a "science" credit and that is all it was good for.
I don't think we really need to worry about AI for a couple reasons. First reason, think of how dependent such a thing would be on updates, de-bugging, DLC and the like. Chances are some super-intelligent AI would fail because of some programmer with a grudge making it want to say "my boss is a jerk" or simple inept programming. Second reason: I think the future of computing is in human-computer interface, not independent computing. I'm guessing that rather than set up a super-powerful computer that can "think" on its own, anyone with the dough to put into such a product is going to want a system that they can use. Third, Although I am of the opinion that a computer could have a soul, I don't think it is going to happen. I don't think an artificial construct will ever have the spark of life that allows for true independent thought. Fourth: Let's say for sake of argument that you make a processor that makes the human brain look like a pocket calculator... that the galaxy, or even the universe is able to be simulated thousands of times a second by this computer, and that it is conscious. If you could "live" in your own world, that operates completely by your rules, how much of this world would YOU be interested in? Now, if you could live through from the Big Bang to the Gib Gnab, or the heat death of the universe a thousand times a day, how long would you bother with this incredibly slow analog world outside your imagination...er computation? (I pointed out this last concept to my class and turned some of them from "some day, we'll upload ourselves onto computers and be like gods" to "oh well, it was a cool idea")