Well, I think it's time to look out for a Technological Singularity

I don't think that "does this have morality, or does it not?" is the right question to ask.
Morality is used to judge actions, not people, or things.
Also, "living things" do not really require morality, unless they have freedom of choice.
"With great power comes great responsibility"
Entities that can choose their own course of action(regardless, do you believe in fate or not) will be inevitably "responsible" for a lot of change.
Morality is critical for the entity to have a measure of which choices are "bad"/harmful and which are "good"/beneficial.
It's like a chess computer that has been preprogrammed not to think of brainlessly retarded moves in the first place.
Basically, morality is a kind of security barrier, that prevents us from doing "harmful" things, much like the self-preservation instinct for unintelligent lifeforms.

(sheesh! I think I just learned a new point of view from my own words!)

To have a "good" morality(i.e. one that most people agree with and/or are inspired by) is crucial for the survival of the species as a whole.IMO it is also the fundamental ingredient for true long term happiness/contentment.

Also, @mnessie
A++ on your AI-speech. Agree fully.
But that other thing...
Is it moral to impose a change(even if it is for the better) on other beings if they disagree, even if they might not be fully aware/knowledgeable?
I would like to hear any other answers first, before posting my own.