A robot can only do what it is programmed to do, if it starts programming itself then that's where things get interesting, but I doubt it will start killing humans since the urge to kill is always an instinctual one.
Robots don't have instincts and it wouldn't make logical sense to give itself instincts since they are not always 100% beneficial to the thing that has them.
I'd be less concerned about A.I. itself and more concerned with who posses the technology because they are the ones who will give the automaton bloodlust, we already see it with drones and computer viruses that cause destruction on their own but it is always because they were made that way,
The creepy thing about AI is that once it is created it teaches itself from the world around it and from the internet and those can pretty dark places... Hopefully they don't develop a "bloodlust" on their own. However the military does have an incentive to create an AI with a "bloodlust" for war purposes...