Responsibility is the social demands applied to the agent. If there were many AI robots in the wild, we could assign them social responsibilities as their duty to the collective - to not harm social values. If they deterministically violate norms we have to hold them responsible so we can maintain social wellbeing.
I think every movement confuses itself because humans have a hard time thinking at multiple levels of analysis. I think Christianity broke the brains of the West in a way by introducing the idea of cosmic oughts and cosmic responsibilities. I think a lot of people get stuck in the trap of implicitly assuming that all oughts are cosmic oughts and all responsibilities are cosmic responsibilities. When westerners realize that the idea of cosmic morality is absurd then they default to assuming that there is no morality, no oughts and no responsibilities. But when we take an evolutionary approach we can absolutely construct emergent moralities, emergent oughts, and emergent responsibilities that don't rely on a cosmic underpinning.