How far can and should scientists and engineers take responsibility for the potential or actual consequences of their research findings and creations? Does the answer to this depend on the kind of research they are doing, its possible consequences and the context in which they are working? Our interviewees suggest strongly that the answer to the second question is firmly yes. Those who worked in defence research during World War Two had few reservations about doing so. The situation during the Cold War could be more ambiguous but the sense of a genuine threat helped to overcome reservations. Some engineers demonstrate a sense of responsibility for using their skills to build a better world, although the ways in which they sought to do this differed very widely and reflected changinglocal and global concerns. Others demonstrate an awareness of the malleable potential of new technologies which might be used for benign or malevolent purposes in different contexts. For engineers who saw human lives rely on the safety of their creations, the consequences of getting it wrong could be catastrophic and encouraged a meticulous and cautious approach which stressed safety above all else. While individuals are willing to articulate and reflect on these issues during life story interviews, David Davies as editor of Nature during the 1970s, found scientists as a group very reluctant to express opinions and engage in debate on contentious topics.