Roko's basilisk is a thought experiment which states that an otherwise benevolent AI in the future may torture anyone who knew of its potential existence but did not directly contribute to its advancement, in order to incentivize said advancement in the present.