I've been playing with the GPT-2 neural net text generators, and they produce interesting results when primed with two disparate themes, e.g. secure C++ coding meets trashy romance novel:
Most of the time GPT-2 will veer towards a single theme and forget the other. However occasionally it produces absolute gold. e.g. in this alternative universe Brent ignores Sylvia's advice and takes a risk: