We wrote the style transfer code based on https://github.com/anishathalye/neural-style
(1) download the tensorflow VGG19 model from http://www.vlfeat.org/matconvnet/models/beta16/imagenet-vgg-verydeep-19.mat (2) Put the model into ./plainModel/ folder (3) modify the vggModel.py to incorporate the content image and style image (4) Run the vggModel.py
To run the inception Model, download the models and also put the models into ./plainModel/folder (1) Inception V3 model from http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz (this model is compatible with older tensorflow. Rename to InceptionV1.pb) (2) Inception V1 and V4 model from https://github.com/beniz/deepdetect/issues/89 (rename to InceptionV1.pb and InceptionV4.pb)
There are other models[*] that can be used including inception, res-net, vgg-19, vgg-16 from tensorflow/slim, download the pb files from https://drive.google.com/open?id=0B-YQJ1l195yzMUItcFBvaTV5aFk
[*] The *_model.pb files are generated by meta2pb.py, based the meta graph model downloaded from https://github.com/tensorflow/models/tree/master/slim. (The meta graph requires newest version of tensorflow, so I changed the meta graph to pb, hope to have some compatibility for older tensor, but it seems not working well with 0.8 version on the cluster. It seems pb files may still not be back-compatible if some high-level APIs are used when the graph is created)
File structures: inceptionModel.py: script to run style transfer using inception Models vgg.py: script to create vgg19 model vggModel.py: script to run style trensfer using vgg19 model meta2pb.py: code to transfer meta graph to pb files