Frameworks for deep learning with dynamic graphs construction

Two frameworks PyTorch and DyNet are released recently. They both feature dynamic graph construction, which is more flexible than current frameworks like Tensorflow and Theano, with other features of their own. (though Tensorflow has some functions for dynamic NN architecture construction)



Update on Feb 16:

Tensorflow accounced a new feature called Tensorflow Fold, which supports dynamic computation graphs. More importantly, different from the above two, it also supports dynamic batching, which makes it more powerful. I would expect other frameworks like MXNet/Pytorch/others will support the function as well very soon.

Details here:


More to be continued on these topics soon.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s