年夜饭变“年夜烦”:套餐几成行规 不得超两小时
百度 顺义、大兴、亦庄、昌平、房山等新城鼓励工业、仓储、批发市场等用地调整为科技创新用房这个区域的目标是坚持集约高效发展,提升城市发展水平和综合服务能力,建设高新技术和战略新兴产业聚集区。Excellent tutorial explaining Recurrent Neural Networks (RNNs) which hold great promise for learning general sequences, and have applications for text analysis, handwriting recognition and even machine translation.
By Gregory Piatetsky,
@kdnuggets.
While feats of Deep Learning has been gathering much attention, there were also breakthroughs in a related technology of Recurrent Neural Networks (RNN). RNNs hold great promise for learning general sequences, and have applications for text analysis, handwriting recognition and even machine translation.
RNN is learning to paint house numbers (Andrej Karpathy)
See a fantastic post by Andrej Karpathy, "The Unreasonable Effectiveness of Recurrent Neural Networks" where he uses RNNs to do amazing stuff like paint house numbers in this image, or generate text in the style of Paul Graham, Shakespeare, and even Latex.
See below an excellent tutorial
"General Sequence Learning using Recurrent Neural Networks"
by Alec Radford, Indico Head of Research, who led a workshop on general sequence learning using recurrent neural networks at Next.ML in San Francisco, Feb 2015.
Alec introduces RNNs and sketches how to implement them and cover the tricks necessary to make them work well. Then he investigates using RNNs as general text classification and regression models, examining where they succeed and where they fail compared to more traditional text analysis models.
Finally, he presents simple Python and Theano library for training RNNs with a scikit-learn style interface and you'll see how to use it through several hands-on tutorials on real world text datasets.
Related:
While feats of Deep Learning has been gathering much attention, there were also breakthroughs in a related technology of Recurrent Neural Networks (RNN). RNNs hold great promise for learning general sequences, and have applications for text analysis, handwriting recognition and even machine translation.

RNN is learning to paint house numbers (Andrej Karpathy)
See a fantastic post by Andrej Karpathy, "The Unreasonable Effectiveness of Recurrent Neural Networks" where he uses RNNs to do amazing stuff like paint house numbers in this image, or generate text in the style of Paul Graham, Shakespeare, and even Latex.
See below an excellent tutorial
"General Sequence Learning using Recurrent Neural Networks"
by Alec Radford, Indico Head of Research, who led a workshop on general sequence learning using recurrent neural networks at Next.ML in San Francisco, Feb 2015.
Alec introduces RNNs and sketches how to implement them and cover the tricks necessary to make them work well. Then he investigates using RNNs as general text classification and regression models, examining where they succeed and where they fail compared to more traditional text analysis models.
Finally, he presents simple Python and Theano library for training RNNs with a scikit-learn style interface and you'll see how to use it through several hands-on tutorials on real world text datasets.
Related:
- Top /r/MachineLearning Posts, May: Unreasonable Effectiveness of Recurrent Neural Networks, Time-Lapse Mining
- Deep Learning RNNaissance, an insightful, comprehensive, and entertaining overview
- Top KDnuggets tweets, Jan 14-15: 10 FB likes predicts personality better than a co-worker; A Deep Dive into Recurrent Neural Nets