肚脐眼有什么用| 三十年婚姻是什么婚| 11月1日什么星座| 什么是ph值| 科颜氏属于什么档次| 流莺是什么意思| 五条杠什么牌子| 草是什么颜色的| 孙字五行属什么| 孕妇地中海贫血对胎儿有什么影响| 石膏的主要成分是什么| 治疗静脉曲张有什么药| 阳历三月是什么星座| 迁就什么意思| 榻榻米是什么| 青羊药片有什么功效| 五个月的宝宝能吃什么辅食| 孕妇甲减是什么原因引起的| 宫颈息肉是什么原因引起的| 什么是生育津贴| denham是什么牌子| 李五行属性是什么| 幽门螺杆菌感染有什么症状| 吃什么可以降糖| 6月18日什么星座| 神经疼是什么原因| 过敏性紫癜有什么症状| 71年属什么生肖| 什么病可以鉴定病残| 疾厄宫是什么意思| 女人白带多是什么原因| 血常规一般查什么病| 沉冤得雪是什么意思| 淋巴结肿大看什么科室最好| 海鲜不能和什么一起吃| 五心烦热失眠手脚心发热吃什么药| 睡觉睁眼睛是什么原因| 大脑记忆力下降是什么原因| 北京朝阳医院擅长什么| 豚是什么动物| 鱼豆腐是什么做的| pb是什么单位| 黑白猫是什么品种| 甲状腺偏高是什么原因引起的| 眼睛做激光手术有什么后遗症| 甲亢和甲减有什么区别| 涵字属于五行属什么| 骨折吃什么消炎药| jb什么意思| 脚趾麻木是什么病先兆| 什么牌的笔记本电脑好| 世界上最小的长度单位是什么| 7年之痒是什么意思| 跑步配速什么意思| 声音沙哑是什么原因| 万马奔腾是什么生肖| 黑长直是什么意思| 滞是什么意思| 张良和刘邦是什么关系| 新生儿黄疸高有什么风险| ag医学上是什么意思| 鉴定是什么意思| 停电了打什么电话| 咏柳中的咏是什么意思| 天喜星是什么意思| 梦见小兔子是什么意思| 激素是什么| 小孩肛门瘙痒什么原因| 女人耳垂大厚代表什么| sec是什么意思| 月经为什么叫大姨妈| 最是什么意思| 电饭煲内胆什么材质好| 吃什么水果可以降火| 拿什么让你幸福| 疖是什么意思| 澳门买什么最便宜| 消融手术是什么意思| 如来佛祖叫什么名字| 伪骨科是什么意思| 抗甲状腺球蛋白抗体高是什么意思| 无名指和小指发麻是什么原因| 为什么脚底板会痛| 忽必烈姓什么| 肚脐左侧疼是什么原因| 室性期前收缩是什么病| 什么是细菌感染| 产后为什么脸部松弛| 历时是什么意思| 下午六点半是什么时辰| yrc是什么牌子的鞋| 什么叫世家| 马革裹尸是什么意思| 丙五行属什么| 口水多是什么原因| 含义是什么意思| 剖腹产坐月子吃什么| 不建议什么意思| 为什么吃芒果会过敏| l是什么码| 每天跑步对身体有什么好处| 神戳戳是什么意思| 体内湿热吃什么中成药| 寻麻疹是什么原因引起的| 自采暖是什么意思| 尿点什么意思| 农历6月28日是什么星座| 吃完饭打嗝是什么原因| 不让他看我的朋友圈是什么效果| 青枝骨折属于什么骨折| 老鼠为什么怕猫| 党按照什么的原则选拔干部| 71是什么意思| 什么偏旁| 三个马读什么| mra是什么药| 头晕晕的是什么原因| 性冷淡吃什么药| 水漫金山什么意思| 料理机是干什么用的| 膀胱壁毛糙是什么原因| 牙疼能吃什么食物| 七夕节是什么意思| 六味地黄丸有什么副作用吗| 仓鼠咬笼子是什么原因| 强心剂是什么意思| 草龟吃什么食物| 女人小便带血是什么原因引起的| 长期喝茶有什么危害| 无大碍是什么意思| 心气虚吃什么中成药| wrong什么意思| 心脏由什么组织构成| 梦见小猫崽是什么意思| 辐射对人体有什么伤害| 白脉病是什么病| 新生儿不睡觉是什么原因| 初级会计什么时候拿证| 来月经前有什么症状| 2月16号是什么星座| 新疆人信仰什么教| 什么食物胆固醇含量高| 水母是什么| 慢性炎症是什么| 江西景德镇有什么好玩的地方| 属虎的生什么属相的宝宝好| 孟子姓什么| 农历8月是什么月| 世界上最坚硬的东西是什么| 菠菜不能与什么一起吃| 立秋当天吃什么| 智能手环什么品牌好| 风流人物指什么生肖| 不带壳的蜗牛叫什么| 怀孕6个月吃什么好| 女人为什么会得霉菌| 淋巴门消失是什么意思| 气胸叩诊呈什么音| 站街女是什么意思| 6月18号什么星座| 端午节是什么星座| 屁特别臭是什么原因| 脸色暗沉发黑什么原因| 人间烟火什么意思| 脚踝肿挂什么科| 灰指甲是什么样子的| 小暑是什么意思啊| 单脐动脉对胎儿有什么影响| 仕女图是什么意思| 维生素c有什么用| 7月1日是什么日子| 霏是什么意思| 乳房变大是什么原因| 心功能一级什么意思| 草字头加个弓念什么| 病变是什么意思| 胃热吃什么药最有效| 口关读什么| 耳朵旁边长痘痘是什么原因| 刘字是什么偏旁| 耳鸣脑鸣是什么原因引起的| 灵芝孢子粉什么时候吃最好| 北京生源是什么意思| 鬼画符是什么意思| 目瞪口呆是什么生肖| 什么是什么意思| 心博是什么意思| 对冲是什么意思| 甲状腺肿是什么意思| 处暑的处是什么意思| 今年什么时候起伏| 4月16日什么星座| 意境是什么意思| p是什么单位| 什么是道家| 五谷丰登指什么生肖| 狗感冒了吃什么药| 佛珠生菇讲述什么道理| 喝咖啡困倦是什么原因| 丰五行属什么| 类风湿性关节炎的症状是什么| 女猴配什么属相最好| 为什么会有白带| 新生儿头发稀少是什么原因| 什么是天赋| 三和大神是什么意思| 沉冤得雪是什么意思| 7月11是什么星座| 土的行业有什么工作| 碳酸饮料喝多了有什么危害| 九月二十五是什么星座| 924是什么星座| 什么食物对肝有好处| 什么是牙线| 儿童便秘吃什么最管用| 羧甲基纤维素钠是什么| 牛油是什么油| 黄喉是什么东西| 拔智齿挂什么科| 不停的放屁是什么原因| 左肩膀疼是什么原因| 擒贼先擒王是什么生肖| yonex是什么品牌| 三伏天喝什么汤最好| bml什么意思| NPY什么意思| 面部痉挛是什么原因引起的| 为什么会长寻常疣| 锰酸钾是什么颜色| 判决书什么时候生效| 女人右手中指有痣代表什么| 白水晶五行属什么| 万什么一心| 最好的大学是什么大学| 心脑血管挂什么科| out什么意思| 猴头菇和什么煲汤最好| 产前筛查是检查什么| 梦见玫瑰花是什么预兆| 农历五月二十一是什么星座| eb病毒igg抗体阳性是什么意思| gaba是什么| 有缘无分是什么意思| 什么是软组织损伤| 十恶不赦是什么意思| 四川代表什么生肖| 孕妇胆固醇高对胎儿有什么影响| 喝金银花有什么好处| 鲁迅是著名的什么家| 灿烂的近义词是什么| 味淋可以用什么代替| 王晶为什么不娶邱淑贞| 肾阴虚什么症状| 天文是什么意思| 嘴麻是什么原因引起的| 下岗是什么意思| 3.8什么星座| 心脏支架后吃什么药| 阳痿是什么意思| 6月14号是什么星座| 舌头上有黑苔是什么原因| 风湿病挂什么科| 为什么长不胖一直很瘦| 怜香惜玉是什么意思| 百度
 

·喀赞其民俗旅游区 “升格”为大学生实训基地

百度 体育人才具有国际影响力的重大赛事策划人和组织人、著名运动员和教练员、国际级和国家A级裁判员、知名体育解说员和体育节目主持人;具有良好发展趋势和培养前途的优秀体育后备人才可申请办理人才引进。

Learn about Artificial Neural Networks, Deep Learning, Recurrent Neural Networks and LSTMs like never before and use NLP to build a Chatbot!



By Jaime Zornoza, Universidad Politecnica de Madrid

Figure

 

Ever fantasied about having your own personal assistant to answer any questions you can ask, or have conversations with? Well, thanks to Machine Learning and?Deep Neural Networks, this is not so far from happening. Think of the amazing capabilities exhibited by Apple’s Siri or Amazon’s Alexa.

Don’t get too excited, in this next series of posts we are not going to create an omnipotent Artificial Intelligence, rather we will?create a simple chatbot?that given some input information and a question about such information, responds to yes/no questions regarding what it has been told.

It is nowhere near to Siri’s or Alexa’s capabilities, but it illustrates very well how even using very simple deep neural network structures, amazing results can be obtained. In this post we will learn about?Artificial Neural Networks, Deep Learning, Recurrent Neural Networks and Long-Short Term Memory Networks.?In the next post we will use them on a real project to make a question answering bot.

Before we start with all the fun regarding Neural Networks, I want you to first take a close look at the?following image. In it there are two pictures; one of a school bus driving through a road, and one of an ordinary living room, which have had descriptions attached by?human annotators.

Figure

Figure with two different images with a text description made by human annotators.

 

Done? Lets get on with it then!

 

The beginning— Artificial Neural Networks

 
To construct the neural network model that will be used to create the chatbot,?Keras, a very popular Python Library for Neural Networks will be used. However, before going any further, we first have to understand what an Artificial Neural Network or ANN is.

ANNs are Machine Learning models that try to mimic the functioning of the human brain, whose structure is built from a large number of?neurons connected in between them?— hence the name “Artificial Neural Networks

 

The Perceptron

 
The simplest ANN model is composed of a single neuron, and goes by the Star-Trek sounding name?Perceptron. It was invented in 1957 by Frank Rossenblatt, and it consist of a simple neuron, which takes the weighted sum of its inputs (which in a biological neuron would be the dendrites) applies a mathematical function to them, and outputs its result (the output would be the equivalent of the axon of a biological neuron). We won’t dive into the details of the different functions that can be applied here, as the intention of the post is not to become experts, but rather to get a basic understanding of how a neural network works.

Figure

Image of a single neuron, with the inputs on the left, the weights that multiply each input, and the neuron itself, that applies a function to the weighted sum of the inputs and outputs the result.

 

These individual neurons can be stacked on top of each other forming layers of the size that we want, and then these layers can be sequentially put next to each other to make the network deeper.

When networks are built in this way, the neurons that don’t belong to the input or output layers are considered part of the?hidden layers,?depicting with their name one of the main characteristics of an ANN: they are almost black box models; we understand the mathematics behind what happens, and kind of have an intuition of what goes on inside the black box, but if we take the output of a hidden layer and try to make sense of it, we will probably crunch our heads and achieve no positive results.

Still, they produce amazing results, so nobody complains about their lack of interpretability.

Figure

Image of a larger neural network, composed of many individual neurons and layers: an input layer, 2 hidden layers and an output layer.

 

Neural network structures and how to train them have been known already for more than two decades. What is it then, that has lead to all the fuss and hype about ANN and Deep Learning that is going on right now? The answer to this question is just below, but before we have to understand what Deep Learning really is.

 

What is Deep Learning then?

 
Deep learning, as you might guess by the name, is just the use of a lot of layers to progressively extract higher level features from the data that we feed to the neural network. It is a simple as that; the use of multiple hidden layers to enhance the performance of our neural models.

Now that we know this, the answer to the question above is pretty easy:?scale. Over the last two decades the amount of available data of all sorts, and the power of our data storing and processing machines (yes, computers), have exponentially increased.

These computing capabilities and the massive increases in the amount of available data to train our models with have allowed us to create larger, deeper neural networks, which just perform better than smaller ones.

Andrew Ng, one of the world’s leading experts in Deep Learning, makes this clear in?this video. In it, he shows an image similar to the following one, and with it he explains the benefits of having more data to train our models with, and the advantage of large neural networks versus other Machine Learning models.

Figure

Image showing the evolution of the performance of different algorithms as we feed them more training data

 

For traditional Machine Learning algorithms (linear or logistic regressions, SMVs, Random Forests and so on), performance increases as we train the models with more data, up to a certain point, where performance stops going up as we feed the model with further data. When this point is reached it is like the model does not know what to do with the additional data, and i’ts performance can not be improved any more by feeding more of it.

With neural networks on the other hand, this never happens. Performance, almost always increases with data (if this data is of good quality of course), and it does so at a faster pace depending on the size of the network. Therefore, if we want to get the?best possible performance, we would need to be somewhere on the green line (Large Neural Network) and towards the right of the X axis (high Amount of Data).

Aside from this, there has also been some algorithmic improvements, but the main fact that has led to the magnificent rise of Deep Learning and Artificial Neural Networks, is just scale:?scale of computing and scale of data.

Another important personality on the field,?Jeff Dean?(one of the instigators of the adoption of?Deep Learning within Google), says the following about deep learning:

When you hear the term deep learning, just think of a large deep neural net. Deep refers to the number of layers typically and so this is kind of the popular term that’s been adopted in the press. I think of them as deep neural networks generally.

When talking about Deep Learning, he highlights the?scalability of neural networks?indicating that results get better with more data and larger models, that in turn require more computation to train, just like we have seen before.

 

Okay perfect I understood all that, but how do Neural Networks actually learn?

 
Well, you probably have guessed already: they learn from data.

Remember the weights that multiplied our inputs in the single?perceptron? Well, these weights are also included in any edge that joins two different neurons. This means that in the image of a larger neural network, they are present in every single one of the black edges, taking the output of one neuron, multiplying it and then giving it as input to the other neuron that such edge is connected to.

Figure

Image of a neural network with two hidden layers and the weights in between each of the layers

 

When we train a neural network (training a neural network is the ML expression for making it learn) we feed it a set of known data (in ML this is called labelled data), have it predict a characteristic that we know about such data (like if an image represents a dog or a cat) and then compare the predicted result to the actual result.

As this process goes on and the network makes mistakes, it adapts the weights of the connections in between the neurons to reduce the number of mistakes it makes. Because of this, as shown before, if we give the network more and more data most of the time it will improve it’s performance.

 

Learning from sequential data — Recurrent Neural Networks

 
Now that we know what artificial neural networks and deep learning are, and have a slight idea of how neural networks learn, lets start looking at the type of networks that we will use to build our chatbot:?Recurrent Neural Networks or RNNs for short.

Recurrent neural networks are a special kind of neural networks that are designed to effectively deal with?sequential data. This kind of data includes?time series?(a list of values of some parameters over a certain period of time)?text documents, which can be seen as a sequence of words, or?audio,?which can be seen as a sequence of sound frequencies.

The way RNNs do this, is by taking the output of each neuron, and feeding it back to it as an input. By doing this, it does not only receive new pieces of information in every time step, but it also adds to these new pieces of information a weighted version of the?previous output. This makes these neurons have a kind of?“memory?of the previous inputs it has had, as they are somehow quantified by the output being fed back to the neuron.

Figure

A recurrent neuron, where the output data is multiplied by a weight and fed back into the input

 

Cells that are a function of inputs from previous time steps are also known as?memory cells.

The problem with RNNs is that as time passes by and they get fed more and more new data,?they start to “forget?about the previous data they have seen, as it gets?diluted?between the new data, the transformation from activation function, and the weight multiplication. This means they have a?good short term memory, but a slight problem when trying to remember things that have happened a while ago (data they have seen many time steps in the past).

We need some sort of?Long term memory, which is just what LSTMs provide.

 

Enhancing our memory — Long Short Term Memory Networks

 
Long-Short Term Memory networks or?LSTMs?are a variant of RNN that solve the Long term memory problem of the former. We will end this post by briefly explaining how they work.

They have a more?complex cell structure?than a normal recurrent neuron, that allows them to better regulate how to?learn?or?forget?from the different input sources.

Figure

Representation of an LSTM cell. Dont play attention to the blue circles and boxes, as you can see it has a way more complex structure than a normal RNN unit, and we wont go into it in this post.

 

An LSTM neuron can do this by incorporating a?cell state?and three different gates: the input gate, the forget gate and the output gate. In each time step, the cell can decide what to do with the state vector: read from it, write to it, or delete it, thanks to an explicit gating mechanism. With the?input gate, the cell can decide whether to update the cell state or not. With the?forget gate?the cell can erase its memory, and with the?output gate?the cell can decide whether to make the output information available or not.

LSTMs also mitigate the problems of?exploding and vanishing gradients, but that is a story for another day.

That’s it! Now we have a superficial understanding of how these different kind of neural networks work, and we can put it to use to build our?first Deep Learning project!

 

Conclusion

 
Neural Networks are awesome. As we will see in the next post, even a very simple structure with just a few layers can?create a very competent Chatbot.

Oh, and by the way, remember this image?

Figure

Figure with two different images with a short text description made by a neural network.

 

Well, just to prove how cool Deep Neural Networks are, I have to admit something.?I lied about how the descriptions for the images were produced.

At the beginning of the post I said that these descriptions were made by human annotators, however,?the truth is that these short texts describing what can be seen on each image were actually produced by an Artificial Neural Network.

Insane right?

If you want to learn how to use Deep Learning to create an awesome chatbot,?follow me on Medium, and stay tuned for my next post!

Until then, take care, and enjoy AI!

 

Additional Resources:

 
As the explanations for the different concepts described in this post have been very superficial, in case there is any of you who wants to go further and continue learning, here are some fantastic additional resources.

Okay, that is all, I hope you liked the post. Feel Free to connect with me on?LinkedIn?or follow me on Twitter at @jaimezorno. Also, you can take a look at my other posts on Data Science and Machine Learning?here. Have a good read!

 
Bio: Jaime Zornoza is an Industrial Engineer with a bachelor specialized in Electronics and a Masters degree specialized in Computer Science.

Original. Reposted with permission.

Related:



肢体拘挛是什么意思 花匠是什么意思 万能受血者是什么血型 吃开心果有什么好处和坏处 10月19是什么星座
脑回路是什么意思 林伽是什么 北极和南极有什么区别 桑叶有什么功效和作用 猪心炖什么治失眠
蜂王浆什么味道 生理需要是什么意思 荣字五行属什么 处女座上升星座是什么 秋天有什么植物
八卦脸什么意思 轮状病毒是什么症状 手指缝溃烂擦什么药膏 蒸馒头用什么面粉 如履薄冰是什么意思
白细胞低是怎么回事有什么危害hcv9jop3ns2r.cn 石英岩玉是什么hcv7jop9ns9r.cn 可乐饼为什么叫可乐饼hcv7jop6ns6r.cn 狗被蜱虫咬了有什么症状hcv9jop4ns0r.cn carol什么意思hcv9jop5ns2r.cn
做b超憋尿要憋到什么程度hcv9jop3ns7r.cn 中堂相当于现在什么官hcv9jop6ns8r.cn 国家一级演员是什么级别hcv8jop1ns4r.cn 血尿是什么原因hcv9jop7ns1r.cn 每天跑步对身体有什么好处fenrenren.com
梦见自己大便是什么意思hcv7jop9ns5r.cn 以讹传讹什么意思onlinewuye.com 小孩智力发育迟缓挂什么科hcv9jop5ns7r.cn 卷帘大将是干什么的hcv8jop9ns2r.cn 青蛙长什么样jinxinzhichuang.com
喝断片了是什么意思hlguo.com 1997年属牛是什么命hcv8jop7ns1r.cn 尿毒症前兆是什么症状表现hcv9jop2ns7r.cn 1985年属牛是什么命hcv9jop2ns7r.cn 什么是象声词hcv8jop0ns3r.cn
百度