Question: plz giveconclusion differences The network learns the received data until it is able to provide an output that is in accordance with the set criteria

plz giveconclusion differences The network learns

plz giveconclusion

differences The network learns the received data until it is able to provide an output that is in accordance with the set criteria (the number of training epochs or error tolerance) Application of neural networks In this study we used popular Pattern recognition that is based on feedforward neural network and Backpropagation learning method as well as Probabilistic neural network (PNN). PNN is multilayered feedforward network which is driven from the Bayesian network, probabilistic directed acyclic graphical model, and Kernel Fisher discriminate analysis PNN have three layers. They are often used for classification problems, because of their ability to map inputs to outputs, even if their mutual relation is hard to define. Also, probabilistic neural networks train really fast, because training is performed in one pass of each training instance, rather than several as with recurrent networks Architecture of probabilistic neural network is composed of input, radial basis, competitive and output layer Radial basis layer record the difference between input vector and training vector. In this way, the vector, whose elements show how close inputs to the training inputs are, is obtained. The nest layer crates the sum of these contributions, for each category, and creates new vector of probabilities that describes the belonging of each instance to specific group. At the end, complete transfer function returns 1 for belonging class and 0 for other classes Feedforward neural network - Multilayer perceptron network for pattern recognition usually consists of three layers, input, hidden and output, but it depends of the complexity of the problem. Sometime there is a need for more complex network, with more than one hidden layer. Numbers of neurons in input and output layer are determined by the concrete data and its structure. These numbers are easy to define for each classification problem. On the other hand, number of neurons in hidden layer, depends of complexity of problem. Even though, in literature, there are many different opinions how to compute number of hidden neurons, there is no generally accepted formula Pattern recognition is based on backpropagation supervised learning algorithm. Backpropagation is really successful in classification problems (Chaudhari, 2010) and it is the most widely used method for training neural networks. Through algorithm learning minimum error function is defined, by using the method of gradient descent. Difference between target values and resulting output is computed is sent back trough the network in order to adjust the weights of each connection in network. Training network is based on minimization of this difference through iterative process 6. ANALYSIS OF OBTAINED RESULTS FOR NEURAL NETWORKS In order to predict probability of company bankruptcy we created two neural networks: multilayer perceptron and PNN.Inputs in both of neural network are financial ratios from Altman's Z model, and the output is class in which company belongs. Class indicates weather that company is active or bankrupt. Thus number of neurons in output layer is two and in input layer is five. As we already said, data set consists of 52 companies which are registered in Serbia, in collected data there are 26 active firms and 26 bankrupt. In our case, we have five financial ratios in Altman's Z model therefore there are five neurons in input layer. On the other side, in output layer there is two neurons, because we are trying to classify each company in group of active or bankrupt firms. For hidden layer, we decided to use 10 neurons in only one layer. Pattern recognition NN First, we created pattern recognition neural network which used Scaled Conjugate Gradient for training, Data set was randomly divided in three groups: one for training, one for verification and one for testing network's performance. Our goal was to create new model that could better establish the correlation of the Altman's five financial ratios After training pattern recognition neural network for ten times, we got average training error of 0,03172. For computing training error we used method of least squared error. Neural network was trained so it could predict probability that company is going to bankrupt, for new data. Neural network successfully predicted bankruptcy for 82% of new companies, with variance of 0.78 When we look closely to results we can see that pattern recognition mainly made mistake in classifying company's that are in Altman's gray zone. From this we can conclude that pattern recognition gave the similarsigpificance to financial ration as Altman's Z score Probabilistic neural network Second neural network that we created was probabilistic neural network. Average successfulness after 10 trainings and after usage of network for predictions under new data is 90.7%, with variance is 0.55 Interesting comparisons

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related General Management Questions!