• Nu S-Au Găsit Rezultate

View of Swarm Optimization with Neural Networks for Effective Classification Techniques

N/A
N/A
Protected

Academic year: 2022

Share "View of Swarm Optimization with Neural Networks for Effective Classification Techniques"

Copied!
7
0
0

Text complet

(1)

Swarm Optimization with Neural Networks for Effective Classification Techniques

Dr.K.Kalyani

Asst.prof., PG and Research Dept. of Computer Science, Marudupandiyar College, Thanjavur - 613 403, Tamil Nadu (Affiliated to Bharathidasan University, Tiruchirappalli)

Abstract:Swarm intelligence is a cooperative behavior of collective systems like insects such as ant colony optimization(ACO), fish schooling, birds flocking, bee Colony Optimization (BCO) particle swarm optimization (PSO) and so on. In thispaper, a hybrid performance for data organization and information extrapolation is recommended.The Honey Bee MatingOptimization (HBMO) algorithm and Artificial Neural Networks may also be considered as a distinctive swarm-basedoptimization, in which the exploration algorithm is encouraged by the development of real honey-bee marital and mimic theiterative mating process of honey bees and approaches to select applicable drones for mating progression through the fitnessfunction enrichmentfor selection of superlativeweightsfor hidden layers of Neural Network classifiers. Extended HBMO with Neural Network algorithm is now realistic to classify the data proficiently by training the neural network.Extended HBMO (EHBMO-NN) procedure is now realistic to categorize the data efficiently by teaching the neural network. The arrangement precision of EHBMO-NN is associated with several other procedures. In this paper, extended honey-bee coupling optimization process (EHBMO-NN) is presented and verified with few benchmark instances. A developed way of Honey Bee Mating Optimization performance is joined with Neural Network which increases exactitude and decrease time interruption in complication of numerous factual world datasets.

Key words: Swarm Intelligence, Extended Honey BeeMatingOptimization, Fitness function,Effective Classification,Neural Network.

I. INTRODUCTION: -

The Honey Bee Mating Optimization (HBMO) procedure is a group-founded kind optimization performance, in which the examining method imitators the coupling development in honey-bee associations. Thus, the HBMO process is interrelated to the universal field of crowd aptitude, but the coupling progression which is based on boundary and metamorphosis machinists, strongly relate this process to evolutionary calculating too [1]. Fundamentals on the HBMO procedure are concisely pronounced forward, based on general models offered. A honey bee colony community a single queen-bee, some thousands of murmurs and numerous tens of thousands of employee-bees. The queen-bee is focussed in egg positioning and lives up to 5 or 6 years, although murmurs and employee-bees live no longer than 6 months. Murmurs, considered as forebears of the association, companion with the queen-bee and then die. They are haploid and act to transmission the genome hereditary from their mother to the next generation without shifting its genetic configuration, except through transmutations. The evolutionary segment of the procedure starts with the reproducing journey of the colony [2]. Throughout the copulating journey the queen mates with murmurs to form a chromosomal pool, also called spermatheca, which comprises of chromosomes received by the queen from murmurs [3]. The second phase of the evolutionary progression starts afterwards the genomic pond was filled with genes, and comprises in breeding eggs with genetic evidence from the spermatheca, based on boundary procedures between chromosomes. The final stage of the evolutionary process comprises in floating the broods based on the fitness function generated during the second stage, and creating a new generation of bees, based on transmutation operators [4].

In this work, University of California, Irvine machine learning databank is used.

Through the investigation of prevailing technique, based on the proficient consequences, that

(2)

it sources Scalability disputes, absenteeism of precision and time depletion in instance of outsized datasets [5]. To experiment this concern, our work is concentrated on numerous approaches deliberated and to progress the fitness function assessment in extended honey bee mating optimization (EHBMO). Our objectives are to resolve convolution and scalability concerns in real world datasets and to recover the proficiency in data cataloguing. The weights are optimized through the evaluation of enhanced fitness function. The results obtained from the Extended honey bee mating optimization with Neural Network is to avoid scalability issues in large datasets, reduce the timeconsumption and also provide better accuracy and performance.

In our work, the University of California, Irvine (UCI) Machine Learning database is used. Several more researchers are used this databank for honey bee reproducing optimization performances such as Iris flower, wine, heart disease, cancer, diabetes and soya been etc. The UCI datasets are used here to calculate the intermission, exactness and adeptness in countless data and sources of the data are dissimilar from each other. Since that time, it has been broadly used by scholars, educationalists, and scientists all over the world as a topmost source of machine learning data sets and it is universally known as standard datasets.

II. NEURAL NETWORKS: -

Neural Networks are known vibrant and quick system for productivity forecasting.

MLP neural system is measured as the session of forage system which comprises numerous layers of computational units [6]. Each unit is a demonstration of neuron in which involves of a linear or non-linear stimulation function. The construction of the system is a absorbed and manifold layer graph and neurons in each layer is fully associated with the neurons of the subsequent layer [7].

The loads of the network are usually prepared unsystematically and are increasingly transformed reiteration during the training process learn a goal function. Training in such networks means that the network has to learn the goal function. To do so, each input composed with its equivalent output is presented to the network [8]. Learning algorithm tries to adjust the weight in all layers in a way that the error between calculated output and accurate output become small. The most known and popular learning process is Back Propagation [9]. The Learning algorithm is used to reduce the overall error of the network based on optimization method called gradient descent.

After vaccination of any input to primary layer of the network defined error function in the output layer. To regulate those weights which modifications continue until the alteration of weights of the first layer [10]. This progression is done for all proceedings of training data in each time. After finding the best restrictions (kind of activation function in hidden layer and number of hidden layer neurons) of the NN model, in alongside with the preparation of the network in each repetition,

III. HBMO COMBINED WITH NEURAL NETWORK (NN)

HBMO-NN algorithm was exploited to elevate the masses of the system. The loads are optimized by the estimation of suitability function. The HBMO algorithm is functioned based on bee’s auxiliary with crossover and mutation operator. The replacement of bees is done by fitness calculated after boundary and metamorphosis processes. So, the suitability function of HBMO is extended as EHBMO. The HBMO consists of F0 and F1 as Fitness Function whereas in EHBMO comprises additional two functions of F2, F3.

(3)

Fitness F0- is the sum of Euclidean remoteness of employee bees to its drone and drone to the queen bee. Fitness 1-is the ratio of the average dynamism sustainability of employee bees with its murmur. Fitness 2-is the ratio of the average Euclidean distance of the murmur to the queen with the sum of Euclidean distance of all the employee bees to the queen. Fitness 3-is the input particle which is strained with threshold significance of the employee bees and murmur, we usea better-quality way of Bee optimization technique known as Extended Honey Bee Mating Optimization is combined with Neural Network to form hybrid algorithms named as (EHBMO-NN) which improves accuracy and reduce time delay in complexity of numerous grounds.

 The HBMO process initiated with the coupling-flight, where a queen (best solution) chooses murmur probabilistically to arrangement the spermathecal.

 A murmur then nominated from the list at unsystematic for the formation of children.

Creation of number of new broods by cross- overing the drones ‟genotypes with the empress’s”.

 Use of employees to conduct local examine on young bees. Reworking of employee’s suitability based on the quantity of perfection accomplished on broods. Replacement of feebler queens by righter broods.

The procedure begins with three user-defined constraints and one predefined parameter. The predefined parameter is the number of employees, demonstrating the number of heuristics prearranged in the package. The three user-defined parameters are the quantity of monarchs, the queen’s spermatheca size and the number of young’s that is born by all queens.

HBMO-Neural Networks Algorithm

Step 1: The contribution neuron n is feed with teaching data xm with preferred target yn.

Step 2: The input preparation data xm undertakes recapitulation process. In each reiteration, the weightage of each node w(n) is calculated.

Step 3: The bias or mistake amount of each node is considered as delta function.

Step 4: The heaviness of each node w(n) is calculated based on the delta node and the input data xm, thew(n)characterizes weights of connections between network input xm and neuron n in input layer, andthe Symbolsyn represents output signal of neuron n.

Step 5: The weight w(n) is accustomed in the concealed layer by Extended Honey Bee Mating Optimization process.

Step 6: The target production is investigated and endures back dissemination system to reach bias reduced operative output.

Where f0 is the sum of Euclidean distances of worker bees to its drone and drone to the Queen Bee, Br is a Replacement of Bees in the current round, αl (l = 1, β) is the number of worker bees, β is the number of clusters, dw(b), d is the Euclidean distance from worker bee i in cluster j to its drone, dd, Q(B) is the Euclidean distance from jth drone to the Queen bee. Function f1 is the ratio of the average energy of worker bees with its drone. Function f2 is the ratio of the average Euclidean distance of the drone to the Q(b) with the sum of Euclidean distance of all the worker bees to the Queen Bee. Function f3is the input particle is filtered with threshold value of the worker bees (α i … . n) and drone (𝛽 1 … 𝑛), So that the worker bees are eliminated based on this threshold value which regains minimum iteration and Energy Efficiency.

(4)

The constants A, B, C, D are predefined constants used to weight the contribution of each of the sub-objectives and A + B + C + D = 1. The fitness function defined above has the objective of simultaneously minimizing the intra-cluster distance between worker bees and drone, as quantified by f0 and of maximizing the cluster head’s energy in its cluster as quantified by f1; and of producing cluster with unequal size as quantified by f2; and also, of optimizing the energy dissipation in the clusters as quantified by f3. According to the fitness function, a small value of f0, f1 suggests compact clusters with the optimum set of worker bees that have sufficient energy to perform the drone tasks. A small value of f2 means that the size of the clusters located closer to the base station is smaller. A small value of f3 shows that the formed clusters are more energy efficient.

IV. EXPERIMENTAL RESULT AND DISCUSSION: - Table-1Input Dataset Statistics:

Table-2 : This Table Represents the Data Analysis using HBMO-ANN Algorithm with the Standard Repository Dataset.

HBMO-NN Statistical Data Analysis Dataset

Used

TP Rate

FP Rate

Preci sion

Re call

F-

Measure MCC

ROC Area

Accu racy

No. of Classes Iris 0.987 0.011 0.968 0.944 0.881 0.91 1.005 1.000547 3 Liver 0.739 0.281 0.77 0.739 0.711 0.405 0.751 1.052125 2 Cancer 0.69 0.24 0.738 0.5 0.726 0.315 0.66 1.028441 2

Table-3 : This Table Represents the Data Analysis using EHBMO-NN Algorithm with the Standard Repository Dataset.

Tab EHBMO- NN Statistical Data Analysis Dataset

Used T

TP Rate

FP Rate

Pre

cision Recall F-Measure MCC

ROC Area

Accu racy

No. of Classes

Iris 0.989 0.004 1.008 0.987 0.989 0.97 1.087 1.001585 3 Liver 0.797 0.214 0.89 0.799 0.788 0.469 0.784 1.065278 2

Dataset Used Iris Liver Cancer Diabetes

Arrhy thmia

No of

Instances 150 345 32 768 452

No of Classes 3 2 2 2 16

No of

Attribute 5 78 57 9 280

(5)

Cancer 0.9 0.4 0.789 0.8 0.768 0.367 0.81 1.050737 2

HBMO - NN EHBMO - NN

The Figure-1 illustrates the Accuracy of HBMO-NN is 75% and EHBMO-NN is 95%

HBMO - NN EHBMO - NN

The Figure-2 illustrates the Time of HBMO-NN is 15 min and EHBMO-NN is 9.4 min

HBMO - NN EHBMO - NN

The Figure-3 illustrates the Performance Efficiency of HBMO-NN is 70%

andEHBMO-NN is 97 %

0 20 40 60 80 100

75

95

15

9.4

0 5 10 15 20

70

97

0 20 40 60 80 100 120 140 160 180 200

(6)

V. CONCLUSION:

In this research, first check the efficiency of HBMO-NN in Data Cataloguing tasks, based on the attained results, the HBMO-NN causes Scalability disputes in case of outsized Datasets, to challenge this Scalability issue, the research is focus based on alternates scrutinised, and committed to progress the fitness function assessment in ExtendedHBMO- NN. Our objectives are to answer complication and scalability issues in real world datasets and to expand competence in data extrapolation. We executed and matched this EHBMO-NN with the HBMO-NN, from the results, it concludes that EHBMO-NN can obtain competitive results against the real-world data sets used, although there is some growth in the computational effort needed. Guidelines for forthcoming work include examination and manipulation by relating this tool to more challenging data sources comprising continuous qualities.

References :-

1. Omid Bozorg Haddad, and M. A. Mariño “Hbmo In Engineering Optimization “ Iran University of Science and Technology, Department of Civil ngineering, Tehran, Iran Ninth International Water Technology Conference, IWTC9 2005, Sharm El-Sheikh, Egypt 1053.

2. “L. Qingyong, S. Zhiping, S. Jun and S. Zhongzhi, “Swarm Intelligence Clustering Algorithm Based on Attractor”, Lecture Notes in Computer Science, Springer Link, Vol.

3621, 2005, pp. 496-504.

3. “A Survey on Artificial Bee Colony Models for Numerical Optimizations and its Work In Image Segmentation and Data Classification”. Lavanya Gunasekaran and Srinivasan Subramaniam Australian Journal of Basic and Applied Sciences ISSN:1991-8178 .

4. Gopakumar, C., and S. Reshma. "Wavelet Based Analysis of ECG Signal for the Detection of Myocardial Infarction Using SVM Classifier." International Journal of Electronics and Communication Engineering 4.4 (2015): 9-16.

5. “An Enhanced K Means Clustering using Improved Hopfield Neural Network and Genetic Algorithm” , International Journal of Recent Technology and Engineering (IJRTE) ISSN:

2277-3878, Volume-2, Issue-3, July 2013 .

6. Baris Yuce , Michael S. Packianather . “Honey Bees Inspired Optimization Method: The Bees Algorithm “Insects 2013, 4, 646-662; doi:10.3390/insects4040646 .

7. K. Lenin and B. R. Reddy “Bumble Bees Mating Optimization (BBMO) Algorithm for Solving Optimal Reactive Power Dispatch Problem” International Journal of Electronics and Electrical Engineering Vol. 3, No. 4, August 2015 Jawaharlal Nehru Technological University , Kukatpally, Hyderabad 500 085, India

8. Khosravi, M. O. E. I. N., MILAD ASKARI HASHEMABADI, and VAJIHE SHARIFI DAVARANI. "Loss Reduction With Optimization Of Capacitor Placement Using Bfa Algorithm-Case Study For A 20 Kv Network In Iran." International Journal of Electrical and Electronics Engineering,(3) (2014): 69-80.

9. “Association Rules Optimization using Artificial Bee Colony Algorithm with Mutation”

International Journal of Computer Applications (0975 – 8887) Volume 116 – No. 13, April 2015/ 29 Manish Gupta Asst. Professor, Computer Science Dept. Vikrant Institute of Technology & Management Gwalior, India.

10. “Training A Feed-Forward Neural Network With Artificial Bee Colony Based Ackpropagation Method “ International Journal of Computer Science & Information Technology (IJCSIT) Vol 4, No 4, August 2012.

11. Yuksel Celik1 and Erkan Ulker “An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization” Hindawi Publishing . The ScientificWorld Journal Volume 2013, Article ID 370172.

12. Kumari, K. Karuna, and P. V. Sridevi. "Phase-only Synthesis of Linear Microstrip Patch Antenna Array using Improved Local Search Particle Swarm Optimization." International Journal of Applied Engineering Research 12.6 (2017): 818-832.

(7)

13. Yannis Marinakis1, Magdalene Marinaki2 and Nikolaos Matsatsinis1 “A Hybrid Clustering Algorithm based on Honey Bees Mating ptimization and Greedy Randomized Adaptive Search Procedure”. UCI Machine Learning Repository ://archive.ics.uci.edu/ml/datasets.

14. Senthiil, P. V., V. A. Sirusshti, and T. Sathish. "Artificial Intelligence Based Green Manufacturability quantification of a unit production process." International Journal of Mechanical and Production Engineering Research and Development 9.2 (2019): 841-852.

15. Lonkar, P. R. I. Y. A. N. K. A., and D. C. Mehetre. "Data Forwarding Based on Interest Using ABC Algorithm In SANs." International Journal of Computer Networking, Wireless and Mobile Communication 5 (2015).

16. NAIDU, P. SANYASI, BABITA BHAGAT, and NEHA RATHI. "PARTICLES SWARM OPTIMIZATION TECHNIQUES: PRINCIPLE, COMPARISON & APPLICATION."

International Journal of Computer Science Engineering and Information Technology Research (IJCSEITR)8.2, Jun 2018, 37-48

Referințe

DOCUMENTE SIMILARE

Keywords:Artificial neural networks, Data mining techniques, Meteorological data, Rainfall prediction, Support Vector

In this article, we study that the artificial neural network algorithm is present for the prediction of size of Ag-NPs with the inputs of AgNO 3

In automated galaxy classification good results were obtained by using sev- eral types of artificial neural networks.. Experimental results show that even on a reduced dimension

engineering in the chemical and material industries 5 4 Spring EN,FR,DE Preparation of bachelor’s degree project in industrial. informatics 5 4 Spring

• The future incoming staff member contacts the Department of International Relations of Politehnica University Timişoara ([email protected]) and sends the visiting

The authors used six features to predict lower back pain with ANN and achieved a classification accuracy of 0.83% with the Artificial Neural Network algorithm.. The author

Wrapper approach that employs Intelligence algorithms, namely, Score based Artificial Fish Swarm Algorithm (SAFSA), Mutation Score Butterfly Optimization Algorithm (MSBOA) and Score

Here, a novel method is known as the Hybrid Linear stacking model for feature selection and Xgboost algorithm for heart disease classification (HLS-Xgboost)1. This model