Communications on Applied Electronics
Foundation of Computer Science (FCS), NY, USA
|
Volume 4 - Issue 3 |
Published: January 2016 |
Authors: P. R. Deshmukh, Roshani Ade |
![]() |
P. R. Deshmukh, Roshani Ade . An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade. Communications on Applied Electronics. 4, 3 (January 2016), 10-14. DOI=10.5120/cae2016652040
@article{ 10.5120/cae2016652040, author = { P. R. Deshmukh,Roshani Ade }, title = { An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade }, journal = { Communications on Applied Electronics }, year = { 2016 }, volume = { 4 }, number = { 3 }, pages = { 10-14 }, doi = { 10.5120/cae2016652040 }, publisher = { Foundation of Computer Science (FCS), NY, USA } }
%0 Journal Article %D 2016 %A P. R. Deshmukh %A Roshani Ade %T An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade%T %J Communications on Applied Electronics %V 4 %N 3 %P 10-14 %R 10.5120/cae2016652040 %I Foundation of Computer Science (FCS), NY, USA
The massive amount of raw student’s data in the education organization can be converted into the information and buried knowledge can be taken out of it for the purpose various applications related to students. As the student’s data in the educational systems is increasing day by day, so instead of classical batch learning algorithm, incremental learning algorithm tries to forget unrelated information while training fresh examples. Now a days, combining classifiers is nothing but taking more than one opinion contributes a lot, to get more accurate results. Therefore, a suggestion is an incremental ensemble of two classifiers namely Naïve Bayes, K-Star using voting scheme based on hypothesis strength and ambiguity grade. The voting rule proposed in this paper is compared with the existing majority voting rule for the student’s data set.