Faculty Abhimanyu Bar

Abhimanyu Bar

Assistant Professor

Department of Computer Science and Engineering

Contact Details

abhimanyu.b@srmap.edu.in

Office Location

Education

2023
PhD
University of Hyderabad, Telengana
India
2011
MCA
BPUT Rourkela, Odisha
India
2008
B.Sc
Utkal University, Odisha
India

Personal Website

Experience

  • Assistant Professor, School of Computer Science Engineering and Technology-Bennett University, Greater Noida, Delhi-NCR

Research Interest

  • My research focuses on classical rough set theory with a special emphasis on feature subset selection (reduct computation) and dimensionality reduction. We have developed optimal and near-optimal methods for reduct discovery, including approaches based on A* search and coarsest granularity criteria, aimed at simplifying decision systems while retaining predictive accuracy. Dr. Bar’s broader interests include rough sets, soft computing, data mining, and artificial intelligence, with applications in building reliable, efficient, and interpretable machine learning models. His work has contributed to the advancement of reduct computation techniques and their use in classification, knowledge representation, and ensemble model design
  • My current research focuses on developing explainable ensemble machine learning models using Classical Rough Set Theory. I investigate diversity-driven multiple reduct generation to construct ensemble frameworks where each base learner operates on distinct, information-preserving feature subsets. By leveraging the rule-induction capability and attribute reduction strength of Rough Sets, my work aims to enhance model transparency and interpretability without compromising predictive performance. I explore how reduct-based feature selection and discernibility concepts can provide clear, human-understandable explanations for ensemble decisions. Through this integration of Rough Set theory and Explainable AI, I seek to design robust, scalable, and interpretable ensemble systems for reliable decision-making in real-world applications.

Memberships

  • IEEE

Publications

  • A novel approach for shortest optimal reduct computation

    Phani Kumar G.Y., Bar A., Sai Prasad P.S.V.S.

    Article, Information Sciences, 2026, DOI Link

    View abstract ⏷

    Rough set theory has emerged as a robust soft computing paradigm for feature selection, commonly known as reduct computation. A decision system may contain multiple reducts of varying sizes, all offering equivalent classification capabilities. However, when model performance is a critical factor, the shortest reduct is generally preferred due to its simplicity and interpretability. The discernibility matrix method is a widely used technique for computing such reducts. Despite its effectiveness, this method is computationally intensive and classified as NP-hard, limiting its scalability for datasets where discernibility matrix computation becomes infeasible. This study addresses the limitations of traditional discernibility matrix-based approaches by introducing a novel method that combines a Breadth-First Search control strategy with an incremental approach to compute the absorbed discernibility matrix. The Breadth First Search strategy enables efficient exploration of the search space to identify the shortest optimal reduct early, while the incremental absorbed discernibility matrix enhances the computational scalability of the algorithm. To validate the proposed method, an experimental evaluation was conducted against two state-of-the-art algorithms: Breadth-First Search, representing the discernibility matrix-based strategy, and MinReduct, a benchmark for absorbed discernibility matrix-based approaches. Results demonstrate superior computational performance and earlier discovery of shortest reducts without compromising correctness or optimality.
  • Diversity among multiple reducts computation with application to ensemble of classification model

    Bar A., Kumar A., Phani Kumar G.Y., Sai Prasad P.S.V.S.

    Article, International Journal of Approximate Reasoning, 2025, DOI Link

    View abstract ⏷

    In rough set–based feature selection, the discernibility matrix provides a mathematical framework for computing all or multiple reducts. However, for applications such as ensemble model induction, there is no need to generate an exhaustive set of reducts; instead, selecting a smaller subset with better individual performance and sufficient diversity tends to more effective. Diversity among base classifiers is critical for improving the predictive performance of ensemble models, yet most existing rough set–based methods do not explicitly address this aspect when generating multiple reducts. To overcome this limitation, this paper proposes two strategies that embed diversity directly into the reduct generation process. The first introduces a novel partition refinement cardinality heuristic that selects mutually exclusive reducts with maximum partition cardinality differences to promote classifier diversity. The second presents an efficient adaptation of an existing least overlap heuristic, combined with an incremental construction of the absorbed discernibility matrix to ensure scalability for large datasets where conventional discernibility matrix construction is infeasible. Finally, empirical analysis with state-of-the-art algorithms demonstrates that the diverse reducts generated by the proposed methods successfully achieve their goal of enhancing ensemble model performance through improved diversity and predictive accuracy.
  • Approaches for coarsest granularity based near-optimal reduct computation

    Bar A., Prasad P.S.V.S.S.

    Article, Applied Intelligence, 2023, DOI Link

    View abstract ⏷

    Traditionally, the shortest length has been used as the optimality criterion in rough set based optimal / near-optimal reduct computation. A more generalizable alternative to the optimal reduct computation approach was recently introduced, with the coarsest granular space as the optimality criterion. However, owing to exponential time complexity, it is not scalable to even moderate-sized data sets. This article investigates to formulate two near-optimal reduct computation alternatives for scaling comparatively larger data sets. The first algorithm employs a controlled A∗ search based strategy to find a near-optimal reduct while reducing both space utilization and computational time. Whereas, the second algorithm employs a greedy sequential backward elimination (SBE) strategy on the higher granular space attribute ordering for achieving coarsest granular space based near-optimal reduct. The comparative experimental study is conducted among the proposed approaches with the coarsest granular space based optimal reduct algorithm A∗RSOR and state-of-the-art shortest length based optimal and near-optimal reduct algorithms. The experimental study amply validates the relevance of the proposed approaches in obtaining near-optimal reduct with increased scalability and comparable or improved generalizable classification models induction.
  • Coarsest granularity-based optimal reduct using A * search

    Bar A., Kumar A., Sai Prasad P.S.V.S.

    Article, Granular Computing, 2023, DOI Link

    View abstract ⏷

    The optimal reduct computation problem aims to obtain the best reduct out of all possible reducts of a given decision system. In the rough set literature, two optimality criteria exist for computing an optimal reduct: shortest length based and coarsest granular space based. The coarsest granular space-based optimal reduct has the ability to induce a better generalizable classification model. The A∗RSOR is an existing A∗ search-based optimal reduct computation algorithm that uses the coarsest granular space as an optimality criterion. This article proposes an improved coarsest granularity-based optimal reduct approach MA∗_RSOR through analyzing the search process’s behaviour in A∗RSOR algorithm. To minimize the search space utilization and arrive at an optimal reduct in less time, suitable modifications are incorporated using the domain knowledge of rough set theory. The relevance of MA∗_RSOR is demonstrated through theoretical analysis and comparative experimental validation with state-of-the-art algorithms. The experimental results with benchmark data sets established that MA∗_RSOR achieves significant computational time gain (49 - 99 %) and space reduction (37 - 96 %) over A∗RSOR. The MA∗_RSOR could induce classification models with significantly better classification accuracies than state-of-the-art shortest length-based optimal/near-optimal reduct computation algorithms. In addition, a coefficient of variation based CVNonCore heuristic is proposed for predicting when the MA∗_RSOR algorithm is appropriate to use. Experimental results validate the relevance of the heuristic as its prediction turned out correctly in 8 out of 10 data sets.
  • Multiple Reducts Computation in Rough Sets with Applications to Ensemble Classification

    Bar A., Sai Prasad P.S.V.S.

    Conference paper, Lecture Notes in Electrical Engineering, 2020, DOI Link

    View abstract ⏷

    Rough set theory has emerged as an influential soft-computing approach for feature subset selection (reduct computation) in the decision system amidst incompleteness and inconsistency. Multiple reducts computation using rough sets provide an elegant way for construction of ensemble classifier for better and stable classification. The existing approaches for multiple reducts computation are primarily based on the genetic algorithm and select diverse multiple reducts after generation of abundant candidate reducts. This work proposes an MRGA_MRC algorithm for multiple reducts computation by utilizing the systematically evolving search space of all reducts computation in the MRGA algorithm without generation of many candidate reducts. A novel heuristic is introduced for selection of diverse multiple reducts. Experiments conducted on the benchmark decision systems have established the relevance of the proposed approach in comparison to the genetic algorithm based multiple reducts computation approach REUCS.
  • Finding Optimal Rough Set Reduct with A* Search Algorithm

    Bar A., Kumar A., Sai Prasad P.S.V.S.

    Conference paper, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2019, DOI Link

    View abstract ⏷

    Feature subset selection or reduct computation is a prominent domain for the classical rough set theory, which can preserve the most predictive features of a decision system. A given decision system has several reducts. Computation of all possible reducts was achieved through the computing prime implicants of the discernibility function. Currently, an optimal reduct based on any optimality criteria can only be achieved post-generation of all possible reducts. Indeed, it is an NP-hard problem. Several researchers have extended the alternative aspects with search strategies such as Genetic Algorithm, Ant Colony Optimization, Simulated Annealing, etc., for obtaining near-optimal reducts. In this paper, we propose an admissible and consistent heuristic for computing the optimal reduct having least number of induced equivalence classes or granules. $$A^*RSOR$$ reduct computation algorithm is developed using the proposed consistent heuristic in $$A^*$$ search. The proposed approach is validated both theoretically and experimentally. The comparative results establish the relevance of the proposed optimality criterion as the achieved optimal reduct has obtained significantly better accuracies with different classifiers.

Patents

  • NA

    Abhimanyu Bar

    Patent Application No: NA, Date Filed: 14/04/2026, Date Published: 14/04/2026,

Projects

Scholars

Interests

  • Artificial Intelligence
  • Data Science
  • Machine Learning
  • Soft Computing

Thought Leaderships

There are no Thought Leaderships associated with this faculty.

Top Achievements

Research Area

No research areas found for this faculty.

Computer Science and Engineering is a fast-evolving discipline and this is an exciting time to become a Computer Scientist!

Computer Science and Engineering is a fast-evolving discipline and this is an exciting time to become a Computer Scientist!

Recent Updates

No recent updates found.

Education
2008
B.Sc
Utkal University
India
2011
MCA
BPUT Rourkela
India
2023
PhD
University of Hyderabad
India
Experience
  • Assistant Professor, School of Computer Science Engineering and Technology-Bennett University, Greater Noida, Delhi-NCR
Research Interests
  • My research focuses on classical rough set theory with a special emphasis on feature subset selection (reduct computation) and dimensionality reduction. We have developed optimal and near-optimal methods for reduct discovery, including approaches based on A* search and coarsest granularity criteria, aimed at simplifying decision systems while retaining predictive accuracy. Dr. Bar’s broader interests include rough sets, soft computing, data mining, and artificial intelligence, with applications in building reliable, efficient, and interpretable machine learning models. His work has contributed to the advancement of reduct computation techniques and their use in classification, knowledge representation, and ensemble model design
  • My current research focuses on developing explainable ensemble machine learning models using Classical Rough Set Theory. I investigate diversity-driven multiple reduct generation to construct ensemble frameworks where each base learner operates on distinct, information-preserving feature subsets. By leveraging the rule-induction capability and attribute reduction strength of Rough Sets, my work aims to enhance model transparency and interpretability without compromising predictive performance. I explore how reduct-based feature selection and discernibility concepts can provide clear, human-understandable explanations for ensemble decisions. Through this integration of Rough Set theory and Explainable AI, I seek to design robust, scalable, and interpretable ensemble systems for reliable decision-making in real-world applications.
Awards & Fellowships
Memberships
  • IEEE
Publications
  • A novel approach for shortest optimal reduct computation

    Phani Kumar G.Y., Bar A., Sai Prasad P.S.V.S.

    Article, Information Sciences, 2026, DOI Link

    View abstract ⏷

    Rough set theory has emerged as a robust soft computing paradigm for feature selection, commonly known as reduct computation. A decision system may contain multiple reducts of varying sizes, all offering equivalent classification capabilities. However, when model performance is a critical factor, the shortest reduct is generally preferred due to its simplicity and interpretability. The discernibility matrix method is a widely used technique for computing such reducts. Despite its effectiveness, this method is computationally intensive and classified as NP-hard, limiting its scalability for datasets where discernibility matrix computation becomes infeasible. This study addresses the limitations of traditional discernibility matrix-based approaches by introducing a novel method that combines a Breadth-First Search control strategy with an incremental approach to compute the absorbed discernibility matrix. The Breadth First Search strategy enables efficient exploration of the search space to identify the shortest optimal reduct early, while the incremental absorbed discernibility matrix enhances the computational scalability of the algorithm. To validate the proposed method, an experimental evaluation was conducted against two state-of-the-art algorithms: Breadth-First Search, representing the discernibility matrix-based strategy, and MinReduct, a benchmark for absorbed discernibility matrix-based approaches. Results demonstrate superior computational performance and earlier discovery of shortest reducts without compromising correctness or optimality.
  • Diversity among multiple reducts computation with application to ensemble of classification model

    Bar A., Kumar A., Phani Kumar G.Y., Sai Prasad P.S.V.S.

    Article, International Journal of Approximate Reasoning, 2025, DOI Link

    View abstract ⏷

    In rough set–based feature selection, the discernibility matrix provides a mathematical framework for computing all or multiple reducts. However, for applications such as ensemble model induction, there is no need to generate an exhaustive set of reducts; instead, selecting a smaller subset with better individual performance and sufficient diversity tends to more effective. Diversity among base classifiers is critical for improving the predictive performance of ensemble models, yet most existing rough set–based methods do not explicitly address this aspect when generating multiple reducts. To overcome this limitation, this paper proposes two strategies that embed diversity directly into the reduct generation process. The first introduces a novel partition refinement cardinality heuristic that selects mutually exclusive reducts with maximum partition cardinality differences to promote classifier diversity. The second presents an efficient adaptation of an existing least overlap heuristic, combined with an incremental construction of the absorbed discernibility matrix to ensure scalability for large datasets where conventional discernibility matrix construction is infeasible. Finally, empirical analysis with state-of-the-art algorithms demonstrates that the diverse reducts generated by the proposed methods successfully achieve their goal of enhancing ensemble model performance through improved diversity and predictive accuracy.
  • Approaches for coarsest granularity based near-optimal reduct computation

    Bar A., Prasad P.S.V.S.S.

    Article, Applied Intelligence, 2023, DOI Link

    View abstract ⏷

    Traditionally, the shortest length has been used as the optimality criterion in rough set based optimal / near-optimal reduct computation. A more generalizable alternative to the optimal reduct computation approach was recently introduced, with the coarsest granular space as the optimality criterion. However, owing to exponential time complexity, it is not scalable to even moderate-sized data sets. This article investigates to formulate two near-optimal reduct computation alternatives for scaling comparatively larger data sets. The first algorithm employs a controlled A∗ search based strategy to find a near-optimal reduct while reducing both space utilization and computational time. Whereas, the second algorithm employs a greedy sequential backward elimination (SBE) strategy on the higher granular space attribute ordering for achieving coarsest granular space based near-optimal reduct. The comparative experimental study is conducted among the proposed approaches with the coarsest granular space based optimal reduct algorithm A∗RSOR and state-of-the-art shortest length based optimal and near-optimal reduct algorithms. The experimental study amply validates the relevance of the proposed approaches in obtaining near-optimal reduct with increased scalability and comparable or improved generalizable classification models induction.
  • Coarsest granularity-based optimal reduct using A * search

    Bar A., Kumar A., Sai Prasad P.S.V.S.

    Article, Granular Computing, 2023, DOI Link

    View abstract ⏷

    The optimal reduct computation problem aims to obtain the best reduct out of all possible reducts of a given decision system. In the rough set literature, two optimality criteria exist for computing an optimal reduct: shortest length based and coarsest granular space based. The coarsest granular space-based optimal reduct has the ability to induce a better generalizable classification model. The A∗RSOR is an existing A∗ search-based optimal reduct computation algorithm that uses the coarsest granular space as an optimality criterion. This article proposes an improved coarsest granularity-based optimal reduct approach MA∗_RSOR through analyzing the search process’s behaviour in A∗RSOR algorithm. To minimize the search space utilization and arrive at an optimal reduct in less time, suitable modifications are incorporated using the domain knowledge of rough set theory. The relevance of MA∗_RSOR is demonstrated through theoretical analysis and comparative experimental validation with state-of-the-art algorithms. The experimental results with benchmark data sets established that MA∗_RSOR achieves significant computational time gain (49 - 99 %) and space reduction (37 - 96 %) over A∗RSOR. The MA∗_RSOR could induce classification models with significantly better classification accuracies than state-of-the-art shortest length-based optimal/near-optimal reduct computation algorithms. In addition, a coefficient of variation based CVNonCore heuristic is proposed for predicting when the MA∗_RSOR algorithm is appropriate to use. Experimental results validate the relevance of the heuristic as its prediction turned out correctly in 8 out of 10 data sets.
  • Multiple Reducts Computation in Rough Sets with Applications to Ensemble Classification

    Bar A., Sai Prasad P.S.V.S.

    Conference paper, Lecture Notes in Electrical Engineering, 2020, DOI Link

    View abstract ⏷

    Rough set theory has emerged as an influential soft-computing approach for feature subset selection (reduct computation) in the decision system amidst incompleteness and inconsistency. Multiple reducts computation using rough sets provide an elegant way for construction of ensemble classifier for better and stable classification. The existing approaches for multiple reducts computation are primarily based on the genetic algorithm and select diverse multiple reducts after generation of abundant candidate reducts. This work proposes an MRGA_MRC algorithm for multiple reducts computation by utilizing the systematically evolving search space of all reducts computation in the MRGA algorithm without generation of many candidate reducts. A novel heuristic is introduced for selection of diverse multiple reducts. Experiments conducted on the benchmark decision systems have established the relevance of the proposed approach in comparison to the genetic algorithm based multiple reducts computation approach REUCS.
  • Finding Optimal Rough Set Reduct with A* Search Algorithm

    Bar A., Kumar A., Sai Prasad P.S.V.S.

    Conference paper, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2019, DOI Link

    View abstract ⏷

    Feature subset selection or reduct computation is a prominent domain for the classical rough set theory, which can preserve the most predictive features of a decision system. A given decision system has several reducts. Computation of all possible reducts was achieved through the computing prime implicants of the discernibility function. Currently, an optimal reduct based on any optimality criteria can only be achieved post-generation of all possible reducts. Indeed, it is an NP-hard problem. Several researchers have extended the alternative aspects with search strategies such as Genetic Algorithm, Ant Colony Optimization, Simulated Annealing, etc., for obtaining near-optimal reducts. In this paper, we propose an admissible and consistent heuristic for computing the optimal reduct having least number of induced equivalence classes or granules. $$A^*RSOR$$ reduct computation algorithm is developed using the proposed consistent heuristic in $$A^*$$ search. The proposed approach is validated both theoretically and experimentally. The comparative results establish the relevance of the proposed optimality criterion as the achieved optimal reduct has obtained significantly better accuracies with different classifiers.
Contact Details

abhimanyu.b@srmap.edu.in

Scholars
Interests

  • Artificial Intelligence
  • Data Science
  • Machine Learning
  • Soft Computing

Education
2008
B.Sc
Utkal University
India
2011
MCA
BPUT Rourkela
India
2023
PhD
University of Hyderabad
India
Experience
  • Assistant Professor, School of Computer Science Engineering and Technology-Bennett University, Greater Noida, Delhi-NCR
Research Interests
  • My research focuses on classical rough set theory with a special emphasis on feature subset selection (reduct computation) and dimensionality reduction. We have developed optimal and near-optimal methods for reduct discovery, including approaches based on A* search and coarsest granularity criteria, aimed at simplifying decision systems while retaining predictive accuracy. Dr. Bar’s broader interests include rough sets, soft computing, data mining, and artificial intelligence, with applications in building reliable, efficient, and interpretable machine learning models. His work has contributed to the advancement of reduct computation techniques and their use in classification, knowledge representation, and ensemble model design
  • My current research focuses on developing explainable ensemble machine learning models using Classical Rough Set Theory. I investigate diversity-driven multiple reduct generation to construct ensemble frameworks where each base learner operates on distinct, information-preserving feature subsets. By leveraging the rule-induction capability and attribute reduction strength of Rough Sets, my work aims to enhance model transparency and interpretability without compromising predictive performance. I explore how reduct-based feature selection and discernibility concepts can provide clear, human-understandable explanations for ensemble decisions. Through this integration of Rough Set theory and Explainable AI, I seek to design robust, scalable, and interpretable ensemble systems for reliable decision-making in real-world applications.
Awards & Fellowships
Memberships
  • IEEE
Publications
  • A novel approach for shortest optimal reduct computation

    Phani Kumar G.Y., Bar A., Sai Prasad P.S.V.S.

    Article, Information Sciences, 2026, DOI Link

    View abstract ⏷

    Rough set theory has emerged as a robust soft computing paradigm for feature selection, commonly known as reduct computation. A decision system may contain multiple reducts of varying sizes, all offering equivalent classification capabilities. However, when model performance is a critical factor, the shortest reduct is generally preferred due to its simplicity and interpretability. The discernibility matrix method is a widely used technique for computing such reducts. Despite its effectiveness, this method is computationally intensive and classified as NP-hard, limiting its scalability for datasets where discernibility matrix computation becomes infeasible. This study addresses the limitations of traditional discernibility matrix-based approaches by introducing a novel method that combines a Breadth-First Search control strategy with an incremental approach to compute the absorbed discernibility matrix. The Breadth First Search strategy enables efficient exploration of the search space to identify the shortest optimal reduct early, while the incremental absorbed discernibility matrix enhances the computational scalability of the algorithm. To validate the proposed method, an experimental evaluation was conducted against two state-of-the-art algorithms: Breadth-First Search, representing the discernibility matrix-based strategy, and MinReduct, a benchmark for absorbed discernibility matrix-based approaches. Results demonstrate superior computational performance and earlier discovery of shortest reducts without compromising correctness or optimality.
  • Diversity among multiple reducts computation with application to ensemble of classification model

    Bar A., Kumar A., Phani Kumar G.Y., Sai Prasad P.S.V.S.

    Article, International Journal of Approximate Reasoning, 2025, DOI Link

    View abstract ⏷

    In rough set–based feature selection, the discernibility matrix provides a mathematical framework for computing all or multiple reducts. However, for applications such as ensemble model induction, there is no need to generate an exhaustive set of reducts; instead, selecting a smaller subset with better individual performance and sufficient diversity tends to more effective. Diversity among base classifiers is critical for improving the predictive performance of ensemble models, yet most existing rough set–based methods do not explicitly address this aspect when generating multiple reducts. To overcome this limitation, this paper proposes two strategies that embed diversity directly into the reduct generation process. The first introduces a novel partition refinement cardinality heuristic that selects mutually exclusive reducts with maximum partition cardinality differences to promote classifier diversity. The second presents an efficient adaptation of an existing least overlap heuristic, combined with an incremental construction of the absorbed discernibility matrix to ensure scalability for large datasets where conventional discernibility matrix construction is infeasible. Finally, empirical analysis with state-of-the-art algorithms demonstrates that the diverse reducts generated by the proposed methods successfully achieve their goal of enhancing ensemble model performance through improved diversity and predictive accuracy.
  • Approaches for coarsest granularity based near-optimal reduct computation

    Bar A., Prasad P.S.V.S.S.

    Article, Applied Intelligence, 2023, DOI Link

    View abstract ⏷

    Traditionally, the shortest length has been used as the optimality criterion in rough set based optimal / near-optimal reduct computation. A more generalizable alternative to the optimal reduct computation approach was recently introduced, with the coarsest granular space as the optimality criterion. However, owing to exponential time complexity, it is not scalable to even moderate-sized data sets. This article investigates to formulate two near-optimal reduct computation alternatives for scaling comparatively larger data sets. The first algorithm employs a controlled A∗ search based strategy to find a near-optimal reduct while reducing both space utilization and computational time. Whereas, the second algorithm employs a greedy sequential backward elimination (SBE) strategy on the higher granular space attribute ordering for achieving coarsest granular space based near-optimal reduct. The comparative experimental study is conducted among the proposed approaches with the coarsest granular space based optimal reduct algorithm A∗RSOR and state-of-the-art shortest length based optimal and near-optimal reduct algorithms. The experimental study amply validates the relevance of the proposed approaches in obtaining near-optimal reduct with increased scalability and comparable or improved generalizable classification models induction.
  • Coarsest granularity-based optimal reduct using A * search

    Bar A., Kumar A., Sai Prasad P.S.V.S.

    Article, Granular Computing, 2023, DOI Link

    View abstract ⏷

    The optimal reduct computation problem aims to obtain the best reduct out of all possible reducts of a given decision system. In the rough set literature, two optimality criteria exist for computing an optimal reduct: shortest length based and coarsest granular space based. The coarsest granular space-based optimal reduct has the ability to induce a better generalizable classification model. The A∗RSOR is an existing A∗ search-based optimal reduct computation algorithm that uses the coarsest granular space as an optimality criterion. This article proposes an improved coarsest granularity-based optimal reduct approach MA∗_RSOR through analyzing the search process’s behaviour in A∗RSOR algorithm. To minimize the search space utilization and arrive at an optimal reduct in less time, suitable modifications are incorporated using the domain knowledge of rough set theory. The relevance of MA∗_RSOR is demonstrated through theoretical analysis and comparative experimental validation with state-of-the-art algorithms. The experimental results with benchmark data sets established that MA∗_RSOR achieves significant computational time gain (49 - 99 %) and space reduction (37 - 96 %) over A∗RSOR. The MA∗_RSOR could induce classification models with significantly better classification accuracies than state-of-the-art shortest length-based optimal/near-optimal reduct computation algorithms. In addition, a coefficient of variation based CVNonCore heuristic is proposed for predicting when the MA∗_RSOR algorithm is appropriate to use. Experimental results validate the relevance of the heuristic as its prediction turned out correctly in 8 out of 10 data sets.
  • Multiple Reducts Computation in Rough Sets with Applications to Ensemble Classification

    Bar A., Sai Prasad P.S.V.S.

    Conference paper, Lecture Notes in Electrical Engineering, 2020, DOI Link

    View abstract ⏷

    Rough set theory has emerged as an influential soft-computing approach for feature subset selection (reduct computation) in the decision system amidst incompleteness and inconsistency. Multiple reducts computation using rough sets provide an elegant way for construction of ensemble classifier for better and stable classification. The existing approaches for multiple reducts computation are primarily based on the genetic algorithm and select diverse multiple reducts after generation of abundant candidate reducts. This work proposes an MRGA_MRC algorithm for multiple reducts computation by utilizing the systematically evolving search space of all reducts computation in the MRGA algorithm without generation of many candidate reducts. A novel heuristic is introduced for selection of diverse multiple reducts. Experiments conducted on the benchmark decision systems have established the relevance of the proposed approach in comparison to the genetic algorithm based multiple reducts computation approach REUCS.
  • Finding Optimal Rough Set Reduct with A* Search Algorithm

    Bar A., Kumar A., Sai Prasad P.S.V.S.

    Conference paper, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2019, DOI Link

    View abstract ⏷

    Feature subset selection or reduct computation is a prominent domain for the classical rough set theory, which can preserve the most predictive features of a decision system. A given decision system has several reducts. Computation of all possible reducts was achieved through the computing prime implicants of the discernibility function. Currently, an optimal reduct based on any optimality criteria can only be achieved post-generation of all possible reducts. Indeed, it is an NP-hard problem. Several researchers have extended the alternative aspects with search strategies such as Genetic Algorithm, Ant Colony Optimization, Simulated Annealing, etc., for obtaining near-optimal reducts. In this paper, we propose an admissible and consistent heuristic for computing the optimal reduct having least number of induced equivalence classes or granules. $$A^*RSOR$$ reduct computation algorithm is developed using the proposed consistent heuristic in $$A^*$$ search. The proposed approach is validated both theoretically and experimentally. The comparative results establish the relevance of the proposed optimality criterion as the achieved optimal reduct has obtained significantly better accuracies with different classifiers.
Contact Details

abhimanyu.b@srmap.edu.in

Scholars