Mid Sweden University

miun.sePublications
System disruptions
We are currently experiencing disruptions on the search portals due to high traffic. We are working to resolve the issue, you may temporarily encounter an error message.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A hyper-heuristic guided by a probabilistic graphical model for single-objective real-parameter optimization
Show others and affiliations
2022 (English)In: International Journal of Machine Learning and Cybernetics, ISSN 1868-8071, E-ISSN 1868-808X, Vol. 13, no 12, p. 3743-3772Article in journal (Refereed) Published
Abstract [en]

Metaheuristics algorithms are designed to find approximate solutions for challenging optimization problems. The success of the algorithm over a given optimization task relies on the suitability of its search heuristics for the problem-domain. Thus, the design of custom metaheuristic algorithms leads to more accurate solutions. Hyper-heuristics (HH) are important tools commonly used to select low-level heuristics (LLHs) to solve a specific problem. HH are able to acquire knowledge from the problems where they are used. However, as other artificial intelligence tools it is necessary to identify how the knowledge affects the performance of the algorithm. One way to generate such knowledge is to capture interactions between variables using probabilistic graphical models such as Bayesian networks (BN) in conjunction with estimation of distribution algorithms (EDA). This article presents a method based on that used an EDA based on BN as a high-level selection mechanism for HH called Hyper-heuristic approach based on Bayesian learning and evolutionary operators (HHBNO). Here the knowledge is extracted form BN to evolve the sequences of LLHs in an online learning process by exploring the inter-dependencies among the LLHs. The proposes approach is tested over CEC’17 set of benchmark function of single-objective real-parameter optimization. Statical tests verifies that the HHBNO  presents competitive results in comparison with other metaheuristic algorithms with high performance in terms of convergence. The generated BN is further visually investigated to display the acquired knowledge during the evolutionary process, and it is constructed with the probabilities of each LLHs.

Place, publisher, year, edition, pages
Springer Nature , 2022. Vol. 13, no 12, p. 3743-3772
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:miun:diva-51064DOI: 10.1007/s13042-022-01623-6Scopus ID: 2-s2.0-85135626725OAI: oai:DiVA.org:miun-51064DiVA, id: diva2:1849200
Available from: 2024-04-05 Created: 2024-04-05 Last updated: 2024-04-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Seyed Jalaleddin, Mousavirad

Search in DiVA

By author/editor
Oliva, DiegoSeyed Jalaleddin, Mousavirad
In the same journal
International Journal of Machine Learning and Cybernetics
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 6 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf