Workshops

Workshops

Advances in Multimodal Optimization
Mike Preuss, University of Munster, Germany
Michael G. Epitropakis, Lancaster University, UK
Xiaodong Li, RMIT University, Australia
[Sunday, 9 Sep, 14:00-15:30, 16:00-17:30] [+]

The workshop attempts to bring together researchers from evolutionary computation and related areas who are interested in Multi-modal Optimization. This is a currently forming field, and we aim for a highly interactive and productive meeting that makes a step forward towards defining it. The Workshop will provide a unique opportunity to review the advances in the current state-of-the-art in the field of Niching methods. Further discussion will deal with several experimental/theoretical scenarios, performance measures, real-world and benchmark problem sets and outline the possible future developments in this area. Positional statements, suggestions, and comments are very welcome!

Website: http://www.epitropakis.co.uk/ppsn2018-niching/

Black Box Discrete Optimization Benchmarking (BB-DOB)
Pietro S. Oliveto, University of Sheffield, UK
Markus Wagner, University of Adelaide, Australia
Thomas Weise, Hefei University, China
Borys Wróbel, Adam Mickiewicz University, Poland
Aleš Zamuda, University of Maribor, Slovenia
[Saturday, 8 Sep, 13:30-15:30, 16:00-18:00][+]

The aim of BB-DOB is to set up a process that will allow to achieve a standard methodology for the benchmarking of black box optimisation algorithms in discrete and combinatorial search spaces. Our long-term aim is to produce:

  1. a well-motivated benchmark function testbed
  2. an experimental set-up
  3. generation of data output for post-processing and
  4. presentation of the results in graphs and tables

In this workshop we encourage a discussion concerning which functions should be included in the benchmarking testbed (i.e., point (1) above). The functions should capture the difficulties of combinatorial optimization problems in practice but at the same time be comprehensible such that algorithm behaviours can be interpreted according to the performance on a given benchmark problem. The desired search behaviour should be clear and algorithm deficiencies understood in depth. This understanding should lead to the design of improved algorithms. Ideally (not necessarily for all), the benchmark functions should be scalable with the problem size and non-trivial in the black box optimisation sense (the function may be shifted such that the global optimum may be any point). This work is organised as part of the COST Action CA15140: ImAppNIO.

Website: http://iao.hfuu.edu.cn/bbdob-ppsn18

Bridging the Gap Between Theory and Practice in Nature-Inspired Optimisation
Fernando G. Lobo, University of Algarve, Portugal
Thomas Jansen, Aberystwyth University, UK
[Sunday, 9 Sep, 09:00-10:30, 11:00-12:30] [+]

Nature-inspired search and optimisation heuristics have been used for decades to solve practical problems across different domains. Alongside, the theoretical understanding of them has been improving substantially, providing better understanding of what they can and cannot do in terms of solution quality and runtime.

In spite of much improvement from the theoretical perspective, there is a large gap between theoretical foundations and practical applications. Theory and practice reinforce each other. Theory is driven by the need to improve understanding of challenges observed in practice. Likewise, practical applications can benefit from insights and guidelines derived from theory.

The workshop seeks to bring together researchers interested in the debate on how to narrow the gap between theory and practice. We hope that this debate will improve the current state of the field. We welcome anyone to participate and encourage the submission of an extended abstract (at most 4 pages) with position statements.

Website: http://fernandolobo.info/ppsn2018workshop/

Developmental Neural Networks
Dennis Wilson, University of Toulouse, France
Julian Miller, University of York, UK
Sylvain Cussat-Blanc, University of Toulouse, France
[Saturday, 8 Sep, 16:00-17:30] [+]

In nature, brains are built through a process of biological development in which many aspects of the network of neurons and connections change are shaped by external information received through sensory organs. Biological development mechanisms such as axon guidance and dendrite pruning have been shown to rely on neural activity. Despite this, most artificial neural network (ANN) models do not include developmental mechanisms and regard learning as the adjustment of connection weights, while some that do use development restrain it to a period before the ANN is used. It is worthwhile to understand the cognitive functions offered by development and to investigate the fundamental questions raised by artificial neural development. In this workshop, we will explore existing and future approaches that aim to incorporate development into ANNs. Invited speakers will present their work with neural networks, both artificial and biological, in the context of development. Accepted submissions on contemporary work in this field will be published and presented and we will hold an open discussion on the topic.

Website: https://www.irit.fr/devonn/

Evolutionary Machine Learning
Giovanni Squillero, Politecnico di Torino, Italy
Alberto Tonda, National Institute of Agronomic Research, France
[Sunday, 9 Sep, 14:00-15:30, 16:00-17:30] [+]

Evolutionary machine-learning (EML) can be defined as a crossbreed between the fields of evolutionary computation (EC) and machine learning (ML). A first line of research ascribable to EML predates the recent ML bonanza and focuses on using EC algorithms to optimize frameworks: it included remarkable studies in the 1990s, such as the attempts to determine optimal topologies for an artificial neural network using a genetic algorithm. The other way around, a line tackling the use of ML techniques to boost EC algorithms appeared before 2000. More recently, scholars are proposing truly hybrid approaches, where EC algorithms are deeply embedded in frameworks performing ML tasks. This workshop’s topics of interest include but are not limited to: EA applied to ML tasks (e.g., evolutionary classifiers), EA applied to ML algorithms (e.g., evolutionary optimization of deep neural network), ML applied to EA algorithms (e.g., optimal parameter prediction), Real world applications of EML.

Website: https://evolearning.github.io/ppsn18/

Investigating Optimization Problems from Machine Learning and Data Analysis
Marcus Gallagher, University of Queensland, Australia
Mike Preuss, University of Munster, Germany
Pascal Kerschke, University of Munster, Germany
[Sunday, 9 Sep, 09:00-10:30, 11:00-12:30] [+]

In continuous black-box optimization, there are a number of benchmark problem sets and competitions. However, the focus has mainly been on the performance and comparison of algorithms on artificial problems. The aim of this workshop is to instead make a set of optimization problems the center of focus, bringing together researchers to discuss and develop deeper insights into the structure and difficulty of the problem set, as well as experimental methodology (including algorithms). Several problem classes (and specific problem instances) from the area of machine learning and data analysis will be proposed in advance of the workshop submission deadline. Participants will be invited to submit a brief paper that shows new insights into the problems, for example via exploratory landscape analysis, algorithm performance (with a focus on “why”) or analysis of the quality/diversity of solutions present in the problem instances.

Website: https://sites.google.com/view/optml-ppsn18/home