Research Reports and Expert Reports (18)

[1] Evelyne Lutton and Jean-Daniel Fekete. Visual analytics and experimental analysis of evolutionary algorithms. Research Report RR-7605, INRIA, 04 2011. [ bib | http | .pdf ]
Experimental analysis of evolutionary algorithms usually aims at tuning the parameter setting or at improving knowledge about internal mechanisms (operators efficiency, genetic material distribution, or diversity management for instance). This crucial step relies on the analysis of a huge amount of multidimensional data, including numeric and symbolic data. Usual features of existing EA visualisation systems consist in visualising time- or generation-dependent curves (fitness, diversity, or other statistics). But when dealing with detailed genomic information, the task becomes more difficult, as a convenient visualisation strongly depends on the considered fitness landscape. In this latter case the raw data are usually sets of successive populations of points of a complex multidimensional space. The purpose of this paper is to evaluate the potential interest of some recent visual analytics tools for navigating in complex sets of EA data, and to sketch future developements of visual analytics tools adapted to the needs of EA experimental analysis.

Keywords: Visual Analytics; experimental analysis of evolutionary algorithms; parameter tuning; fitness landscape visualisation
[2] Olivier Barrière, Evelyne Lutton, Pierre-Henri Wuillemin, Cédric Baudrit, Mariette Sicard, Bruno Pinaud, and Nathalie Perrot. Modeling an agrifood industrial process using cooperative coevolution algorithms. Technical Report 6914, INRIA Rocquencourt, April 2009. [ bib | .pdf ]
This report presents two experiments related to the modeling of an industrial agrifood process using evolutionary techniques. Experiments have been focussed on a specific problem which is the modeling of a Camembert-cheese ripening process. Two related complex optimisation problems have been considered: 1- a deterministic modeling problem, the phase prediction problem,for which a search for a closed form tree expression has been performed using genetic programming (GP), 2- a Bayesian network structure estimation problem, considered as a two-stage problem, i.e. searching first for an approximation of an independence model using EA, and then deducing, via a deterministic algorithm, a Bayesian network which represents the equivalence class of the independence model found at the first stage. In both of these problems, cooperative-coevolution techniques (also called “Parisian” approaches) have been proved successful. These approaches actually allow to represent the searched solution as an aggregation of several individuals (or even as a whole population), as each individual only bears a part of the searched solution. This scheme allows to use the artificial Darwinism principles in a more economic way, and the gain in terms of robustness and efficiency is important.

[3] Evelyne Lutton, Yann Landrin-Schweitzer, and Jacques Lévy Véhel. Experiments on controlled regularity fitness landscapes. Technical Report 5823, INRIA Rocquencourt, February 2006. [ bib | .pdf ]
We present an experimental analysis of the influence of the local irregularity of the fitness function on the behavior of a simple version of an evolutionary algorithm (EA). Previous theoretical as well as experimental work on this subject suggest that the performance of EA strongly depends on the irregularity of the fitness function. Several irregularity measures have been derived, in order to numerically characterize this type of difficulty source for EA. These characterizations are mainly based on Hölder exponents. Previous studies used a global characterization of fitness regularity (namely the global Hölder exponent), with experimental validations being conducted on test functions with uniform irregularity. The present work refines the analysis by investigating the behavior of an EA on functions displaying variable local regularity. Our experiments confirm and quantify the intuition that performance decreases as irregularity increases. In addition, they suggest a way to modify the genetic topology to accommodate for variable regularity: More precisely, it appears that the mutation parameter, which controls the size of the neighbourhood of a point, should increase when regularity decreases. These results open the way to a theoretical analysis based on local Hölder exponents, and poses several questions with respect to on-line measurements and usage of regularity for fitness functions.

[4] Evelyne Lutton, Mario Pilz, and Jacques Lévy Véhel. An interactive evolution scheme applied to multifractal image denoising. Technical Report 5535, INRIA, March 2005. [ bib | .pdf ]
Interactive evolutionary algorithms (IEA) often suffer from what is called the user bottleneck. In this paper, we propose and analyse a method to limit the user interactions, while still providing sufficient informations for the EA to converge. The method has been currently developed on a multifractal image denoising application: a multifractal denoising method is adapted to complex images, but depends on a set of parameters that are quite difficult to tune by hand. A simple IEA has been developed for this purpose in a previous work. We now experiment an approximation of the user judgment, via a fitness map, that helps to reduce the number of user-interactions. The method is easily extensible to other interactive, or computationally expensive, evolutionary schemes.

[5] Pierre Collet, Evelyne Lutton, and Marc Schoenauer. Ppsn vi reviewers and papers: an easea match - affectation automatique des relecteurs de la conférence ppsn vi : un exemple d'utilisation du langage EASEA. Technical Report 4177, INRIA Rocquencourt, Mai 2001. [ bib | .pdf ]
This paper presents an algorithm which has been used by PPSN VI organisers to allocate papers to reviewers, which is typically a Constraint Satisfaction Problem. Its aim is not to present a new revolutionary competitive method to solve CSPs, but rather to show how such a problem can be simply implemented using EASEA, a language designed specifically to write evolutionary algorithms.

[6] Pierre Collet, Marc Schoenauer, Evelyne Lutton, and Jean Louchet. EASEA : un langage de spécification pour les algorithmes évolutionnaires - specifying evolutionary algorithms with easea. Technical Report 4218, INRIA Rocquencourt, june 2001. [ bib | .pdf ]
Evolutionary algorithms are not straightforward to implement and the lack of any specialised language forces users to write their algorithms in C, C++ or JAVA. However, most evolutionary algorithms follow a similar design, and the only really specific part is the code representing the problem to be solved. Therefore, it seems that nothing, in theory, could prevent a user from being able to design and run his evolutionary algorithm from a Graphic User Interface, without any other programming effort than the function to be optimised. Writing such a GUI rapidly poses the problem of saving and reloading the evolutionary algorithm on which the user is working, and translating the information into compilable code. This very much sounds like a specifying language and its compiler. The EASEA software was created on this purpose, and to our knowledge, it is the first and only usable compiler of a language specific to evolutionary algorithms. This reprot describes how EASEA has been designed and the problems which needed to be solved to achieve its implementation.

[7] Evelyne Lutton, Pierre Collet, Jean Louchet, Michele Sebag, and Cyril Fonlupt. Evolution artificielle. Technical report, ENSTA lecture notes, mars 2000. [ bib ]

[8] Pierre Collet, Evelyne Lutton, Frédéric Raynal, and Marc Schoenauer. Polar ifs + individual genetic programming = efficient ifs inverse problem solving. Technical Report 3849, INRIA Rocquencourt, December 1999. [ bib | .pdf ]
The inverse problem for Iterated Functions Systems (finding an IFS whose attractor is a target 2D shape) with non-affine IFS is a very complex task. Successful approaches have been made using Genetic Programming, but there is still room for improvement in both the IFS and the GP parts. The main difficulty with non-linear IFS is the efficient handling of contractance constraints. This paper introduces Polar IFS, a specific representation of IFS functions that shrinks the search space to mostly contractive functions. Moreover, the Polar representation gives direct access to the fixed points of the functions, whereas the fixed point of general non-linear IFS can only be numerically estimated. On the evolutionary side, the individual approach is similar to the Michigan approach of Classifier Systems: each individual of the population embodies a single function rather than the whole IFS. A solution to the inverse problem is then built from a set of individuals. Both improvements show a drastic cut-down on CPU-time: good results are obtained with small populations in few generations.

[9] Benoît Leblanc and Evelyne Lutton. Bitwise regularity coefficients as a tool for deception analysis. Technical Report 3274, INRIA Rocquencourt, October 1997. [ bib | .pdf ]
We present in this paper a theoretical analysis that relates an irregularity measure of a fitness function to the so-called GA-deception. This approach is a continuation of a previous work that has presented a deception analysis of Holder functions. The analysis developed here is a generalization of this work in two ways: we first use a bitwise regularity instead of a Holder exponent as a basis for our deception analysis, second, we perform a similar deception analysis of a GA with uniform crossover. We finally propose to use the bitwise regularity coefficients in order to analyze the influence of a chromosome encoding on the GA efficiency, and we present experiments with Gray encoding.

[10] Evelyne Lutton, Jacques Lévy Véhel, Guillaume Cretin, Philippe Glevarec, and Cédric Roll. Mixed ifs: resolution of the inverse problem using genetic programming. Technical Report 2631, INRIA Rocquencourt, August 1995. [ bib | .pdf ]
We address here the resolution of the so-called inverse problem for IFS. This problem has already been widely considered, and some studies have been performed for affine IFS, using deterministic or stochastic methods (Simulated Annealing or Genetic Algorithm) [?]. When dealing with non affine IFS, the usual techniques do not perform well, except if some a priori hypotheses on the structure of the IFS (number and type functions) are made. In this work, a Genetic Programming method is investigated to solve the «general» inverse problem, which permits to perform at the same time a numeric and a symbolic optimization. The use of «mixed IFS», as we call them, may enlarge the scope of some applications, as for example image compression, because they allow to code a wider range of shapes.

[11] Evelyne Lutton and Jacques Lévy Véhel. Some remarks on the optimization of hölder functions with genetic algorithms. Technical Report 2627, INRIA Rocquencourt, July 1995. 35 p. [ bib | .pdf ]
We investigate the problem of Hölder functions optimization using Genetic Algorithms (GA). We first derive a relation between the Hölder exponent of the function, the sampling rate, and the accuracy of the optimum localization, both in the domain and the range of the function. This relation holds for any optimization method which work on sampled search spaces. We then present a finer analysis in the case of the use of a GA, which is based on a deceptivity analysis. Our approach uses a decomposition on the Haar basis, which reflects in a natural way the Hölder structure of the function. It allows to relate the deceptivity, the exponent and some parameters of the GA (including the sampling precision). These results provide some indications which may help to make the convergence of a GA easier.

[12] Evelyne Lutton. Etat de l'art des algorithmes génétiques. Technical report, Rapport d'expertise pour le SGDN, December 1993. [ bib ]

[13] Evelyne Lutton and Patrice Martinez. A genetic algorithm for the detection of 2d geometric primitives in images. Technical Report 2210, INRIA Rocquencour, November 1993. [ bib | .pdf ]
We investigate the use of genetic algorithms (GAs) in the framework of image primitives extraction (such as segments, circles, ellipses or quadrilaterals). This approach completes the well-known Hough Transform, in the sense that GAs are efficient when the Hough approach becomes too expensive in memory, i.e. when we search for complex primitives having more than 3 or 4 parameters.

Indeed, a GA is a stochastic technique, relatively slow, but which provides with an efficient tool to search in a high dimensional space. The philosophy of the method is very similar to the Hough Transform, which is to search an optimum in a parameter space. However, we will see that the implementation is different.

The idea of using a GA for that purpose is not new, Roth and Levine have proposed a method for 2D and 3D primitives in 1992. For the detection of 2D primitives, we re-implement that method and improve it mainly in three ways :

* by using distance images instead of directly using contour images, which tends to smoothen the function to optimize,

* by using a GA-sharing technique, to detect several image primitives in the same step,

* by applying some recent theoretical results on GAs (about mutation probabilities) to reduce convergence time.

[14] Evelyne Lutton. 3d model-based stereo reconstruction using coupled markov random fields. Technical Report 1951, INRIA Rocquencourt, June 1993. [ bib | .pdf ]
A lot of methods have been proposed in the field of stereo-reconstruction. We address here the problem of model-based tridimensional reconstruction from an alreadysegmented and matched stereo pair. This work is a continuation of the work presented, concerning the reconstruction problem. We study here a method based on Markov random fields, which allows the a priori segmentation and matching to be refined during the reconstruction of the 3D surfaces. A new segmentation and matching is then produced which respects the 3D coherence (or equivalently the disparity coherence) of each segmented region-pair. In this first approach, we use simple segmentation energies for each image (without line processes), plus a coupling term between the left and right images, associating planes (as surface primitives) with each region pair. This is the justification for using coupled Markov random fields. We present results on synthetic and real images. These preliminary results allow us to assess the feasability of a hierarchical stereo reconstruction method with no a priori segmentation.

[15] Jacques Lévy Véhel and Evelyne Lutton. Optimisation of fractal functions using genetic algorithms. Technical Report 1941, INRIA Rocquencourt, June 1993. [ bib | .pdf ]
In this work, we investigate the difficult problem of the optimization of fractal functions. We first derive some relations between the local scaling exponents of the functions, the sampling rate and the accuracy of the localization of the optimum, both in the domain and the range of the functions. We then apply these ideas to the resolution of the inverse problem for iterated function system (IFS) using a genetic algorithm. In the conditions of study (2D problem for sets), the optimization process yields the optimum with a good precision and within a tractable computing time.

[16] Evelyne Lutton and Henri Maître. Positionning a camera from a photograph and a scene model. Technical Report 92 D 002, Télécom Paris, Département Images, January 1992. [ bib ]

[17] Evelyne Lutton, Henri Maître, and Jaime Lopez-Krahe. Determination of vanishing points using hough transform. Technical Report 91 D 008, Télécom Paris, Département Images, September 1991. [ bib ]

[18] Evelyne Lutton and Henri Maître. Systèmes de gestion de bases de données et images. Narcisse Project, Workpackage Report T104, Télécom Paris, June 1990. [ bib ]

This file has been generated by bibtex2html 1.85.