Papers in Journals (28)

[1] Evelyne Lutton, Alberto Tonda, Sébastien Gaucel, Riaublanc Alain, and Nathalie Perrot. Food model exploration through evolutionary optimization coupled with visualization: application to the prediction of a milk gel structure. IFSET Journal (Innovative Food Science and Emerging Technologies), 2014. [ bib | http ]
Obtaining reliable in-silico food models is fundamental for a better understanding of these systems. The complex phenomena involved in these real-world processes reflect in the intricate structure of models, so that thoroughly exploring their behaviour and, for example, finding meaningful correlations between variables, become a relevant challenge for the experts. In this paper, we present a methodology based on visualisation and evolutionary computation to assist experts during model exploration. The proposed approach is tested on an established model of milk gel structures, and we show how experts are eventually able to find a correlation between two parameters, previously considered independent. Reverse-engineering the final outcome, the emergence of such a pattern is proved by physical laws underlying the oil-water interface colonisation. It is interesting to notice that, while the present work is focused on milk gel modelling, the proposed methodology can be straightforwardly generalised to other complex physical phenomena.

[2] Nadia Boukhelifa, Waldo Cancino, Anastasia Bezerianos, and Evelyne Lutton. Evolutionary visual exploration: Evaluation with expert users. Computer Graphics Forum, 32(3):31 - 40, 2013. [ bib | http ]
We present an Evolutionary Visual Exploration (EVE) system that combines visual analytics with stochastic optimisation to aid the exploration of multidimensional datasets characterised by a large number of possible views or projections. Starting from dimensions whose values are automatically calculated by a PCA, an interactive evolutionary algorithm progressively builds (or evolves) non-trivial viewpoints in the form of linear and non-linear dimension combinations, to help users discover new interesting views and relationships in their data. The criteria for evolving new dimensions is not known a priori and are partially specified by the user via an interactive interface: (i) The user selects views with meaningful or interesting visual patterns and provides a satisfaction score. (ii) The system calibrates a fitness function (optimised by the evolutionary algorithm) to take into account the user input, and then calculates new views. Our method leverages automatic tools to detect interesting visual features and human interpretation to derive meaning, validate the findings and guide the exploration without having to grasp advanced statistical concepts. To validate our method, we built a prototype tool (EvoGraphDice) as an extension of an existing scatterplot matrix inspection tool, and conducted an observational study with five domain experts. Our results show that EvoGraphDice can help users quantify qualitative hypotheses and try out different scenarios to dynamically transform their data. Importantly, it allowed our experts to think laterally, better formulate their research questions and build new hypotheses for further investigation.

[3] Etienne Descamps, Nathalie Perrot, Sébastien Gaucel, Cristian Trelea, Riaublanc Alain, Alan Mackie, and Evelyne Lutton. Coupling deterministic and random sequential approaches for structure and texture prediction of a dairy oil-in-water emulsion. IFSET Journal (Innovative Food Science and Emerging Technologies), 2013. [ bib | http ]
Dairy products made of concentrated milk protein powder and milk fat have been experimentally shown to behave like complex systems: The resulting textures depend on various factors, including concentration and type of proteins, nature of heat treatment and homogenisation process. The aim of this paper is to combine two models in order to predict the composition of the interface of a homogenised oil-in-water emulsion, and the resulting bridge structure between the fat droplets. This structure is then correlated to the texture of the emulsion. Free unknown parameters of both models have been estimated from experimental data using an evolutionary optimisation algorithm. The resulting model fits the experimental data, and is coherent with the macroscopic texture measurements. Industrial relevance Sustainability is nowadays at the heart of industrial requirements. The development of mathematical approaches should facilitate common approaches to risk/benefit assessment and nutritional quality in food research and industry. These models will enhance knowledge on process-structure-property relationships from molecular to macroscopic level, and facilitate creation of in-silico simulators with functional and nutritional properties. The stochastic optimisation techniques (evolutionary algorithms) employed in these works allow the users to thoroughly explore the systems and optimise it. With regard to the complexity of the food systems and dynamics, the challenge of the mathematical approaches is to realise a complete dynamic description of food processing. In order to reach this objective, it is mandatory to use innovative strategies, exploiting the most recent advances in cognitive and complex system sciences.

[4] Alberto Tonda, Evelyne Lutton, and Giovanni Squillero. A benchmark for cooperative coevolution. Memetic Computing, 4(4):263-277, December 2012. Special Issue on Nature Inspired Cooperative Strategies for Optimization. Regular research paper. [ bib | .pdf ]
Cooperative co-evolution algorithms (CCEA) are a thriving sub-field of evolutionary computation. This class of algorithms makes it possible to exploit more efficiently the artificial Darwinist scheme, as soon as an optimisation problem can be turned into a co-evolution of interdependent sub-parts of the searched solution. Testing the efficiency of new CCEA concepts, however, it is not straightforward: while theres is a rich literature of benchmarks for more traditional evolutionary techniques, the same does not hold true for this relatively new paradigm. We present a benchmark problem designed to study the behavior and performance of CCEAs, modeling a search for the optimal placement of a set of lamps inside a room. The relative complexity of the problem can be adjusted by operating on a single parameter. The fitness function is a trade-off between conflicting objectives, so the performance of an algorithm can be examined by making use of different metrics. We show how three different cooperative strategies, Parisian Evolution (PE), Group Evolution (GE) and Allopatric Group Evolution (AGE), can be applied to the problem. Using a Classical Evolution (CE) approach as comparison, we analyse the behavior of each algorithm in detail, with respect to the size of the problem.

[5] F.P. Vidal, P.F. Villard, and E. Lutton. Tuning of patient specific deformable models using an adaptive evolutionary optimization strategy. Biomedical Engineering, IEEE Transactions on, 59(10):2942-2949, October 2012. [ bib | .pdf ]
We present and analyze the behavior of an evolutionary algorithm designed to estimate the parameters of a complex organ behavior model. The model is adaptable to account for patients specificities. The aim is to finely tune the model to be accurately adapted to various real patient datasets. It can then be embedded, for example, in high fidelity simulations of the human physiology. We present here an application focused on respiration modeling. The algorithm is automatic and adaptive. A compound fitness function has been designed to take into account for various quantities that have to be minimized. The algorithm efficiency is experimentally analyzed on several real test-cases: i) three patient datasets have been acquired with the breath hold protocol, and ii) two datasets corresponds to 4D CT scans. Its performance is compared with two traditional methods (downhill simplex and conjugate gradient descent), a random search and a basic realvalued genetic algorithm. The results show that our evolutionary scheme provides more significantly stable and accurate results.

[6] Leonardo Trujillo, Gustavo Olague, Evelyne Lutton, Francisco Fernandez de Vega, Leon Dozal, and Eddie Clemente. Speciation in behavioral space for evolutionary robotics. Journal of Intelligent & Robotic Systems, pages 1-29, 2011. [ bib | http | .pdf ]
In Evolutionary Robotics, population-based evolutionary computation is used to design robot neurocontrollers that produce behaviors which allow the robot to fulfill a user-defined task. However, the standard approach is to use canonical evolutionary algorithms, where the search tends to make the evolving population converge towards a single behavioral solution, even if the high-level task could be accomplished by structurally different behaviors. In this work, we present an approach that preserves behavioral diversity within the population in order to produce a diverse set of structurally different behaviors that the robot can use. In order to achieve this, we employ the concept of speciation, where the population is dynamically subdivided into sub-groups, or species, each one characterized by a particular behavioral structure that all individuals within that species share. Speciation is achieved by describing each neurocontroller using a representations that we call a behavior signature, these are descriptors that characterize the traversed path of the robot within the environment. Behavior signatures are coded using character strings, this allows us to compare them using a string similarity measure, and three measures are tested. The proposed behavior-based speciation is compared with canonical evolution and a method that speciates based on network topology. Experimental tests were carried out using two robot tasks (navigation and homing behavior), several training environments, and two different robots (Khepera and Pioneer), both real and simulated. Results indicate that behavior-based speciation increases the diversity of the behaviors based on their structure, without sacrificing performance. Moreover, the evolved controllers exhibit good robustness when the robot is placed within environments that were not used during training. In conclusion, the speciation method presented in this work allows an evolutionary algorithm to produce several robot behaviors that are structurally different but all are able to solve the same robot task.

[7] Cynthia Perez, Gustavo Olague, Francisco Fernandez, and Evelyne Lutton. An artificial life approach to dense stereo disparity. The Journal of the Artificial Life and Robotics, AROB, 13(2), March 2009. [ bib | .pdf ]
This paper presents an adaptive approach to improve the infection algorithm that we have used to solve the dense stereo matching problem. The algorithm presented in this paper incorporates two different epidemic automata along a single execution of the infection algorithm. The new algorithm attemps to provide a general behaviour of guessing the best correspondence between a pair of images. Our aim is to provide with a new strategy inspired of evolutionary computation, which combines the behaviours of both automata into a single correspondence problem. The new algorithm will decide which automata will be used based on transmition of information and mutation, as well as the attributes, texture and geometry, of the input images. This article gives details about how are coded the rules in the infection algorithm. Finally, we show experiments with a real stereo pair, as well as with a standard test bed to show how the infection algorithm works.

[8] P. Legrand, C. Bourgeois-Republique, V. Pean, E. Harboun-Cohen, J. Lévy Véhel, B. Frachet, E. Lutton, and P. Collet. Interactive evolution for cochlear implants fitting. GPEM, 8(4):319-354, December 2007. Special Issue on Medical Applications,. [ bib | .pdf ]
Cochlear implants are devices that become more and more sophisticated and adapted to the need of patients, but in the same time they become more and more difficult to tune. After a deaf patient has been surgically implanted, a specialised medical practitioner has to spend hours during months to precisely fit the implant to the patient. This process is a complex one implying two intricated tasks: the practitioner has to tune the parameters of the device (optimisation) while the patient's brain needs to adapt to the new data he receives (learning). This paper presents a study that intends to make the implant more adaptable to environment (auditive ecology) and to simplify the preocess of fitting. Real experiments on volunteer implanted patients are presented, that show the efficiency of interactive evolution for this purpose.

[9] Gregory Valigiani, Cyril Fonlupt, Evelyne Lutton, and Pierre Collet. Optimisation par "hommilière" de chemins pé`dagogiques pour un logiciel de e-learning. TSI Techniques et Sciences Informatiques, 26(10):1245-1268, 2007. Hermès. [ bib | .pdf ]
This paper describes experiments aimed at adapting Ant Colony Optimisation (ACO) techniques to an e-learning environment, thanks to the fact that the available online material can be organised in a graph by means of hyperlinks between educational topics. The idea is to find paths in the graph making it easier for students to improve. ACO is based on an ant-hill metaphor. In this case, however, the agents that move on the graph are students who unconsciously leave pheromones in the environment. Tests showed that humans did not behave as ants, meaning that the ACO paradigm had to be modified so that it could work with human agents. A new word has been coined to describe the new paradigm: "man-hill" optimization.

[10] L. Trujillo, G. Olague, P. Legrand, and E. Lutton. Regularity-based descriptor computed from local image oscillations. Optics Express, on-line journal of the Optics Society of America, OSX, 15(10):6140-6145, 2007. [ bib | .pdf ]
This work presents a novel local image descriptor based on the concept of pointwise signal regularity. Local image regions are extracted using either an interest point or an interest region detector, and discriminative feature vectors are constructed by uniformly sampling the pointwise Hölderian regularity around each region center. Regularity estimation is performed usin local image oscillations, the most straightforward method directly derived from the definition of the Hölder exponent. Furthermore, estimating the Hölder exponent in this manner has proven to be superior when compared to wavelet based estimation. Our detector shows invariance to illumination change, JPEG compression, image rotation and scale change. Results show that the proposed descriptor is stable with respect to variations in imaging conditions, and reliable performances metrics prove it to be comparable and in some instances better than SIFR, the state-of-the-art in local descriptors.

[11] Evelyne Lutton. Evolution artificielle et applications industrielles. Revue de l'Electricité et de l'Electronique,  (8), Septembre 2006. [ bib | http ]
La transposition informatique des principes de l'évolution naturelle selon Charles Darwin est à la base d'un ensemble de techniques d'optimisation stochastique (algorithmes génétiques, stratégies d'évolution, ou plus généralement algorithmes évolutionnaires) de plus en plus appréciés pour leur flexibilité et leur efficacité. Nous donnons ici une présentation synthétique des méthodes d'évolution artificielle. Ces outils ont un très vaste champ d'applications, qui ne se limite pas à l'optimisation pure. Leur mise en oeuvre se fait malgré tout au prix d'un coût de calcul important, d'oú la nécessité de bien comprendre les mécanismes d'évolution artificielle pour en adapter et régler efficacement les différentes composantes. Par ailleurs, on note que les applications phares de ce domaine sont souvent fondées sur une hybridation avec d'autres techniques d'optimisation. Les algorithmes évolutionnaires ne sont donc pas à considérer comme concurrents des algorithmes d'optimisation plus classiques, mais plutôt comme complémentaires.

[12] Evelyne Lutton, Gustavo Olague, and S. Cagnoni. Introduction to the special issue on evolutionary computer vision and image understanding. Pattern Recognition Letters, 27(11):1161-1163, August 2006. [ bib | .pdf ]
 

[13] Pierre Collet, Evelyne Lutton, and Gregory Valigiani. Etude comportementale des hommilières pour l'optimisation. EpiNet, EPI Electronic Magazine,  (83), March 2006. [ bib | http ]
An Ant Colony Optimisation technique has been implemented in order to help students visit pedagogical items proposed by Paraschool (a French leading e-learning company). The large number of students (more than 250 000) suggested to students as artificial ants, to leave stigmergic information on the web-site graph. This difference brought many changes in the original ACO process, but also a large improvement in the students' guiding system. The concept of Man-Hill has therefore been introduced. At this stage, it becomes necessary to rate the pedagogical items to refine the model and propose the students with items corresponding to their level. The Elo rating (used in Chess competitions) is used for his purpose. As a side effect, it revealed to be also a powerful audit system that can track semantic problem in exercises.

[14] G. Valigiani, E. Lutton, Y. Jamont, R. Biojout, and P. Collet. Automatic rating process to audit a man-hill. WSEAS Transactions on Advances in Engineering Education, 3(Issue 1):1-7, January 2006. ISSN 1790-1979. [ bib | .pdf ]
An Ant Colony Optimisation technic has been implemented in order to help students roaming among pedagogical items proposed by the system of Paraschool (French leading e-learning company). The large number of students rise the idea to use the students instead of artificials ants to leave stigmergic information on the web-site graph. This difference brought many changes in the original ACO process, but also a large improvement in the students'guiding system. The concept of “Man-Hill” has therefore been introduced. In this stage, the need of rating pedagogical items shows up in order to present students with strength-adapted items. The Elo automatic chess rating process has been applied to Paraschool system. Thanks to this mechanism, students and pedagogical items could be rated. As a side-effect, it is also a powerful audit system that can track semantic problem in exercises.

[15] Yann Landrin-Schweitzer, Pierre Collet, and Evelyne Lutton. Introducing lateral thinking in search engines. GPEM, Genetic Programming an Evolvable Hardware Journal, W. Banzhaf et al. Eds., 1(7):9-31, 2006. [ bib | .pdf ]
Too much information kills information. This common statement applies to huge databases, where state of the art search engines may retrieve hundreds of very similar documents for a precise query. In fact, this is becoming so problematic that Novartis Pharma, one of the leaders of the pharmaceutical industry, has come up with the somewhat odd request to decrease the precision of their search engine, in order to keep some diversity in the retrieved documents. Rather than decreasing precision by introducing random noise, this paper describes ELISE, an Evolutionary Learning Interactive Search Engine that interactively evolves rewriting modules and rules (some kind of elaborated user profile) along a Parisian Approach. Additional documents are therefore retrieved that are related both to the domains of interest of the user and to the original query, with results that suggest of lateral thinking capabilities.

[16] Enrique Dunn, Gustavo Olague, and Evelyne Lutton. Parisian camera placement for vision metrology. Pattern Recognition Letters, 27(11):1209-1219, 2006. [ bib | .pdf ]
This paper presents a novel camera network design methodology based on the Parisian evolutionary computation approach. This methodology proposes to partition the original problem into a set of homogeneous elements, whose individual contribution to the problem can be evaluated separately. A population comprised of these homogeneous elements is evolved with the goal of creating a single solution by a process of aggregation. The goal of the Parisian evolutionary process is to locally build better individuals that jointly form better global solutions. The implementation of the proposed approach requires addressing aspects such as problem decomposition and representation, local and global fitness integration, as well as diversity preservation mechanisms. The benefit of applying the Parisian approach to our camera placement problem is a substantial reduction in computational effort expended in the evolutionary optimisation process. Moreover, experimental results coincide with previous state of the art photogrammetric network design methodologies, while incurring in only a fraction of the computational cost.

[17] Evelyne Lutton. Evolution of fractal shapes for artists and designers. IJAIT, International Journal of Artificial Intelligence Tools, 15(4):651-672, 2006. Special Issue on AI in Music and Art. [ bib | .pdf ]
We analyse in this paper the way randomness is considered and used in ArtiE-Fract. ArtiE-Fract is an interactive software, that allows the user (artist or designer) to explore the space of fractal 2D shapes with help of an interactive genetic programming scheme. The basic components of ArtiE-Fract are first described, then we focus on its use by two artists, illustrated by samples of their works. These “real life” tests have led us to implement additionnal components in the software. It seems obvious for the people who use ArtiE-Fract that this system is a versatile tool for creation, especially regarding the specific use of controlled random components.

[18] G. Olague, F. Fernandez, C. Perez, and E. Lutton. The infection algorithm: an artificial epidemic approach for dense stereo correspondence. Artificial Life, 12(4):593-615, 2006. [ bib | http ]
We present a new bio-inspired approach applied to a problem of stereo image matching. This approach is based on an artificial epidemic process, which we call the infection algorithm. The problem at hand is a basic one in computer vision for 3D scene reconstruction. It has many complex aspects and is known as an extremely difficult one. The aim is to match the contents of two images in order to obtain 3D information that allows the generation of simulated projections from a viewpoint that is different from the ones of the initial photographs. This process is known as view synthesis. The algorithm we propose exploits the image contents in order to produce only the necessary 3D depth information, while saving computational time. It is based on a set of distributed rules, which propagate like an artificial epidemic over the images. Experiments on a pair of real images are presented, and realistic reprojected images have been generated.

[19] Olivier Pauplin, Jean Louchet, Evelyne Lutton, and Arnaud de la Fortelle. Evolutionary optimisation for obstacle detection and avoidance in mobile robotics. Journal of Advanced Computational Intelligence and Intelligent Informatics (JACIII), 9(6):622-629, 2005. Special Issue on ISCIIA'04. [ bib | .pdf ]
This paper presents an artificial evolution based method for stereo image analysis and its application to real-time obstacle detection and avoidance for a mobile robot. It uses the Parisian approach, which consists here in splitting the representation of the robot's environment into a large number of simple primitives, the "flies", which are evolved according to a biologically inspired scheme. Results obtained on real scene with different fitness functions are presented and discussed, and an exploitation for obstacle avoidance in mobile robotics is proposed.

[20] Evelyne Lutton. Darwinisme artificiel: une vue d'ensemble. Revue Technique et Science Informatique, TSI, Traitement du Signal, numéro spécial "Méthodologie de la gestion intelligente des senseurs", 22(4):339-354, 2005. [ bib | .pdf ]
Les algorithmes génétiques, la programmation génétique, les stratégies d'évolution, et ce que l'on appelle maintenant en général les algorithmes évolutionnaires, sont des techniques d'optimisation stochastiques inspirées de la théorie de l'évolution selon Darwin. Nous donnons ici une vision globale de ces techniques, en insistant sur l'extrême flexibilité du concept d'évolution artificielle. Cet outil a un champ très vaste d'applications, qui ne se limite pas à l'optimisation pure. Leur mise en oeuvre se fait cependant au prix d'un coût calculatoire important, d'où la nécessité de bien comprendre ces mécanismes d'évolution pour adapter et régler efficacement les différentes composantes de ces algorithmes. Par ailleurs, on note que les applications-phares de ce domaine sont assez souvent fondées sur une hybridation avec d'autres techniques d'optimisation. Les algorithmes évolutionnaires ne sont donc pas à considérer comme une méthode d'optimisation concurrente des méthodes d'optimisation classiques, mais plutôt comme une approche complémentaire.

[21] B. Leblanc, B. Braunschweig, H. Toulhoat, and E. Lutton. Improving the sampling efficiency of monte carlo molecular molecular simulation: an evolutionary approach. Molecular Physics, 101(22):3293-3308, November 2003. [ bib | .pdf ]
We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.

[22] Hatem Hamda, François Jouve, Evelyne Lutton, Marc Schoenauer, and Michèle Sebag. Compact unstructured representations for evolutionary design. Applied Intelligence,  (16):139-155, 2002. [ bib | .pdf ]
This paper proposes a few steps to escape structured extensive representations for evolutionary solving of Topological Optimum Design (TOD) problems: early results have shown the ability of Evolutionary methods to find numerical solutions to yet unsolved TOD problems, but those approaches were limited because the complexity of the representation was that of a fixed underlying mesh. Different compact unstructured representations are introduced, the complexity of which is self-adaptive, i.e. is evolved by the algorithm itself. The Voronoi-based representations are variable length lists of alleles that are directly decoded into shapes, while the IFS representation, based on fractal theory, involves a much more complex morphogenetic process. First results demonstrates that Voronoi-based representations allow one to push further the limits of Evolutionary Topological Optimum Design by actually removing the correlation between the complexity of the representations and that of the discretization. Further comparative results among all these representations on simple test problems indicate that the complex causality in the IFS representation disfavor it compared to the Voronoi-based representations.

[23] Benoit Leblanc, Evelyne Lutton, and Françoise Axel. Genetic algorithms as a tool in the study of aperiodic long range order : the case of x-ray diffraction spectra of gaas-alas multilayer heterostructures. he European Physical Journal B, 29(4):619-629, 2002. [ bib | .pdf ]
We present the first application of Genetic Algorithms to the analysis of data from an aperiodically ordered system, high resolution X-Ray diffraction spectra from multilayer heterostructures arranged according to a deterministic or random scheme. This method paves the way to the solution of the “inverse problem”, that is the retrieval of the generating disorder from the investigation of the spectra of an unknown sample having non crystallographic, non quasi-crystallographic order.

[24] Pierre Collet, Evelyne Lutton, Frédéric Raynal, and Marc Schoenauer. Polar ifs + parisian genetic programming = efficient ifs inverse problem solving. Genetic Programming and Evolvable Machines Journal, 1(4):339-361, 2000. October. [ bib | .pdf ]
This paper proposes a new method for treating the inverse problem for Iterated Functions Systems (IFS) using Genetic Programming. This method is based on two original aspects. On the fractal side, a new representation of the IFS functions, termed Polar Iterated Functions Systems, is designed, shrinking the search space to mostly contractive functions. Moreover, the Polar representation gives direct access to the fixed points of the functions. On the evolutionary side, a new variant of GP, the "Parisian" approach is presented. The paper explains its similarity to the "Michigan" approach of Classifier Systems: each individual of the population only represents a part of the global solution. The solution to the inverse problem for IFS is then built from a set of individuals. A local contribution to the global fitness of an IFS is carefully defined for each one of its member functions and plays a major role in the fitness of each individual. It is argued here that both proposals result in a large improvement in the algorithms. We observe a drastic cut-down on CPU-time, obtaining good results with small populations in few generations.

[25] E. Lutton and J. Lévy Véhel. Hölder functions and deception of genetic algorithms. IEEE transactions on Evolutionary computation, 2(2):56-72, July 1998. [ bib | .pdf ]
We present a deception analysis for Hölder functions. Our approach uses a decomposition on the Haar basis, which reflects in a natural way the Hölder structure of the function. It allows to relate the deception, the Hölder exponent, and some parameters of the genetic algorithms (GAs). These results prove that deception is connected to the irregularity of the fitness function, and shed a new light on the schema theory. In addition, this analysis may assist in understanding the influence of some of the parameters on the performance of a GA.

[26] E. Lutton, J. Lévy Véhel, G. Cretin, P. Glevarec, and C. Roll. Mixed ifs: resolution of the inverse problem using genetic programming. Complex Systems, 9:375-398, 1995. (see also Inria Research Report No 2631). [ bib | .pdf ]
We address here the resolution of the so-called inverse problem for IFS. This problem has already been widely considered, and some studies have been performed for affine IFS, using deterministic or stochastic methods (Simulated Annealing or Genetic Algorithm). When dealing with non affine IFS, the usual techniques do not perform well, except if some a priori hypotheses on the structure of the IFS (number and type functions) are made. In this work, a Genetic Programming method is investigated to solve the "general" inverse problem, which permits to perform at the same time a numeric and a symbolic optimization. The use of "mixed IFS", as we call them, may enlarge the scope of some applications, as for example image compression, because they allow to code a wider range of shapes.

[27] Jacques Lévy Véhel, Khalid Daoudi, and Evelyne Lutton. Fractal modeling of speech signals. Fractals, 2(3):379-382, September 1994. [ bib | .pdf ]
In this paper, we present a method for speech signal analysis and synthesis based on IFS theory. We consider a speech signal and the graph of a continuous function whose irregularity, measured in terms of its local Hölder exponents, is arbitrary. We extract a few remarkable points in the signal and perform a fractal interpolation between them using a classical technique based on IFS theory. We thus obtain a functional representation of the speech signal, which is well adapted to various applications, as for instance voice interpolation.

[28] Evelyne Lutton, Henri Maitre, and Jaime Lopez-Krahe. Determination of vanishing points using hough transform. PAMI, IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(4):430-438, April 1994. [ bib | .pdf ]
We propose a method to locate three vanishing points on an image, corresponding to three orthogonal directions of the scene. This method is based on two cascaded Hough transforms. We show that, even in the case of synthetic images of high quality, a naive approach may fail, essentially due to the limitation of the image size. We take into account these errors as well as errors due to detection inaccuracy of the image segments, and provide a method efficient, even in the case of real complex scenes.


This file was generated by bibtex2html 1.96.