ESM'2007, October 22-24, 2007, Westin Dragonara Hotel, St.Julian's, Malta, Conference Keynotes


The ESM'2007 conference will feature an opening speech by the Minister of for Investment, Industry and Information Technology of Malta.

Keynote Speakers

The conference will feature the following keynotes:

Pierre l'Ecuyer,
Departement d’Informatique et de Recherche Operationnelle
Universite de Montreal, C.P. 6128, Succ. Centre-Ville
Montreal (Quebec), H3C 3J7, Canada

Why Is Variance Reduction So Important?

Monte Carlo simulation is an incredibly versatile tool for studying complex stochastic systems. By replicating the simulation several times independently, one can in principle estimate performances measures of the system to arbitrary accuracy. Decisions and operating rules can also be optimized via simulation. A major drawback, however, is that the method converges very slowly and often requires an excessive amount of computing time.

Efficiency improvement methods provide ways of either reducing the required computing time for a given target accuracy, or obtaining an estimator with better accuracy for a given computational budget. Variance reduction is one way to improve the efficiency. Many key ideas in variance reduction have been introduced already in the early days of the Monte Carlo method, in
the late forties, at Los Alamos. But since then, enormous progress has been made in our understanding of these methods.

This talk is a guided tour of some situations where efficiency improvement is essential for the simulation approach to be viable. We will discuss importance sampling and splitting for rare-event simulation, common random numbers and their synchronization for comparing similar systems and for optimization, perturbation analysis for derivative estimation, smoothing estimators by taking conditional expectations, exploiting auxiliary information via control variates, and reducing the noise via stratification and quasi-Monte Carlo.

We will provide concrete examples where a clever use of these methods makes a huge difference in the computing time required to obtain a given accuracy.

You can download the abstract here.

Renate Sitte,
Griffith University,
Gold Coast Campus,
Fac. of Information and Communications Technologies,
9726 Gold Coast, Queensland, Australia

About the predictability and complexity of Complex Systems

Understanding and prediction are the ultimate goals in Modeling and Simulation. In this endeavor, we are faced with ever increasing complexity. This complexity is twofold: ever more complex models and modeling techniques mapping to the increasingly complex phenomena that we are trying to simulate. With steadily raising computing capability we are able to incorporate more complexity into our models and simulations. In return, this has had an impact on our thinking paradigm and capability for abstraction and modeling techniques. So, when are things complex? Our thorough analysis has revealed that complexity can mean very different things to different people, contexts and environments. In many cases it refers to an aspect or part of the system under study, in others the magnitude and variety of the system itself. To some “complexity” rather means the unknown: that what cannot be easily understood, modeled or predicted with current knowledge. In this keynote address I will present a unifying and systematic approach to complexity, to bring some clarity into the unknown, and a step further towards predictability. We will be looking at different types of complexity, ranging from simple structural and functional complexity, to those really complex systems that change (but not necessarily evolve) through their own dynamics. With this in mind, we will be able to identify what can be predicted and what cannot. By knowing it, we will be in a better position to find the unknown, which is the most important step towards the prediction of complex systems.

You can download the abstract here.

Invited Speaker

Eugene Kindler, Ostrava University
Faculty of Sciences, Dept. of Mathematics,
Ostrava, Czech Republic

SIMULA and 40 years of the object-oriented programming

40 years ago, the IFIP conference on simulation programming languages was organized, where certain principles of programming were presented. Some of these principles were classes as structures of data and procedures, subclasses and virtuality of procedures; they were later covered by the term object-oriented programming (OOP). But there are other principles presented at the conference, which transcend OOP; some of them have roots in former simulation languages and other are in relation to the block orientation of programming, which was considered as actual in the sixties, then condemned in the seventies and nowadays step by step accepted as suitable and advantageous.
Two phases of development of the programming paradigm will be analyzed, especially from the viewpoint of simulation; the first phase was between 1958 and 1967, the other between 1968 and nowadays. The principles that transcend OOP enable e.g. (1) formalizing world viewings using concepts represented by classes (which – for different world viewings – can differ in their semantics contrary to having the same names), (2) nesting models, (3) secure communication among simulation models with different modeled time flows, (4) nesting world viewings, etc. Thus simulation becomes a vanguard of computer-aided analyzing of comparing formal theories using mutually corresponding concepts with the same context but with different interpretation. Illustrations of applications in industry, logistics and services will be presented.

You can download the abstract here.