Invited Talks

Patrick oct 2009.jpg Tuesday, August 30th

Patrick Valduriez, INRIA and LIRMM, Montpellier, France

Title:

"Principles of Distributed Data Management in 2020?"

Download Slides

ABSTRACT.

Distributed data management went from a potentially significant technology in the early 1980’s to one that is now common place. The advents of high-speed networks, fast commodity hardware, and the web, have certainly changed the way we typically look at data distribution. The third edition of the Özsu-Valduriez textbook Principles of Distributed Database Systems (Springer, 2011) reflects this evolution with in depth treatment of recent topics such as peer-to-peer data management, web and XML data management, stream data management and cloud data management. What is interesting is that the fundamental principles of distributed data management could be still presented based on the same architectural models of the earlier editions, which characterize distributed data management on three dimensions: distribution, heterogeneity and autonomy of the data sources.
In retrospect, the focus on fundamental principles and generic techniques has been useful not only to understand and teach the material, but also to enable an infinite number of variations. The primary application of these generic techniques has been obviously for distributed and parallel DBMS versions, which all DBMS vendors now provide. But many other systems, e.g. application servers, search engines, directories, have used distributed data management techniques as well.
Today, to support the requirements of important data-intensive applications (e.g. social networks, web data analytics, scientific applications, etc.), new distributed data management techniques and systems (e.g. MapReduce, Hadoop, SciDB, Peanut, Pig latin, etc.) are emerging and receiving much attention from the research community. Although they do well in terms of consistency/flexibility/performance trade-offs for specific applications, they seem to be ad-hoc and might hurt data interoperability.
The key questions I will discuss are: what are the fundamental principles behind the emerging solutions? Is there any generic architectural model, to explain those principles? Do we need new foundations to look at data distribution?

SPEAKER BIOGRAPHY.

Patrick Valduriez is a senior researcher at INRIA, Montpellier. He has also been a professor of CS at University Paris 6 (1999-2002) and a researcher at Microelectronics and Computer Technology Corp. in Austin, Texas (1985-1989). He received his Ph. D. degree and Doctorat d'Etat in CS from University Paris 6 in 1981 and 1985, respectively. His research focuses on data management in large-scale distributed and parallel systems (P2P, cluster, grid, cloud), in particular, scientific data management. He has authored and co-authored over 200 technical papers and several textbooks, among which “Principles of Distributed Database Systems”. He has been a member of the SIGMOD board, a trustee of the VLDB endowment and an associate editor of several journals, including ACM TODS, the VLDB Journal, Distributed and Parallel Databases, and Internet and Databases. He has served as PC chair of major conferences such as PDIS93, SIGMOD97, VLDB98 (industrial chair), VLDB99 (European chair). He was the general chair of SIGMOD04, EDBT08 and VLDB09. He was the recipient of the 1993 IBM scientific prize in CS in France. He obtained the best paper award at VLDB00.



p_thalheim1.jpg Wednesday, August 31th

Bernhard Thalheim, Christian-Albrechts-University Kiel, Germany

Title:

"The Science of Conceptual Modelling"

Download Slides

ABSTRACT.

Conceptual modelling is one of the central activities in Computer Science.
Conceptual models are mainly used as intermediate artifact for system construction.
They are schematic descriptions of a system, a theory, or a phenomenon of an origin thus forming a model. A conceptual model is a model enhanced by concepts. The process of conceptual modelling is ruled by the purpose of modelling and the models. It is based on a number of modelling acts, on a number of correctness conditions, on modelling principles and postulates, and on paradigms of the background or substance theories. Purposes determine the (surplus) value of a model.
Conceptual modelling is performed by a modeller that directs the process based on his/her experience, education, understanding, intention and attitude.
Conceptual models are products that are used by other stakeholders such as programmers, learners, business users, and evaluators. Conceptual models use a language as a carrier for the modelling artifact and are restricted by the expressiveness of this carrier.

This keynote aims at a discussion of a general theory of modelling as a culture and an art. A general theory of modelling also considers modelling as an apprenticeship and as a technology. It is thus an art. Modelling is on of the main elements of Computer Science culture that consists of commonly accepted behaviour patterns, arts, consensus, institutions, and all other supporting means and thoughts.

SPEAKER BIOGRAPHY.

Bernhard Thalheim is interested and researching in database technology, database programming, (distributed) object-relational information systems, business informatics, web information systems, performance tuning and forecasting, data mining, data warehouses and OLAP foundations, content management, database and information systems theory, database sysstems and software architecture, discrete mathematics, and logics.
Bernhard Thalheim has published more than research 300 papers, developed 3 monographs, edited more than 30 conference and survey proceedings, and supervised more than 30 PhD students. His work was awarded the P.P. Chen award by Elsevier in 2008. Since 2009 he is an ER fellow. He is also the honorary Kolmogorov chair at Lomonossov University at Moscov in Russia. He got his diploma in mathematics from Dresden University of Technology in Germany, his PhD in mathematics from Lomonossov University at Moscov in Russia, and his Advanced PhD (habilitation) in Computer Science from Dresden University of Technology in Germany.

He chaired database groups in universities in Dresden, Kuwait, Rostock, Cottbus, and currently in Kiel. His group was able to collect a large library of database applications worldwide due to a cooperative program with one of the largest database consultant agencies.The library contains around 4,500 existing and currently deployed database applications. This library is the basis for the development of universal and generic applications in areas such as e-commerce, web information systems, health care, telecommunication, insurance, travel, and financial services. These universal models led to the development of a theory of conceptual modeling for database applications.

He has been chairing more than threescore projects aiming at development of large/huge database systems, content management systems, database farms, and web information systems. The results and implementations of these projects have been and are used in industry and government. More than a dozen of these projects have been developed in teams spanning over several groups in several countries.

Bernhard Thalheim participated in the development of foundations of conceptual modeling. A result of work is the development of a full-fledged theoretical foundation of the entity-relationship model. Another result of this work is the development of a theory of modelling, models and modelling activities, especially for conceptual models.



Foto_GKI.jpg Thursday, September 1st

Gabriele Kern-Isberner, Technical University of Dortmund, Germany

Title:

"Probabilistic Logics in Expert Systems: Approaches, Implementations, and Applications"

Download Slides

ABSTRACT.

Authors of the paper:
Gabriele Kern-Isberner, Christoph Beierle, Matthias Thimm, Marc Finthammer,

The handling of uncertain information is of crucial importance for the success of expert systems. This talk gives an overview on logic-based approaches to probabilistic reasoning and goes into more details about recent developments for relational, respectively first-order, probabilistic methods like Markov logic networks, and Bayesian logic programs. In particular, we will feature the maximum entropy approach as a powerful and elegant method that combines convenience with respect to knowledge representation with excellent inference properties, and present the KReator system as a versatile toolbox for probabilistic relational learning, modelling, and inference. Moreover, we will illustrate applications of probabilistic logics in various scenarios.

SPEAKER BIOGRAPHY.

Gabriele Kern-Isberner received her diploma in mathematics in 1979, and her doctoral degree in mathematics in 1985, both from the University of Dortmund. In 2000, she did her habilitation in computer science at the FernUniversitaet in Hagen, the German Open University, and got the Venia legendi for computer science. She worked as a research assistant and as a lecturer at the universities of Dortmund, Hagen, and Leipzig. Since 2004, she has been a Professor for Information Engineering at the department of computer science at the University of Dortmund (now Technische Universitaet Dortmund).

Prof.~Kern-Isberner's scientific work focuses on qualitative and quantitative approaches to knowledge representation, such as default and non-monotonic logics, uncertain reasoning, probabilistic reasoning, belief revision, and argumentation, as well as multi-agent systems and knowledge discovery. In this wide scope of domains, she is pursuing in particular the development of methods that help to integrate approaches and points of views from different fields.

Prof. Kern-Isberner has been organizing workshops and conferences in the field of knowledge representation and reasoning, and has been involved in the organization of major conferences, e.g. ECAI, IJCAI, KR, ECSQARU, IPMU, COMMA, to name just a few. Likewise, she has been a reviewer for leading journals in the field of artificial intelligence. Moreover, she is the speaker of the interest group of knowledge representation of the Gesellschaft fuer Informatik (German Society of Computer Science).



Thursday, September 1st

Ram Herkanaidu, Education Manager, Global Educational Programs Development, Kaspersky Lab

Title:

"The shifting security perimeter and the computer in your pocket"

ABSTRACT.

Today there's no clear distinction between the work place and the rest of our lives. The boundary between what's personal and what's business is becoming blurred. Moreover, employees today are often required to be 'always-on' - conducting business not just from the office, but also at home, at the airport, in a hotel, etc. Employees are also using mobile devices for work, for personal business - and sometimes both on the same device. Yet there are dangers in mobile working that threaten to undermine corporate security. This presentation will outline the nature of the threat from mobile malware and the increasing risk of data leakage from mobile devices.