DEXA 2016 Keynote Talks
Johannes Kepler University (JKU) in Linz, Austria
"From Natural Language to Automated Reasoning"
Abstract: We outline the possible interaction between knowledge mining, natural language processing, sentiment analysis, data base systems, ontology technology, algorithm synthesis, and automated reasoning for enhancing the sophistication of web-based knowledge processing.
We focus, in particular, on the transition from parsed natural language texts to formal texts in the frame of logical systems and the potential impact of automating this transition on methods for finding hidden knowledge in big (or small) data and the automated composition of algorithms (cooperation plans for networks of application software).
Simple cooperation apps like IFTTT and the new version of SIRI demonstrate the power of (automatically) combining clusters of existing applications under the control of expressions of desires in natural language.
In the Theorema Working Group of the speaker quite powerful algorithm synthesis methods have been developed that can generate algorithms for relatively difficult mathematical problems. These methods are based on automated reasoning and start from formal problem specifications in the frame of predicate logic. We ask ourselves how the deep reasoning used in mathematical algorithm synthesis could be combined with recent advances in natural language processing for reaching a new level of intelligence in the communication between humans and the web for every-day and business applications.
The talk is expository and tries to draw a big picture of how we could and should proceed in this area but will also explain some technical details and demonstrate some surprising results in the formal reasoning aspect of the overall approach.
Short Bio: Prof. Bruno Buchberger is an international researcher, best known for his invention of the theory of Gröbner bases (established in his PhD thesis 1965) providing a general algorithmic method for solving fundamental problems in non-linear polynomial systems. His current main research topic is automated mathematical theory exploration (the "Theorema Project”) with application to the automation of invention and the derivation of hidden knowledge in big data. He led numerous research and industrial projects and was / is a consultant of the Austrian and Upper Austrian Government. For his contribution to the economic and technologic development of Austria, he was elected “Austrian of the Year” 2010 (by the Austrian newspaper Die Presse) and received numerous other Austrian awards.
Torben Bach Pedersen
Aalborg University, Denmark
"Managing Big Multidimensional Data A Journey From Acquisition to Prescriptive Analytics"
Abstract: More and more data is being collected from a variety of new sources such as sensors, smart devices, social media, crowd-sourcing, and (Linked) Open Data. Such data is large, fast, and often complex. There is a universal wish perform multidimensional OLAP-style analytics on such data, i.e., to turn it into "Big Multidimensional Data". The keynote will look at challenges and solutions in managing Big Multidimensional data. This is a multi-stage journey from its initial acquisition, over cleansing and transformation, to (distributed) storage, indexing, and query processing, further on to building (predictive) models over it, and ultimately performing prescriptive analytics that couples analytics with optimization to suggest optimal actions. A number of case studies from advanced application domains such as Smart Energy, Smart Transport, and Smart Logistics will be used for illustration.
Short Bio: Torben Bach Pedersen is a Professor of Computer Science at Aalborg University, Denmark. His research interests include many aspects of Big Data analytics, with a focus on technologies for "Big Multidimensional Data" - the integration and analysis of large amounts of complex and highly dynamic multidimensional data in domains such as logistics (indoor/outdoor moving objects), smart grids (energy data management), transport (GPS data), and Linked Open Data. He is an ACM Distinguished Scientist, and a member of the Danish Academy of Technical Sciences, the SSTD Endowment, and the SSDBM Steering Committee. He has served as Area Editor for Information Systems and Springer EDBS, PC Chair for DaWaK, DOLAP, SSDBM, and DASFAA, and regularly serves on the PCs of the major database conferences.
University of Muenster, Germany
"The Price of Data"
Abstract: As data is becoming a commodity similar to electricity, as individuals become more and more transparent thanks to the comprehensive data traces they leave, and as data gets increasingly connected across company boundaries, the question arises of whether a price tag should be attached to data and, if so, what it should say. In this talk, the price of data is studied from a variety of angles and areas, including telecommunication, social networks, advertising, and automation; the issues discussed include aspects such as fair pricing, data ownership, and ethics. Special attention is paid to data marketplaces, where nowadays everybody can trade data, although the currency in which buyers are requested to pay may no longer be what they expect.
Short Bio: Gottfried Vossen is a Professor of Computer Science in the Department of Information Systems at the University of Muenster in Germany. He is a Fellow of the German Computer Science Society and an Honorary Professor at the University of Waikato Management School in Hamilton, New Zealand. He received his master’s and Ph.D. degrees as well as the German Habilitation from the Technical University of Aachen in Germany, and is an Editor-in-Chief of Elsevier's Information Systems - An International Journal. His current research interests include conceptual as well as application-oriented challenges concerning databases, information systems, business process modelling, Smart Web applications, cloud computing, and big data.
University of Linz, Austria
"The IS Perspective in E-Government Research and Development"
Abstract: Five years ago DEXA established a Conference line under the name of EGOVIS as to underscore the importance of the Information Systems view. Obviously, the Information Systems aspect plays a pivotal role in the whole field of E-Government Research and Development. The term denotes the study of organizational systems with a specific reference to information and the complementary networks of hardware and software. It comprises a quite broad scope and thus addresses a breath of themes ranging from a strategic and design focus to managerial and operational questions. Accordingly, the IS Perspective induces a holistic and comprehensive approach for E-Government R&D. In addition it helps to comprehend and improve a series of actual and novel innovations. The contribution outlines first the general merits of the IS perspective and consequently moves to discussing some recent challenges. Respective themes comprise Collaboration Features, Mobile Government, Open Government and Modelling Approaches.
Corvinus University of Budapest,Hungary
"Trust or security - stakeholders' responsibility"
Abstract: Security is one of the most often used term in the world of ICT. It is very likely, security and in a closer look the information security is an important phenomenon. The total expenditure on information security in 2015 was estimated by Gartner to 75.4 billion USD, 4.7% growth compared to 2014 spending .
Demand for security goes back in time well before the age of ICT, somehow in parallel with the level of vulnerability. Vulnerability is the likelihood of losing something what is in our possession, nevertheless if we worked for it, or inherited it. The likelihood of losing something depends on the variety and extent of threats. Occurring a negative event which effects individuals, organisation, physical objects, processes, etc. in a bad way often call risk. Risk management takes into account the risk with the likelihood of occurrence and how serious is the consequence of the bad event. This way the weighted risk stands in front of the security measurements. Security measurements tend from the very simple physical solutions (e.g. fences) through the logical level solutions (e.g. authentication) up to the strategic level security governance.
If we raise the question, how much money is worthwhile to send on security, in order to minimize the vulnerability (= the potential loss), the answer is easy and simple. The spending on security is justified up to the level where cost of security is still a little bit less or maximum equal to the value of the potential loss. This is the answer to the question, if… we are fully in aware of the value of every properties what we have. But how can we translate the value of every properties on monetary tools? Anyhow.
From the angle of threat-types, they can be grouped in several groups. In the following we focus on the information security, only. The scale starts from blocking the use of an information system, information service (Denial of Service, DOS), through spamming, alteration of data, identity thieves, money flow diverting, industrial espionage up to destroying data. As a consequence there are effective financial losses, e.g. if an outgoing invoice file is destroyed, the company cannot claim money from its customers. Beside of financial losses, important know-how can be transferred to the competitor, which may create indirectly financial losses. On the level of individuals personal or very special personal (e.g. health status) can be stolen and misused. These and similar problems encourages the ICT managers increasing the level of security, but the calculation of the needed level is in a grey zone, since no exact information of the monetary sum of potential losses.
The problem is even more complex, because due to national security, anti-cyberterrorism, budgetary reasons (tax offices), competition law (anti-trust, anti-cartel actions), and many other community reasons there is a social need for “legal backdoors” in properly secured information systems, the state is eagerly needs the data of its enterprises and citizens. Without questioning the rationality of data inquiry of the state, it raises extra security, data privacy issues.
Back to the original question, what is the sufficient extent of security spending or effort?
The usual approach is the weighted risk analysis (risk likelihood x impact), based on the foreseeable potential loss is a reliable basis, what kind of efforts are justified in order to spend on security.
Behind of the vulnerability and threats (risk) and the security measures in most of the cases work strategies. Strategy means in this case properly defined goal or goals (to block services, to get secret or hidden information, etc.) and procedures aiming to reach the goals. On the other hand security management has the opposite goals and the suitable procedures. Game theory addresses typically similar problems, therefore it is worthwhile to investigate whether game theory can be applied in the security domain. From game theory angle this is an equilibrium problem, especially if we take into consideration its dynamical character.
Having a deeper look into goals and strategies, we find, the goals are relatively more stable, while strategies changes frequently. The reason of variety of strategies partly explained by the fast changing technology, partly is due to the different social environment. Social engineering as a separate industry branch has developed on the ground of social components of the security. Having analysed and decomposed the social environment we found the trust as one of the most effective factor. Trust on the contrary of “hard” (=well-defined) security procedure, is a “soft” concept.
Trust by a very general definition is a set of beliefs, according to which interacting other parties are benevolent. Trust has different interpretation in psychology, in sociology, in social psychology, in economics, in philosophy. Many other concepts are linked to the concept of trust, like reliance, trustworthiness, stereotypes, values and value sharing, - the list is quite long. From security point of view the social psychological interpretation looks the most relevant. From this approach trust is a common belief in a given community (society, or the effected part of the society) which is based on the combination of shared values and expectations. The recent Panama offshore case highlight the complexity of trust issue. The Panamian law firm Mossack Fonseca offered services for shell corporations for tax evasion, money laundry and many other illegal actions. After leaking the list of its customers, investigating journalists published several very delicate issues, pushing governments to act. From the point of view of the law firm the leakage created a big security problem (an employee was the highest risk factor, as always), however this security breach fits very well to high priority social and ethical values, and apart from the interested parties the belief of citizens in justice, order, in other words trust is strengthened. The list of positive and negative examples is endless.
Trust therefore can be reformulated also as strategy, in the sense of expectations and procedures. Expectations strongly correlates to the shared values and the variety of the priority order of values, less variety of priority orders, stronger the effect of the values on the procedures. Procedures are very much linked to the actions which may follow the negative effect of trust (“being betrayed”, disappointment, losing confidence). In the above example, publicity which will push governments to rethink regulations.
At this point there are different situations, different players, and different strategies, both on the security and trust “side”. What is interesting at which point the prevention break-into actions (=security measures) will be in equilibrium with the actions-to-do (strategies) based on social-economic-individual requirements (=trust)? We believe the security and trust phenomenon can be approached, investigated through seeking the equilibrium among them.
Plenty of research issues arise, just to mention one which equilibrium concept fit better: Nash equilibrium or Pareto? How to operationalize strategies behind of trust? What can be taken into account as payoff? How to cope with the global character of the virtual world and the geographically diverse communities, hence trust (components, level) geographically, sociologically diverse nature?
Despite of the plenty open questions, one conclusion already can be drawn: the good security governance should address not only the advanced technical, technological solutions but should be open to the trust issues, security strategy must be based also on the socio-components of the trust.
INESC TEC/ FCUP, Universidade do Porto, Portugal
"Scalable Online Top-N Recommender Systems"
Abstract: Given the large volumes of data recommender systems currently have to deal with, we need online stream based approaches that are able to cope with high throughput observations. In this talk I will describe the work on incremental matrix factorization approaches for binary ratings, starting with a general introduction, looking at various approaches and describing existing enhancements. I will also focus on adequate procedures for the evaluation of online recommender algorithms.
Stewart James Kowalski
Norwegian Information Security Lab, Norwegian University of Science and Technology
"Security Value Chains in Theory and Practice: Can We Trust the Cheapest Link?"