Overview of speakers and programme for the Big Techday 3 on Friday, May 28th, 2010
An overview of the programme can be found here.
An overview of the programme can be found here.
Europe currently leads the global Square Kilometre Array (SKA) project: a future radio telescope aimed at answering fundamental questions about our Universe, and requiring significant technological innovation in electronics, ICT and green energy. The reach of SKA research will be broad, namely: astroscience; detection, processing and analysis of data via hardware and software solutions of generic use, e.g. in medical sciences; renewable energy; defence and space science. The impact of the SKA makes it a priority for international funders working alongside commercial and non-commercial partners where it has direct impact for grand challenges such as security, the energy industry and leading edge computing. As a science project of truly global scale addressing some of mankind’s most fundamental questions, it will build human capital and knowledge transfer and act as an inspirational tool with unprecedented potential for discovery. As a driver of innovation, Bruce Elmegreeen from IBM has noted that the SKA will “generate new ways of doing ICT that could revolutionize the world”. Comprising a collecting area of a million square metres, formed by several thousand receiving dishes and arrays of novel receiver technologies SKA will be constructed in either Southern Africa or Western Australia (the decision will be made in 2012).
Dr. Christian Kaiser, TNG
Many achievements of the IT, taken for granted today, trace back to physics. Everyone knows about the history of the internet, which started at CERN. But only a few people would assume that astrophysicists are responsible for todays common technologies like Wi-Fi. Boosts of innovation in physical fundamental research have nearly every time resulted in new ideas and technologies for IT, even though this is not always obvious.
What has physics ever done for IT? A lot, and a good deal more to come.
Prof. Dr. Marc Stamminger, Universität Erlangen, Chair for Graphical Data Processing
Computer graphics are omnipresent. You see them in computer games, in movies or in the daily tv-news. Especially in the last 10 years there was a tremendous spike in development. Effects, which caused some time ago computers to work for days, are now possible to be processed in real time, which means in a few milliseconds. The main reason for this is the rapid increase of the computing power of graphic cards and new display techniques, which make optimal use of the enormous parallel computing power of graphics cards.
I will show in this talk examples of really different applications. These apps are used to illustrate, what todays computer graphics are able to do. I will try to give an outlook, what will be possible mid term and where the challenges are.
Dr. Michael Bussmann, Forschungszentrum Dresden-Rossendorf, Laser Particle Acceleration Division
Todays graphic cards can deliver a floating point performance of several TFLOPS. They are thus an interesting platform for cheap, power-efficient high performance computing. Building GPU-driven simulation software for real world physics applications requires a hierachical structuring of computation and communication tasks which goes beyond the simple message passing paradigm widely used in HPC today. I will present a GPGPU implementation of a laser plasma simulation that can be scaled to large clusters and introduce performance tools that give information on process execution on CPUs and GPUs simultaneously. Finally, I will discuss why hierachical algorithmic design cannot be avoided when aiming for large-scale simulations and how this will affect software engineering in science.
Business rules can quickly get an unmanageable complexity, be it for processing usage data in the mobile network, for validating insurance requests, or for checking medical bills. How this challenge is handled in practice and which experience could be gained with rules engines, will be highlighted in this talk on the basis of projects in mobile communications, life insurance, and health care.
Today, every software developer has to struggle with a higher frequency of changes and constantly changing requirements. The times of only true architectures are gone - and only the addition of unit testing does not solve the problem. This talk shows, how to choose your architectures, to keep them alive and to adapt to the challange of dynamicially evolving web applications. Additionally, we will show typical problems and pitfalls for developers, and how to avoid them.
Prof. Dr. Andreas Zeller, Universität des Saarlandes, Software Engineering Chair
Oops! My program just failed. Is there a programmer who never experienced this? In his talk, Andreas Zeller presents techniques that (a) automatically detect problems - by learning "normal" behavior and searching for differences; (b) automatically determine failure causes - by systematically isolating factors relevant for the failure; and (c) predict where problems will occur in the future - by learning which process and product features are correlated with errors in the past. Case studies on real programs with real errors, from AspectJ via Firefox to Windows demonstrate applicability and scalability of the techniques proposed.
Andreas Zeller is a full professor at Saarland University in Saarbrücken, Germany. His research concerns the analysis of large, complex software systems and their problems. His book "Why Programs Fail - A Guide to Systematic Debugging" obtained a Jolt Software Development Productivity Award in 2006.
Prof. Dr. Lutz Prechelt, professor of Informatics and Head of the Software Engineering Research Group at Freie Universität Berlin
The one topic about which every programmer has a really strong opinion: programming languages and which of them is best or worst. One reason why the religious wars about languages continue since decades is that strong empirical evidence is so scarce. This talk will present some such evidenc
Expect some of your prejudices to be confirmed and others to be heavily shaken!
Maike Kaufman, Robotics Research Group, Department of Engineering Science, University of Oxford
Nonparametric Bayesian Methods have generated much interest in the Machine Learning community in recent years, due to their ability to perform inference in models with infinite numbers of free parameters without over-fitting. Recent theoretical advancements, along with the continued increase in processing power, have allowed their application on data sets of considerable size and complexity.
I will outline the Bayesian nonparametric framework and introduce some of the most widely used methods, along with example applications.
Dr. Bernd Schönwälder, Managing Director, Nickwoods GmbH
The discussion about the function of the financial markets and the cause of their collapse is under way. It is seldom appropriately visible that today, market mechanics are dominated by fully automated server infrastructures. In key markets 60%-70% of all transactions are executed without involving a human being by specialized high performance data centers. Competition to fully utilize all technical possibilities has escalated into a cost-intensive arms race for all participants.
The bleeding edge of algorithmic trading: I will highlight, using examples and live demos, the technical challenges, operating modes and risks of this normally completely closed technical world.
C++ is one of the most used languages in the software development landscape. But C++ has not seen a major update since 1998. Since then and now many new and successfull programming languages entered the game, and many of them offer features that C++ is missing. Until now. The new, almost final, C++ Standard (a.k.a C++0X) adds these missing pieces to the language and a lot more. C++ is back!
A Coding Kata is a practical approach to learn programming best practices. Invented by Dave Thomas and inspired by martial arts, the Coding Kata is an exercise to help developers to hone their programming skills through practice and repetition.
Attendees be warned! You will see source code in this session.
Scala is a general purpose programming language designed to express common programming patterns in a concise, elegant, and type-safe way. It smoothly integrates features of object-oriented and functional languages, enabling Java and other programmers to be more productive. Code sizes are typically reduced by a factor of two to three when compared to an equivalent Java application.
What happens when science fiction becomes battlefield reality?An amazing revolution is taking place on the battlefield, starting to change not just how wars are fought, but also the politics, economics, laws, and ethics that surround war itself. This upheaval is already afoot - remote-controlleddrones take out terrorists in Afghanistan, while the number of unmanned systems on the ground in Iraq has gone from zero to 12,000 over the last five years. But it is only the start. Military officers quietly acknowledge that new prototypes will soon make human fighter pilots obsolete, while the Pentagon researches tiny robots the size of flies to carry out reconnaissance work now handled by elite Special Forces troops.
Scrum does not put any requirements on how to set up your development environmentand on specific development practices. This talk shows how principles of Scrumand agile software development help projects gain technical excellence.We present best practices rom several years' experience on agile projects,in particular:
The discussion about the correct procedure model in projectsgot new momentum by adapting Kanban, a method for productionflow control, to IT projects. The presentation will start bylooking from Scrum to Kanban and will highlight, what is specialabout Kanban, where to use it and how you can extend Scrumwith ideas of Kanban.
Normally a short reaction time as well as a fast execution is expected for development projects in the field of business intelligence and (more technically) datawarehousing (DWH). The usage of agile methods offers itself for this task. A particular challenge is, that you have for DWH-projects not only to change the complex processing logic, but also apply changes to extremely large sets of data.
We will highlight, how you may apply agile techniques in a datawarehousing environment, to sustainably achieve short development cycles to add new functinality at a constant high quality, while your complexity is growing.
Twitter as phenomenon of the web 2.0 has by now outgrown its hip image. 50 million tweets a day make Twitter an excellent data source for virtually everything connected to social interactions. While Twitter has grown up, complex event processing (CEP) is still more of a buzz word currently than actual application, although it is quickly emerging as a future market. By clever combination of established technologies originating in telecommunications, of rules engines and smartphone apps, we will show a new social network platform based on Twitter in a live demo and explain the basics of CEP as well as the technology used.