Zeit: 05.10.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


P. Putz (ISDS Univ.Wien): Exact approaches to the single source network loading problem

This is joint work with I. Ljubic and J.-J. Salazar Gonzalez

We consider the problem of deploying a broadband telecommunications system that lays optical fiber cable from a central office to a number of end-customers. We are dealing with a capacitated network design problem that requires an installation of fiber optic cables with sufficient capacity to carry the traffic from the central office to the end-customers. This is the single-source variant of the network loading problem.

We propose a new compact disaggregated mixed integer programming formulation for the problem. We project out flow variables by introducing Benders cuts that are further strengthened by additional inequalities. The whole procedure is incorporated into a branch-and-cut framework. In our computational experiments we do see improved gaps, when deploying Benders inequalities, at least in some cases. Reducing the computational cost for separating the Benders cuts and improving our primal heuristic to help closing the gap faster is the main focus of our ongoing effort.

Presentation

Zeit: 12.10.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


W. Polonik (UC Davis): Der Exzessmassen-Ansatz und verwandte statistische Methoden

Der Exzessmassen-Ansatz (Hartigan, 1987, Müller and Sawitzki, 1991) beruht auf der Idee des direkten Messens von Massenkonzentration. Dieser Ansatz weist Verbindungen zu überraschend vielen anderen statistischen Methoden und Ansätzen auf. In diesem Vortrag werden verschiedene solcher Verbindungen dargelegt. Hierbei werden unter anderem die folgenden Themen berührt: Majorisierung, statistisches Lernen und Klassifikation, das Schätzen von "split points", die Hough-Transformation, nichtparametrische Maximum-Likelihood Dichteschätzung und die sogenannte "vertical density representation".

Die Kenntnis solcher Verbindungen vertieft einerseits das Verständnis über die einzelnen Methoden und trägt andererseits zur Entwicklung eines erweiterten Blickes auf diesen Bereich der Statistik bei - eine Kombination, die das Potenzial aufweist, zur Weiterentwicklung der Forschung in diesem Bereich beizutragen.

Presentation

Zeit: 19.10.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


A.Weingessel (Erste Bank Wien): Current challenges for strategic risk management

In the current market crisis the appropriateness of current risk management models is one of the issues widely iscussed in the banking world. This talk highlights the challenges banks are currently facing to meet the requirements by regulators, investors and other stakeholders to risk management. We will discuss the use of mathematical models in the area of strategic risk management and their strengths and weaknesses to provide value-added for the management of a bank.

Presentation

Zeit: 09.11.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


C. Helmberg (TU Chemnitz): Graph realizations corresponding to optimized extremal eigenvalues of the Laplacian

This is joint work with S.Dienelt, F.Göring and M.Wappler

In spectral graph partitioning heuristics the initial partition is typically based on the sign structure of the Fiedler vector, i.e., the eigenvector to the second smallest eigenvalue of the Laplace matrix of the graph. In order to obtain a better understanding of connections between the spectrum of the Laplacian and separator structures in the graph we study graph realizations in Euclidean space obtained from the optimal solutions of semidefinite programs that seek to optimize the extremal eigenvalues of the Laplacian by redistributing the mass on the edges of the graph. The latter problem also appears in maximum variance unfolding techniques for graphical representation within data mining and in optimizing the mixing rate of Markov chains.

We show that the geometric structure of corresponding optimal graph realizations is tightly linked to the separator structure of the graph. Moreover, in both cases (maximizing the second smallest and minimizing the maximum eigenvalue) there even exist optimal realizations whose dimension is bounded by the tree width of the graph plus one. In the case of the second smallest eigenvalue, introducing node weights and edge lengths and maximizing the minimum dimension of optimal realizations over all possible node weights and edge lengths gives rise to a minor monotone graph parameter, the rotational dimension of the graph. We give a forbidden minor characterization for graphs of dimension 0, 1, and 2.

Presentation

Zeit: 16.11.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


F. Frommlet (Univ. Wien): Asymptotic optimality of multiple testing and model selection procedures under sparsity

Asymptotic optimality of a large class of multiple testing rules is investigated using the framework of Bayesian Decision Theory. A normal scale mixture model is considered, leading to an asymptotic framework which can be naturally motivated under the assumption of sparsity, where the proportion of "true" alternatives converges to zero. Within this setup optimality of a rule is proved by showing that the ratio of its Bayes risk and that of the Bayes oracle (a rule which minimizes the Bayes risk) converges to one.

The class of fixed threshold multiple testing rules which are asymptotically optimal is fully characterized as well as the class of optimal rules controlling the Bayesian False Discovery Rate (BFDR). Furthermore conditions are provided under which the popular Benjamini-Hochberg and Bonferroni procedures are asymptotically optimal. It is shown that for a wide class of sparsity levels, the threshold of the former can be approximated very well by a non-random threshold.

Apart from multiple testing the problem of model selection for multiple regression under sparsity is considered. Under the assumption of orthogonality and for known variances results from multiple testing immediately translate into the regression setting, where the scale mixture model is extended to a more general class of priors. We illustrate asymptotic optimality properties of modified versions of the Bayesian Information Criterion (mBIC), where we specifically discuss modifications allowing to control FDR. Finally optimality of mBIC in the case of unknown variances is proven.

Presentation

Zeit: 23.11.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


M. Meitz (Koc Univ. Istanbul): Parameter estimation in nonlinear AR-GARCH models

This is joint work with Pentti Saikkonen

This paper develops an asymptotic estimation theory for nonlinear autoregressive models with conditionally heteroskedastic errors. We consider a functional coefficient autoregression of order p (AR(p)) with the conditional variance specified as a general nonlinear first order generalized autoregressive conditional heteroskedasticity (GARCH(1,1)) model. Strong consistency and asymptotic normality of the global Gaussian quasi maximum likelihood (QML) estimator are established under conditions comparable to those recently used in the corresponding linear case. To the best of our knowledge, this paper provides the first results on consistency and asymptotic normality of the QML estimator in nonlinear autoregressive models with GARCH errors.

Presentation

Zeit: 30.11.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


M. Jünger (Univ. Köln): GEODUAL - fun with geometric duality and combinatorial optimization

Almost 20 years ago, William R. Pulleyblank and I started assigning geometric interpretations to dual solutions of certain combinatorial optimization problems. Among other things, this resulted in colorful pictures that were not only æsthetically pleasing but also of educational value, because certain theorems can be appreciated visually without any formalism. For many years I have been sorry for not being able to spice up my lectures in the subject area with such pictures or even animations. The recently developed GEODUAL software closes this gap. Using the multi-platform Qt 4 open source library, it is a widely usable tool for creating and solving geometric instances of the Minimum Spanning Tree problem, the Perfect Matching problem, and the Traveling Salesman problem, along with visual proofs of optimality.


Presentation

Zeit: 07.12.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


P. Reiter (Univ. Wien): Two exact hybrid algorithms for solving a bi-objective vehicle routing problem

This is joint work with W. Gutjahr

In this work, we present an exact algorithm for solving a bi-objective vehicle routing problem with minimization of total travel cost as the first objective and minimization of the length of the longest route as the second objective. The capacitated vehicle routing problem (CVRP) can be defined as follows. Its objective is to find optimal routes for a fleet of K identical vehicles (each having a capacity Q) serving a set of n customers (with deterministic demand) and based at a single depot. The aim is to minimize the total cost to serve all customers.

In general, there will be no single solution that attains the optimum of both objectives at the same time. Therefore, it is desirable to compute the set of Pareto-optimal (or: efficient) solutions. In our work, we use the adaptive epsilon-constraint method in combination with a branch-and-cut algorithm and (i) a multiobjective genetic algorithm or (ii) a single-objetive genetic algorithm (GA) to solve the considered problem. The efficiency of the algorithms is assessed on a set of standard CVRP benchmark instances from the TSPLIB.

Presentation

Zeit: 14.12.2009, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


P. Mutzel (TU Dortmund): Recent approximation results for crossing minimization

This is joint work with M.Chimani, C.Gutwenger, P.Hlinený, and Ch.Wolf.

The crossing number problem asks for the minimum number cr(G) of edge crossings that can be achieved by drawing a given graph G=(V,E) in the 2-dimensional plane. Although many interesting results have been published on this problem, so far the crossing number is known only for very few graph classes, e.g., even for complete graphs it is only known for graphs with up to 12 vertices. The decision variant of the crossing number problem is NP-complete, and no polynomial time approximation algorithm is known for general graphs G within some non-trivial factor. For many years, only algorithms have been known that approximate cr(G)+|V| for bounded degree graphs. Recent papers have presented polynomial time algorithms for sparse graphs that do only approximate cr(G).

A sparse graph is called "apex graph", if it is planar after removal of one vertex, the apex v. For apex graphs, we present a polynomial time algorithm which finds the optimal embedding, so that inserting the apex v and its incident edges leads to a minimum- crossing drawing over the set of all possible planar embeddings. We have also shown that this algorithm provides crossing numbers of at most (deg(v) maxdeg(G)/2) cr(G), where maxdeg(G) denotes the maximum degree of G, thus providing the first constant approximation algorithm for apex graphs with bounded degree.


Presentation

Zeit: 11.01.2010, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


G. Eichfelder (Univ.Erlangen-Nürnberg): A new copositivity test

In this talk three new tests for the copositivity of a matrix are presented which use a difference-of-convex (d.c.) decomposition of the quadratic form of the matrix. We employ spectral information to obtain this d.c. decomposition. The tests are then based on the solutions of convex quadratic and of linear subproblems. The resulting copositivity conditions can be combined to a copositivity test of branch-and-bound type. For branching, we use subdivisions of the standard simplex based on the solutions of the subproblems which occur in the copositivity conditions. We also report on first numerical experiences with this procedure.


Presentation

Zeit: 18.01.2010, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


A. Tomazic (Univ.Wien): Computing bounds for a new general Connected Facility Location problem

We introduce a new general model for Connected Facility Location, generalizing previously considered versions. Besides respecting facility capacities and customer price collecting, we allow customer coverage constraints. By Lagrangian Decomposition we derive lower bounds and use dual and primal heuristics to compute upper bounds. Computational results show the efficiency of our heuristical bounding approach.

Presentation

Zeit: 25.01.2010, 16:30

Ort: Leopold-Schmetterer-Seminarraum, 1010 Wien, Univ.str. 5/3.Stock


F. Gach (Univ.Wien): Efficiency in indirect inference

Indirect inference is a simulation-based alternative to maximum likelihood estimation when neither an explicit nor computable form of the likelihood function ist available. The method was introduced in 1993 by Smith and also by Gourieroux et al.; their proposed estimator turns out to be consistent and asymptotically normal but is efficient only under the somewhat restrictive assumption that the so-called auxiliary model is correctly specified. In this talk, I give an overview of the method and present a new framework leading to efficient estimation.

Presentation

© 2004 by Daniel Bomze