Frédéric GardiVice President Products — Innovation 24 & LocalSolverLocalSolver: a mathematical optimization solver based on neighborhood search
The talk deals with local search for combinatorial optimization and its extension to mixed-variable optimization. Although not yet understood from the theoretical point of view, local search is the paradigm of choice to tackle large-scale real-life optimization problems. Today end-users ask for interactivity with decision support systems. For optimization software, it means obtaining good-quality solutions quickly.In this talk, we introduce LocalSolver, a heuristic solver for large-scale optimization problems. It provides good solutions in short running times for problems described in their mathematical form without any particular structure. Models supported by LocalSolver involve linear and nonlinear objectives and constraints including algebraic and logical expressions, in continuous and discrete variables. LocalSolver starts from a possibly infeasible solution and iteratively improves it by exploring some neighborhoods. A differentiator with classical solvers is the integration of small-neighborhood moves whose incremental evaluation is fast, allowing exploring millions of feasible solutions in minutes on some problems.We will present the modeling formalism of LocalSolver through examples in combinatorial and continuous optimization. We will give the main ideas about how the solver works and illustrate its performance on various benchmarks. Finally, we will provide an overview of the ongoing developments in the areas of vehicle routing and black-box optimization.
John HansenIndependent consultant and founder of Project2 ApS — Project2Process Mining – Data-driven Process AnalysisProcess mining is a short-cut to getting an overview of the business process map and improve the process. Instead of never ending workshops, process mining techniques are applied to data already present in the IT systems. Hereby, the process map is automatically generated and the process performance is analysed. It is LEAN on steroids.The traditional process mining techniques where workshops are facilitated and diagrams are manually drawn, suffer from being slow and based on subjectivity. With process mining facts based data, already present in the IT systems, are used – bringing results faster.
Process mining is the most innovative initiative in the BPM industry, we have seen in many years.
In this presentation process mining will be presented companioned with a live demo in a process mining tool.
Laurent PerronSoftware Engineer — GoogleOptimization at GoogleGoogle is a big company, with plenty of resources to optimize, and lots of talented engineers. This translates into many interesting challenges to for the Operations Research team. We will explore some of those in this presentation.
Christian PlumAnalytics Expert — Maersk LineWhat makes OR projects successful?Many good OR projects never mature enough to have real impact on the processes they model, even though they in theory show significant benefits for the problem they model. Most of the causes for failure lie outside the core of the optimization model: data quality, stakeholder engagement, visualization and user interface, process (Opt-in or Opt-out, etc.) and many others.
This presentation will go through some of the areas where a good idea can be grounded, and discuss how some of these issues can managed. We’ll take offset in some concrete examples of projects from Maersk Line, which made it and others who failed and investigate the causes.
Peder WikströmConsultant — Peder Wikström Skogsanalys AB“TIMBHA” – A new software for optimal harvest scheduling and road upgrading that anyone can useTIMBHA is a working name for a new Windows program for short-term harvest planning. It is designed to be used by a forest planner at a small or large forest company, for example at a district office. Using it for teaching is encouraged.
The program starts from a given set of stands already planned to be harvested in the next few years, as obtained from any long-term plan. TIMBHA will schedule harvest timings in more detail by assigning season and year for each stand. The optimization problem is to minimize total cost for terrain transport and road upgrading, subject to timber flow constraints, soil bearing capacity for each stand and road class accessibility in a given season. The user should supply two data sets in the form of GIS shape files, one for the stands and one for the roads. Stand data will also be possible to import directly from a Heureka result database.
The user enters the number of planning years, the length of seasons, road classes, road upgrade costs, and timber demands for one or more assortments. The program then builds an internal road network graph including hauling “roads” with any public road as destination and solves shortest path problems to obtain a number of alternative routes from each stand to a public road. The routes are used as input to the optimization model to obtain a strong formulation and to avoid the need of “arc-to-arc” constraints. Each route may be a subset or superset of another route, and this is utilized to strengthen the optimization model formulation.
The program currently has built-in links to a number of commercial solver. A user must install and require the license for one of these solvers. For problem generation, a linear programming matrix generator has been developed which generates text files in LP-format that can be read by any solver. Results are presented in maps and in pivot tables, and can be exported as shape files and to spreadsheet programs.
TIMBHA version 1.0 is scheduled for release in the end of this year.
The program is developed in .NET, and uses third-party libraries DotSpatial, QuickGraph(open source) and DevExpress (commercial). The matrix generator will probably be made available as open source on the CodePlex open source website.
Stephen HallTech Sale for IBM Decisions Optimisation — IBMWhat it takes to make a decision based on dataOperation research is a powerful decision support tool because it is based on facts and data. For a complex problem with a large amount of data, it allows to reveal new solutions, explore alternatives and streamline the decision making process. The current technology solves real life problems with an unparalleled accuracy and precision, thanks to faster algorithms, computers and networks, and the digitalization and instrumentalization of our world. But what does it take to create an application that helps decision makers while managing huge amounts of data in a fast changing environment?This talk will review the different technologies used in an application with operation research such as data management, business analytics, business intelligence, optimization and visualization. Because accuracy and precision are not enough for the business success of an application, I will also cover user acceptance, reliability, flexibility and speed of deployment of an application. The talk will conclude with the IBM response to the challenges mentioned above; IBM Watson Analytics.