Sunday, May 11, 2008

Managing Technology Risk in R&D Project Planning: Optimal Timing and Parallelization of R&D Activities

by Crama P; De Reyck B; Degraeve Z; Leus R.

Abstract
An inherent characteristic of R&D projects is technological uncertainty, which may result in project failure, and time and resources spent without any tangible return. In pharmaceutical projects, for instance, stringent scientific procedures have to be followed to ensure patient safety and drug efficacy in pre-clinical and clinical tests before a medicine can be approved for production. A project consists of several stages, and may have to be terminated in any of these stages, with typically a low likelihood of success. In project planning and scheduling, this technological uncertainty has typically been ignored, and project plans are developed only for scenarios in which the project succeeds. In this paper, we examine how to schedule projects in order to maximize their expected net present value, when the project activities have a probability of failure, and where an activity's failure leads to overall project termination. We formulate the problem, show that it is NP-hard and develop a branchand- bound algorithm that allows to obtain optimal solutions. We also present polynomial-time algorithms for special cases, and present a number of managerial insights for R&D project and planning, including the advantages and disadvantages of parallelization of R&D activities in different settings.

Keywords: Applications; Branch-and-bound; Computational complexity; Exact algorithms programming; Integer; Pharmaceutical; Project management; Project scheduling; R&D projects analysis of algorithms; Risk industries

For detail, download here (right click)

The European Consumer: United in Diversity?,

by Lemmens A; Croux C; Dekimpe MG.

Abstract
The ongoing unification which takes place on the European political scene, along with recent advances in consumer mobility and communication technology, raises the question whether the European Union can be treated as a single market to fully exploit the potential synergy effects from pan-European marketing strategies. Previous research, which mostly used domain-specific segmentation bases, has resulted in mixed conclusions. In this paper, a more general segmentation base is adopted, as we consider the homogeneity in the European countries' Consumer Confidence Indicators. Moreover, rather than analyzing more traditional static similarity measures, we adopt the concepts of dynamic correlation and cohesion between countries. The short-run fluctuations in consumer confidence are found to be largely country specific. However, a myopic focus on these fluctuations may inspire management to adopt multi-country strategies, foregoing the potential longer-run benefits from more standardized marketing strategies. Indeed, the Consumer Confidence Indicators become much more homogeneous as the planning horizon is extended. However, this homogeneity is found to remain inversely related to the cultural, economic and geographic distances among the various Member States. Hence, pan-regional rather pan-European strategies are called for.

Keywords: Communication; Consumer confidence; Country; Dynamic correlation; Effects; European unification; European Union; Indicators; Management; Market; Marketing; Planning; Research; Similarity; Strategy; Technology

For detail, download here (right click)

Pricing Bridges to Cross a River

by Grigoriev A; van Hoesel S; van der Kraaij AF; Spieksma FCR; Uetz M; Bouhtou M.

Abstract
We consider a Stackelberg pricing problem in directed, uncapacitated networks. Tariffs have to be defined by an operator, the leader, for a subset of m arcs, the tariff arcs. Costs of all other arcs are assumed to be given. There are n clients, the followers, that route their demand independent of each other on paths with minimal total cost. The problem is to find tariffs that maximize the operator's revenue. Motivated by problems in telecommunication networks, we consider a restricted version of this problem, assuming that each client utilizes at most one of the operator's tariff arcs. The problem is equivalent to pricing bridges that clients can use in order to cross a river. We prove that this problem is APX-hard. Moreover, we show that uniform pricing yields both an m–approximation, and a (1 + lnD)–approximation. Here, D is upper bounded by the total demand of all clients. We furthermore discuss some polynomially solvable special cases, and present a short computational study with instances from France Télécom. In addition, we consider the problem under the additional restriction that the operator must serve all clients. We prove that this problem does not admit approximation algorithms with any reasonable performance guarantee, unless NP = ZPP, and we prove the existence of an n–approximation algorithm.

Keywords: Pricing; Networks; Tariffs; Costs; Cost; Demand; Problems; Order; Yield; Studies; Approximation; Algorithms; Performance

For detail, download here (right click)

The Transportation Problem with Exclusionary Side Constraints

by Goossens D; Spieksma FCR.

Abstract
We consider the so-called Transportation Problem with Exclusionary Side Con- straints (TPESC), which is a generalization of the ordinary transportation problem. We determine the complexity status for each of two special cases of this problem, by proving NP-completeness, and by exhibiting a pseudo-polynomial time algorithm. For the general problem, we show that it cannot be approximated with a constant perfor- mance ratio in polynomial time (unless P=NP). These results settle the complexity status of the TPESC.

For detail, download here (right click)

Thin-Trading Effects in Beta: Bias v. Estimation Error

by Sercu P; Vandebroek M; Vinaimont T.

Abstract
Two regression coeffcients often used in Finance, the Scholes-Williams (1977) quasi-multiperiod 'thin-trading' beta and the Hansen-Hodrick (1980) overlapping-periods regression coeffcient, can both be written as instrumental-variables estimators.Competitors are Dimson's beta and the Hansen-Hodrick original OLS beta.We check the performance of all these estimators and the validity of thet-tests in small and medium samples, in and outside their stated assumptions, and wereporttheir performances in a hedge-fund styleportfolio-management application .In all experiments as well as in the real-data estimates, less bias comes at the cost of a higher standard error. Our hedge-portfolio experiment shows that the safest procedure even is to simply match by size and industry;any estimation just adds noise. There is a clear relation between portfolio variance and the variance of the beta estimator used in market-neutralizing the portfolio, dwarfing the beneficial effect of bias

Keywords: Market model; Thin trading; Size; Noise; Portfolio; Variance; Estimator; Bias; Regression; Finance; Performance; Cost

For detail, download here (right click)

Modeling Customer Loyalty Using Customer Lifetime Value

by Glady N; Baesens B; Croux C.

Abstract
The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution will be to redefine the notion of customer's loyalty by considering it from a customer-centric point-of-view instead of a product-centric point-of-view. We will hereby use the customer lifetime value (CLV) defined as the discounted value of future marginal earnings, based on the customer's activity. Hence, a churner will be defined as someone whose CLV, thus the related marginal profit, is decreasing. As a second contribution, the loss incurred by the CLV decrease will be used to appraise the cost to misclassify a customer by introducing a new loss function. In the empirical study, we will compare the accuracy of various classification techniques commonly used in the domain of churn prediction, including two cost-sensitive classirfiers. Our final conclusion is that since profit is what really matters in a commercial environment, standard statistical accuracy measures or prediction need to be revised and a more profit oriented focus may be desirable.

Keywords: Churn prediction; Classification; Customer lifetime value; Prediction models

For detail, download here (right click)

The Matrix Bid Auction: Micro-Economic Properties and Expressiveness

by Goossens D; Spieksma FCR.

Abstract
A combinatorial auction is an auction where multiple items are for sale simultaneously to a set of buyers. Furthermore, buyers are allowed to place bids on subsets of the available items. This paper focusses on a combinatorial auction where a bidder can express his preferences by means of a so-called ordered matrix bid. This matrix bid auction was developed by Day (2004) and allows bids on all possible subsets, although there are restrictions on what a bidder can bid for these sets. We give an overview of how this auction works. We elaborate on the relevance of the matrix bid auction and we develop methods to verify whether a given matrix bid satisfies a number of properties related to micro-economic theory. Finally, we investigate how a collection of arbitrary bids can be represented as a matrix bid.Keywords: Combinatorial auction, matrix bids, free disposal,subadditivity, submodularity, gross substitutes, expressiveness

Keywords: Bids; Combinatorial auction; Expressiveness; Free disposal; Gross substitutes; Matrix; Matrix bids; Methods; Preference; Subadditivity; Submodularity; Theory; Work

For detail, download here (right click)

Exploring The Bullwhip Effect by Means of Spreadsheet Simulation

by Boute R; Lambrecht MR.

Abstract
An important supply chain research problem is the bullwhip effect: demand fluctuations increase as one moves up the supply chain from retailer to manufacturer. It has been recognized that demand forecasting and ordering policies are two of the key causes of the bullwhip effect. In this paper we present a spreadsheet application, which explores a series of replenishment policies and forecasting techniques under different demand patterns. It illustrates how tuning the parameters of the replenishment policy induces or reduces the bullwhip effect. Moreover, we demonstrate how bullwhip reduction (order variability dampening) may have an adverse impact on inventory holdings. Indeed, order smoothing may increase inventory fluctuations resulting in poorer customer service. As such, the spreadsheets can be used as an educational tool to gain a clear insight into the use or abuse of inventory control policies and improper forecasting in relation to the bullwhip effect and customer service. Keywords: Bullwhip effect, forecasting techniques, replenishment rules, inventory fluctuations, spreadsheet simulation

Keywords: Bullwhip; Bullwhip effect; Forecasting techniques; Inventory fluctuations; Replenishment rule; Simulation; Spreadsheet simulation

For detail, download here (right click)

The predictive power of the European Economic Sentiment Indicator

by Gelper S; Croux C.

Abstract
Economic sentiment surveys are carried out by all European Union member states on a monthly basis. The survey outcomes are used to obtain early insight into future economic evolutions and often receive extensive press coverage. Based on these surveys, the European Commission constructs an aggregate European Economic Sentiment Indicator (ESI). This paper compares the ESI with more sophisticated aggregation schemes based on two statistical methods: dynamic factor analysis and partial least squares. We compare the aggregate sentiment indicators and the weights used in their construction. Afterwards a comparison of their forecast performance for two real economic series, industrial production growth and unemployment, follows. Our findings are twofold. First it is found that the ESI, although constructed in a rather ad hoc way, can compete with the indicators constructed according to statistical principles. Secondly, the predictive power of the sentiment indicators, as tested for in an out-of sample Granger causality framework, is limited.

Keywords: Common indicators; Dimension reduction methods; Economic sentiment indicator; Forecasting

For detail, download here (right click)

Predicting Business/ICT Alignment with AntMiner+,

by Cumps B; Martens D; De Backer M; Haesen R; Viaene S; Dedene G; Baesens B; Snoeck M.

Abstract
In this paper we report on the results of a European survey on business/ICT alignment practices. The goal of this study is to come up with some practical guidelines for managers on how to strive for better alignment of ICT investments with business requirements. Based on Luftman's alignment framework we examine 18 ICT management practices belonging to 6 different competency clusters. We use AntMiner+, a rule induction technique, to create an alignment rule set. The results indicate that B/ICT alignment is a multidimensional goal which can only be obtained through focused investments covering different alignment aspects. The obtained rule set is an interesting mix of both formal engineering and social interaction processes and structures. We discuss the implication of the alignment rules for practitioners.

Keywords: Alignment; Artificial ant systems; Business; Business/ICT alignment; Data; Data mining; Framework; Investment; Investments; Management; Management practices; Managers; Practical guidelines; Processes; Requirements; Rules; Structure; Studies; Systems

For detail, download here (right click)

A Methodology for Integrated Risk Management and Proactive Scheduling of Construction Projects

by Schatteman D; Herroelen WS; Van De Vonder S; Boone A.

Abstract
An integrated methodology is developed for planning construction projects under uncertainty. The methodology relies on a computer supported risk management system that allows to identify, analyze and quantify the major risk factors and derive the probability of their occurrence and their impact on the duration of the project activities. Using project management estimates of the marginal cost of activity starting time disruptions, a proactive baseline schedule is developed that is suffciently protected against the anticipated disruptions with acceptable project makespan performance. The methodology is illustrated on a real life application.

Keywords: Risk; Risk management; Management; Scheduling; Construction; Planning; Uncertainty; Factors; Probability; Impact; Project management; Cost; Time; Performance; Real life

For detail, download here (right click)

A Note on A Motion Control Problem for A Placement Machine

by Coenen S; van Hop N; van de Klundert J; Spieksma FCR.

Abstract
Assembling printed circuit boards effciently using automated placement machines is a challenging task. Here, we focus on a motion control problem for a specific type of placement machines. More specifically,the problem is to establish movement patterns for the robot arm, the feeder rack,and -when appropriate- the work table, of a sequential, pick-and-place machine. In this note we show that a (popular) greedy strategy may not always yield an optimum solution. However, under the Tchebychev metric, as well as under the Manhattan metric, we can model the problem as a linear program, thereby establishing the existence of a polynomial time algorithm for this motion control problem. Finally, we give experimental evidence that computing optimal solutions to this motion control problem can yield significantly better solutions than those found by a greedy method.

Keywords: Algorithms; Bids; Branch-and-price; Combinatorial auction; Complexity; Computational complexity; Exact algorithm; Mathematical programming; Matrix; Matrix bids; Research; Winner determination; Control; Printed circuit boards; Patterns; Work; Strategy; Yield; Model; Time; Optimal
For detail, download here (right click)

An Investigation of Resource-allocation Decisions by Means of Project Networks

by Leus R.

Abstract
This paper investigates the relationship between resource allocation and ES-policies, which are a type of scheduling policies introduced for stochastic scheduling and which can be represented by a directed acyclic graph. We present a formal treatment of resource flows as are presentation of resource-allocation decisions, extending the existing literature. A number of complexity results are established, showing that a number of recently proposed objective functions for evaluating the quality of ES-policies lead to difficult problems. Finally, some reflections are provided on possible effciency enhancements to enumeration algorithms for ES-policies.

Keywords: Complexity; Project scheduling; Resource allocation; Resource constraints

For detail, download here (right click)

Business Process Verification: a Petri Net pproach

by De Backer M; Snoeck M.

Abstract
In this report, we discuss the use of Petri Net language theory for business process modeling. Essentially, the focus is on the opportunities of the modeling technique for analysis and verification. Semantic compatibility, as opposed to syntactic compatibility, is concerned with the meaningfulness of the distributedbusiness process. We start with a description and motivation of different notions of semantically compatible business processes. Further, these different types ofcompatibility are formalized by means of Petri Net language theory. Finally, we describe the foundations of an algorithm that enables us to verify the semantic compatibility in an automated way.

Keywords: Business process modeling; Petri Net theory; Semantic compatibility; Verification; Theory; Business; Processes; Process modeling; Opportunities

For detail, download here (right click)

Rank-order Conjoint Experiments: efficiency and design

by Vermeulen B; Goos P; Vandebroek M.

Abstract
In a rank-order conjoint experiment, the respondent is asked to rank a number of alternatives instead of choosing the preferred one, as is the standard procedure in conjoint choice experiments. In this paper, we study the efficiency of those experiments and propose a D-optimality criterion for rank-order conjoint experiments to find designs yielding the most precise parameter estimators. For that purpose, an expression of the Fisher information matrix for the rank-ordered multinomial logit model is derived which clearly shows how much additional information is provided by each extra ranking step made by the respondent. A simulation study shows that Bayesian D-optimal ranking designs are slightly better than Bayesian D-optimal choice designs and (near-)orthogonal designs and perform considerably better than other commonly used designs in marketing in terms of estimation and prediction accuracy. Finally, it is shown that improvements of about 50% to 60% in estimation and prediction accuracy can be obtained by ranking a second alternative. If the respondent ranks a third alternative, a further improvement of 30%in estimation and prediction accuracy is obtained.

For detail, download here (right click)

Evaluation of Scan Methods Used in the Monitoring of Public Health Surveillance

Shannon Elizabeth Fraker

Abstract
With the recent increase in the threat of biological terrorism as well as the continual risk
of other diseases, the research in public health surveillance and disease monitoring has grown
tremendously. There is an abundance of data available in all sorts of forms. Hospitals, federal
and local governments, and industries are all collecting data and developing new methods to be
used in the detection of anomalies. Many of these methods are developed, applied to a real data
set, and incorporated into software. This research, however, takes a different view of the
evaluation of these methods.

Keywords: Anomaly detection, CUSUM charts, EWMA charts, RecurrenceInterval, Scan Method, Time-to-SignalCopyright 2007.

For detail, download here (right click)

Least Squares Optimal Scalingof Partially Observed Linear Systems

JAN DE LEEUW

Abstract
We study linear systems in which both the coefficients ofthe linear combinations and the variables which are combined linearlyare only partially known. This includes the two logical extremes com-pletely known and completely unknown. The systems we study includethe usual linear systems of simultaneous equations, as well as nonlin-ear multivariate analysis systems. Throughout, we use unweighted leastsquares loss functions and majorization algorithms to minimize them.We incorporate both errors-in-equations and errors-in-variables as ad-ditional unknowns into the loss function, and arrive at algorithms fordecompositions of partially unknown matrices.

For detail, download here (right click)

Bayesian Modeling of Accelerated Life Tests with Random Effects

By
Avery J. Ashby
Ramón V. León
Jayanth Thyagarajan
Department of Statistics, The University of Tennessee, Knoxville. 37996-0532

Abstract
We show how to use Bayesian modeling to analyze data from an accelerated life test
where the test units come from different groups (such as batches) and the group effect is
random and significant. Our approach can handle multiple random effects and several
accelerating factors. However, we present our approach on the basis on an important
application concerning pressure vessels wrapped in Kevlar 49 fibers where the fibers of
each vessel comes from a single spool and the spool effect is random. We show how
Bayesian modeling using Markov chain Monte Carlo (MCMC) methods can be used to
easily answer questions of interest in accelerated life tests with random effects that are
not easily answered with more traditional methods. For example, we can predict the
lifetime of a pressure vessel wound with a Kevlar 49 fiber either from a spool used in the
accelerated life test or from another random spool from the population of spools. We
comment on the implications that this analysis has on the estimates of reliability (and
safety) for the Space Shuttle, which has a system of 22 such pressure vessels. Our
2 approach is implemented in the freely available WinBUGS software so that readers can
easily apply the method to their own data.

Keywords: Markov chain Monte Carlo (MCMC), WinBUGS, Credibility Interval,
Prediction Interval, Quantile.


For detail, download here (right click)