Showing posts with label Economy. Show all posts
Showing posts with label Economy. Show all posts

Wednesday, June 11, 2008

The predictive power of the European Economic Sentiment Indicator

Sarah Gelper and Christophe Croux

Abstract
Economic sentiment surveys are carried out by all European Union member states on a monthly basis. The survey outcomes are used to obtain early insight into future economic evolutions and often receive extensive press coverage. Based on these surveys, the European Commission constructs an aggregate European Economic Sentiment Indicator (ESI). This paper compares the ESI with more sophisticated aggregation schemes based on two statistical methods: dynamic factor analysis and partial least squares. We compare the aggregate sentiment indicators and the weights used in their construction. Afterwards a comparison of their forecast performance for two real economic series, industrial production growth and unemployment, follows. Our findings are twofold. First it is found that the ESI, although constructed in a rather ad hoc way, can compete with the indicators constructed according to statistical principles. Secondly, the predictive power of the sentiment indicators, as tested for in an out-of sample Granger causality framework, is limited.

Keywords: Common indicators; Dimension reduction methods; Economic sentiment indicator; Forecasting.

For detail, download here (right click)

Brecht Cardoen, Erik Demeulemeester, Jeroen BeliÄen
Katholieke Universiteit Leuven, Faculty of Economics and Applied Economics, Department of Decision Sciences and Information Management,
Naamsestraat 69, B-3000 Leuven, Belgium,

Abstract
In this paper we will investigate how to sequence surgical cases in a day-care facility so that multiple objectives are simultaneously optimized. The limited availability of resources and the occurrence of medical precautions, such as an additional cleaning of the operating room after the surgery of an infected patient, are taken into account. A branch-and-price methodology will be introduced in order to develop both exact and heuristic algorithms. In this methodology, column generation is used to optimize the linear programming formulation of the scheduling problem. Both a dynamic programming approach and an integer programming approach will be specified in order to solve the pricing problem. The column generation procedure will be combined with various branching schemes in order to guarantee the integrality of the solutions. The resulting solution procedures will be thoroughly tested and evaluated using real-life data of the surgical day-care center at the university hospital Gasthuisberg in Leuven (Belgium). Computational results will be summarized and conclusions will eventually be formulated.

Keywords: Branch-and-price; Column generation; Health care operations; Scheduling.

For detail, download here (right click)

Key words: Customer lifetime value; Value; Yield; Companies; Order; Model; Product; Expected

Boute R, Lambrecht MR, Lambrechts O and Sterckx P
Abstract: Various inventory studies have been published in the last decades. Some studies emphasize the importance of low inventories, other examine the evolution of inventories over time and especially focus on the impact of the just-in-time (JIT) revolution. The aim of this paper is to investigate the level of inventories held by Belgian companies at one moment in time, namely May 2004. First we examine differences in inventory ratios between manufacturing industry sectors as well as between wholesale and retail. We find empirical evidence that the type of production process is the most important driver for work in process inventory. The finished goods inventory ratio also differs significantly among industry sectors, but here the reasons for the difference are harder to distinguish. Finally we find the inventory ratio to be significantly higher in retail than in wholesale. Furthermore, we examine the financial impact of inventories in the manufacturing industry. We find that companies with very high inventory ratios have more chance to be bad financial performers. Regression analyses partially support the hypothesis of a negative relationship between inventory ratio and financial performance but significant results could not be obtained for all sectors.

Key words: Inventory; Manufacturing
For detail, download here (right click)

A Modified Pareto/NBD Approach for Predicting Customer

Nicolas Glady, Bart Baesens and Christophe Croux
Faculty of Economics and Management, K.U.Leuven
School of Management, University of Southampton
Abstract
Valuing customers is a central issue for any commercial activity. The customer lifetime value (CLV) is the discounted value of the future profits that this customer yields to the company. In order to compute the CLV, one needs to predict the future number of transactions a customer will make and the pro¯t of these transactions. With the Pareto/NBD model, the future number of transactions of a customer can be predicted, and the CLV is then computed as a discounted product between this number and the expected profit per transaction. Usually, the number of transactions and the future profits per transaction are estimated separately. This study proposes an alternative. We show that the dependence between the number of transactions and
their probability can be used to increase the accuracy of the prediction of the CLV. This is illustrated with a new empirical case from the retail banking sector.
Keywords: Customer lifetime value; Value; Yield; Companies; Order; Model; Product; Expected.
For detail, download here (right click)

Characteristics and Performance of Students in an Online Section of Business Statistics

John Dutton
North Carolina State University

Marilyn Dutton
North Carolina Central University
Journal of Statistics Education Volume 13, Number 3 (2005),
Abstract
We compare students in online and lecture sections of a business statistics class taught simultaneously by the same instructor using the same content, assignments, and exams in the fall of 2001. Student data are based on class grades, registration records, and two surveys. The surveys asked for information on preparedness, reasons for section choice, and evaluations of course experience and satisfaction. Using descriptive statistics, regression analysis and standard hypothesis tests, we test for significant differences between the online and lecture sections with regard to performance and satisfaction with the course as well as motivation and preparedness for taking an online course. We report several differences, including better performance by online students.
Keywords: Distance education; Internet course; Online education.
For detail, click here (right click)

Sunday, May 11, 2008

The European Consumer: United in Diversity?,

by Lemmens A; Croux C; Dekimpe MG.

Abstract
The ongoing unification which takes place on the European political scene, along with recent advances in consumer mobility and communication technology, raises the question whether the European Union can be treated as a single market to fully exploit the potential synergy effects from pan-European marketing strategies. Previous research, which mostly used domain-specific segmentation bases, has resulted in mixed conclusions. In this paper, a more general segmentation base is adopted, as we consider the homogeneity in the European countries' Consumer Confidence Indicators. Moreover, rather than analyzing more traditional static similarity measures, we adopt the concepts of dynamic correlation and cohesion between countries. The short-run fluctuations in consumer confidence are found to be largely country specific. However, a myopic focus on these fluctuations may inspire management to adopt multi-country strategies, foregoing the potential longer-run benefits from more standardized marketing strategies. Indeed, the Consumer Confidence Indicators become much more homogeneous as the planning horizon is extended. However, this homogeneity is found to remain inversely related to the cultural, economic and geographic distances among the various Member States. Hence, pan-regional rather pan-European strategies are called for.

Keywords: Communication; Consumer confidence; Country; Dynamic correlation; Effects; European unification; European Union; Indicators; Management; Market; Marketing; Planning; Research; Similarity; Strategy; Technology

For detail, download here (right click)

Pricing Bridges to Cross a River

by Grigoriev A; van Hoesel S; van der Kraaij AF; Spieksma FCR; Uetz M; Bouhtou M.

Abstract
We consider a Stackelberg pricing problem in directed, uncapacitated networks. Tariffs have to be defined by an operator, the leader, for a subset of m arcs, the tariff arcs. Costs of all other arcs are assumed to be given. There are n clients, the followers, that route their demand independent of each other on paths with minimal total cost. The problem is to find tariffs that maximize the operator's revenue. Motivated by problems in telecommunication networks, we consider a restricted version of this problem, assuming that each client utilizes at most one of the operator's tariff arcs. The problem is equivalent to pricing bridges that clients can use in order to cross a river. We prove that this problem is APX-hard. Moreover, we show that uniform pricing yields both an m–approximation, and a (1 + lnD)–approximation. Here, D is upper bounded by the total demand of all clients. We furthermore discuss some polynomially solvable special cases, and present a short computational study with instances from France Télécom. In addition, we consider the problem under the additional restriction that the operator must serve all clients. We prove that this problem does not admit approximation algorithms with any reasonable performance guarantee, unless NP = ZPP, and we prove the existence of an n–approximation algorithm.

Keywords: Pricing; Networks; Tariffs; Costs; Cost; Demand; Problems; Order; Yield; Studies; Approximation; Algorithms; Performance

For detail, download here (right click)

The Transportation Problem with Exclusionary Side Constraints

by Goossens D; Spieksma FCR.

Abstract
We consider the so-called Transportation Problem with Exclusionary Side Con- straints (TPESC), which is a generalization of the ordinary transportation problem. We determine the complexity status for each of two special cases of this problem, by proving NP-completeness, and by exhibiting a pseudo-polynomial time algorithm. For the general problem, we show that it cannot be approximated with a constant perfor- mance ratio in polynomial time (unless P=NP). These results settle the complexity status of the TPESC.

For detail, download here (right click)

Thin-Trading Effects in Beta: Bias v. Estimation Error

by Sercu P; Vandebroek M; Vinaimont T.

Abstract
Two regression coeffcients often used in Finance, the Scholes-Williams (1977) quasi-multiperiod 'thin-trading' beta and the Hansen-Hodrick (1980) overlapping-periods regression coeffcient, can both be written as instrumental-variables estimators.Competitors are Dimson's beta and the Hansen-Hodrick original OLS beta.We check the performance of all these estimators and the validity of thet-tests in small and medium samples, in and outside their stated assumptions, and wereporttheir performances in a hedge-fund styleportfolio-management application .In all experiments as well as in the real-data estimates, less bias comes at the cost of a higher standard error. Our hedge-portfolio experiment shows that the safest procedure even is to simply match by size and industry;any estimation just adds noise. There is a clear relation between portfolio variance and the variance of the beta estimator used in market-neutralizing the portfolio, dwarfing the beneficial effect of bias

Keywords: Market model; Thin trading; Size; Noise; Portfolio; Variance; Estimator; Bias; Regression; Finance; Performance; Cost

For detail, download here (right click)

Modeling Customer Loyalty Using Customer Lifetime Value

by Glady N; Baesens B; Croux C.

Abstract
The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution will be to redefine the notion of customer's loyalty by considering it from a customer-centric point-of-view instead of a product-centric point-of-view. We will hereby use the customer lifetime value (CLV) defined as the discounted value of future marginal earnings, based on the customer's activity. Hence, a churner will be defined as someone whose CLV, thus the related marginal profit, is decreasing. As a second contribution, the loss incurred by the CLV decrease will be used to appraise the cost to misclassify a customer by introducing a new loss function. In the empirical study, we will compare the accuracy of various classification techniques commonly used in the domain of churn prediction, including two cost-sensitive classirfiers. Our final conclusion is that since profit is what really matters in a commercial environment, standard statistical accuracy measures or prediction need to be revised and a more profit oriented focus may be desirable.

Keywords: Churn prediction; Classification; Customer lifetime value; Prediction models

For detail, download here (right click)

The Matrix Bid Auction: Micro-Economic Properties and Expressiveness

by Goossens D; Spieksma FCR.

Abstract
A combinatorial auction is an auction where multiple items are for sale simultaneously to a set of buyers. Furthermore, buyers are allowed to place bids on subsets of the available items. This paper focusses on a combinatorial auction where a bidder can express his preferences by means of a so-called ordered matrix bid. This matrix bid auction was developed by Day (2004) and allows bids on all possible subsets, although there are restrictions on what a bidder can bid for these sets. We give an overview of how this auction works. We elaborate on the relevance of the matrix bid auction and we develop methods to verify whether a given matrix bid satisfies a number of properties related to micro-economic theory. Finally, we investigate how a collection of arbitrary bids can be represented as a matrix bid.Keywords: Combinatorial auction, matrix bids, free disposal,subadditivity, submodularity, gross substitutes, expressiveness

Keywords: Bids; Combinatorial auction; Expressiveness; Free disposal; Gross substitutes; Matrix; Matrix bids; Methods; Preference; Subadditivity; Submodularity; Theory; Work

For detail, download here (right click)

Exploring The Bullwhip Effect by Means of Spreadsheet Simulation

by Boute R; Lambrecht MR.

Abstract
An important supply chain research problem is the bullwhip effect: demand fluctuations increase as one moves up the supply chain from retailer to manufacturer. It has been recognized that demand forecasting and ordering policies are two of the key causes of the bullwhip effect. In this paper we present a spreadsheet application, which explores a series of replenishment policies and forecasting techniques under different demand patterns. It illustrates how tuning the parameters of the replenishment policy induces or reduces the bullwhip effect. Moreover, we demonstrate how bullwhip reduction (order variability dampening) may have an adverse impact on inventory holdings. Indeed, order smoothing may increase inventory fluctuations resulting in poorer customer service. As such, the spreadsheets can be used as an educational tool to gain a clear insight into the use or abuse of inventory control policies and improper forecasting in relation to the bullwhip effect and customer service. Keywords: Bullwhip effect, forecasting techniques, replenishment rules, inventory fluctuations, spreadsheet simulation

Keywords: Bullwhip; Bullwhip effect; Forecasting techniques; Inventory fluctuations; Replenishment rule; Simulation; Spreadsheet simulation

For detail, download here (right click)

The predictive power of the European Economic Sentiment Indicator

by Gelper S; Croux C.

Abstract
Economic sentiment surveys are carried out by all European Union member states on a monthly basis. The survey outcomes are used to obtain early insight into future economic evolutions and often receive extensive press coverage. Based on these surveys, the European Commission constructs an aggregate European Economic Sentiment Indicator (ESI). This paper compares the ESI with more sophisticated aggregation schemes based on two statistical methods: dynamic factor analysis and partial least squares. We compare the aggregate sentiment indicators and the weights used in their construction. Afterwards a comparison of their forecast performance for two real economic series, industrial production growth and unemployment, follows. Our findings are twofold. First it is found that the ESI, although constructed in a rather ad hoc way, can compete with the indicators constructed according to statistical principles. Secondly, the predictive power of the sentiment indicators, as tested for in an out-of sample Granger causality framework, is limited.

Keywords: Common indicators; Dimension reduction methods; Economic sentiment indicator; Forecasting

For detail, download here (right click)

Predicting Business/ICT Alignment with AntMiner+,

by Cumps B; Martens D; De Backer M; Haesen R; Viaene S; Dedene G; Baesens B; Snoeck M.

Abstract
In this paper we report on the results of a European survey on business/ICT alignment practices. The goal of this study is to come up with some practical guidelines for managers on how to strive for better alignment of ICT investments with business requirements. Based on Luftman's alignment framework we examine 18 ICT management practices belonging to 6 different competency clusters. We use AntMiner+, a rule induction technique, to create an alignment rule set. The results indicate that B/ICT alignment is a multidimensional goal which can only be obtained through focused investments covering different alignment aspects. The obtained rule set is an interesting mix of both formal engineering and social interaction processes and structures. We discuss the implication of the alignment rules for practitioners.

Keywords: Alignment; Artificial ant systems; Business; Business/ICT alignment; Data; Data mining; Framework; Investment; Investments; Management; Management practices; Managers; Practical guidelines; Processes; Requirements; Rules; Structure; Studies; Systems

For detail, download here (right click)

A Methodology for Integrated Risk Management and Proactive Scheduling of Construction Projects

by Schatteman D; Herroelen WS; Van De Vonder S; Boone A.

Abstract
An integrated methodology is developed for planning construction projects under uncertainty. The methodology relies on a computer supported risk management system that allows to identify, analyze and quantify the major risk factors and derive the probability of their occurrence and their impact on the duration of the project activities. Using project management estimates of the marginal cost of activity starting time disruptions, a proactive baseline schedule is developed that is suffciently protected against the anticipated disruptions with acceptable project makespan performance. The methodology is illustrated on a real life application.

Keywords: Risk; Risk management; Management; Scheduling; Construction; Planning; Uncertainty; Factors; Probability; Impact; Project management; Cost; Time; Performance; Real life

For detail, download here (right click)

A Note on A Motion Control Problem for A Placement Machine

by Coenen S; van Hop N; van de Klundert J; Spieksma FCR.

Abstract
Assembling printed circuit boards effciently using automated placement machines is a challenging task. Here, we focus on a motion control problem for a specific type of placement machines. More specifically,the problem is to establish movement patterns for the robot arm, the feeder rack,and -when appropriate- the work table, of a sequential, pick-and-place machine. In this note we show that a (popular) greedy strategy may not always yield an optimum solution. However, under the Tchebychev metric, as well as under the Manhattan metric, we can model the problem as a linear program, thereby establishing the existence of a polynomial time algorithm for this motion control problem. Finally, we give experimental evidence that computing optimal solutions to this motion control problem can yield significantly better solutions than those found by a greedy method.

Keywords: Algorithms; Bids; Branch-and-price; Combinatorial auction; Complexity; Computational complexity; Exact algorithm; Mathematical programming; Matrix; Matrix bids; Research; Winner determination; Control; Printed circuit boards; Patterns; Work; Strategy; Yield; Model; Time; Optimal
For detail, download here (right click)

An Investigation of Resource-allocation Decisions by Means of Project Networks

by Leus R.

Abstract
This paper investigates the relationship between resource allocation and ES-policies, which are a type of scheduling policies introduced for stochastic scheduling and which can be represented by a directed acyclic graph. We present a formal treatment of resource flows as are presentation of resource-allocation decisions, extending the existing literature. A number of complexity results are established, showing that a number of recently proposed objective functions for evaluating the quality of ES-policies lead to difficult problems. Finally, some reflections are provided on possible effciency enhancements to enumeration algorithms for ES-policies.

Keywords: Complexity; Project scheduling; Resource allocation; Resource constraints

For detail, download here (right click)

Business Process Verification: a Petri Net pproach

by De Backer M; Snoeck M.

Abstract
In this report, we discuss the use of Petri Net language theory for business process modeling. Essentially, the focus is on the opportunities of the modeling technique for analysis and verification. Semantic compatibility, as opposed to syntactic compatibility, is concerned with the meaningfulness of the distributedbusiness process. We start with a description and motivation of different notions of semantically compatible business processes. Further, these different types ofcompatibility are formalized by means of Petri Net language theory. Finally, we describe the foundations of an algorithm that enables us to verify the semantic compatibility in an automated way.

Keywords: Business process modeling; Petri Net theory; Semantic compatibility; Verification; Theory; Business; Processes; Process modeling; Opportunities

For detail, download here (right click)

Evaluation of Scan Methods Used in the Monitoring of Public Health Surveillance

Shannon Elizabeth Fraker

Abstract
With the recent increase in the threat of biological terrorism as well as the continual risk
of other diseases, the research in public health surveillance and disease monitoring has grown
tremendously. There is an abundance of data available in all sorts of forms. Hospitals, federal
and local governments, and industries are all collecting data and developing new methods to be
used in the detection of anomalies. Many of these methods are developed, applied to a real data
set, and incorporated into software. This research, however, takes a different view of the
evaluation of these methods.

Keywords: Anomaly detection, CUSUM charts, EWMA charts, RecurrenceInterval, Scan Method, Time-to-SignalCopyright 2007.

For detail, download here (right click)

Least Squares Optimal Scalingof Partially Observed Linear Systems

JAN DE LEEUW

Abstract
We study linear systems in which both the coefficients ofthe linear combinations and the variables which are combined linearlyare only partially known. This includes the two logical extremes com-pletely known and completely unknown. The systems we study includethe usual linear systems of simultaneous equations, as well as nonlin-ear multivariate analysis systems. Throughout, we use unweighted leastsquares loss functions and majorization algorithms to minimize them.We incorporate both errors-in-equations and errors-in-variables as ad-ditional unknowns into the loss function, and arrive at algorithms fordecompositions of partially unknown matrices.

For detail, download here (right click)