Praise for the First Edition
’…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics.’ -MAA Reviews
Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.
Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both popular and modern time series methodologies as well as an introduction to Bayesian methods in forecasting. Introduction to Time Series Analysis and Forecasting, Second Edition also includes:
- Over 300 exercises from diverse disciplines including health care, environmental studies, engineering, and finance
- More than 50 programming algorithms using JMP®, SAS®, and R that illustrate the theory and practicality of forecasting techniques in the context of time-oriented data
- New material on frequency domain and spatial temporal data analysis
- Expanded coverage of the variogram and spectrum with applications as well as transfer and intervention model functions
- A supplementary website featuring Power Point® slides, data sets, and select solutions to the problems
Introduction to Time Series Analysis and Forecasting, Second Edition is an ideal textbook upper-undergraduate and graduate-levels courses in forecasting and time series. The book is also an excellent reference for practitioners and researchers who need to model and analyze time series data to generate forecasts.
Innehållsförteckning
preface xi
1 Introduction to Forecasting 1
1.1 The Nature and Uses of Forecasts 1
1.2 Some Examples of Time Series 6
1.3 The Forecasting Process 13
1.4 Data for Forecasting 16
1.4.1 The Data Warehouse 16
1.4.2 Data Cleaning 18
1.4.3 Imputation 18
1.5 Resources for Forecasting 19
Exercises 20
2 Statistics Background for Forecasting 25
2.1 Introduction 25
2.2 Graphical Displays 26
2.2.1 Time Series Plots 26
2.2.2 Plotting Smoothed Data 30
2.3 Numerical Description of Time Series Data 33
2.3.1 Stationary Time Series 33
2.3.2 Autocovariance and Autocorrelation Functions 36
2.3.3 The Variogram 42
2.4 Use of Data Transformations and Adjustments 46
2.4.1 Transformations 46
2.4.2 Trend and Seasonal Adjustments 48
2.5 General Approach to Time Series Modeling and Forecasting 61
2.6 Evaluating and Monitoring Forecasting Model Performance 64
2.6.1 Forecasting Model Evaluation 64
2.6.2 Choosing Between Competing Models 74
2.6.3 Monitoring a Forecasting Model 77
2.7 R Commands for Chapter 2 84
Exercises 96
3 Regression Analysis and Forecasting 107
3.1 Introduction 107
3.2 Least Squares Estimation in Linear Regression Models 110
3.3 Statistical Inference in Linear Regression 119
3.3.1 Test for Significance of Regression 120
3.3.2 Tests on Individual Regression Coefficients and Groups of Coefficients 123
3.3.3 Confidence Intervals on Individual Regression Coefficients 130
3.3.4 Confidence Intervals on the Mean Response 131
3.4 Prediction of New Observations 134
3.5 Model Adequacy Checking 136
3.5.1 Residual Plots 136
3.5.2 Scaled Residuals and PRESS 139
3.5.3 Measures of Leverage and Influence 144
3.6 Variable Selection Methods in Regression 146
3.7 Generalized and Weighted Least Squares 152
3.7.1 Generalized Least Squares 153
3.7.2 Weighted Least Squares 156
3.7.3 Discounted Least Squares 161
3.8 Regression Models for General Time Series Data 177
3.8.1 Detecting Autocorrelation: The Durbin–Watson Test 178
3.8.2 Estimating the Parameters in Time Series Regression Models 184
3.9 Econometric Models 205
3.10 R Commands for Chapter 3 209
Exercises 219
4 Exponential Smoothing Methods 233
4.1 Introduction 233
4.2 First-Order Exponential Smoothing 239
4.2.1 The Initial Value, ̃y0 241
4.2.2 The Value of ???? 241
4.3 Modeling Time Series Data 245
4.4 Second-Order Exponential Smoothing 247
4.5 Higher-Order Exponential Smoothing 257
4.6 Forecasting 259
4.6.1 Constant Process 259
4.6.2 Linear Trend Process 264
4.6.3 Estimation of ????2e 273
4.6.4 Adaptive Updating of the Discount Factor 274
4.6.5 Model Assessment 276
4.7 Exponential Smoothing for Seasonal Data 277
4.7.1 Additive Seasonal Model 277
4.7.2 Multiplicative Seasonal Model 280
4.8 Exponential Smoothing of Biosurveillance Data 286
4.9 Exponential Smoothers and Arima Models 299
4.10 R Commands for Chapter 4 300
Exercises 311
5 Autoregressive Integrated Moving Average (Arima) Models 327
5.1 Introduction 327
5.2 Linear Models for Stationary Time Series 328
5.2.1 Stationarity 329
5.2.2 Stationary Time Series 329
5.3 Finite Order Moving Average Processes 333
5.3.1 The First-Order Moving Average Process, MA(1) 334
5.3.2 The Second-Order Moving Average Process, MA(2) 336
5.4 Finite Order Autoregressive Processes 337
5.4.1 First-Order Autoregressive Process, AR(1) 338
5.4.2 Second-Order Autoregressive Process, AR(2) 341
5.4.3 General Autoregressive Process, AR(p) 346
5.4.4 Partial Autocorrelation Function, PACF 348
5.5 Mixed Autoregressive–Moving Average Processes 354
5.5.1 Stationarity of ARMA(p, q) Process 355
5.5.2 Invertibility of ARMA(p, q) Process 355
5.5.3 ACF and PACF of ARMA(p, q) Process 356
5.6 Nonstationary Processes 363
5.6.1 Some Examples of ARIMA(p, d, q) Processes 363
5.7 Time Series Model Building 367
5.7.1 Model Identification 367
5.7.2 Parameter Estimation 368
5.7.3 Diagnostic Checking 368
5.7.4 Examples of Building ARIMA Models 369
5.8 Forecasting Arima Processes 378
5.9 Seasonal Processes 383
5.10 Arima Modeling of Biosurveillance Data 393
5.11 Final Comments 399
5.12 R Commands for Chapter 5 401
Exercises 412
6 Transfer Functions and Intervention Models 427
6.1 Introduction 427
6.2 Transfer Function Models 428
6.3 Transfer Function–Noise Models 436
6.4 Cross-Correlation Function 436
6.5 Model Specification 438
6.6 Forecasting with Transfer Function–Noise Models 456
6.7 Intervention Analysis 462
6.8 R Commands for Chapter 6 473
Exercises 486
7 Survey of Other Forecasting Methods 493
7.1 Multivariate Time Series Models and Forecasting 493
7.1.1 Multivariate Stationary Process 494
7.1.2 Vector ARIMA Models 494
7.1.3 Vector AR (VAR) Models 496
7.2 State Space Models 502
7.3 Arch and Garch Models 507
7.4 Direct Forecasting of Percentiles 512
7.5 Combining Forecasts to Improve Prediction Performance 518
7.6 Aggregation and Disaggregation of Forecasts 522
7.7 Neural Networks and Forecasting 526
7.8 Spectral Analysis 529
7.9 Bayesian Methods in Forecasting 535
7.10 Some Comments on Practical Implementation and Use of Statistical Forecasting Procedures 542
7.11 R Commands for Chapter 7 545
Exercises 550
Appendix A Statistical Tables 561
Appendix B Data Sets for Exercises 581
Appendix C Introduction to R 627
Bibliography 631
Index 639
Om författaren
DOUGLAS C. MONTGOMERY, Ph D, is Regents’ Professor and ASU Foundation Professor of Engineering at Arizona State University. With over 35 years of academic and consulting experience, Dr. Montgomery has authored or coauthored over 250 journal articles and 13 books. His research interests include design and analysis of experiments, statistical methods for process monitoring and optimization, and the analysis of time-oriented data.
CHERYL L. JENNINGS, Ph D, is Faculty Associate at Arizona State University. With more than 30 years of experience in the automotive, semiconductor, and banking industries, Dr. Jennings has coauthored two books. Her areas of professional interest include Six Sigma, modeling and analysis, performance management, and process control and improvement.
MURAT KULAHCI, Ph D, is Associate Professor of Statistics at the Technical University of Denmark and Guest Deputy Professor at the Luleå University of Technology in Sweden. He is the author and/or coauthor of over 60 journal articles and two books. Dr. Kulahci’s research interests include time series analysis, design of experiments, and statistical process control and monitoring.