background preloader

Forecasting: principles and practice

Forecasting: principles and practice
Welcome to our online textbook on forecasting. This textbook is intended to provide a comprehensive introduction to forecasting methods and to present enough information about each method for readers to be able to use them sensibly. We don’t attempt to give a thorough discussion of the theoretical details behind each method, although the references at the end of each chapter will fill in many of those details. The book is written for three audiences: (1) people finding themselves doing forecasting in business when they may not have had any formal training in the area; (2) undergraduate students studying business; (3) MBA students doing a forecasting elective. We use it ourselves for a second-year subject for students undertaking a Bachelor of Commerce degree at Monash University, Australia. For most sections, we only assume that readers are familiar with algebra, and high school mathematics should be sufficient background. Use the table of contents on the right to browse the book. Related:  Time SeriesForecasting

Debunking Handbook, The Posted on 27 November 2011 by John Cook The Debunking Handbook, a guide to debunking misinformation, is now freely available to download. Although there is a great deal of psychological research on misinformation, there's no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of myths. The Debunking Handbook boils the research down into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples' minds. It also looks at a key element to successful debunking: providing an alternative explanation. The Authors: John Cook is the Climate Change Communication Fellow for the Global Change Institute at the University of Queensland. Professor Lewandowsky is an Australian Professorial Fellow and a cognitive scientist at the University of Western Australia.

What is volatility? Some facts and some speculation. Definition Volatility is the annualized standard deviation of returns — it is often expressed in percent. A volatility of 20 means that there is about a one-third probability that an asset’s price a year from now will have fallen or risen by more than 20% from its present value. In R the computation, given a series of daily prices, looks like: sqrt(252) * sd(diff(log(priceSeriesDaily))) * 100 Usually — as here – log returns are used (though it is unlikely to make much difference). Historical estimation What frequency of returns should be used when estimating volatility? There is folklore that it is better to use monthly data than daily data because daily data is more noisy. However, this is finance, so things aren’t that easy. Another complication is if there are assets from around the globe. Through time Volatility would be more boring if finance were like other fields where standard deviations never change. But why? Across assets Implied volatility Not risk

Forecasting within limits Forecasting within limits It is com­mon to want fore­casts to be pos­i­tive, or to require them to be within some spec­i­fied range . Both of these sit­u­a­tions are rel­a­tively easy to han­dle using transformations. Pos­i­tive forecasts To impose a pos­i­tiv­ity con­straint, sim­ply work on the log scale. . Fore­casts con­strained to an interval To see how to han­dle data con­strained to an inter­val, imag­ine that the egg prices were con­strained to lie within and . to the whole real line: where is on the orig­i­nal scale and is the trans­formed data. The pre­dic­tion inter­vals from these trans­for­ma­tions have the same cov­er­age prob­a­bil­ity as on the trans­formed scale, because quan­tiles are pre­served under monot­o­n­i­cally increas­ing trans­for­ma­tions. Related Posts:

FLCT: Funny Little Calculus Text - Robert W. Ghrist Classics in Psychology An internet resource developed byChristopher D. Green , ISSN 1492-3173 (Return to Classics index) Last updated . 19th- & 20th-Century Psychology Can't find what you want? Ancient Thought Plato. Aristotle. Aristotle. For additional works by the Presocratics, Plato, Aristotle, Hippocrates, Euclid, Lucretius, Epictetus, Galen, Plotinus, and Augustine, see the Links to Documents at Other Sites page. Medieval & Renaissance Thought For works by Aquinas, Roger Bacon, Pico, and Machiavelli see the Links to Documents at Other Sites page. Modern Philosophical Thought Berkeley, George. (1732). Bowen, Francis. (1860). McCosh, James. (1874). Herbart, J. Fiske, John. (1902). Royce, Josiah. (1902). Stumpf, Carl. (1930). Titchener, E. Creighton, J. For additional works by Descartes, Hobbes, Pascal, Locke, Leibniz, Spinoza, , Voltaire, Hume, Smith, Malthus, Kant, Hegel, Marx, Mill, Brentano, Mach, Peirce, James, Dewey, Husserl, Russell, Mead, and Merleau-Ponty see the Links to Documents at Other Sites page.

Time Series Analysis | R Statistics.Net Any metric that is measured over time is a time series. It is of high importance because of industrial relevance especially w.r.t forecasting (demand, sales, supply etc). It can be broken down to its components so as to systematically forecast it. What is a Time Series ? Any metric that is measured over regular time intervals makes a Time Series. How To Create A Time Series In R ? Upon importing your data into R, use ts() function as follows. ts (inputData, frequency = 4, start = c(1959, 2)) # frequency 4 => Quarterly Data ts (1:10, frequency = 12, start = 1990) # freq 12 => Monthly data. Understanding Your Time Series Each datapoint (Yt) in a Time Series can be expressed as either a sum or a product of 3 components, namely, Seasonality(St), Trend(Tt) and Error(et) (a.k.a White Noise). For Additive Time Series, Yt = St + Tt + et For Multiplicative Time Series, Yt = St * Tt * et A multiplicative time series can be converted to additive by taking a log of the time series.

Interpreting noise When watch­ing the TV news, or read­ing news­pa­per com­men­tary, I am fre­quently amazed at the attempts peo­ple make to inter­pret ran­dom noise. For exam­ple, the lat­est tiny fluc­tu­a­tion in the share price of a major com­pany is attrib­uted to the CEO being ill. When the exchange rate goes up, the TV finance com­men­ta­tor con­fi­dently announces that it is a reac­tion to Chi­nese build­ing con­tracts. No one ever says “The unem­ploy­ment rate has dropped by 0.1% for no appar­ent reason.” What is going on here is that the com­men­ta­tors are assum­ing we live in a noise-​​free world. They imag­ine that every­thing is explic­a­ble, you just have to find the expla­na­tion. The finance news Every night on the nightly TV news bul­letins, a sup­posed expert will go through the changes in share prices, stock prices indexes, cur­rency rates, and eco­nomic indi­ca­tors, from the past 24 hours. in mag­ni­tude, where Sadly, that’s unlikely to hap­pen. Sea­son­ally adjusted data where and . .

Big Data, Data Mining, Predictive Analytics, Statistics, StatSoft Electronic Textbook "Thank you and thank you again for providing a complete, well-structured, and easy-to-understand online resource. Every other website or snobbish research paper has not deigned to explain things in words consisting of less than four syllables. I was tossed to and fro like a man holding on to a frail plank that he calls his determination until I came across your electronic textbook...You have cleared the air for me. You have enlightened. You have illuminated. — Mr. "As a professional medical statistician of some 40 years standing, I can unreservedly recommend this textbook as a resource for self-education, teaching and on-the-fly illustration of specific statistical methodology in one-to-one statistical consulting. — Mr. "Excellent book. — Dr. "Just wanted to congratulate whoever wrote the 'Experimental Design' page. — James A. Read More Testimonials >> StatSoft has freely provided the Electronic Statistics Textbook as a public service since 1995. Proper citation:

historic documents in computer science Fortran Fortran Automated Coding System For the IBM 704 the very first Fortran manual, by John Backus, et al., Oct. 1956 Al Kossow has in his manual collection also an IBM 704 manual, if you want to have a look at the machine that this original Fortran language was made for. Also the next IBM manuals are from the collection at his web site, where you can find a number of Fortran manuals, for many machines of various manufacturers. The FORTRAN II General Information Manual and IBM 7090/7094 Programming Systems: FORTRAN II Programming are two IBM manuals of 1963 describing the FORTRAN II language. IBM 7090/7094 Programming Systems: FORTRAN IV Language, 1963, and IBM System 360 and System 370 FORTRAN IV Language, 1974. FORTRAN IV was next to the ANSI standard of 1966 for long time the reference language for Fortran as used by legions of scientists and engineers. Algol60 I think, this is the original Peter Naur edition of the Algol 60 report.

R Video tutorial for Spatial Statistics: Introductory Time-Series analysis of US Environmental Protection Agency (EPA) pollution data Download EPA air pollution data The US Environmental Protection Agency (EPA) provides tons of free data about air pollution and other weather measurements through their website. An overview of their offer is available here: The data are provided in hourly, daily and annual averages for the following parameters: Ozone, SO2, CO,NO2, Pm 2.5 FRM/FEM Mass, Pm2.5 non FRM/FEM Mass, PM10, Wind, Temperature, Barometric Pressure, RH and Dewpoint, HAPs (Hazardous Air Pollutants), VOCs (Volatile Organic Compounds) and Lead. All the files are accessible from this page: The web links to download the zip files are very similar to each other, they have an initial starting URL: and then the name of the file has the following format: type_property_year.zip The type can be: hourly, daily or annual. data <- download.EPA(year=2013,property="ozone",type="daily")

Errors on percentage errors The MAPE (mean absolute per­cent­age error) is a pop­u­lar mea­sure for fore­cast accu­racy and is defined as where denotes an obser­va­tion and denotes its fore­cast, and the mean is taken over Arm­strong (1985, p.348) was the first (to my knowl­edge) to point out the asym­me­try of the MAPE say­ing that “it has a bias favor­ing esti­mates that are below the actual val­ues”. A few years later, Arm­strong and Col­lopy (1992) argued that the MAPE “puts a heav­ier penalty on fore­casts that exceed the actual than those that are less than the actual”. and , so that the rel­a­tive error is 50÷150=0.33, in con­trast to the sit­u­a­tion where , when the rel­a­tive error would be 50÷100=0.50. Thus, the MAPE puts a heav­ier penalty on neg­a­tive errors (when ) than on pos­i­tive errors. , so pos­i­tive errors arise only when the fore­cast is too small. To avoid the asym­me­try of the MAPE, Arm­strong (1985, p.348) pro­posed the “adjusted MAPE”, which he defined as ), or infi­nite (if , where . . , then .

Statistics books for (free) download This post will eventually grow to hold a wide list of books on statistics (e-books, pdf books and so on) that are available for free download. But for now we’ll start off with just one several books: The Elements of Statistical Learning written by Trevor Hastie, Robert Tibshirani and Jerome Friedman. you can legally download a copy of the book in pdf format from the authors website! Direct download (First discovered on the “one R tip a day” blog)Statistics (Probability and Data Analysis) – a wikibook. Several of these books were discovered through a CrossValidated discussion: Know of any more e-books freely available for download? Related

Book Reviews

Related: