29 December 2016

Strategic Management: Decision Trees (Just the Quotes)

"A decision tree does not give management the answer to an investment problem; rather, it helps management determine which alternative at any particular choice point will yield the greatest expected monetary gain, given the information and alternatives pertinent to the decision."  (John F Magee, "Decision Trees for Decision Making", Harvard Business Review, 1964) [source]

"A decision tree of any size will always combine (a) action choices with (b) different possible events or results of action which are partially affected by chance or other uncontrollable circumstances." (John F Magee, "Decision Trees for Decision Making", Harvard Business Review, 1964) [source]

"The unique feature of the decision tree is that it allows management to combine analytical techniques such as discounted cash flow and present value methods with a clear portrayal of the impact of future decision alternatives and events. Using the decision tree, management can consider various courses of action with greater ease and clarity. The interactions between present decision alternatives, uncertain events, and future choices and their results become more visible." (John F Magee, "Decision Trees for Decision Making", Harvard Business Review, 1964) [source]

"[decision trees are the] most picturesque of all the allegedly scientific aids to making decisions. The analyst charts all the possible outcomes of different options, and charts all the latters' outcomes, too. This produces a series of stems and branches (hence the tree). Each of the chains of events is given a probability and a monetary value." (Robert Heller, "The Pocket Manager", 1987)

"Decision trees make decision-making easier by identifying a series of conditions and actions. They are used to determine actions in response to given situations. [...] One benefit of a decision tree is that it gives a visual depiction of all the conditions and actions of a decision. They are also easy to construct and follow, and they may be compressed into a decision table." (Ralph L Kliem & Irwin S Ludin, Tools and Tips for Today's Project Manager, 1999)

"One advantage that decision tree modeling has over other pattern recognition techniques lies in the interpretability of the decision model. Due to this interpretability, information relating to the identification of important features and interclass relationships can be used to support the design of future experiments and data analysis." (S D Brown, A J Myles, in Comprehensive Chemometrics, 2009)

"Decision trees are an important tool for decision making and risk analysis, and are usually represented in the form of a graph or list of rules. One of the most important features of decision trees is the ease of their application. Being visual in nature, they are readily comprehensible and applicable. Even if users are not familiar with the way that a decision tree is constructed, they can still successfully implement it. Most often decision trees are used to predict future scenarios, based on previous experience, and to support rational decision making." (Jelena Djuris et al, "Neural computing in pharmaceutical products and process development", Computer-Aided Applications in Pharmaceutical Technology, 2013)

"Decision trees (DTs) are the simplest modeling techniques and are most appropriate for modeling interventions in which the relevant events occur over a short time period. The main limitation of decision trees is their inflexibility to model decision problems, which involve recurring events and are ongoing over time. " (H Haji Ali Afzali & J Karnon, "Specification and Implementation of Decision Analytic Model Structures for Economic Evaluation of Health Care Technologies", Encyclopedia of Health Economics, 2014)

"Decision trees are considered a good predictive model to start with, and have many advantages. Interpretability, variable selection, variable interaction, and the flexibility to choose the level of complexity for a decision tree all come into play." (Ralph Winters, "Practical Predictive Analytics", 2017)

"Decision trees show the breakdown of the data by one variable then another in a very intuitive way, though they are generally just diagrams that don’t actually encode data visually." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

"Random forests are essentially an ensemble of trees. They use many short trees, fitted to multiple samples of the data, and the predictions are averaged for each observation. This helps to get around a problem that trees, and many other machine learning techniques, are not guaranteed to find optimal models, in the way that linear regression is. They do a very challenging job of fitting non-linear predictions over many variables, even sometimes when there are more variables than there are observations. To do that, they have to employ 'greedy algorithms', which find a reasonably good model but not necessarily the very best model possible." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

No comments:

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.