Decision Trees are the most powerful and popular tool for classification and prediction and in recent years it has been a widely explored methodology in Degree Projects. A decision tree is a tree-like structure similar to a flowchart, in which each internal node represents a test on an attribute, each branch represents a result of the test, and each leaf node (terminal node) contains a class label.
Much more complex decision questions can be represented in the form of a table of results. However, especially in the case of complex investment decisions, a different representation of the information relevant to the problem – the decision tree – is useful to show the paths by which different possible outcomes are reached.
Likewise, action or decision bifurcations can be indicated with square nodes and random event bifurcations with round nodes. Other symbols can also be used instead, such as one- or two-line branches, special letters, or colors. It does not matter much which method of distinction is used, as long as one or the other is used. A decision tree of any size will always combine (a) action options with (b) different possible events or outcomes of the action that are partially affected by chance or other uncontrollable circumstances.
Construction of the decision tree
A tree can be developed by dividing the source set into subsets based on an attribute value test. This process is repeated on each derived subset in a recursive manner called recursive partitioning. The recursion is complete when the subset at a node all have the same value of the target variable, or when the division no longer adds value to the predictions.
The construction of the decision tree classifier does not require any domain knowledge or parameter tuning and is therefore appropriate for exploratory knowledge discovery. Decision trees can handle high-dimensional data. In general, the decision tree classifier has good accuracy. Decision tree induction is a typical inductive approach to learning classification knowledge.
Representation of decision trees
Decision trees classify instances by ordering them in the tree from the root to some leaf node, which gives the instance classification. An instance is classified starting at the root node of the tree, testing the attribute specified by this node, and working down the branch of the tree corresponding to the value of the attribute. This process is repeated for the subtree rooted at the new node. Example:
The decision tree ranks a given morning based on its suitability and returns the rank associated with the particular sheet.
For example, the instance (Forecast = Rain, Temperature = Hot, Humidity = High, Wind = Strong ) would be classified in the leftmost branch of this decision tree, and thus would be classified as a negative instance. In other words, we can say that the decision tree represents a disjunction of conjunctions of constraints on the values of the attributes of the instances.
Whereas (Forecast = Sunny ^ Humidity = Normal) v (Forecast = Cloudy) v (Forecast = Rain ^ Wind = Weak) would be placed on the right of the decision tree.
Strengths and weaknesses of the decision tree method
The strengths of decision tree methods are:
They are able to generate understandable rules.
They perform the classification without requiring many calculations.
They are capable of handling both continuous and categorical variables.
They provide a clear indication of the most important fields for prediction or classification.
Weaknesses of decision tree methods are:
They are less appropriate for estimation tasks where the goal is to predict the value of a continuous attribute.
They are prone to errors in classification problems with many classes and a relatively small number of training examples.
Training decision tree performance can be computationally expensive. At each node, each candidate split field must be sorted before finding its best split. In some algorithms, combinations of fields are used and the optimal combination weights have to be found. Pruning algorithms can also be expensive, since many candidate subtrees have to be formed and compared.
Continuing with the weather example, let’s say it’s a fairly cloudy Saturday morning and you have 75 people coming in for cocktails in the afternoon. You have a nice garden and your house is not too big; so, if the weather allows it, you might like to set up the refreshments in the garden and have the party there. It would be more pleasant and your guests would be more comfortable.
On the other hand, if you set up your garden party and after all the guests are gathered it starts to rain, the refreshments will be ruined, your guests will get wet, and you’ll be wishing you had decided to have the party at home. We could complicate this problem by considering the possibility of a partial commitment to one or another dish and the opportunities to adjust the time estimates as the day progresses, but the simple problem is all we need.
In the first node on the left, the host has the option to have the party inside or outside. Each branch represents an alternative action or decision. At the end of each branch or alternative course there is another node that represents a chance event: whether it will rain or not. Each subsequent alternate course to the right represents an alternate outcome of this chance event. Each complete alternative course of the tree has an associated result, which is displayed at the end of the rightmost or terminal branch of the course.
Decision Event Chains
The example above, while involving only a single decision stage, illustrates the elementary principles on which larger and more complex decision trees are built. Let’s take a somewhat more complicated situation:
It is about deciding whether to approve a budget for the development of an improved product. They urge you to do so because the development, if successful, will give you a competitive advantage, but if you don’t develop the product, your competitor can and will seriously damage your market share.
Your initial decision is shown on the left. Following the decision to go ahead with the project, if the development is successful, there is a second decision stage at Point A. Assuming there is no major change in the situation between now and Point A, you decide now what alternatives will be important to you at that time. To the right of the tree are the results of different sequences of decisions and events. These results are also based on your current information. In effect, you say, “If what I know now is true, this is what will happen.”
Of course, you do not try to identify all the events that can happen or all the decisions that you will have to make on a topic under discussion. Only decisions and events or outcomes that are important to you and have consequences that you want to compare are exposed in the decision tree.
Choice of course of action
We are now ready for the next step in the analysis: comparing the consequences of different courses of action. A decision tree does not give management the answer to an investment problem, but instead helps management determine which alternative, at any particular choice point, will yield the largest expected monetary gain, given the information and the alternatives. relevant to the decision.
Of course, the gains must be considered along with the risks. For example, a company’s shareholders may view a particular investment as one of a number of possibilities, some of which will work and some of which will fail. A major investment can pose a risk to a middle manager – to his job and his career – no matter what decision is made. Another participant may have much to gain from success, but little to lose from project failure. The nature of the risk, as each individual sees it, will affect not only the assumptions he is willing to make, but also the strategy he will follow to deal with the risk.
The existence of multiple, unstated and conflicting goals will certainly contribute to the politics of choice and you can be sure that the political element exists whenever people’s lives and ambitions are affected. Continuing with the last example, it is not a bad exercise to think about who are the parties to an investment decision and try to make these assessments:
What is at risk? Is it the profits or the value of the capital, the survival of the company, the maintenance of a job, the opportunity of an important career?
Who is at risk? The shareholder usually bears the risk in one way. Management, employees, the community, they can all take different risks.
What is the nature of the risk that each person bears? Is it, in your terms, unique, once-in-a-lifetime, sequential, insurable? Does it affect the economy, the sector, the company or a part of the company?
Considerations like the above are sure to enter top management thinking and not be eliminated by the decision tree. But the tree will show management which current decision will contribute most to their long-term goals.
In many cases, the uncertain elements take the form of discrete single-variable alternatives. In others, however, the cash flow possibilities during a stage may cover a whole spectrum and depend on a series of independent or partially related variables and subject to fortuitous influences: cost, demand, performance, etc.
Here, the range of variability or the probability of falling within a given range during a stage can be easily calculated from knowledge of the key variables and the uncertainties surrounding them. The range of possibilities during the stage can then be broken down into two, three or more ‘subsets’, which can be used as discrete random alternatives.
Why should you make a decision tree?
Now that you know exactly what a decision tree is, it’s time to consider why this methodology is so effective.
Decision trees are flexible
This means that decision trees are nonlinear, which means there is much more flexibility to explore, plan, and predict various possible outcomes of your decisions, regardless of when they occur.
Decision trees effectively communicate complex processes
In this regard, decision trees visually demonstrate cause-and-effect relationships, providing a simplified view of a potentially complicated process. Decision trees are also straightforward and easy to understand, even if you’ve never created one before.
Decision trees focus on probability and data, not emotions and biases
While it can certainly be helpful to consult others when making an important decision, relying too much on the opinions of your colleagues, friends, or family members can be risky. For starters, they may not have all the information. Furthermore, their advice may be influenced by their own personal biases, rather than hard facts or probabilities.
Decision trees, by contrast, provide a balanced view of the decision-making process, while calculating both risk and reward.
Decision trees clarify options, risks, goals, and payoffs
A great advantage of decision trees is their predictive framework, which allows different possibilities to be plotted and, ultimately, to determine which course of action has the highest probability of success. This, in turn, helps protect your decisions against unnecessary risk or undesirable outcomes.
Decision trees also allow for a more creative approach to the decision-making process. By visualizing different paths you could take, you might find a course of action you hadn’t considered before, or decide to combine paths to optimize your results. Visualizing your decision-making process can also ease uncertainties and help clarify your position.
Decision Tree Best Practices
keep it simple
Use data to predict outcomes
When building your decision tree, you will have to make some guesses. It’s okay to be unsure, no one expects you to pull out a crystal ball. That said, your decision tree will be much more useful if it takes real data into account when determining possible outcomes. A simple action plan flowchart will make it easy to make the right decisions from the data.
Use a professionally designed decision tree template
Using a professionally designed template can make your decision tree more attractive to clients, team members, and stakeholders in your project.
Make better decisions by using decision trees
Decision trees can dramatically increase your decision-making ability. The process of identifying the big decision (“root”), possible courses of action (“branches”), and possible outcomes (“leaves”), as well as assessing the risks, rewards, and odds of success, They will allow you to have a bird’s eye view of the decision-making process.