A Regression Tree in Dungeons and Dragons
Directions
Predict HP using D&D character attributes and the force :-)
1. Libraries
## Registered S3 method overwritten by 'xts':
## method from
## as.zoo.xts zoo
## Registered S3 method overwritten by 'quantmod':
## method from
## as.zoo.data.frame zoo
## Registered S3 methods overwritten by 'forecast':
## method from
## fitted.fracdiff fracdiff
## residuals.fracdiff fracdiff
## Loading required package: lattice
## Loading required package: ggplot2
2. Load data
## ID Name Gender Race Height Publisher
## 1 A001 A-Bomb Male Human 203 Marvel Comics
## 2 A002 Abe Sapien Male Icthyo Sapien 191 Dark Horse Comics
## 3 A004 Abomination Male Human / Radiation 203 Marvel Comics
## 4 A009 Agent 13 Female <NA> 173 Marvel Comics
## 5 A015 Alex Mercer Male Human NA Wildstorm
## 6 A016 Alex Woolsly Male <NA> NA NBC - Heroes
## 7 A024 Angel Male Vampire NA Dark Horse Comics
## 8 A025 Angel Dust Female Mutant 165 Marvel Comics
## 9 A028 Animal Man Male Human 183 DC Comics
## 10 A032 Anti-Monitor Male God / Eternal 61 DC Comics
## Alignment Weight Manipulative Resourceful Dismissive Intelligent Trusting
## 1 good 441 10 10 7 6 7
## 2 good 65 7 7 6 8 6
## 3 bad 441 6 8 1 6 3
## 4 good 61 7 7 1 9 7
## 5 bad NA 10 6 8 3 4
## 6 good NA 8 10 5 5 6
## 7 good NA 8 6 8 7 4
## 8 good 57 9 8 9 4 1
## 9 good 83 7 6 6 5 8
## 10 bad NA 7 7 7 1 9
## Loyal Stubborn Brave HouseID House STR DEX CON INT WIS CHA Level HP
## 1 7 7 9 1 Slytherin 18 11 17 12 13 11 1 7
## 2 7 6 9 1 Slytherin 16 17 10 13 15 11 8 72
## 3 3 5 2 1 Slytherin 13 14 13 10 18 15 15 135
## 4 4 6 6 1 Slytherin 15 18 16 16 17 10 14 140
## 5 4 1 8 1 Slytherin 14 17 13 12 10 11 9 72
## 6 7 7 6 1 Slytherin 14 14 11 13 12 12 1 8
## 7 1 5 2 1 Slytherin 15 17 15 18 13 18 11 88
## 8 6 5 4 1 Slytherin 8 17 12 15 17 18 1 8
## 9 3 3 2 1 Slytherin 10 17 15 18 13 14 8 56
## 10 1 6 5 1 Slytherin 8 10 11 16 12 11 7 63
## 'data.frame': 734 obs. of 26 variables:
## $ ID : Factor w/ 734 levels "A001","A002",..: 1 2 4 9 15 16 24 25 28 32 ...
## $ Name : Factor w/ 715 levels "A-Bomb","Abe Sapien",..: 1 2 4 9 15 16 23 24 27 31 ...
## $ Gender : Factor w/ 2 levels "Female","Male": 2 2 2 1 2 2 2 1 2 2 ...
## $ Race : Factor w/ 61 levels "Alien","Alpha",..: 24 33 32 NA 24 NA 57 43 24 21 ...
## $ Height : num 203 191 203 173 NA NA NA 165 183 61 ...
## $ Publisher : Factor w/ 25 levels "","ABC Studios",..: 13 3 13 13 25 15 3 13 4 4 ...
## $ Alignment : Factor w/ 3 levels "bad","good","neutral": 2 2 1 2 1 2 2 2 2 1 ...
## $ Weight : int 441 65 441 61 NA NA NA 57 83 NA ...
## $ Manipulative: int 10 7 6 7 10 8 8 9 7 7 ...
## $ Resourceful : int 10 7 8 7 6 10 6 8 6 7 ...
## $ Dismissive : int 7 6 1 1 8 5 8 9 6 7 ...
## $ Intelligent : int 6 8 6 9 3 5 7 4 5 1 ...
## $ Trusting : int 7 6 3 7 4 6 4 1 8 9 ...
## $ Loyal : int 7 7 3 4 4 7 1 6 3 1 ...
## $ Stubborn : int 7 6 5 6 1 7 5 5 3 6 ...
## $ Brave : int 9 9 2 6 8 6 2 4 2 5 ...
## $ HouseID : int 1 1 1 1 1 1 1 1 1 1 ...
## $ House : Factor w/ 4 levels "Gryffindor","Hufflepuff",..: 4 4 4 4 4 4 4 4 4 4 ...
## $ STR : int 18 16 13 15 14 14 15 8 10 8 ...
## $ DEX : int 11 17 14 18 17 14 17 17 17 10 ...
## $ CON : int 17 10 13 16 13 11 15 12 15 11 ...
## $ INT : int 12 13 10 16 12 13 18 15 18 16 ...
## $ WIS : int 13 15 18 17 10 12 13 17 13 12 ...
## $ CHA : int 11 11 15 10 11 12 18 18 14 11 ...
## $ Level : int 1 8 15 14 9 1 11 1 8 7 ...
## $ HP : int 7 72 135 140 72 8 88 8 56 63 ...
## [1] "ID" "Name" "Gender" "Race" "Height"
## [6] "Publisher" "Alignment" "Weight" "Manipulative" "Resourceful"
## [11] "Dismissive" "Intelligent" "Trusting" "Loyal" "Stubborn"
## [16] "Brave" "HouseID" "House" "STR" "DEX"
## [21] "CON" "INT" "WIS" "CHA" "Level"
## [26] "HP"
## [1] 734
Remove unnecessary variables for this model
## [1] "STR" "DEX" "CON" "INT" "WIS" "CHA" "HP"
Look at the new order
## [,1]
## [1,] "STR"
## [2,] "DEX"
## [3,] "CON"
## [4,] "INT"
## [5,] "WIS"
## [6,] "CHA"
## [7,] "HP"
## 'data.frame': 734 obs. of 7 variables:
## $ STR: int 18 16 13 15 14 14 15 8 10 8 ...
## $ DEX: int 11 17 14 18 17 14 17 17 17 10 ...
## $ CON: int 17 10 13 16 13 11 15 12 15 11 ...
## $ INT: int 12 13 10 16 12 13 18 15 18 16 ...
## $ WIS: int 13 15 18 17 10 12 13 17 13 12 ...
## $ CHA: int 11 11 15 10 11 12 18 18 14 11 ...
## $ HP : int 7 72 135 140 72 8 88 8 56 63 ...
## < table of extent 0 >
## [1] 734
3. Training validation split
We’re using our favourite seed number, but you can use any other seed. Note that your solutions may differ slightly with different seeds.
set.seed(666)
train_index <- sample(1:nrow(hogwarts), 0.6 * nrow(hogwarts))
valid_index <- setdiff(1:nrow(hogwarts), train_index)
train_df <- hogwarts[train_index, ]
valid_df <- hogwarts[valid_index, ]
nrow(train_df)
## [1] 440
## [1] 294
## STR DEX CON INT WIS CHA HP
## 574 13 11 12 18 12 13 72
## 638 8 13 18 18 18 15 120
## 608 15 13 17 10 10 12 98
## 123 13 10 11 11 10 17 63
## 540 8 10 18 12 18 17 40
## 654 14 12 12 13 18 11 54
## STR DEX CON INT WIS CHA HP
## 2 16 17 10 13 15 11 72
## 3 13 14 13 10 18 15 135
## 5 14 17 13 12 10 11 72
## 12 13 14 12 17 12 11 30
## 13 10 14 15 17 12 16 84
## 14 16 14 14 13 10 12 36
## 'data.frame': 440 obs. of 7 variables:
## $ STR: int 13 8 15 13 8 14 9 11 15 17 ...
## $ DEX: int 11 13 13 10 10 12 16 18 14 18 ...
## $ CON: int 12 18 17 11 18 12 13 15 11 11 ...
## $ INT: int 18 18 10 11 12 13 14 16 18 11 ...
## $ WIS: int 12 18 10 10 18 18 11 10 18 16 ...
## $ CHA: int 13 15 12 17 17 11 18 11 11 10 ...
## $ HP : int 72 120 98 63 40 54 140 66 56 56 ...
## 'data.frame': 294 obs. of 7 variables:
## $ STR: int 16 13 14 13 10 16 11 11 9 11 ...
## $ DEX: int 17 14 17 14 14 14 16 13 11 13 ...
## $ CON: int 10 13 13 12 15 14 18 10 15 12 ...
## $ INT: int 13 10 12 17 17 13 14 13 12 16 ...
## $ WIS: int 15 18 10 12 12 10 15 10 14 15 ...
## $ CHA: int 11 15 11 11 16 12 14 10 13 14 ...
## $ HP : int 72 135 72 30 84 36 140 140 72 21 ...
4. Regression tree
## [1] "STR" "DEX" "CON" "INT" "WIS" "CHA" "HP"
4.1 Large tree.
This is harder to read, and may not be very useful.
regress_tr <- rpart(HP ~ STR + DEX + CON + INT + WIS + CHA,
data = train_df, method = "anova", maxdepth = 20)
prp(regress_tr)
## ME RMSE MAE MPE MAPE
## Test set -2.391647e-15 35.81544 29.87884 -66.19501 93.24161
## ME RMSE MAE MPE MAPE
## Test set -8.012589 37.49316 30.60095 -82.68468 104.7522
4.2 Shallower tree.
A shallower tree may be more useful. But it may be overly simplistic.
regress_tr_shallow <- rpart(HP ~ STR + DEX + CON + INT + WIS + CHA,
data = train_df, method = "anova",
minbucket = 2, maxdepth = 3)
prp(regress_tr_shallow)
predict_train_shallow <- predict(regress_tr_shallow, train_df)
accuracy(predict_train_shallow, train_df$HP)
## ME RMSE MAE MPE MAPE
## Test set -2.083927e-15 37.61598 32.04895 -75.22096 103.9582
predict_valid_shallow <- predict(regress_tr_shallow, valid_df)
accuracy(predict_valid_shallow, valid_df$HP)
## ME RMSE MAE MPE MAPE
## Test set -7.670169 34.73232 28.2203 -81.03383 99.89169
5. Predict new record
## [1] "STR" "DEX" "CON" "INT" "WIS" "CHA" "HP"
Using the large tree
padawan_1 <- data.frame(STR = 11,
DEX = 17,
CON = 14,
INT = 17,
WIS = 17,
CHA = 13)
regress_tr_pred <- predict(regress_tr, newdata = padawan_1)
regress_tr_pred
## 1
## 57.18182
Using the shallow tree.
## 1
## 71.62176
Compare the results.