Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose feature names #14

Merged
merged 3 commits into from
Mar 2, 2022
Merged

Expose feature names #14

merged 3 commits into from
Mar 2, 2022

Conversation

ablaom
Copy link
Member

@ablaom ablaom commented Mar 1, 2022

Context: https://github.com/bensadeghi/DecisionTree.jl/issues/147

Addresses #13:

using MLJ
tree = (@load DecisionTreeClassifier pkg=DecisionTree)()
X, y = @load_iris
mach = machine(tree, X, y) |> fit!

julia> fitted_params(mach)
(tree = Decision Tree
Leaves: 9
Depth:  5,
 encoding = Dict{CategoricalArrays.CategoricalValue{String, UInt32}, UInt32}("virginica" => 0x00000003, "setosa" => 0x00000001, "versicolor" => 0x00000002),
 features = [:sepal_length, :sepal_width, :petal_length, :petal_width],)

julia> MLJ.report(mach)
(classes_seen = CategoricalArrays.CategoricalValue{String, UInt32}["setosa", "versicolor", "virginica"],
 print_tree = TreePrinter object (call with display depth),
 features = [:sepal_length, :sepal_width, :petal_length, :petal_width],)

@ablaom
Copy link
Member Author

ablaom commented Mar 1, 2022

@roland-KA Could you confirm this meets requirements? If you have the time, a review would be nice.

@roland-KA
Copy link
Collaborator

roland-KA commented Mar 1, 2022

That looks good 👍. If the index values of features correspond to the featid values of a DecisionTree.Node, then it's exactly what we need.

And with a short test with two different data sets I could confirm that.

@ablaom ablaom merged commit 4c1f0fc into dev Mar 2, 2022
This was referenced Mar 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants