Herbal plants are one type of plant that has been used from ancient times to the present. Herbal plants are used as traditional medicines. Various studies have been conducted to classify various types of herbal plant leaves. The methods used are Naive Bayes, KNN, BNN, VGG16, and others. The dataset in this study consisted of 10 classes, namely, starfruit, guava, lime, basil, aloe vera, jackfruit, pandanus, papaya, celery, and betel. The dataset then goes through the stages of preprocessing and data splitting. The EfficientNetV2B0 model was chosen because it has a minimalist architecture but high effectiveness. Several layers are added to reduce overfitting so as to get maximum results. Based on the results of research using the EfficientNetV2B0 model, it obtains an accuracy of 99.14% and a loss of 1.95% using test data.
Dataset contain 10 class about Herbal Leave :
- Belimbing Wuluh
- Jambu Biji
- Jeruk Nipis
- Kemangi
- Lidah Buaya
- Nangka
- Pandan
- Pepaya
- Seledri
- Sirih
Dataset available from this site https://data.mendeley.com/datasets/s82j8dh4rr
In the EfficientNetV2B0 base model, we add an output layer so we can adjust the number of target classes in the dataset used. Before the output layer, we add two additional layers to reduce overfitting and produce better results. We add MaxPooling2D, Flatten, and Batch Normalisation as additional layers to the created model. The function of MaxPooling2D is to reduce the spatial dimension of the feature representation generated by the previous layer. Then use the Flatten functions to change the multidimensional feature representation into a one-dimensional vector so that the layer becomes fully-connected. Batch Normalization helps speed up training and is similar in function to L2 Regularization in dealing with overfitting. The entire model can be seen in the image below.
Training model using M1 GPU (Optimizer='adam', lr=0.001, batch_size=128, epoch=30)
Splitting data into 3 (train, validation, test)
Model evalute using test data, here is the result :
- Accuracy = 99.14%
- Loss = 1.95%
- Macro avg. Precision = 99%
- Macro avg. Recall = 99%%
- Macro avg. F1 = 99%
- Stay Tune :D