Total members 11806 |It is currently Thu Nov 21, 2019 5:01 am Login / Join Codemiles

Java

C/C++

PHP

C#

HTML

CSS

ASP

Javascript

JQuery

AJAX

XSD

Python

Matlab

R Scripts

Weka





This example is a good starting point to use the machine learning concept on a classification problem. In the code snippet below, we apply the supervised learning concept with the naive Bayes classifier. The naive Baye classifier is formulated around the Bayes theorem and conditional probability basics. The dataset that is used in the example is the Breast Cancer Dataset. We load this dataset using sklearn package function load_breast_cancer(). That dataset has records for 569 patients and 30 features regarding the images collected using the Needle Tip in Area of Concern. Some features are radius, texture, perimeter, area, smoothness, compactness. To keep the simplicity level of this example, we pick only the first two features. The target of this data is two classes binary (Malignant,Benign). The dataset is split into training and testing sets to validate the trained classified on 50% ratio. The size of training and testing is 284 patients each. We measure the outcome of the validation process using performance measures such as precision, recall, f-measure.


Code:
# https://jupyter.org/try
# Demo2
# M. S. Rakha, Ph.D.
# Post-Doctoral - Queen's University
# Supervised Learning - Naive Bayes Classification
%matplotlib inline
import numpy as np
import pandas as pd
from sklearn import datasets
from sklearn.preprocessing import scale
import sklearn.metrics as sm
from sklearn.metrics import confusion_matrix,classification_report
from sklearn.model_selection import train_test_split

np.random.seed(5)
breastCancer = datasets.load_breast_cancer()

list(breastCancer.target_names)

#Only two features
X = breastCancer.data[:, 0:2]
y = breastCancer.target


X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.50, random_state=42)
X_train[:,0].size
X_train[:,0].size

varriableNames= breastCancer.feature_names


from sklearn.naive_bayes import GaussianNB
nb = GaussianNB()
nb.fit(X_train, y_train);

y_pred = nb.predict(X_test)


from sklearn.metrics import classification_report
print(classification_report(y_test, y_pred))



Below is the results of running this python code on Jupyter notebook:
Code:
              precision    recall  f1-score   support

           0       0.93      0.76      0.83        98
           1       0.88      0.97      0.92       187
         accuracy             0.89       285
         macro avg           0.90      0.86      0.88       285
         weighted avg       0.90      0.89      0.89       285




_________________
M. S. Rakha, Ph.D.
Queen's University
Canada


Author:
Mastermind
User avatar Posts: 2715
Have thanks: 74 time
Post new topic Reply to topic  [ 1 post ] 

  Related Posts  to : Naive Bayes Classification (Binary )- Supervised Learning
 Random Forest Classification (Binary )- Supervised Learning     -  
 naive Bayes classifier in MATLAB     -  
 Build Linear Regression in Python - Supervised Learning     -  
 KFold Cross-validation Random Forest Binary Classification     -  
 binary search     -  
 Useful tutorials for learning ADO.NET using C#     -  
 Java Binary Tree     -  
 Read Binary File in C++     -  
 convert string into binary     -  
 need help for creatinb binary tree in php     -  









Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
All copyrights reserved to codemiles.com 2007-2011
mileX v1.0 designed by codemiles team
Codemiles.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com