CISC 5800: Machine Learning

Final Project
Due: December 13 (extended from Dec 8) - spend 30+ hours over 1 month

For the final project, you will take a data set and use at least two classification approaches to distinguish classes in your data. This project requires scientific experimentation, programming, and a written report.

For the project you must:

Your grade will be calculated as follows:

The data set:
You will use the "SPECTF Heart Data Set" data set provided by University of California, Irvine. It uses 44 features to predict the presence of a heart condition — using the first feature as class indicator. Read over the documentation for the data set on the Irvine web site. All feature values in our .mat file are the same as the feature values listed in the documentation.

You may download the data directly from the UC Irvine site, using the CSV files SPECTF.train and SPECTF.test. The CSV files will be available for download from our course site in the next week. To load into Matlab, you cannot use the standard "load" command. Instead use importdata, e.g.
trainData=importdata('SPECTF.train');

The classifiers:
You must use at least two of the following:

You are welcome (but not required) to explore at least one other method not listed. You are welcome to convert numeric features to discrete category data if you wish.

You also must explore classifying based on either a subset of features and/or by using dimensionality reduction techniques such as:

The experiments:
As we have discussed in class, each classification/learning method potentially has a variety of settings and hyer-parameters to manipulate. Possible settings and hyper-parameters include:

You are to experiment with these or related parameters and their effects on learning. Your experiments must be thought-out. In your report, you must explain your justification for the different parameters values you have tried --- e.g., based on your understandings of learning methods and of the data.

For each classifier method, you should explore the effects of varying at least three settings/hyper-parameters, trying at least five different values per setting/hyper-parameter. For example, for logistic classification gradient ascent, you could vary ε step size and λ for a L1-regularizer. You can evaluate learning accuracy based on values:
ε0.1 0.20.3 0.40.5 0.10.1 0.10.1
λ10 1010 1010 2030 4050
This would constitute 5 different values for each of two hyper-parameters.

Graded materials:
You must submit: (1) Your complete Matlab/Python code, (2) Your 6–10 page report

Your code must include:


Your report must include:

Time commitment: This project should take you at least 30 hours over a month span.

Due date: The project will be due December 8.