Workshop 2

Before the workshop

  • Watch lectures 5 to 8 and read the corresponding chapters in the course literature
  • Install TensorFlow and Keras on your laptop
  • Prepare questions (if you have any) on the contents in lectures 5 to 8

Aim of the workshop

The aim of this workshop is to discuss the contents in lectures 5 to 8, and do practical assignments on what you have learned in the lectures.

Assignments

A1: Spiral dataset in the Weka tool

  • Classify the Spiral dataset in Weka using the Linear (functions/Liblinear), Neural Network (MultiLayerPerceptron) and Support Vector Machines (functions/LibSVM) algorithms
  • The Spiral dataset can be downloaded from the Datasets page
  • Why does the Linear classifier has much lower accuracy than Neural Network and Support Vector Machine?
  • Note that Weka automatically tries to determine the size of the hidden layer. It often happens that it uses too few hidden units to be able to accurately learn the concept. Try to change the hiddenLayers field from a to 72.



A2: Diabetes dataset using the Weka library

  • Write Java code for classifying the Diabetes dataset in Weka using the Neural Network and Random Forest algorithms



A3: Spiral dataset in Scikit

  • Classify the Spiral dataset in Scikit using a Neural Network algorithm
  • You need to write code for loading csv dataset files (use the Pandas library)



A4: Diabetes dataset in Scikit

  • Classify the Diabetes datasets in Scikit using the Neural Network and Xgboost algorithms
  • You need to write code for loading csv dataset files (use the Pandas library)



A5: Iris dataset in TensorFlow (optional)

  • Skip this unless you are interested in learning core TF (Keras is easier to use)
  • Write code for loading, training and evaluating the Iris dataset using premade Linear and DNN estimators in TensorFlow
  • See instructions at the TensorFlow page



A6: MNIST dataset in TensorFlow using linear classifier (optional)

  • Skip this unless you are interested in learning core TF (Keras is easier to use)
  • Write code for loading, training and evaluating the MNIST dataset using a Linear classifier in TensorFlow
  • See instructions at the TensorFlow page



A7: MNIST dataset in TensorFlow using ConvNet (optional)

  • Skip this unless you are interested in learning core TF (Keras is easier to use)
  • Write code for loading, training and evaluating the MNIST dataset using a ConvNet classifier in TensorFlow
  • See instructions at the TensorFlow page



A8: MNIST dataset in Keras

  • Write code for loading, training and evaluating the MNIST dataset using a Linear and a ConvNet classifier in Keras
  • See instructions at the Keras page



A9: Pre-trained models in Keras

  • Use the pre-trained models VGG16 and VGG19 in Keras to classify images
  • See instructions at the Keras page
  • Test on other images than the examples. Are they classified correctly?



A10: Diabetes dataset in Keras using DNN

  • Write code for loading, training and evaluating the Diabetes dataset using a deep neural network classifier in Keras
  • See instructions at the Keras page



A11: Banknote dataset in Web ML Experimenter

  • Download the Banknote dataset from the Datasets page
  • Upload the dataset in the Web ML Experimenter
  • Try classifying the Banknite dataset using different classifiers. Test with different hyperparameter settings.
  • Which classifier had the highest accuracy?



A12: Diabetes dataset in R

  • Classify the diabetes dataset in R using Neural Networks, SVM and RandomForest
  • Split the dataset into 80% training and 20% testing, and evaluate accuracy on the test dataset
  • Which classifier had the highest accuracy?

Welcome to CoursePress

en utav Linnéuniversitets lärplattformar. Som inloggad student kan du kommunicera, hålla koll på dina kurser och mycket mer. Du som är gäst kan nå de flesta kurser och dess innehåll utan att logga in.

Läs mer lärplattformar vid Linnéuniversitetet

Student account

To log in you need a student account at Linnaeus University.

Read more about collecting your account

Log in LNU