ML-prog-assig-1

docx

School

Portland State University *

*We aren’t endorsed by this school

Course

445

Subject

Industrial Engineering

Date

Apr 3, 2024

Type

docx

Pages

2

Report

Uploaded by DeaconDuck3827

#I ran my code through Google colab and this same code can be used for all the 3 experiments. # As mentioned in program, for experiment-1, I used hidden inputs values as 20,50,100. Momentum as 0.9 and learning rate as 0.1 # As mentioned in program, for experiment-2, I used momentum values as 0, 0.25, 0.5 and 0.9. Hidden unit as 100 and learning rate as 0.1 # As mentioned in program, for experiment-3, I used split values as 0.25,0.5 to limit training data set. Hidden unit as 100 and learning rate as 0.1 # By adjusting the values of above parameters according to the experiment we can view the graphs and confusion matrices as in the report. import tensorflow as tf # tensor flow libraries from sklearn.metrics import confusion_matrix #library for generating confusion matrix import matplotlib.pyplot as plt #Librarry for generating plots from keras import layers,initializers from keras import optimizers #library for optimizers #Read the available mnist data set from keras mnist_data = tf.keras.datasets.mnist (X_train_data,Y_train_data),(X_test_data,Y_test_data) = mnist_data.load_data() #Normalize the data X_train_data = tf.keras.utils.normalize(X_train_data,axis = 1) X_test_data = tf.keras.utils.normalize(X_test_data,axis = 1) # A basic feed forward neural network model = tf.keras.models.Sequential() #Convert the inputs as 0 and 1 from 255 model.add(tf.keras.layers.Flatten()) #Hidden units in place of 100 take 20,50,100 model.add(tf.keras.layers.Dense(100,activation='sigmoid',kernel_initializer='random_uniform',bias_initi alizer=initializers.Constant(1))) # Sigmoid activation function
model.add(tf.keras.layers.Dense(10,activation='sigmoid',kernel_initializer='random_uniform',bias_initial izer=initializers.Constant(1))) #Set momentum values 0,0.25,0.5 according to the experiment 2 sgd = optimizers.SGD(lr=0.1, momentum=0.9) model.compile(optimizer = 'sgd',loss = 'sparse_categorical_crossentropy',metrics = ['accuracy']) # Use split value as 0.25,0.5 for one quarter and one-half of values respectively in experiment 3 k = model.fit(X_train_data,Y_train_data,validation_split=0.5,epochs=50,batch_size=50,verbose=1) loss,acc = model.evaluate(X_test_data,Y_test_data) predict_x=model.predict(X_test_data) y_predicted=np.argmax(predict_x,axis=1) confusion_matx = confusion_matrix(Y_test,y_predicted) print(confusion_matx) plt.plot(k.history['accuracy']) plt.plot(k.history['val_accuracy']) plt.ylabel('Accuracy') plt.xlabel('Epochs') plt.legend(['Train','Test'], loc = 'upper left') plt.show()
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help