Tuesday, July 23, 2019

Tip 4 AI

  • Remove spike

    • Options
      • Change network
        • I.e. use 2 layer LSTM instead of 1 layer LSTM
      • Increase the batch size
      • Remove outliers
  • number of steps (tensorflow)
    • num_steps = (len(traindf) / args['batch_size']) / args['learning_rate']
  • queue capacity (tensorflow)
    • Queue_capacity = batch_size * 10
  • checkpoint (tensorflow)
    • save_checkpoints_steps = max(100, params["train_steps"] // 10)
  • tf.data.Dataset (tensorflow)
    • num_parallel_X, prefetch

    • dataset.apply(

      • Tf.contrib.data.shuffle_and_repeat

      • Tf.contrib.data.map_and_batch

  • Mirrored strategy (tensorflow)
    • distribution = tf.contrib.distribute.MirroredStrategy()

    • tf.estimator.RunConfig(train_distribute=distribution)

  • Multi-label classification (tensorflow)
    • Use the loss functions listed below for reducing computation
      • tf.nn.sampled_softmax_loss, tf.nn.nce_loss
  • Visualizing the convolutions and pooling (tensorflow)
    • Code

      import tensorflow as tf
      print(tf.__version__)
      mnist = tf.keras.datasets.fashion_mnist
      (training_images, training_labels), (test_images, test_labels) = mnist.load_data()
      training_images=training_images.reshape(6000028281)
      training_images=training_images / 255.0
      test_images = test_images.reshape(1000028281)
      test_images=test_images/255.0
      model = tf.keras.models.Sequential([
        tf.keras.layers.Conv2D(64, (3,3), activation='relu', input_shape=(28281)),
        tf.keras.layers.MaxPooling2D(22),
        tf.keras.layers.Conv2D(64, (3,3), activation='relu'),
        tf.keras.layers.MaxPooling2D(2,2),
        tf.keras.layers.Flatten(),
        tf.keras.layers.Dense(128, activation='relu'),
        tf.keras.layers.Dense(10, activation='softmax')
      ])
      model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
      model.summary()
      model.fit(training_images, training_labels, epochs=5)
      test_loss = model.evaluate(test_images, test_labels)
        
      print(test_labels[:100])
        
      import matplotlib.pyplot as plt
      f, axarr = plt.subplots(3,4)
      FIRST_IMAGE=1
      SECOND_IMAGE=3
      THIRD_IMAGE=4
      CONVOLUTION_NUMBER = 1
      from tensorflow.keras import models
      layer_outputs = [layer.output for layer in model.layers]
      activation_model = tf.keras.models.Model(inputs = model.input, outputs = layer_outputs)
      for in range(0,4):
        f1 = activation_model.predict(test_images[FIRST_IMAGE].reshape(128281))[x]
        axarr[0,x].imshow(f1[0, : , :, CONVOLUTION_NUMBER], cmap='inferno')
        axarr[0,x].grid(False)
        f2 = activation_model.predict(test_images[SECOND_IMAGE].reshape(128281))[x]
        axarr[1,x].imshow(f2[0, : , :, CONVOLUTION_NUMBER], cmap='inferno')
        axarr[1,x].grid(False)
        f3 = activation_model.predict(test_images[THIRD_IMAGE].reshape(128281))[x]
        axarr[2,x].imshow(f3[0, : , :, CONVOLUTION_NUMBER], cmap='inferno')
        axarr[2,x].grid(False)
  • Early stopping (tensorflow)
    • Code

      import tensorflow as tf
      print(tf.__version__)
       
      class myCallback(tf.keras.callbacks.Callback):
        def on_epoch_end(self, epoch, logs={}):
          if(logs.get('loss')<0.4):
            self.model.stop_training = True
          elif(logs.get('acc')>0.8):
            self.model.stop_training = True
       
      callbacks = myCallback()
      mnist = tf.keras.datasets.fashion_mnist
      (training_images, training_labels), (test_images, test_labels) = mnist.load_data()
      training_images=training_images/255.0
      test_images=test_images/255.0
      model = tf.keras.models.Sequential([
        tf.keras.layers.Flatten(),
        tf.keras.layers.Dense(512, activation=tf.nn.relu),
        tf.keras.layers.Dense(10, activation=tf.nn.softmax)
      ])
      model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
      model.fit(training_images, training_labels, epochs=5, callbacks=[callbacks])

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.