Skip to content
Advertisement

what’s up with tensorflow.js MNIST example nextbatch implementation?

While taking inspiration from the tensorflow.js Handwritten digit recognition with CNNs tutorial, I stumbled upon the following implementation of the nextBatch function in mnist_data.js:

nextBatch(batchSize, data, index) {
  const batchImagesArray = new Float32Array(batchSize * IMAGE_SIZE);
  const batchLabelsArray = new Uint8Array(batchSize * NUM_CLASSES);

  for (let i = 0; i < batchSize; i++) {
    const idx = index();

    const image =
        data[0].slice(idx * IMAGE_SIZE, idx * IMAGE_SIZE + IMAGE_SIZE);
    batchImagesArray.set(image, i * IMAGE_SIZE);

    const label =
        data[1].slice(idx * NUM_CLASSES, idx * NUM_CLASSES + NUM_CLASSES); // weird part
    batchLabelsArray.set(label, i * NUM_CLASSES);
  }

  const xs = tf.tensor2d(batchImagesArray, [batchSize, IMAGE_SIZE]);
  const labels = tf.tensor2d(batchLabelsArray, [batchSize, NUM_CLASSES]);

  return {xs, labels};
}

I understood the point of this function was selecting the images and the corresponding label.
The problem with the provided implementation is that is correctly selecting the corresponding label but also other NUM_CLASSES-1 (10 elements in total) random labels which just happens to be after the selected one.

why is not implemented like the following?

nextBatch(batchSize, data, index) {
  const batchImagesArray = new Float32Array(batchSize * IMAGE_SIZE);
  const batchLabelsArray = new Uint8Array(batchSize);

  for (let i = 0; i < batchSize; i++) {
    const idx = index();

    const image =
        data[0].slice(idx * IMAGE_SIZE, idx * IMAGE_SIZE + IMAGE_SIZE);
    batchImagesArray.set(image, i * IMAGE_SIZE);

    const label = new Uint8Array([data[1][idx]]); // weird part corrected
    batchLabelsArray.set(label, i);
  }

  const xs = tf.tensor2d(batchImagesArray, [batchSize, IMAGE_SIZE]);
  const labels = tf.tensor2d(batchLabelsArray, [batchSize, 1]);

  return {xs, labels};
}

I obviously tried to run it with the above implementation, but the model throws the following:

Error when checking target: expected dense_Dense1 to have shape [,10], but got array with shape [1650,1].
    at new e (errors.ts:48)

Being the dense step implemented as

// Our last layer is a dense layer which has 10 output units, one for each
// output class (i.e. 0, 1, 2, 3, 4, 5, 6, 7, 8, 9).
const NUM_OUTPUT_CLASSES = 10;
model.add(tf.layers.dense({
  units: NUM_OUTPUT_CLASSES,
  kernelInitializer: 'varianceScaling',
  activation: 'softmax'
}));

Should I be correct, how should I fix the dense layer and the rest of the implementation?
If instead the provided implementation is correct, why does it work?

Advertisement

Answer

The issue is related to the shape of the label.

const labels = tf.tensor2d(batchLabelsArray, [batchSize, 1]);

The labels are created with the most right axis having the shape 1. It should rather be equal to the number of classes there are (ie: 0, 1 …, 9) which should therefore be 10.

The error is straightforward indicating that the shape should be [, 10].

  • create tensor with the shape [batchSize, 10]

Obviously if the tensor is created with the shape [batchSize, 10] whereas batchLabelsArray has the length batchSize, it will throw a shape error. It should rather have the length batchSize * NUMBER_OF_CLASSES.

The codelab uses

const batchLabelsArray = new Uint8Array(batchSize * NUM_CLASSES);

An then to set the class of a certain batchSize it uses the following:

for (let i = 0; i < batchSize; i++) {
      const idx = index();

      const image =
          data[0].slice(idx * IMAGE_SIZE, idx * IMAGE_SIZE + IMAGE_SIZE);
      batchImagesArray.set(image, i * IMAGE_SIZE);

      const label =
          data[1].slice(idx * NUM_CLASSES, idx * NUM_CLASSES + NUM_CLASSES);
      batchLabelsArray.set(label, i * NUM_CLASSES);
    }
  • The other option is to use tf.oneHot:
const labels = tf.oneHot(batchLabelsArray, 10) // batchLabelsArray is an array of batchSize length 
User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement