Petals to the Metal - Flower Classification on TPU
- Gangavelli Ruthwik
- Nov 12, 2023
- 3 min read

There are over 5,000 species of mammals, 10,000 species of birds, 30,000 species of fish – and astonishingly, over 400,000 different types of flowers.
We are challenged to build a machine learning model that identifies the type of flowers in a dataset of images (for simplicity, we’re sticking to just over 100 types).
To achieve this result we will use the concept of TPU(Tensor Processing Unit).
Tensor Processing Unit (TPU) is an AI acceleratorapplication-specific integrated circuit (ASIC) developed by Google for neural networkmachine learning, using Google's own TensorFlow software. Google began using TPUs internally in 2015, and in 2018 made them available for third party use, both as part of its cloud infrastructure and by offering a smaller version of the chip for sale.
We use this concept beacuse in most of the cases while building a image classifer we are mostly bottle necked by the hardware of the person using. By using TPU we can accelerate the processing time of the model vastly which will result in faster result.
I had completed implemented this work using Kaggle has my complier in which there is a compition in which we can participate for free and pratice with the resources available and i have choosen python has my computing language.
I had executed this work by building an image classifier in Keras and train it on a Tensor Processing Unit (TPU)
Firstly We begin by importing several Python packages that ar needed

A TPU has eight different cores and each of these cores acts as its own accelerator. (A TPU is sort of like having eight GPUs in one machine.) We tell TensorFlow how to make use of all these cores at once through a distribution strategy. Run the following cell to create the distribution strategy that we'll later apply to our model.

When used with TPUs, datasets need to be stored in a Google Cloud Storage bucket. You can use data from any public GCS bucket by giving its path just like you would data from '/kaggle/input'. The following will retrieve the GCS path for this competition's dataset.

When used with TPUs, datasets are often serialized into TFRecords. This is a format convenient for distributing data to each of the TPUs cores. We've hidden the cell that reads the TFRecords for our dataset since the process is a bit long. You could come back to it later for some guidance on using your own datasets with TPUs and create data pipelines.
Now we're ready to create a neural network for classifying images! We'll use what's known as transfer learning. With transfer learning, you reuse part of a pretrained model to get a head-start on a new dataset.
For this tutorial, we'll use a model called VGG16 pretrained on (ImageNet).
The distribution strategy we created earlier contains a context manager, strategy.scope. This context manager tells TensorFlow how to divide the work of training among the eight TPU cores. When using TensorFlow with a TPU, it's important to define your model in a strategy.scope() context.

The 'sparse_categorical' versions of the loss and metrics are appropriate for a classification task with more than two labels, like this one.

We'll train this network with a special learning rate schedule. with a learining rate at 0.005 with starting learning rate at 0.005.
And now we're ready to train the model. After defining a few parameters we can get the result.
For the better understanding of the result data lets just plot simple graph with result plotted. My Contribution:
After the intial model predicition i had used an new model call VGG16 model and find the predicition value and plot it for the reference and understand the differencee.
Challenges Faced and their Solutions :
while loading the data intially the data was in the format of TFrecords. so i used tensor flow method to load the data
While training the model the major problem was hardware limitation and high execution time. so to slove that i used TPU accelerators that are in-built in the kaggle work environment.
Commentaires