I was intrigued by the concept of neural networks. But I was intimidated by the immense fundamental knowledge that made the technology possible. Coming from an an academia-oriented background, I had the initial impression that I had to learn everything from the ground up. But I also recognized that we can utilize the tools that have already been developed, and build from there. After all, nobody computes for the square-root of 3 anymore — we utilize a tool called a calculator.
So, I looked for a tool that I can use in learning about neural networks. I encountered Fast.ai, which utilizes the teaching technique of top-to-bottom. Thus, I found myself using neural network functions, seeing its potential applications, and whetting my appetite to learn the underlying codes.
The Fast.ai provides teaching modules and a forum for support. However, as a newbie, I found that stitching information together took quite an effort. I am writing this blog to hopefully guide others who are starting after me.
Below is a series of codes that have been modified from Fast.ai’s Fastbook ‘02-production’ module, using Colab as a platform.
- Necessary imports:
2. Assign your objects of interest and their path/ directories for downloading. The Fast.ai module dwelt on bears, I used flowers. There’s loads of interesting subjects!
Tip: at this point, go to the Runtime, click on Change Runtime type, and choose GPU for your hardware accelerator. This will make the process faster. But, be conscious of your use of GPU because Colab is monitoring your usage. It is free to use, but the fine print says that the more you use it, the less will be your priority access in the future.
3. Gathering the downloaded images, splitting them into a train/ validation set, assigning labels and resizing the images into one uniform size.
2. How AI Will Power the Next Wave of Healthcare Innovation?
3. Machine Learning by Using Regression Model
4.Top Data Science Platforms in 2021 Other than Kaggle
4. Exploring possibilities of transforming:
a. by cropping
b. by padding with zeros
c. by augmenting (to be shown on next code)
5. Putting together the codes for downloading, splitting, labelling, resizing and augmenting.
Note that augmentation transformations work as a batch transformation, not as an item transformation. It is a useful technique to increase the samples in the dataset.
Also, querying dls_fl.valid.show_batch() will show the validation set, which has not undergone transformation, thus the images will all be the same if the parameter unique=True is set.
6. Applying the neural network algorithm.
This shows that after the 3rd epoch, the accuracy for the model is 1- 0.0833 = 0.9167. Quite good!
7. Checking the correctness of predictions.
8. Seeing which predictions were most out-of-this-world.
This shows you items in the dataset that you can clean to improve the accuracy.
9. Cleaning the dataset.
The windows let you decide which to keep, to delete, or to reassign to another class.
After cleaning, the model can be re-trained.
10. Save the model once you are happy with it.
11. Trying out the model.
a. Getting an image from the net.
b. Assigning the url and destination of the image to be tested.
c. Checking with my human eye.
d. Asking the model to predict.
The model came up with the right prediction of ‘rose’ with almost a 100% probability.
12. Making a simple app for playing around.
a. Create a mechanism for uploading.
b. Specifying the output.
c. Specifying the predictions.
d. Creating a button for the classification output.
e. Defining a function to handle clicks.
f. And putting it all together!
With this simple application, you can select an image from your Photos folder, or download one from the internet, and play with predicting its classification. Of course, this only provides you with 3 classes for now, but nothing’s limiting you from expanding it!
Happy learning! 🙂
My thanks to the founders of Fast.ai and all the wonderful contributors to the forum!
Credit to Crosswalk.com for the deep water image.
Disclaimer: I am not in the flower business 😉