Thursday, January 28, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Neural Networks

How to Overfit Your Model. Lots of articles out there are talking… | by Muhammad Ardi | Jan, 2021

January 9, 2021
in Neural Networks
How to Overfit Your Model. Lots of articles out there are talking… | by Muhammad Ardi | Jan, 2021
585
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Next up, let’s talk about how the number of features affect classification performance. Different to the previous one, here I am going to use potato leaf disease dataset (it’s here). What basically I wanna perform here is to train two CNN models (again), where the first one is using max-pooling while the next one doesn’t. So below is how the first model looks like.

Model: "sequential_6"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_10 (Conv2D) (None, 254, 254, 16) 448
_________________________________________________________________
conv2d_11 (Conv2D) (None, 252, 252, 32) 4640
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 63, 63, 32) 0
_________________________________________________________________
flatten_6 (Flatten) (None, 127008) 0
_________________________________________________________________
dense_18 (Dense) (None, 100) 12700900
_________________________________________________________________
dense_19 (Dense) (None, 50) 5050
_________________________________________________________________
dense_20 (Dense) (None, 3) 153
=================================================================
Total params: 12,711,191
Trainable params: 12,711,191
Non-trainable params: 0
_________________________________________________________________

As you can see the model summary above, the max-pooling layer is placed right after the last convolution layer. — Well, the size of this pooling layer is 4×4, but it’s just not mentioned in this summary. Also, note that the number of neurons in the flatten layer is only 127,008.

You might also like

Using Machine Learning to Recognize Objects on Hand Drawn Wireframes | by Adam Nahirnyj | Jan, 2021

Machine Learning and Generalization Error — VC Bound in Practice | by Naja Møgeltoft | Jan, 2021

4 Reasons Why MongoDB Atlas is great for Machine Learning | by Britton LaRoche | Jan, 2021

The output showing the training progress bar looks like below. Here you can see that the model is performing excellent since the accuracy towards validation data is reaching up to 93%.

Epoch 18/20
23/23 [==============================] - 1s 31ms/step - loss: 1.0573e-05 - acc: 1.0000 - val_loss: 0.3191 - val_acc: 0.9389
Epoch 19/20
23/23 [==============================] - 1s 31ms/step - loss: 9.6892e-06 - acc: 1.0000 - val_loss: 0.3215 - val_acc: 0.9389
Epoch 20/20
23/23 [==============================] - 1s 31ms/step - loss: 8.9155e-06 - acc: 1.0000 - val_loss: 0.3233 - val_acc: 0.9389

But remember that we want the model to be AS OVERFIT AS POSSIBLE. So my approach here is to modify the CNN such that it no longer contains max-pooling layer. Below is the detail of the modified CNN model.

Model: "sequential_4"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_6 (Conv2D) (None, 254, 254, 16) 448
_________________________________________________________________
conv2d_7 (Conv2D) (None, 252, 252, 32) 4640
_________________________________________________________________
flatten_4 (Flatten) (None, 2032128) 0
_________________________________________________________________
dense_12 (Dense) (None, 100) 203212900
_________________________________________________________________
dense_13 (Dense) (None, 50) 5050
_________________________________________________________________
dense_14 (Dense) (None, 3) 153
=================================================================
Total params: 203,223,191
Trainable params: 203,223,191
Non-trainable params: 0
_________________________________________________________________

Now, you can see the summary above that the number of neurons in the flatten layer is 2,032,128. This number is a lot more compared to the one that we see in the CNN which contains pooling layer in it. Remember, in CNN the actual classification task is done by the dense layer, which basically means that the number of neurons in this flatten layer acts kinda like the number of the input features.

Theoretically speaking, the absence of the pooling layer will cause the model to get more overfit due to the fact that the number of features is a lot higher compared to the previous CNN model. In order to prove, let’s just fit the model and see the result below.

Epoch 15/20
23/23 [==============================] - 2s 65ms/step - loss: 0.0016 - acc: 1.0000 - val_loss: 2.4898 - val_acc: 0.6667
Epoch 16/20
23/23 [==============================] - 2s 68ms/step - loss: 0.0014 - acc: 1.0000 - val_loss: 2.5055 - val_acc: 0.6667
Epoch 17/20
23/23 [==============================] - 2s 66ms/step - loss: 0.0013 - acc: 1.0000 - val_loss: 2.5193 - val_acc: 0.6667
Epoch 18/20
23/23 [==============================] - 1s 65ms/step - loss: 0.0011 - acc: 1.0000 - val_loss: 2.5339 - val_acc: 0.6611
Epoch 19/20
23/23 [==============================] - 2s 65ms/step - loss: 0.0010 - acc: 1.0000 - val_loss: 2.5476 - val_acc: 0.6611
Epoch 20/20
23/23 [==============================] - 2s 65ms/step - loss: 9.2123e-04 - acc: 1.0000 - val_loss: 2.5602 - val_acc: 0.6611

And yes, we’ve got to our goal. We can clearly see here that the validation accuracy is only 66% while at the same time the training accuracy is exactly 100%.

Credit: BecomingHuman By: Muhammad Ardi

Previous Post

State Department creates bureau to reduce 'likelihood of cyber conflict'

Next Post

The good, bad and ugly -- GCN

Related Posts

Using Machine Learning to Recognize Objects on Hand Drawn Wireframes | by Adam Nahirnyj | Jan, 2021
Neural Networks

Using Machine Learning to Recognize Objects on Hand Drawn Wireframes | by Adam Nahirnyj | Jan, 2021

January 28, 2021
Machine Learning and Generalization Error — VC Bound in Practice | by Naja Møgeltoft | Jan, 2021
Neural Networks

Machine Learning and Generalization Error — VC Bound in Practice | by Naja Møgeltoft | Jan, 2021

January 28, 2021
4 Reasons Why MongoDB Atlas is great for Machine Learning | by Britton LaRoche | Jan, 2021
Neural Networks

4 Reasons Why MongoDB Atlas is great for Machine Learning | by Britton LaRoche | Jan, 2021

January 27, 2021
Optimizing Retention through Machine Learning | by Vivek Parashar | Jan, 2021
Neural Networks

Optimizing Retention through Machine Learning | by Vivek Parashar | Jan, 2021

January 27, 2021
Why You’re Using Spotify Wrong. And how you can hijack the… | by Anthony Ranallo | Jan, 2021
Neural Networks

Why You’re Using Spotify Wrong. And how you can hijack the… | by Anthony Ranallo | Jan, 2021

January 27, 2021
Next Post
The good, bad and ugly — GCN

The good, bad and ugly -- GCN

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Mozilla: Racism, misinformation, anti-worker policies are ‘undermining’ the Internet
Internet Security

Mozilla: Racism, misinformation, anti-worker policies are ‘undermining’ the Internet

January 28, 2021
Using AI and Machine Learning to Make Solar Power More Efficient
Machine Learning

Using AI and Machine Learning to Make Solar Power More Efficient

January 28, 2021
The Future of B2B Marketing: 4 Areas to Focus On
Marketing Technology

The Future of B2B Marketing: 4 Areas to Focus On

January 28, 2021
Google says iOS privacy summaries will arrive when its apps are updated
Internet Security

Google says iOS privacy summaries will arrive when its apps are updated

January 28, 2021
Using the Manager Attribute in Active Directory (AD) for Password Resets
Internet Privacy

Using the Manager Attribute in Active Directory (AD) for Password Resets

January 28, 2021
Applying artificial intelligence to science education — ScienceDaily
Machine Learning

Smart algorithm cleans up images by searching for clues buried in noise — ScienceDaily

January 28, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Mozilla: Racism, misinformation, anti-worker policies are ‘undermining’ the Internet January 28, 2021
  • Using AI and Machine Learning to Make Solar Power More Efficient January 28, 2021
  • The Future of B2B Marketing: 4 Areas to Focus On January 28, 2021
  • Google says iOS privacy summaries will arrive when its apps are updated January 28, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates