“Someone on TV has only to say ‘Alexa’ and she lights up, She’s always ready for action, the perfect woman, never says ‘Not tonight, dear’” — Sybil Sage as quoted in the NY Times article: ‘Alexa, Where Have You Been All My Life?’.
Machine Learning has changed many aspects of the modern world we are living in, in a positive way. Self-driving cars? intelligent virtual assistants on smarthphones? recommendation systems used by companies like Amazon and Netflix, cibersecurity automation, Social Media’s News Feed are big examples of how far technology has come.
What is Machine Learning?
Machine learning is a data analytics technique that teaches computers to do what comes naturally to humans and animals: learn from experience. Machine learning algorithms use computational methods to “learn” information directly from data without relying on a predetermined equation as a model. The algorithms adaptively improve their performance as the number of samples available for learning increases.
We can mention a lot of reasons why Machine Learning matters and its application to key industries:
In the financial services industry, Machine Learning helps to track customer happiness. By analysing user activity, smart machines can spot a potential account closure before it occurs. They can also track spending patterns and customer behavior to offer tailored financial advice.
Trending AI Articles:
1. Neural networks for solving differential equations
2. From Perceptron to Deep Neural Nets
3. Are you using the term ‘AI’ incorrectly?
4. Making a Simple Neural Network
If you have an Apple watch, you know this device is getting better every year. Apple watches don’t have keyboards, they use Machine Learning for handwriting recognition, but the model needs to learn how to recognize letters a user might draw.
One of my favorite uses of Machine Learning is online recommendation systems that allows retailers to offer you personalized recommendations based on your previous activity.
Below, I’m using Python’s machine learning library, scikitlearn, to predict human handwriting. The result is pretty amazing!
#Importing Standard Scientific Python Library
import matplotlib.pyplot as plt
#Using a simple dataset of 8×8 gray level images of handwritten digits
from sklearn.datasets import load_digits
#Loading the dataset provided by scikit-learn
digits = load_digits()
#Analyzing a sample image, in this case we are using number 8
import pylab as pl
#Analyzing image pixels. Each element represents the pixel of our grayscale image. The value range from 0 to 255 for 8 bit pixel
#Visualizing first 15 images
images_and_labels = list(zip(digits.images, digits.target))
for index, (image, label) in enumerate(images_and_labels[:15]):
plt.subplot(3, 5, index + 1)
plt.imshow(image, cmap=plt.cm.gray_r, interpolation='nearest')
plt.title('%i' % label)
from sklearn import ensemble
n_samples = len(digits.images)
x = digits.images.reshape((n_samples, -1))
y = digits.target
#Creating random indices.The integer division(//) should be used instead.
valid_index=[i for i in range(len(x)) if i not in sample_index]
#Sample and validation images
sample_images=[x[i] for i in sample_index]
valid_images=[x[i] for i in valid_index]
#Sample and validation targets
sample_target=[y[i] for i in sample_index]
valid_target=[y[i] for i in valid_index]
#Using the Random Forest Classifier
classifier = ensemble.RandomForestClassifier()
#Fit model with sample data
#Attempt to predict validation data
print ('Random Tree Classifier:n')
#Make sure to add an extra set of square brackets.
We can see that the machine predicted that the image was a 3. Just imagining handwriting recognition technology applications is mind-blowing!
You can find me on GitHub: