## Key Concepts

Review core concepts you need to learn to master this subject

### Perceptron Bias Term

``.gamut-clns7e-ColorizedContainer-fontSmoothing{display:block;text-align:left;font-weight:normal;background-color:#211E2F;color:#939598;font-family:Monaco,Menlo,"Ubuntu Mono","Droid Sans Mono",Consolas,monospace;font-size:0.875rem;padding:1rem;overflow-wrap:break-word;white-space:pre-wrap;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale;}weighted_sum = x1*w1 + x2*w2 + x3*w3 + 1*wbias``

The bias term is an adjustable, numerical term added to a perceptron’s weighted sum of inputs and weights that can increase classification model accuracy.

The addition of the bias term is helpful because it serves as another model parameter (in addition to weights) that can be tuned to make the model’s performance on training data as good as possible.

The default input value for the bias weight is `1` and the weight value is adjustable.

Perceptron
Lesson 1 of 1
1. 1
Similar to how atoms are the building blocks of matter and how microprocessors are the building blocks of a computer, perceptrons are the building blocks of Neural Networks . If you look closely,…
2. 2
So the perceptron is an artificial neuron that can make a simple decision. Let’s implement one from scratch in Python! The perceptron has three main components: * Inputs: Each input correspon…
3. 3
Great! Now that you understand the structure of the perceptron, here’s an important question — how are the inputs and weights magically turned into an output? This is a two-step process, and …
4. 4
After finding the weighted sum, the second step is to constrain the weighted sum to produce a desired output. Why is that important? Imagine if a perceptron had inputs in the range of 100-1000 …
5. 5
Our perceptron can now make a prediction given inputs, but how do we know if it gets those predictions right? Right now we expect the perceptron to be very bad because it has random weights. We h…
6. 6
Now that we have our training set, we can start feeding inputs into the perceptron and comparing the actual outputs against the expected labels! Every time the output mismatches the expected labe…
7. 7
What do we do once we have the errors for the perceptron? We slowly nudge the perceptron towards a better version of itself that eventually has zero error. The only way to do that is to change the…
8. 8
But one question still remains — how do we tweak the weights optimally? We can’t just play around randomly with the weights until the correct combination magically pops up. There needs to be …
9. 9
You have understood that the perceptron can be trained to produce correct outputs by tweaking the regular weights. However, there are times when a minor adjustment is needed for the perceptron to…
10. 10
So far so good! The perceptron works as expected, but everything seems to be taking place behind the scenes. What if we could visualize the perceptron’s training process to gain a better understand…
11. 11
Let’s recap what you just learned! The perceptron has inputs, weights, and an output. The weights are parameters that define the perceptron and they can be used to represent a line. In other words…
12. 12
Congratulations! You have now built your own perceptron from scratch. Let’s step back and think about what you just accomplished and see if there are any limits to a single perceptron. Earlier, …

## What you'll create

Portfolio projects that showcase your new skills

## How you'll master it

Stress-test your knowledge with quizzes that help commit syntax to memory