
Ваша оценкаЦитаты
Discimus6 декабря 2020 г.Entropy is a measure of uncertainty about a random variable. It reaches its maximum when all values of the random variables are equiprobable. Entropy reaches its minimum when the random variable can have only one value.
097
Discimus6 декабря 2020 г.In many practical cases, you would prefer a learning algorithm that builds a less accurate model fast. Additionally, you might prefer a less accurate model that is much quicker at making predictions.
058
Discimus6 декабря 2020 г.It could look counter-intuitive that learning could benefit from adding more unlabeled examples. It seems like we add more uncertainty to the problem. However, when you add unlabeled examples, you add more information about your problem: a larger sample reflects better the probability distribution the data we labeled came from.
019
Discimus6 декабря 2020 г.Machine learning can also be defined as the process of solving a practical problem by 1) gathering a dataset, and 2) algorithmically building a statistical model based on that dataset.
087
Discimus4 декабря 2020 г.Let’s start by telling the truth: machines don’t learn. What a typical “learning machine” does, is finding a mathematical formula, which, when applied to a collection of inputs (called “training data”), produces the desired outputs... Why isn’t that learning? Because if you slightly distort the inputs, the output is very likely to become completely wrong.
083