We have built a machine learning system to categorize images as family friendly, borderline, or not family friendly. When you submit content to Alpha Coders, this system will look at your image, and give it one of those categorizations.
The system will get it right most of the time - but not all of the time. Below are some scenarios where it might get it wrong, and what you can do.
Content that our system marks as not family friendly will be limited to Image Abyss.
If you submit content that is family friendly, and our system marks it as not family friendly:
If you submit content that is not family friendly, and our system marks it as family friendly:
Our machine learning system is going to continue to learn and grow. We re-train it with examples that it gets wrong every week (Monday mornings). So you might notice it getting better over time.
|By DavidWest 1 year ago|
mi imagen no corresponde a no apto para familias
entoces que las familiares no son aptas o que
|By Alexis Torres Luna 1 month ago|
Sometimes the machine learning models will get it wrong. You can always request a review. We then use that data to improve the model.
|By DavidWest 1 month ago|