3个月前,也就是2月份左右吧,Github上出现一个开源项目:Infinite Red, Inc.工作室宣布开源旗下基于tensorflow的tfjs的鉴黄小工具
据说是从15000张图片中 进行机器学习而来的比较聪明的工具,值得一用
NSFW JS 全称为:NotSafe/SuitableForWork给 NSFW JS 一张图片元素或画布,然后简单地调用 classify,可能会得到如下 5 个分类结果。绘画(Drawing)——无害的艺术,或艺术绘画;变态(Hentai)——色情艺术,不适合大多数工作环境;中立(Neutral)——一般,无害的内容;色情(Porn)——不雅的内容和行为,通常涉及生殖器;性感(Sexy)——不合时宜的挑衅内容。
<!DOCTYPE html> <html lang="en"><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width, initial-scale=1.0"><meta http-equiv="X-UA-Compatible" content="ie=edge"><title>Document</title><!-- Load TensorFlow.js. This is required --><script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@1.0.4"></script><!-- Load the NSFWJS library from AWS --><script src="https://s3.amazonaws.com/ir_public/nsfwjscdn/bundle.js"></script><!-- For testing: Load from local bundle `yarn scriptbundle` --><!-- <script src="../../dist/bundle.js"></script> --></head><body><input type="file" οnchange="showImg()" accept="image/*"><br><br><img id="myImg" src="" width="150" alt="Thumb preview..."><button οnclick="judge()">鉴别</button><script>function judge() {const nsfwjs = require('nsfwjs')const img = document.getElementById('myImg')// Load the model.nsfwjs.load().then(model => {// Classify the image.model.classify(img).then(predictions => {console.log('Predictions', predictions)})})}function showImg() {var demoImage = document.querySelector('img');var file = document.querySelector('input[type=file]').files[0];var reader = new FileReader();reader.onload = function (event) {demoImage.src = reader.result;}reader.readAsDataURL(file);console.log(file)}</script></body></html>