Skip to content

Update Usage Section Of Toxicity Classifier Model#1336

Open
gaikwadrahul8 wants to merge 3 commits intotensorflow:masterfrom
gaikwadrahul8:patch-3
Open

Update Usage Section Of Toxicity Classifier Model#1336
gaikwadrahul8 wants to merge 3 commits intotensorflow:masterfrom
gaikwadrahul8:patch-3

Conversation

@gaikwadrahul8
Copy link
Copy Markdown
Contributor

@gaikwadrahul8 gaikwadrahul8 commented Jan 19, 2024

I have updated usage section of TFJs Toxicity Classifier Model on this page because result was not displaying as expected mentioned in the comment section so I did some modifications in the code to display result as expected. I tested code snippet on Node.js environment and also on the Google chrome browser and it's displaying the model result as expected this enhancements will improve the clarity of model result and usability of the model for community members.

Kindly review the updated section and provide any feedback or suggestions you may have. If the changes are satisfactory please do the needful. Thank you.

Here is screenshot of model result without modifications of current code snippet in the Node.js project :

image

Here is screenshot of model result with modifications of current code snippet in the Node.js project :

(base) gaikwadrahul-macbookpro:test-8145 gaikwadrahul$ node index.js

============================
Hi there 👋. Looks like you are running TensorFlow.js in Node.js. To speed things up dramatically, install our node backend, which binds to TensorFlow C++, by running npm i @tensorflow/tfjs-node, or npm i @tensorflow/tfjs-node-gpu if you have CUDA. Then call require('@tensorflow/tfjs-node'); (-gpu suffix for CUDA) at the start of your program. Visit https://github.com/tensorflow/tfjs-node for more details.
============================
Label: identity_attack
        - probabilities : Float32Array(2) [ 0.9659663438796997, 0.034033700823783875 ]
        - match : false

Label: insult
        - probabilities : Float32Array(2) [ 0.08124702423810959, 0.9187529683113098 ]
        - match : true

Label: obscene
        - probabilities : Float32Array(2) [ 0.3993152379989624, 0.6006847620010376 ]
        - match : null

Label: severe_toxicity
        - probabilities : Float32Array(2) [ 0.9970394968986511, 0.002960436511784792 ]
        - match : false

Label: sexual_explicit
        - probabilities : Float32Array(2) [ 0.7053251266479492, 0.2946748435497284 ]
        - match : null

Label: threat
        - probabilities : Float32Array(2) [ 0.9106737971305847, 0.08932614326477051 ]
        - match : false

Label: toxicity
        - probabilities : Float32Array(2) [ 0.031176716089248657, 0.9688233137130737 ]
        - match : true

(base) gaikwadrahul-macbookpro:test-8145 gaikwadrahul$ 

@gbaned
Copy link
Copy Markdown

gbaned commented Jun 7, 2024

Hi @gaikwadrahul8 Can you please rebase your branch and resolve the conflicts? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants