The code for our newly accepted paper in MICCAI 2023: "Instructive Feature Enhancement for Dichotomous Medical Image Segmentation."
-
Updated
Aug 3, 2023 - Python
The code for our newly accepted paper in MICCAI 2023: "Instructive Feature Enhancement for Dichotomous Medical Image Segmentation."
This is the official implementation of the paper namely Real-Time Anomaly Detection and Feature Analysis Based on Time Series for Surveillance Video.
Sort & Slice: A Simple and Superior Alternative to Hash-Based Folding for Extended-Connectivity Fingerprints (ECFPs)
A wordle solver by maximizing information gain vs playing heuristically with character frequencies
This is an application designed in HTML5/Javascript of a scanner that makes a comparison between two methods, namely between Shanon entropy (Information entropy) and self-sequence alignment (Information content). Information entropy (IE) and Information content (IC) are two methods that quantitatively measure information.
Experimental classification algorithms on german credit data implemented using scikit-learn library
词典(dict),正向最⼤匹配(forward maximum matching),逆向最⼤匹配(backward maximum matching),信息熵(information entropy)。
A small suite of functions for calculating information entropy based measurements.
decision trees made easy
A sensitivity toolbox that is tailored to the design process in the presence of uncertainties
Entropy-Optimized Wordle Solver inspired by 3Blue1Brown's Modern Information Theory Video.
A game-dynamics-aware Hangman solver leveraging fine-tuned BERT, weighted N-grams, and information entropy. Achieves a 74.4% success rate on a disjoint dataset.
This is a Decision Tree implementation with Python which uses information gain to split attributes. It does not use any ML library.
Code developed for CSE 515 Multimedia Web Databases
Repository containing experimental pipelines for the paper "Learning Dynamics and Core of Learners".
Add a description, image, and links to the information-entropy topic page so that developers can more easily learn about it.
To associate your repository with the information-entropy topic, visit your repo's landing page and select "manage topics."