This repo contains the code of "ConTNet: Why not use convolution and transformer at the same time?"
-
Updated
May 25, 2021 - Python
This repo contains the code of "ConTNet: Why not use convolution and transformer at the same time?"
The project of the downstream tasks based on LucaOne's Embedding.
Datasets and code for results presented in the ProbConserv paper
A python implementation of “Self-Supervised Learning of Spatial Acoustic Representation with Cross-Channel Signal Reconstruction and Multi-Channel Conformer” [TASLP 2024]
Repository for SEPAL: Scalable Feature Learning on Huge Knowledge Graphs for Downstream Machine Learning
[ICLR24] AutoVP: An Automated Visual Prompting Framework and Benchmark
PLMFit platform for TL on PLMs
An open-source implementaion for fine-tuning DINOv2 by Meta.
This repository contains the resources, code, and documentation for LlamaLens, a specialized multilingual large language model (LLM) designed to analyze news and social media content effectively. LlamaLens supports multiple languages, including Arabic, English, and Hindi, and is tailored for diverse tasks such as sentiment analysis, misinformation.
This repository contains implementations of ELMo (Embeddings from Language Models) models trained on a news dataset. Additionally, it includes a classification task using ELMo embeddings.
An implementation of ELMo embeddings using PyTorch, featuring stacked Bi-LSTMs for contextualized word representations. Pretrained on bidirectional language modeling and evaluated on the AG News dataset for text classification, achieving metrics like accuracy, F1, and precision.
This library is based on simpletransformers and HuggingFace's Transformers library.
Add a description, image, and links to the downstream-tasks topic page so that developers can more easily learn about it.
To associate your repository with the downstream-tasks topic, visit your repo's landing page and select "manage topics."