Compact models that mimic vector space of the "crisistransformers/CT-M1-Complete". These models should be fine-tuned on downstream tasks like BERT.
CrisisTransformers
university
AI & ML interests
Natural Language Processing, Social Computing, Crisis Informatics
Recent Activity
View all activity
Organization Card
CrisisTransformers | State-of-the-art contextual and semantically meaningful sentence embeddings for crisis-related social media texts
CrisisTransformers is a family of pre-trained language models and sentence encoders introduced in the following papers:
- Pre-trained models and sentence encoders: CrisisTransformers: Pre-trained language models and sentence encoders for crisis-related social media texts
- Multi-lingual sentence encoders: Semantically Enriched Cross-Lingual Sentence Embeddings for Crisis-related Social Media Texts
- Mini models: "Actionable Help" in Crises: A Novel Dataset and Resource-Efficient Models for Identifying Request and Offer Social Media Posts
The models were trained on a massive corpus of over 15 billion word tokens from tweets associated with 30+ crisis events, such as disease outbreaks, natural disasters, conflicts, etc.
models
14
crisistransformers/CT-XLMR-SE
Sentence Similarity
•
Updated
•
18
crisistransformers/CT-mBERT-SE
Sentence Similarity
•
Updated
•
88
crisistransformers/CT-M1-Complete
Fill-Mask
•
Updated
•
18
crisistransformers/CT-M2-OneLook
Fill-Mask
•
Updated
•
19
crisistransformers/CT-M2-BestLoss
Fill-Mask
•
Updated
•
14
crisistransformers/CT-M2-Complete
Fill-Mask
•
Updated
•
36
crisistransformers/CT-M3-OneLook
Fill-Mask
•
Updated
•
22
crisistransformers/CT-M3-BestLoss
Fill-Mask
•
Updated
•
17
crisistransformers/CT-M3-Complete
Fill-Mask
•
Updated
•
64
•
1
crisistransformers/CT-M1-BestLoss
Fill-Mask
•
Updated
•
24
datasets
0
None public yet