dytr - Multi-Task Transformer Demo
⚠️ Important Note
This is a demonstration model trained for educational purposes only.
Model Limitations:
- Limited training epochs (10 epochs only)
- Small dataset size (demonstration only)
Expected Performance:
- Results may not be accurate for production use
- The model may produce incorrect predictions
- This demo only shows that dytr can successfully fine-tune BERT for multi-task learning
Purpose: This demo proves that the dytr library can:
- Load pretrained BERT models
- Fine-tune them on multiple tasks simultaneously
- Handle different task types (classification, NER, generation, error detection)
Task
Examples