Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
#

pretrained-language-models

Here are 15 public repositories matching this topic...

This research examines the performance of Large Language Models (GPT-3.5 Turbo and Gemini 1.5 Pro) in Bengali Natural Language Inference, comparing them with state-of-the-art models using the XNLI dataset. It explores zero-shot and few-shot scenarios to evaluate their efficacy in low-resource settings.

  • Updated May 8, 2024
  • Jupyter Notebook

Identified ADEs and associated terms in an annotated corpus with Named Entity Recognition (NER) modeling with Flair and PyTorch. Fine-tuned pre-trained transformer models such as XLM-RoBERTa, SpanBERT, and Bio_ClinicalBERT. Achieved F1 scores of 0.73 and 0.77 for BIOES and BIO tagging models, respectively.

  • Updated Dec 20, 2024
  • Jupyter Notebook

This study focuses on political sentiment analysis during Bangladeshi elections, using the "Motamot" dataset to evaluate how Pre-trained Language Models (PLMs) and Large Language Models (LLMs) capture complex sentiment characteristics. The research explores the effectiveness of various models and learning strategies in understanding public opinion.

  • Updated Aug 10, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the pretrained-language-models topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the pretrained-language-models topic, visit your repo's landing page and select "manage topics."

Learn more

Morty Proxy This is a proxified and sanitized view of the page, visit original site.