Edit model card

I-Comprehend Question Generation Model

Overview

The I-Comprehend Question Generation Model is a T5-based model designed to generate questions from a given context and answer. This model is particularly useful for educational purposes, automated content creation, and enhancing reading comprehension tools.

Model Details

  • Model Architecture: T5 (Text-to-Text Transfer Transformer)
  • Model Type: Conditional Generation
  • Training Data: [Specify the dataset or type of data used for training]
  • Use Cases: Question generation, educational tools, content creation

Installation

To use this model, you need to have the transformers library installed. You can install it via pip:

pip install transformers

Usage

To use the model, load it with the appropriate tokenizer and model classes from the transformers library. Ensure you have the correct repository ID or local path.

from transformers import T5ForConditionalGeneration, T5Tokenizer

# Load the model and tokenizer
model = T5ForConditionalGeneration.from_pretrained("miiiciiii/I-Comprehend_qg")
tokenizer = T5Tokenizer.from_pretrained("miiiciiii/I-Comprehend_qg")

def get_question(context, answer, model, tokenizer):
    """Generate a question for the given answer and context."""
    answer_span = context.replace(answer, f"<hl>{answer}<hl>", 1) + "</s>"
    inputs = tokenizer(answer_span, return_tensors="pt")
    question = model.generate(input_ids=inputs.input_ids, max_length=50)[0]

    return tokenizer.decode(question, skip_special_tokens=True)

# Define the context and answer
context = "The Eiffel Tower is located in Paris and is one of the most famous landmarks in the world."
answer = "Eiffel Tower"

# Generate the question
question = get_question(context, answer, model, tokenizer)
print("Generated Question:", question)

Model Performance

  • Evaluation Metrics: [BLEU, ROUGE]
  • Performance Results: [Accuracy]

Limitations

  • The model may not perform well on contexts that are significantly different from the training data.
  • It may generate questions that are too generic or not contextually relevant in some cases.

Contributing

We welcome contributions to improve the model or expand its capabilities. Please feel free to open issues or submit pull requests.

License

[MIT License]

Acknowledgments

  • [Acknowledge any datasets, libraries, or collaborators that contributed to the model]

Contact

For any questions or issues, please contact [icomprehend.system@gmail.com].

Downloads last month
37
Safetensors
Model size
223M params
Tensor type
F32
ยท
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for miiiciiii/I-Comprehend_qg

Finetuned
(1)
this model

Space using miiiciiii/I-Comprehend_qg 1