ilmscore | LLM Hallucinations Predictions
N/A
Accuracy

Recent Predictions

Total: 1
Correct: 0
Incorrect: 0
Pending: 1
Unrated: 0
Prediction
Author
Predicted at
Status
Video
Large Language Models (LLMs) have a 'hallucination problem,' where they can present incorrect information as fact and fabricate supporting sources.
"LMS suffer from the hallucination problem they can present a factually incorrect answer as if it wer..."
Aug 4, 2024
Pending