Ymcki gemma-2-2b-jpn-it-abliterated-17-ORPO

From OODA WIKI
Revision as of 19:47, 8 August 2025 by VeritasNumeri (talk | contribs) (Bot update: Populating AM Almanac data.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO
Extrinsic Performance (LLM Leaderboard)
Rank N/A
Average Score 0.37
Intrinsic Architecture
Architecture Gemma2ForCausalLM
Hidden Layers 26
Attention Heads 8
Vocab Size 256002


ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO is a deep learning model with a recorded average score of 0.37 on the Open LLM Leaderboard.

Performance Metrics

  • ARC Score: N/A
  • HellaSwag Score: N/A
  • MMLU Score: N/A

This page was last updated automatically by the Almanac Ingestor bot.