Update README.md
7ecf221 verified 2 days ago
metadata
A highly efficient 2B parameter language model optimized for reasoning and dialogue tasks.
Hibernates-2B is a custom transformer architecture designed for advanced language understanding and generation. Built with performance and efficiency in mind, it leverages state-of-the-art techniques for natural language processing.
2B Parameters
4096 Token Context Window
Custom Transformer Architecture
Optimized for CPU and GPU Inference
Multi-Turn Dialogue Support
Architecture: Custom Transformer
Parameters: 2 Billion
Context Length: 4096 tokens
Model Type: Decoder-only
Tokenizer: Custom WordPiece
Format: SafeTensors
Efficient Resource Usage
Strong Reasoning Capabilities
Multi-Turn Dialogue
Context Awareness
Instruction Following
Resource Requirements: 8GB+ GPU RAM recommended
Task Specificity: Best suited for dialogue and reasoning tasks
Language Support: Primary focus on English
Model Size: Optimized for balance of performance and efficiency
Research and commercial use permitted
Attribution appreciated but not required
No warranty provided
If you use this model in your research, please cite:
Built using PyTorch and Hugging Face Transformers. Special thanks to the open-source AI community.
Due to file size limitations, the model files are hosted externally. Download them from:
Place these files in the root directory of the project before running.