E

Extended Mind Mpt 7b

Developed by normalcomputing
An extended mind Transformer model adjusted based on the Mosaic ML architecture, supporting external memory library retrieval and attention functions
Downloads 111
Release Time : 10/20/2023

Model Overview

This model implements the extended mind method described in the paper, capable of retrieving and focusing on external key-value pair storage (memory library), and can use the original model weights without fine-tuning

Model Features

External memory library integration
Supports passing external memory libraries through token id sequences and automatically completes memory generation and caching
Dynamic memory update
Can update memory content in real-time through the clear_memories() method and the memory_ids attribute
Reference retrieval function
Can output the specific memory indexes called during the generation process to enhance the interpretability of the results
Flexible configuration
Provides various parameter configurations such as memory type selection, similarity masking, and special token processing

Model Capabilities

Text generation
External memory retrieval
Context-aware reasoning
Multi-round dialogue support

Use Cases

Knowledge Q&A
Q&A based on external knowledge
By injecting external knowledge bases such as Wikipedia, answer questions that require professional domain knowledge
Examples show that it can accurately answer specific questions such as the naturalization time of mathematician Grothendieck
Research assistance
Academic literature analysis
By injecting academic content such as paper abstracts, assist in literature reviews and knowledge association
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase