Capytessborosyi 34B 200K DARE Ties
Other
This is a 34B-parameter large language model merged using the DARE Ties method via mergekit, based on the Yi-34B-200K architecture, integrating the capabilities of three models: Nous-Capybara-34B, Tess-M-v1.3, and airoboros-3_1-yi-34b-200k.
Large Language Model
Transformers English