If you're fine-tuning IBM Granite 4.0-H-Micro (or similar hybrid Mamba-Transformer models) with LoRA, the standard target_modules configuration silently skips 90% of the model. This repo documents the ...
一些您可能无法访问的结果已被隐去。
显示无法访问的结果一些您可能无法访问的结果已被隐去。
显示无法访问的结果