全部 |
  • 全部
  • 题名
  • 作者
  • 机构
  • 关键词
  • NSTL主题词
  • 摘要
检索 二次检索 AI检索
外文文献 中文文献
筛选条件:

1. ERAT-DLoRA: Parameter-efficient tuning with enhanced range adaptation in time and depth aware dynamic LoRA NSTL国家科技图书文献中心

Luo D. |  Zheng K.... -  《Neurocomputing》 - 2025,614(Jan.21) - 1.1~1.12 - 共12页

摘要: by traditional fine-tuning procedures that are both | © 2024Despite their potential, the industrial |  deployment of large language models (LLMs) is constrained |  resource-intensive and time-consuming. Low-Rank |  Adaptation (LoRA) has emerged as a pioneering methodology
关键词: Fine-tuning |  LoRA |  Parameter-efficient

2. EC-PEFT: An Expertise-Centric Parameter-Efficient Fine-Tuning Framework for Large Language Models NSTL国家科技图书文献中心

Yimeng Zhang |  Xuelin Cheng... -  《PRICAI 2024,Part II》 -  Pacific Rim International Conference on Artificial Intelligence - 2025, - 296~307 - 共12页

摘要:, parameter-efficient fine-tuning (PEFT) methods have gained |  Expertise-Centric Parameter-Efficient Fine-Tuning (EC-PEFT |  parameter-efficient tuning modules as experts. By | -efficient tuning modules: EC-LoRA and EC-Adapter. We |  increasing attention. However, fine-tuning methods often
关键词: Parameter-Efficient fine-Tuning |  Mixture-of-Experts |  Contrastive learning

3. Leveraging Parameter-Efficient Fine-Tuning for Multilingual Abstractive Summarization NSTL国家科技图书文献中心

Jialun Shen |  Yusong Wang -  《Natural Language Processing and Chinese Computing,Part III》 -  CCF International Conference on Natural Language Processing and Chinese Computing - 2025, - 293~303 - 共11页

摘要: and benefits of parameter-efficient fine-tuning |  encourage the adoption of parameter-efficient fine-tuning | In recent years, parameter-efficient fine |  performance of two representative parameter-efficient fine | -tuning methods have gained attention as an alternative
关键词: Transfer learning |  Parameter-efficient fine-tuning |  Multilingual abstractive summarization |  Pre-trained language models

4. Reparameterization-Based Parameter-Efficient Fine-Tuning Methods for Large Language Models: A Systematic Survey NSTL国家科技图书文献中心

Zezhou Chen |  Zhaoxiang Liu... -  《Natural Language Processing and Chinese Computing,Part III》 -  CCF International Conference on Natural Language Processing and Chinese Computing - 2025, - 107~118 - 共12页

摘要:-Efficient Fine-Tuning (PEFT) methods, especially |  exploit the potential of LLMs, fine-tuning LLMs on | , traditional full fine-tuning methods pose significant |  computational challenges, prompting the emergence of Parameter |  parameter complexity, GPU memory consumption, training
关键词: Large language models |  Parameter-Efficient fine-Tuning |  Reparameterization

5. Introducing Routing Functions to Vision-Language Parameter-Efficient Fine-Tuning with Low-Rank Bottlenecks NSTL国家科技图书文献中心

Tingyu Qu |  Tinne Tuytelaars... -  《Computer Vision - ECCV 2024,Part LXXXVIII》 -  European Conference on Computer Vision - 2025, - 291~308 - 共18页

摘要:Mainstream parameter-efficient fine-tuning |  fine-tuning a pre-trained multimodal model such as |  (PEFT) methods, such as LoRA or Adapter, project a |  model's hidden states to a lower dimension, allowing pre | -trained models to adapt to new data through this low
关键词: Vision and language |  Parameter-Efficient fine-Tuning |  Low-Rank approximation

6. Deconfounded Causality-Aware Parameter-Efficient Fine-Tuning for Problem-Solving Improvement of LLMs NSTL国家科技图书文献中心

Ruoyu Wang |  Xiaoxuan Li... -  《Web Information Systems Engineering - WISE 2024,Part IV》 -  International Conference on Web Information Systems Engineering - 2025, - 161~176 - 共16页

摘要: novel parameter-efficient fine-tuning (PEFT) method to |  results to other fine-tuning methods. This demonstrates | Large Language Models (LLMs) have demonstrated |  remarkable efficiency in tackling various tasks based on |  human instructions, but studies reveal that they often
关键词: Parameter-Efficient fine-Tuning (PEFT) |  Causality |  Large language models

7. VL-MPFT: Multitask Parameter-Efficient Fine-Tuning for Visual-Language Pre-trained Models via Task-Adaptive Masking NSTL国家科技图书文献中心

Min Zhu |  Guanming Liu... -  《Pattern Recognition and Computer Vision,Part V》 -  Chinese Conference on Pattern Recognition and Computer Vision - 2025, - 379~394 - 共16页

摘要:Parameter-efficient fine-tuning (PEFT) has | , existing parameter-efficient fine-tuning methods for VLMs |  parameter-efficient fine-tuning for visual-language pre |  selectively updating a small subset for fine-tuning. However |  fine-tuning accordingly. In this paper, the proposed
关键词: Multimodel learning |  Multitask learning |  Parameter-efficient finetune-tuning

10. Enhancing text understanding of decoder-based model by leveraging parameter-efficient fine-tuning method NSTL国家科技图书文献中心

Wasif,Feroze |  Shaohuan,Cheng... -  《Neural computing & applications》 - 2025,37(9) - 6899~6913 - 共15页

摘要: propose a parameter-efficient fine-tuning framework that | Machine reading comprehension (MRC) is a |  fundamental natural language understanding task in natural |  language processing, which aims to comprehend the text of |  a given passage and answer questions based on it
关键词: Natural language understanding |  MRC question answering |  Parameter-efficient fine-tuning |  Large language models
检索条件Parameter-Efficient Tuning
  • 检索词扩展

NSTL主题词

  • NSTL学科导航