找回密碼
 To register

QQ登錄

只需一步,快速開始

掃一掃,訪問微社區(qū)

打印 上一主題 下一主題

Titlebook: Artificial Intelligence in HCI; 5th International Co Helmut Degen,Stavroula Ntoa Conference proceedings 2024 The Editor(s) (if applicable)

[復制鏈接]
樓主: Diverticulum
51#
發(fā)表于 2025-3-30 08:46:07 | 只看該作者
Enhancing Large Language Models Through External Domain Knowledgestep the artifact is developed based on requirements deducted from literature. Eventually, the functionality of the artifact is demonstrated as a proof-of-concept in a case study. The research contributes an initial approach for effective and grounded knowledge transfer, which minimizes the risk of hallucination from LLM-generated content.
52#
發(fā)表于 2025-3-30 14:39:07 | 只看該作者
53#
發(fā)表于 2025-3-30 16:54:35 | 只看該作者
54#
發(fā)表于 2025-3-30 21:06:38 | 只看該作者
You Got the?Feeling: Attributing Affective States to?Dialogical Social Robotsadoption of a Large Language Model (i.e. chatGPT in our case) whilst the simplest one has been based on a manual simplification of the generated text. We report the obtained results based on the adoption of a number tests and standardized scales and highlight some possibile future directions.
55#
發(fā)表于 2025-3-31 04:53:15 | 只看該作者
Conference proceedings 2024e in HCI, AI-HCI 2024, held as part of the 26th International Conference, HCI International 2024, which took place in Washington, DC, USA, during June 29-July 4, 2024...The total of 1271 papers and 309 posters included in the HCII 2024 proceedings was carefully reviewed and selected from 5108 submis
56#
發(fā)表于 2025-3-31 07:40:50 | 只看該作者
https://doi.org/10.1007/978-3-031-60615-1Artificial Intelligence in HCI; Human-Centered Artificial Intelligence; Dialogue systems; Language mode
57#
發(fā)表于 2025-3-31 10:40:49 | 只看該作者
58#
發(fā)表于 2025-3-31 14:28:39 | 只看該作者
Artificial Intelligence in HCI978-3-031-60615-1Series ISSN 0302-9743 Series E-ISSN 1611-3349
59#
發(fā)表于 2025-3-31 19:01:36 | 只看該作者
Parenting Roles and Relationships,s that leverage LLMs: (1) relation extraction via in-context few-shot learning with LLMs, (2) enhancing the sequence-to-sequence (seq2seq)-based full fine-tuned relation extraction by CoT reasoning explanations generated by LLMs, (3) enhancing the classification-based full fine-tuned relation extrac
 關于派博傳思  派博傳思旗下網(wǎng)站  友情鏈接
派博傳思介紹 公司地理位置 論文服務流程 影響因子官網(wǎng) 吾愛論文網(wǎng) 大講堂 北京大學 Oxford Uni. Harvard Uni.
發(fā)展歷史沿革 期刊點評 投稿經(jīng)驗總結 SCIENCEGARD IMPACTFACTOR 派博系數(shù) 清華大學 Yale Uni. Stanford Uni.
QQ|Archiver|手機版|小黑屋| 派博傳思國際 ( 京公網(wǎng)安備110108008328) GMT+8, 2025-11-1 10:09
Copyright © 2001-2015 派博傳思   京公網(wǎng)安備110108008328 版權所有 All rights reserved
快速回復 返回頂部 返回列表
双江| 密山市| 富川| 潜山县| 图们市| 宜兰市| 乌拉特后旗| 桐城市| 西乌珠穆沁旗| 彝良县| 麻江县| 拜泉县| 彩票| 京山县| 诸城市| 新巴尔虎右旗| 麦盖提县| 武定县| 明溪县| 景泰县| 武宁县| 阿勒泰市| 姚安县| 克山县| 乐昌市| 高碑店市| 云阳县| 静海县| 永丰县| 巧家县| 离岛区| 新巴尔虎左旗| 鲁甸县| 两当县| 临武县| 延安市| 卓资县| 台州市| 濮阳县| 湖北省| 盐边县|