libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro
ARGOMENTO:  BOOKS > INFORMATICA > TESTI GENERALI

he xiangnan (curatore); ren zhaochun (curatore); tang ruiming (curatore) - information retrieval
Zoom

Information Retrieval 30th China Conference, CCIR 2024, Wuhan, China, October 18–20, 2024, Revised Selected Papers

; ;




Disponibilità: Normalmente disponibile in 15 giorni


PREZZO
54,98 €
NICEPRICE
52,23 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, Carta della Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Editore:

Springer

Pubblicazione: 02/2025





Trama

This book constitutes the refereed proceedings of the 30th China Conference on Information Retrieval, CCIR 2024, held in Wuhan, China, during October 18–20, 2024.

The 11 full papers presented in this volume were carefully reviewed and selected from 26 submissions. As the flagship conference of CIPS, CCIR focuses on the development of China’s internet industry and provides a broad platform for the exchange of the latest academic and technological achievements in the field of information retrieval.





Sommario

.- Play to Your Strengths: Collaborative Intelligence of Conventional Recommender Models and Large Language Models.
.- A Dual-Aligned Model for Multimodal Recommendation.
.- CASINet: A Context-Aware Social Interaction Rumor Detection Network.
.- A Claim Decomposition Benchmark for Long-form Answer Verification.
.- Dual-granularity Hierarchical Fusion Network for Multimodal Humor Recognition on Memes.
.- Exploring the Potential of Dimension Reduction in Building Efficient Dense Retrieval Systems.
.- Relation Extraction Model Based on Overlap Rules and Abductive Learning.
.- Multi-task Instruction Tuning for Temporal Question Answering over Knowledge Graphs.
.- On the Capacity of Citation Generation by Large Language Models.
.- Are Large Language Models More Honest in Their Probabilistic or Verbalized Confidence?.
.- QUITO: Accelerating Long-Context Reasoning through Query-Guided Context Compression.











Altre Informazioni

ISBN:

9789819617098

Condizione: Nuovo
Collana: Lecture Notes in Computer Science
Dimensioni: 235 x 155 mm
Formato: Brossura
Illustration Notes:X, 149 p. 32 illus., 31 illus. in color.
Pagine Arabe: 149
Pagine Romane: x


Dicono di noi