NTU-Exam 板


LINE

課程名稱︰資訊檢索與擷取 課程性質︰資工系選修 課程教師︰陳信希 開課學院:電機資訊學院 開課系所︰資訊工程學系 考試日期(年月日)︰2022/01/06 考試時限(分鐘):180 試題 : 1. The following lists 5 tasks and 10 evaluation metrics. Tasks: Web Search, Question Answering, Named Entity Recognition, Relation Ex- traction, Entity Retrieval Evaluation Metrics: Accuracy, Precision, Recall, F1, F0.5, F2, MAP, NDCG, MRR, Kendall Tau Coefficient (a) Please discuss and explain which evaluation metrics are suitable for each task. If there are no suitable metrics for a task, please give your sug- gestions. (25 points) (b) If there are no suitable tasks for an evaluation metric, please discuss in what situation the evaluation metric will be adopted. (10 points) 2. Long documents may contain mixture of topics. Query matches may be spread over the whole document. Please describe how a neural document ranking model aggregates the relevant metches from different parts of a long document. (10 points) 3. Knowledge Base Acceleration (KBA) task is defined as follows. This task aims to filter a time-ordered corpus for documents that are highly relevant to a predefined list of entities. Total 27 people and 2 organiza- tions are selected. A stream corpus spanning 4,973 consecutive hours is con- structed. It contains over 400M documents. Each document has a timestamp that places it in the stream. The 29 target entities were mentioned infrequently enough in the corpus. Judgments for documents from before stream corpus con- struction time were provided as training data for filtering documents from the remaining hours. You are instructed to apply your system to each hourly directory of corpus data in chronological order. For each hour, before pro- cessing the next hour, systems are expected to emit a list of assertions con- necting documents and entities. The goal is to identify only central-rated documents. (a) Please show your idea to deal with the KBA task. (10 points) (b) Please discuss how this task is related to Knowledge Base Completion (KBC), which involves in discovering missing facts. (5 points) 4. Knowledge base is useful for document retrieval. Please explain the latent factor modeling approach and the deep learning approach to introduce know- ledge base to enhance the performance of document retrieval. (14 points) 5. Entity relationship explanation is a textual description to describe how a given pair of entities is related. Please show how to deal with this task by using knowledge graph. (6 points) 6. Traditional IE predict their relation from a predefined set such as "Birth- Place" and "Spouse." By contrast, open information extraction (Open IE) aims to extract the triples that consist of a pair of argument phrases and their relation phrase from textual data. For example, one can extract the following two triples from the sentence "Albert Einstein was born in Ulm and died in Princeton." (Albert Einstein, was born in, Ulm) (Albert Einstein, died in, Princeton) Please answer the following questions about open IE. (a) Compared with traditional IE, give an advantage and a disadvantage of open IE. (6 points) (b) Give two downstream applications of open IE. (6 points) (c) Given a collection of news articles, please provide a feasible method to construct an open IE system without the need of labeled data. (8 points) 7. The following lists the presentation topics presented by the team members. Team 1: Learning an End-to-End Structure for Retrieval in Large-Scale Recommendations Team 2: 1. EmbedKGQA: Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings 2. TransferNet: An Effective and Transparent Framework for Multi-hop Question Answering over Relation Graph 3. Improving Multi-hop Knowledge Base Question Answering by Learning Interme- diate Supervision Signals Team 3: Inductive Topic Variational Graph Auto-Encoder for Text Classification Team 4: Dense Passage Retrieval for Open-Domain Question Answering Team 5: "Did you buy it already?", Detecting Users Purchase-State From Their Product- Related Questions Team 6: UnitedQA: A Hybrid Approach for Open Domain Question Answering Team 7: 1. A Reinforcement Learning Framework for Relevance Feedback 2. Generating Images Instead of Retrieving Them: Relevance Feedback on Gene- rative Adversarial Networks Team 8: 1. AutoDebias: Learning to Debias for Recommendation 2. Casual Intervention for Leveraging Popularity Bias in Recommendation Team 9: Self-Supervised Reinforcement Learning for Recommender Systems Team 10: 1. Multi-behavior Recommendation with Graph Convolutional Networks. 2. Graph Heterogeneous Multi-Relational Recommendation. Team 11: Self-supervised Graph Learning for Recommentation. Team 12: Personalized Search-based Query Rewrite System for Conversational AI Team 13: Group based Personalized Search by Integrating Search Behaviour and Friend Network Team 14: Answering Any-hop Open-domain Questions with Iterative Document Reranking Team 15: 1. Time Matters: Sequential Recommendation with Complex Temporal Information 2. Motif-aware Sequential Recommendation Team 16: 1. Recommending Podcasts for Cold-Start Users Based on Music Listening and Taste 2. Fairness among New Items in Cold Start Recommender Systems 3. A Heterogeneous Graph Neural Model for Cold-start Recommendation Team 17: 1. Estimation-Action-Reflection: Towards Deep Interaction Between Conversa- tional and Recommender Systems 2. Time Interval Aware Self-Attention for Sequential Recommendation Please write down your team id first, and then select the most exciting topic you learned from the other team. Please write down this team id and specify the idea you learned from their presentation in brief. (10 points) -- 第01話 似乎在課堂上聽過的樣子 第02話 那真是太令人絕望了 第03話 已經沒什麼好期望了 第04話 被當、21都是存在的 第05話 怎麼可能會all pass 第06話 這考卷絕對有問題啊 第07話 你能面對真正的分數嗎 第08話 我,真是個笨蛋 第09話 這樣成績,教授絕不會讓我過的 第10話 再也不依靠考古題 第11話 最後留下的補考 第12話 我最愛的學分 --



※ 發信站: 批踢踢實業坊(ptt.cc), 來自: 111.249.65.236 (臺灣)
※ 文章網址: https://webptt.com/m.aspx?n=bbs/NTU-Exam/M.1767060936.A.BB4.html
1F:→ rod24574575 : 收錄資訊系! 12/30 22:55







like.gif 您可能會有興趣的文章
icon.png[問題/行為] 貓晚上進房間會不會有憋尿問題
icon.pngRe: [閒聊] 選了錯誤的女孩成為魔法少女 XDDDDDDDDDD
icon.png[正妹] 瑞典 一張
icon.png[心得] EMS高領長版毛衣.墨小樓MC1002
icon.png[分享] 丹龍隔熱紙GE55+33+22
icon.png[問題] 清洗洗衣機
icon.png[尋物] 窗台下的空間
icon.png[閒聊] 双極の女神1 木魔爵
icon.png[售車] 新竹 1997 march 1297cc 白色 四門
icon.png[討論] 能從照片感受到攝影者心情嗎
icon.png[狂賀] 賀賀賀賀 賀!島村卯月!總選舉NO.1
icon.png[難過] 羨慕白皮膚的女生
icon.png閱讀文章
icon.png[黑特]
icon.png[問題] SBK S1安裝於安全帽位置
icon.png[分享] 舊woo100絕版開箱!!
icon.pngRe: [無言] 關於小包衛生紙
icon.png[開箱] E5-2683V3 RX480Strix 快睿C1 簡單測試
icon.png[心得] 蒼の海賊龍 地獄 執行者16PT
icon.png[售車] 1999年Virage iO 1.8EXi
icon.png[心得] 挑戰33 LV10 獅子座pt solo
icon.png[閒聊] 手把手教你不被桶之新手主購教學
icon.png[分享] Civic Type R 量產版官方照無預警流出
icon.png[售車] Golf 4 2.0 銀色 自排
icon.png[出售] Graco提籃汽座(有底座)2000元誠可議
icon.png[問題] 請問補牙材質掉了還能再補嗎?(台中半年內
icon.png[問題] 44th 單曲 生寫竟然都給重複的啊啊!
icon.png[心得] 華南紅卡/icash 核卡
icon.png[問題] 拔牙矯正這樣正常嗎
icon.png[贈送] 老莫高業 初業 102年版
icon.png[情報] 三大行動支付 本季掀戰火
icon.png[寶寶] 博客來Amos水蠟筆5/1特價五折
icon.pngRe: [心得] 新鮮人一些面試分享
icon.png[心得] 蒼の海賊龍 地獄 麒麟25PT
icon.pngRe: [閒聊] (君の名は。雷慎入) 君名二創漫畫翻譯
icon.pngRe: [閒聊] OGN中場影片:失蹤人口局 (英文字幕)
icon.png[問題] 台灣大哥大4G訊號差
icon.png[出售] [全國]全新千尋侘草LED燈, 水草

請輸入看板名稱,例如:Tech_Job站內搜尋

TOP