• Title/Summary/Keyword: Pointer Network

Search Result 41, Processing Time 0.026 seconds

Bidirectional Stack Pointer Network for Korean Dependency Parsing (Bidirectional Stack Pointer Network를 이용한 한국어 의존 파싱)

  • Hong, Seung-Yean;Na, Seung-Hoon;Shin, Jong-Hoon;Kim, Young-Kil
    • Annual Conference on Human and Language Technology
    • /
    • 2018.10a
    • /
    • pp.19-22
    • /
    • 2018
  • 본 논문에서는 기존 Stack Pointer Network의 의존 파싱 모델을 확장한 Bi-Stack Pointer Network를 제안한다. Stack Pointer Network는 기존의 Pointer Network에 내부 stack을 만들어 전체 문장을 읽어 dependency tree를 구성한다. stack은 tree의 깊이 우선 탐색을 통해 선정되고 Pointer Network는 stack의 top 단어(head)의 자식(child)을 선택한다. 제안한 모델은 기존의 Stack Pointer Network가 지배소(head)정보로 의존소(child)를 예측하는 부분에 Biaffine attention을 통해 의존소(child)에서 지배소(head)를 예측하는 방향을 추가하여 양방향 예측이 가능하게 한 모델이다. 실험 결과, 제안 Bi-Stack Pointer Network모델은 UAS 91.53%, LAS 90.93%의 성능을 보여주어 기존 최고 성능을 개선시켰다.

  • PDF

Stack-Pointer Network for Korean Dependency Parsing (Stack-Pointer Network를 이용한 한국어 의존 구문 분석)

  • Cha, Da-Eun;Lee, Dong-Yub;Lim, Heui-Seok
    • Annual Conference on Human and Language Technology
    • /
    • 2018.10a
    • /
    • pp.685-688
    • /
    • 2018
  • 의존 구문 분석은 자연어 문장에 포함된 단어들 간의 의존 관계를 분석하는 과제로 다양한 자연어 이해 과제에 요구되는 핵심 기술 중 하나이다. 본 연구에서는 단어와 문자 자질을 적용한 기존 Stack-Pointer Network의 인코더의 입력 단어 표상을 확장하여, 한국어를 비롯한 형태적으로 복잡한 언어(morphologically rich language)에 적합하도록 음절-태그 단위, 형태소 단위, 형태소 품사 정보 자질을 보강한 의존 구문 분석 모델을 제안한다. 실험 결과 제안하는 모델은 의존 구조로 변환된 세종 구문 분석 말뭉치에서 UAS 90.58%, LAS 88.35%의 성능을, 2018 국어 정보 처리 시스템 경진 대회 평가 데이터에서 UAS 84.69%, LAS 82.02%의 성능을 보였다. 더불어 제안하는 모델은 포함된 문장의 전체 길이가 긴 의존 관계, 의존소와 지배소의 거리가 먼 의존 관계, 의존소를 구성하는 형태소의 개수가 많은 의존 관계에서 기존 Stack-Pointer Network보다 향상된 성능을 보였다.

  • PDF

FPGA Implementation of a Pointer Interpreter for SDH/SONET Network Synchronization (SDH와 SONET망의 동기화를 위한 포인터 해석기의 FPGA 구현)

  • 이상훈;박남천;신위재
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.5 no.3
    • /
    • pp.230-235
    • /
    • 2004
  • This paper describes FPGA implementation of a pointer interpreter which can support a synchronization of SDH(or SONET)-based transmission network. The pointer interpreter consists of a pointer-word extractor and a pointer-word interpreter The pointer-word extractor which is composed of mod-6480 counter, shift register and pointer synchronizing block, finds out the H1 and H2 pointer word from a 51.84 Mb/s AU-3/STS-1 data frame and then performs the synchronizing with a 6.48 Mb/s by dividing them in 8. Based on the extracted pointer word, pointer-word interpreter analyzes pointer states such LOP, AIS and NORM according to pointer state-transition algorithm. It consists of a majority vote, a pointer word valid/invalid check, a pointer justification, and a pointer state check. The simulation results of Xilinx Virtex XCV200PQ240 FPGA chip shows the exact pointer word extraction and correct decision of pointer status based on extracted pointer word. The proposed pointer interpreter is suitable for pointer interpretation of 155 Mb/s STM-1/STS-3 frame.

  • PDF

Laser pointer detection using neural network for human computer interaction (인간-컴퓨터 상호작용을 위한 신경망 알고리즘기반 레이저포인터 검출)

  • Jung, Chan-Woong;Jeong, Sung-Moon;Lee, Min-Ho
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.16 no.1
    • /
    • pp.21-30
    • /
    • 2011
  • In this paper, an effective method to detect the laser pointer on the screen using the neural network algorithm for implementing the human-computer interaction system. The proposed neural network algorithm is used to train the patches without a laser pointer from the input camera images, the trained neural network then generates output values for an input patch from a camera image. If a small variation is perceived in the input camera image, amplify the small variations and detect the laser pointer spot in the camera image. The proposed system consists of a laser pointer, low-price web-camera and image processing program and has a detection capability of laser spot even if the background of computer monitor has a similar color with the laser pointer spot. Therefore, the proposed technique will be contributed to improve the performance of human-computer interaction system.

Coreference Resolution using Hierarchical Pointer Networks (계층적 포인터 네트워크를 이용한 상호참조해결)

  • Park, Cheoneum;Lee, Changki
    • KIISE Transactions on Computing Practices
    • /
    • v.23 no.9
    • /
    • pp.542-549
    • /
    • 2017
  • Sequence-to-sequence models and similar pointer networks suffer from performance degradation when an input is composed of multiple sentences or when the length of the input sentence is long. To solve this problem, this paper proposes a hierarchical pointer network model that uses both the word level and sentence level information to encode input sequences composed of several sentences at the word level and sentence level. We propose a hierarchical pointer network based coreference resolution that performs a coreference resolution for all mentions. The experimental results show that the proposed model has a precision of 87.07%, recall of 65.39% and CoNLL F1 74.61%, which is an improvement of 21.83% compared to an existing rule-based model.

Pointer-Generator Networks for Community Question Answering Summarization (Pointer-Generator Networks를 이용한 cQA 시스템 질문 요약)

  • kim, Won-Woo;Kim, Seon-Hoon;Jang, Heon-Seok;Kang, In-Ho;Park, Kwang-Hyun
    • Annual Conference on Human and Language Technology
    • /
    • 2018.10a
    • /
    • pp.126-131
    • /
    • 2018
  • cQA(Community-based Question Answering) 시스템은 사용자들이 질문을 남기고 답변을 작성하는 시스템이다. cQA는 사용자의 편의를 위해 기존의 축적된 질문을 검색하거나 카테고리로 분류하는 기능을 제공한다. 질문의 길이가 길 경우 검색이나 카테고리 분류의 정확도가 떨어지는 한계가 있는데, 이를 극복하기 위해 cQA 질문을 요약하는 모델을 구축할 필요가 있다. 하지만 이러한 모델을 구축하려면 대량의 요약 데이터를 확보해야 하는 어려움이 존재한다. 본 논문에서는 이러한 어려움을 극복하기 위해 cQA의 질문 제목, 본문으로 데이터를 확보하고 필터링을 통해 요약 데이터 셋을 만들었다. 또한 본문의 대표 단어를 이용하여 추상 요약을 하기 위해 딥러닝 기반의 Pointer-generator model을 사용하였다. 실험 결과, 기존의 추출 요약 방식보다 딥러닝 기반의 추상 요약 방식의 성능이 더 좋았으며 Pointer-generator model이 보다 좋은 성능을 보였다.

  • PDF

Jitter Analysis for Communication Systems Employing Pointer Scheme (포인터 기법을 사용한 통신 시스템에 대한 지터 해석)

  • Chang, Hoon;Lee, Byeong-Gi
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.27 no.1
    • /
    • pp.1-9
    • /
    • 1990
  • This paper investigates the significance and the implication of the pointer scheme, which was recently adapted by CCITT as a standard synchronization method in the broadband network-node interface environment, and discusses the merits of the pointer scheme in comparison with the conventional positive justification method. It also analyzes the jitter performance of the communication system employing the pointer scheme based on the fact that the pointer scheme corresponds to a multiple-bit positive/zero/negative justification.

  • PDF

Mention Detection with Pointer Networks (포인터 네트워크를 이용한 멘션탐지)

  • Park, Cheoneum;Lee, Changki
    • Journal of KIISE
    • /
    • v.44 no.8
    • /
    • pp.774-781
    • /
    • 2017
  • Mention detection systems use nouns or noun phrases as a head and construct a chunk of text that defines any meaning, including a modifier. The term "mention detection" relates to the extraction of mentions in a document. In the mentions, a coreference resolution pertains to finding out if various mentions have the same meaning to each other. A pointer network is a model based on a recurrent neural network (RNN) encoder-decoder, and outputs a list of elements that correspond to input sequence. In this paper, we propose the use of mention detection using pointer networks. Our proposed model can solve the problem of overlapped mention detection, an issue that could not be solved by sequence labeling when applying the pointer network to the mention detection. As a result of this experiment, performance of the proposed mention detection model showed an F1 of 80.07%, a 7.65%p higher than rule-based mention detection; a co-reference resolution performance using this mention detection model showed a CoNLL F1 of 52.67% (mention boundary), and a CoNLL F1 of 60.11% (head boundary) that is high, 7.68%p, or 1.5%p more than coreference resolution using rule-based mention detection.

Coreference Resolution for Korean Pronouns using Pointer Networks (포인터 네트워크를 이용한 한국어 대명사 상호참조해결)

  • Park, Cheoneum;Lee, Changki
    • Journal of KIISE
    • /
    • v.44 no.5
    • /
    • pp.496-502
    • /
    • 2017
  • Pointer Networks is a deep-learning model for the attention-mechanism outputting of a list of elements that corresponds to the input sequence and is based on a recurrent neural network (RNN). The coreference resolution for pronouns is the natural language processing (NLP) task that defines a single entity to find the antecedents that correspond to the pronouns in a document. In this paper, a pronoun coreference-resolution method that finds the relation between the antecedents and the pronouns using the Pointer Networks is proposed; furthermore, the input methods of the Pointer Networks-that is, the chaining order between the words in an entity-are proposed. From among the methods that are proposed in this paper, the chaining order Coref2 showed the best performance with an F1 of MUC 81.40 %. The method showed performances that are 31.00 % and 19.28 % better than the rule-based (50.40 %) and statistics-based (62.12 %) coreference resolution systems, respectively, for the Korean pronouns.

Multi-task learning with contextual hierarchical attention for Korean coreference resolution

  • Cheoneum Park
    • ETRI Journal
    • /
    • v.45 no.1
    • /
    • pp.93-104
    • /
    • 2023
  • Coreference resolution is a task in discourse analysis that links several headwords used in any document object. We suggest pointer networks-based coreference resolution for Korean using multi-task learning (MTL) with an attention mechanism for a hierarchical structure. As Korean is a head-final language, the head can easily be found. Our model learns the distribution by referring to the same entity position and utilizes a pointer network to conduct coreference resolution depending on the input headword. As the input is a document, the input sequence is very long. Thus, the core idea is to learn the word- and sentence-level distributions in parallel with MTL, while using a shared representation to address the long sequence problem. The suggested technique is used to generate word representations for Korean based on contextual information using pre-trained language models for Korean. In the same experimental conditions, our model performed roughly 1.8% better on CoNLL F1 than previous research without hierarchical structure.