Finite-state residual vector quantization using a tree-structured competitive neural network

被引:6
|
作者
Rizvi, SA [1 ]
Nasrabadi, NM [1 ]
机构
[1] USA,RES LAB,ADELPHI,MD 20783
关键词
competitive neural network; image coding; joint optimization; multilayer perceptron; vector quantization;
D O I
10.1109/76.564114
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Finite-state vector quantization (FSVQ) is known to give better performance than the memoryless vector quantization (VQ). This paper presents a new FSVQ scheme, called finite-state residual vector quantization (FSRVQ), in which each state uses a residual vector quantizer (RVQ) to encode the input vector, This scheme differs from the conventional FSVQ in that the state-RVQ codebooks encode the residual vectors instead of the original vectors, A neural network predictor estimates the current block based on the four previously encoded blocks, The predicted vector is then used to identify the current state as well as to generate a residual vector (the difference between the current vector and the predicted vector), This residual vector is encoded using the current state-RVQ codebooks, A major task in designing our proposed FSRVQ is the joint optimization of the next-state codebook and the state-RVQ codebooks, This is achieved by introducing a novel tree-structured competitive neural network in which the first layer implements the next-state function, and each branch of the tree implements the corresponding state-RVQ, A joint training algorithm is also developed that mutually optimizes the next-state and the state-RVQ codebooks for the proposed FSRVQ, Joint optimization of the next-state function and the state-RVQ codebooks eliminates a large number of redundant states in the conventional FSVQ design; consequently, the memory requirements are substantially reduced in the proposed FSRVQ scheme, The proposed FSRVQ can be designed for high bit rates due to its very low memory requirements and the low search complexity of the state-RVQ's, Simulation results show that the proposed FSRVQ scheme outperforms conventional FSVQ schemes both in terms of memory requirements and the visual quality of the reconstructed image. The proposed FSRVQ scheme also outperforms JPEG (the current standard for still image compression) at low bit rates.
引用
收藏
页码:377 / 390
页数:14
相关论文
共 50 条
  • [1] VARIABLE-BRANCH TREE-STRUCTURED RESIDUAL VECTOR QUANTIZATION
    Yang, Shiueng-Bien
    INTERNATIONAL JOURNAL OF IMAGE AND GRAPHICS, 2008, 8 (01) : 61 - 80
  • [2] Adaptive-search tree-structured residual vector quantization
    Peel, CB
    Liu, XG
    Budge, SE
    2000 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, PROCEEDINGS, VOLS I-VI, 2000, : 1887 - 1890
  • [3] FINITE-STATE VECTOR QUANTIZATION WITH NEURAL-NETWORK CLASSIFICATION OF STATES
    MANIKOPOULOS, CN
    IEE PROCEEDINGS-F RADAR AND SIGNAL PROCESSING, 1993, 140 (03) : 153 - 161
  • [4] Selectively tree-structured vector quantizer using Kohonen neural network
    Wang, W
    Li, X
    Lu, DJ
    ICSP '96 - 1996 3RD INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, PROCEEDINGS, VOLS I AND II, 1996, : 1504 - 1507
  • [5] Image classification using tree-structured discriminant vector quantization
    Ozonat, KM
    CONFERENCE RECORD OF THE THIRTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1 AND 2, 2003, : 1610 - 1614
  • [6] Tree-structured vector quantization for speech recognition
    Barszcz, M
    Chen, W
    Boulianne, G
    Kenny, P
    COMPUTER SPEECH AND LANGUAGE, 2000, 14 (03): : 227 - 239
  • [7] Fast texture synthesis using tree-structured vector quantization
    Wei, LY
    Levoy, M
    SIGGRAPH 2000 CONFERENCE PROCEEDINGS, 2000, : 479 - 488
  • [8] TREE-STRUCTURED RESIDUAL VECTOR QUANTIZATION WITH CONSTRAINED STORAGE FOR IMAGE-CODING
    CHUN, KW
    RA, JB
    ELECTRONICS LETTERS, 1994, 30 (20) : 1662 - 1664
  • [9] RESIDUAL VECTOR QUANTIZATION USING A MULTILAYER COMPETITIVE NEURAL-NETWORK
    RIZVI, SA
    NASRABADI, NM
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 1994, 12 (09) : 1452 - 1459
  • [10] OPTIMAL PRUNING FOR TREE-STRUCTURED VECTOR QUANTIZATION
    LIN, JH
    STORER, JA
    COHN, M
    INFORMATION PROCESSING & MANAGEMENT, 1992, 28 (06) : 723 - 733