Embodiment
By describing the preferred embodiments of the present invention hereinafter by accompanying drawing.Unnecessary details in the following description, by being described in detail, do not become the function of prior art or structure, because will cause the ambiguous of introducing of the present invention.
Typical DRA audio coder 10 has been shown in Figure 1A, and it can be realized by hardware, software and/or firmware.In brief, the related technology of DRA standard is exactly, with a plurality of technology modules, source sound (for example, input PCM sample) is carried out to signal processing, to reach the object of pressure source sound " coding defect is audible hardly ".Above-mentioned a plurality of technology modules includes but not limited to: transient analysis module 20, multirate filter bank module 22, linear scalar quantization module 30, quantification index coding module 32, code book are selected module 34, human auditory model module 40, overall bit distribution module 42 and multiplexing module 50.According to the relevant regulations of DRA standard, above-mentioned technology modules is essential module, and standard compliant DRA output code flow (that is, DRA standard code stream) must be the code stream after above-mentioned resume module.According to its function, above-mentioned module can be divided into four groups, it is multiresolution analysis group (comprising transient analysis module 20, multirate filter bank module 22), quantification group (comprising linear scalar quantization module 30, quantification index coding module 32, code book selection module 34), psychoacoustic model group (comprising human auditory model module 40, overall bit distribution module 42), MUX group (multiplexing module 50).Content of the present invention is main relevant with above-mentioned quantification group.
DRA audio decoder 110 has been shown in Figure 1B, and it is decoded to DRA encoding code stream, to obtain DRA decoded signal (that is, PCM sample output).Hereinafter in connection with Figure 1B discussion flow process relevant to DRA Hofmann decoding method: first receive DRA encoding code stream at demultiplexing module 150 places, and extract control information and data message wherein; Subsequently code book is selected communication to select module 134 to code book, and by this module controls quantification index module 132 and quantifying unit number module; The data of quantification index module 132 based on from demultiplexing module 150 and select the control information of module 134 from code book, decoding obtains quantification index; Finally, inverse quantization module 130 is according to the quantification index of decoding and the control information that provides of quantifying unit number module, the quantification index of re-quantization decoding.
The first embodiment: the Hofmann decoding technology of optimization
Below in conjunction with Fig. 2, explain according to the improvement Hofmann decoding method 1000 of first embodiment of the invention in detail, this coding/decoding method not only can be for DRA audio coding and decoding system, also can be used for other audio frequency and/or video coding and decoding system, therefore be referred to as hereinafter " general improved Hofmann decoding method ".
In order to discuss conveniently, exemplarily provide 1 Huffman code book (being shown in table 1) to be decoded, it can be used for any known audio/video encoding/decoding system.This code book is scheduled to 2 grades, and nodes at different levels form by three variablees, the general formula of first order node is designated as to Structure1 (X, Determin1, Determin2) herein; Second level node is designated as to Structure2 (X, Determin1, Determin2).For example,, corresponding to the first order node (that is, node (1,3,0)) that is numbered 2, X=1, Determin1=3, Determin2=0.Again for example, corresponding to the 3rd second level node (that is, node (5,23,4)) that is numbered 4, X=5, Determin1=23, Determin2=4.Table 1 Huffman code book example (walking abreast)
All nodes of table 1 are divided into parallel two groups (that is, two row) and represent, therefore, the Huffman code book shown in table 1 can be referred to as to parallel Huffman code book.Correspondingly, also have a kind of Huffman code book aligning method of serial, in this aligning method, all node sequence form a line (that is, being divided into one group), and first order node is front, after second level node sequence comes first order node.The serial Huffman code book corresponding with parallel Huffman code book shown in table 1 is shown in Table 2.Table 2 Huffman code book example (serial)
Associative list 1, the arrangement mode of his-and-hers watches 2 is briefly described: in table 2, be numbered the node of 0-15 one by one corresponding to the first order node that is numbered 0-15 in table 1; In table 2, be numbered the node of 16-27 one by one corresponding to 12 second level nodes that are numbered 4 in table 1; In table 2, be numbered the node of 28-33 one by one corresponding to 6 second level nodes that are numbered 6 in table 1; In table 2, be numbered the node of 34-37 one by one corresponding to 4 second level nodes that are numbered 11 in table 1.
Hereinafter, in connection with table 1-2 and Fig. 2, introduce in detail general improved Hofmann decoding method 1000.As shown in Figure 2, general improved Hofmann decoding method 1000 starts at step 1100 place, reads in subsequently x Bit data (x ∈ N, and 4≤x≤8, and x is fixed as 4 in the first embodiment) in step 1101 from data flow to be decoded.Then, in step 1102, the x Bit data by this with binary representation is converted to decimal number, usings the linear directory of this numerical value as the first order code book in table 1.Then, in step 1103, according to this linear directory, table look-up 1, obtain a search unit in first order code book.So far, complete first order search, and start to judge for the first time.
Then, in step 1104, whether the Determin2 that judges this search unit is zero: if Determin2=0 illustrates that Structure1 is leaf node, the concrete form of Structure1 becomes (symbol, bit_used1,0), and decoding process forwards step 1110 to, and symbol is as decoded data in output, and export bit_used1 as the bit number of Huffman code word, then decoding process forwards step 1109 place to and finishes.If Determin2 is non-zero, illustrate that Structure1 is root node, the concrete form of Structure1 becomes (jump_address, Determin1, num_of_subentries), and decoding process forwards step 1105 to, calculate the original position orig (i.e. index position in table 2) of second level Hofmann decoding search.Specifically, according to three variate-values of the Structure1 in the non-zero situation of Determin2, can obtain the original position orig of second level Hofmann decoding search, be orig=jump_address, according to the value of orig, table look-up and 2 can in step 1105, obtain initial second level Huffman node Structure2; And the 3rd the variable num_of_subentries of Structure1 indicated the sum of corresponding second level Huffman node, it is referred to as again the maximum search degree of depth (max_depth).Finally, still, in step 1105, cyclic variable i is initialized as to orig.So far, complete judgement for the first time, and start to carry out second level search (step 1106).
In the search of the second level, second level Huffman node Structure2 is leaf node, and its concrete form can be expressed as Structure2 (codeword, symbol, bit_used2).In step 1106, read in the binary data of the individual bit of bit_used2 (i), and be converted into and treat than code word C_cw, wherein, bit_used2 (i) is illustrated in the numerical value of the bit_used2 of the node that is numbered i in table 2.Next, in step 1107, relatively whether C_cw and code word codeword (i) equate, wherein, codeword (i) is illustrated in the codeword value of the code word that is numbered i in table 2.If C_cw=codeword (i), enters step 1108; Otherwise, make i add 1, proceed step 1106-1107.In fact, the number of times that step 1107 repeats is not more than the maximum search degree of depth (max_depth).In step 1108, export symbol that i node is corresponding as decoded data, output bit_used2+x is as the bit number of Huffman code word.Decoding process subsequently forwards step 1109 place to and finishes.
Below, for first order Huffman node, be leaf node and two kinds of situations of root node, and associative list 1-2 is illustrated.
The situation that is leaf node for first order Huffman node, for example, 4 bits 0001 that read in step 1101
2(in this article, with XYZ
2the data XYZ that represents binary representation) indicate: first order codebook index is 1, in step 1103, tables look-up 1, obtain node for (2,3,0).In step 1104, judge Determin2=0, so node (2,3,0) is leaf node, in the output 2 of step 1110 place as decoded data, output 3 bit numbers as Huffman code word, decoding process subsequently forwards step 1109 place to and finishes.
The situation that is root node for first order Huffman node, for example, 4 bits 0110 that read in step 1101
2indicate: first order codebook index is 6, in step 1103, tables look-up 1, obtain node for (28,0,6).In step 1104, judge Determin2 non-zero, so node (28,0,6) is root node.Further, in step 1105, calculate: the node that second level search is numbered 28 (orig=28) from table 2; And the second level node of corresponding node (28,0,6) adds up to 6 (that is, corresponding to the nodes that are numbered 28-33 in table 2); Cyclic variable is initialized to 28.Then, again read in the binary data 10 of bit_used2 (28)=2 bit
2, and be converted into and treat than code word C_cw=2.In step 1107, can judge C_cw=2 ≠ codeword (28)=3, therefore make i add 1 (being i=29) repeating step 1106-1107 ... until the 4th is when repeat, bit_used2 (31)=3,3 Bit datas that read in are 100
2, corresponding C_cw=4, and codeword (31)=4=C_cw, illustrate and find qualified node (that is, be numbered in table 2 31 node).Now, decoding process enters into step 1108, is numbered symbol=18 that 31 node is corresponding as decoded data in output table 2, and output bit_used2+x=3+4=7 is as the bit number of Huffman code word.Decoding process subsequently forwards step 1109 place to and finishes.
Hereinafter in conjunction with Fig. 3, detailed description is according to the improvement Hoffman decoding device 2000 of first embodiment of the invention, this decoding device not only can be for DRA audio coding and decoding system, also can be used for other audio frequency and/or video coding and decoding system, therefore be referred to as hereinafter " general improved Hoffman decoding device ".
Huffman code book to be decoded as previously mentioned, still be scheduled to 2 grades, nodes at different levels form by three variablees, and the general formula of first order node can be expressed as Structure1 (X, Determin1, Determin2), second level node can be expressed as (the X into Structure2, Determin1, Determin2).The concrete arrangement mode of code book is still as described in table 1-2.
Hereinafter associative list 1-2 and Fig. 2, introduce general improved Hoffman decoding device 2000 in detail.As shown in Figure 3, general improved Hoffman decoding device 2000 comprises: can from encoding code stream, read in x Bit data (x ∈ N, and 4≤x≤8, be still fixed as 4 at this x), and by the buffer module of its output 2101; Receive the x bit of above-mentioned output, and be converted into decimal number, then export this numerical value as the linear directory modular converter 2102 of first order code book linear directory in table 1; Receive the above-mentioned linear directory being output and table look-up 1, then exporting the search module 2103 of corresponding search unit in first order code book.The function of above-mentioned three module 2101-2103 is: carry out first order search.
General improved Hoffman decoding device 2000 also comprises: whether the Determin2 that receives the search unit of above-mentioned output and judge this search unit is zero and selects the judge module 2104 exported.Judge module 2104 shown in Fig. 3 is " selecting an output in single input-bis-" modules, and this module selects one to export according to judged result from two output channels.Particularly, if Determin2=0, judge module 2104 judges that the concrete form of Structure1 is (symbol, bit_used1,0), and using symbol as decoded data, bit_used1 outputs to the first result output module 2110 as Huffman code word bit number, subsequently from the first result output module 2110 output Hofmann decoding data.Otherwise, if Determin2 is non-zero, judge module 2104 judges that the concrete form of Structure1 is (jump_address, Determin1, num_of_subentries), and by jump_address, num_of_subentries output to second level Hofmann decoding original position computing module 2105.Second level Hofmann decoding original position computing module 2105 calculates the original position orig of second level Hofmann decoding search according to formula orig=jump_address, according to this numerical value 2 initial second level of acquisition Huffman node Structure2 that table look-up; And module 2105 is usingd the numerical value of num_of_subentries as sum and the maximum search degree of depth (max_depth) of corresponding second level Huffman node.Finally, in module 2105, cyclic variable i initialization is equaled to orig, and finally export Structure2, max_depth and tri-amounts of i.The effect of above-mentioned module 2104-2110 or 2104-2105 is: carried out judgement for the first time, and prepared to start second level search (for the situation of module 2105 branches).
Situation for module 2105 branches, general improved Hoffman decoding device 2000 also comprises: circulation and code word comparison module 2106, it receives Structure2, max_depth and tri-amounts of i of above-mentioned output, and the concrete data file layout of judging Structure2 is Structure2 (codeword, symbol, bit_used2).Next circulation and code word comparison module 2106 read in the binary data of the individual bit of bit_used2 (i), and are converted into and treat than code word C_cw.Then or in circulation and code word comparison module 2106, relatively whether C_cw and the corresponding corresponding codewords codeword of i (i) equate.If C_cw=codeword (i), circulation and code word comparison module 2106 bit_used2 using the symbol in initial second level Huffman node Structure2 as decoded data, in initial second level Huffman node Structure2 outputs to the second result output module 2108 as Huffman code word bit number, subsequently from the second result output module 2108 output Hofmann decoding data.Otherwise, if judge C_cw ≠ codeword (i) in circulation and code word comparison module 2106, module 2106 makes i add 1, i node searched in continuation in table 2, the re-reading bit_used2 of entering (the i)-> that lays equal stress on is converted to and treats to judge C_cw and the whether equal step of codeword (i) than code word C_cw->, until C_cw=codeword (i).When arriving the node that needs by cyclic search (, determined suitable i) time, circulation and code word comparison module 2106 add that using being numbered symbol in the second level Huffman node of the i bit_used2 numerical value in decoded data, this node in table 2 x outputs to the second result output module 2108 as Huffman code word bit number, subsequently from the second result output module 2108 output Hofmann decoding data.
Preferably, can in circulation and code word comparison module 2106, add a correction module (not shown), it detects the number of times of circulation in circulation and code word comparison module 2106, and when this number of times equals the numerical value of max_depth, correction module reports an error.
Compare with one-level Hofmann decoding method and device in prior art, according to the Hofmann decoding method of first embodiment of the invention and device, may make faster searched the arriving of code book that probability of occurrence is high, and then improve decoding efficiency.Further, in second level node that use according to the Hofmann decoding method of first embodiment of the invention, Huffman code book, bit_used2 monotone increasing, and the probability inverse correlation that bit_used2 and code word occur, so the code word of high probability can be decoded quickly.
Those skilled in the art can understand, no matter to general improved Hofmann decoding method 1000, or to general improved Hoffman decoding device 2000, according to the Huffman code book hierarchical approaches of first embodiment of the invention, are all preferably two-stage.But be also not limited to this, according to code book feature, may construct the improvement Huffman code book more than two-stage.In addition, the data bit reading at first in encoding code stream is counted x and is also not limited to 4, and it can be fixed and be chosen as 5,6,7 or 8.
The second embodiment: improved DRA Hofmann decoding technology
The following examples, for the feature of DRA technology, are further improved it on the basis of " general improved Hofmann decoding method ", are referred to as " improved DRA Hofmann decoding method " herein.
First, in conjunction with to the explanation of Figure 1A-1B and according to DRA standard, DRA Hofmann decoding is mainly used in: decoding transition segment length (seeing DRA standard 5.4.3 joint), decoding code book range of application (seeing DRA standard 5.5.2 joint table 20), decoding code book index (seeing DRA standard 5.5.3 joint table 21), the quantification index based on selected code book index decoder sub-band samples (seeing DRA standard 5.6 joint table 22-23) and decoding quantization step index (seeing DRA standard 5.7 joint tables 25).In addition, by DRA standard subordinate list, B.1 can obtain quantization step based on quantization step index.Finally, based on quantification index and quantization step, in inverse quantization module 130 (seeing Figure 1B), obtain sub-band samples (seeing DRA standard 6.4 joints).In DRA standard altogether for above-mentioned every solution code requirement provides 27 Huffman code book Hufftab01-Hufftab27.Based on each code book feature separately, can further optimize above-described Hofmann decoding method again.
For the convenience on describing, only provide the result of calculation (seeing table 3) for 27 Huffman code books of DRA herein, the appropriate section of the concrete visible DRA standard of code book data appendix.Those skilled in the art are scrutable, by reading the explanation of the present embodiment and in conjunction with 27 actual DRA Huffman code books, can realizing the optimization to DRA Hofmann decoding.
Particularly, Fig. 4 shows the improved DRA Hofmann decoding method 3000 according to second embodiment of the invention.The method with above in conjunction with the general improved Hofmann decoding method 1000 described in Fig. 2 basic identical (step 3101-3110 respectively corresponding step 1101-1110), except increased the step 3101A of the numerical value that reads in x before step 3101, in this step, from encoding code stream, read in the numerical value of x, and with the numerical value guiding step 3101 of this x.After a while in connection with Fig. 6 pair of x numerical computations step 3101 relevant with step 3101A ' carry out detailed explanation.
In addition, in Fig. 5, also show improved DRA Hoffman decoding device 4000.This decoding device and the general improved Hoffman decoding device 2000 basic identical (module 4101-4110 is respective modules 2101-2110 respectively) of being above combined described in Fig. 3, except having increased the input of reading in module 4101A from bits of original for buffer module 4101.In bits of original, read in module 4101A, from encoding code stream, read in the numerical value of x, and with the numerical value of this x, instruct the operation of module 4101.In connection with Fig. 7 pair of x Numerical Simulation Module 4101 ' relevant with module 4101A, carry out detailed explanation after a while.
In order to realize the object of further optimizing Hofmann decoding for DRA code book feature, need to be according to code book feature, in the required search bit of the first order, count between x and the second level maximum search degree of depth (max_depth) and weigh: in the situation that the total item of the two-stage code book corresponding with x does not increase is a lot (as, be no more than certain multiple value of the two-stage decoding code book total item in x=4 situation), the flex point of the searching second level maximum search degree of depth.Finally can further reach following object: in the situation that code book total item is exceeded increase, improve as far as possible Hofmann decoding speed.
Below in conjunction with Fig. 6 and Fig. 7, describe respectively step 3101 in detail ' and module 4101 '.First, forward Fig. 6 to, first describe step 3101 in detail ', this step is in coding side, for calculating and x being stored in to DRA encoding code stream.As shown in Figure 6, in step 3b, will for certain Huffman code book (as, y code book is numbered corresponding to code book in table 3 data that #Hufftaby is capable) the value initialization of cyclic variable x be 4; Then at step 3c, size value (size (m) when the judgement first order is read in m bit, if label in table 3 is the corresponding size value of tab_mX.dat in capable) T1 of size value when whether being greater than the first order and reading in 4 bit (size (4), for example in table 3, label is the corresponding size value of tab_4X.dat in capable) is doubly.Wherein, T1 is first threshold, is preferably 1.5-2, more preferably 1.8-2, more preferably 1.9-2.If being judged as YES in step 3c, directly enters step 3d, x is set as to 4; Then in step 3h by the value storage of x in DRA encoding code stream, and end step 3101 '.
If being judged as NO in step 3c, enters into step 3e and further judge, judge whether m equals 8.If being judged as YES in step 3e, directly enters step 3d, x is set as to 4; Then in step 3h by the value storage of x in DRA encoding code stream, and end step 3101 '.If being judged as NO in step 3e, enters the 3rd determining step 3f, whether judgement (MD (m)-MD (m+1))/MD (m) is greater than T2.Maximum search degree of depth when wherein, MD (m) represents that the first step is read in m bit (if label in table 3 is the corresponding max_depth value of tab_mX.dat in capable); T2 is Second Threshold, is preferably 0.2-0.7, more preferably 0.2-0.5, more preferably 0.25-0.3.
If being judged as YES in step 3f, enters step 3g, x is set as to m+1; Then in step 3h by the value storage of x in DRA encoding code stream, and end step 3101 '.If being judged as NO in step 3f, makes m add 1, and get back to step 3c, proceed the judgement of next round.
Next, forward Fig. 7 to, describe module 4101 ' in detail, this module is in coding side, for calculating and x being stored in to DRA encoding code stream.As shown in Figure 7, assignment module 4b first will for certain Huffman code book (as, y code book is numbered corresponding to code book in table 3 data that #Hufftab y is capable) the value initialization of cyclic variable m be 4; The T1 of size value size (4) when then whether the first determination module 4c is greater than the first order and reads in 4 bit for judging size value size (m) when the first order is read in m bit doubly.Wherein, T1 is first threshold, is preferably 1.5-2, more preferably 1.8-2, more preferably 1.9-2.If being judged as YES in module 4c, enters an x value determination module 4d via its first path, this module is set as 4 by x; Then in memory module 4h by the value storage of x in DRA encoding code stream, and finish the processing of whole module 4101 '.
If being judged as NO in module 4c, enters in the second judge module 4e and judges whether m equals 8 via its alternate path.If being judged as YES in module 4e, enters an x value determination module 4d via its first path, this module is set as 4 by x; Then in memory module 4h by the value storage of x in DRA encoding code stream, and finish the processing of whole module 4101 '.If being judged as NO in module 4e, enters the 3rd determination module 4f via its alternate path, whether judgement (MD (m)-MD (m+1))/MD (m) is greater than T2.Maximum search degree of depth when wherein, MD (m) represents that the first step is read in m bit (if label in table 3 is the corresponding max_depth value of tab_mX.dat in capable); T2 is Second Threshold, is preferably 0.2-0.7, more preferably 0.2-0.5, more preferably 0.25-0.3.
If being judged as YES in
module 4f, directly enters the 2nd x
value determination module 4g, it is set as m+1 by x; Then in
memory module 4h by the value storage of x in DRA encoding code stream, and finish the processing of
whole module 4101 '.Otherwise, make m add 1, and forward the
first determination module 4c to, re-start the judgement of next round.Table 3 is for the analysis of 27 Huffman code books
The 3rd embodiment: improved DRA Hofmann decoding technology
The following examples, for the feature of DRA technology, describe another improved DRA Hofmann decoding technology in detail.In the present embodiment for 27 code books consistent in conjunction with Figure 1A-1B and described 27 the DRA Huffman code books of DRA standard with the second embodiment.
In two-stage Hofmann decoding process, first order search bit is counted total item number size positive correlation of x and two-stage code book, with depth capacity (max_depth) negative correlation of second level decoding.The two-stage code book of optimizing need to be weighed between total item number and depth capacity, obtains preferred x value.Therefore definable one integrate-cost function, is used in it respectively and improves Hofmann decoding method 5000 (not shown) and improve in Hoffman decoding device 6000 (not shown).
The specific definition of above-mentioned integrate-cost function is: cost (x)=α * max_depth (x)+log
2(size (x)) * log
2(size (0)) wherein, α is weight parameter; X is illustrated in the bit number that the decoding end first order is initially read in; In max_depth (x), size (x) expression table 3, label is max_depth and the size corresponding value of tab_xX.dat in capable; Size (0) represents linear code book item number.
In improving Hofmann decoding method 5000, give first within the specific limits value of α, this scope is preferably 0.4-1.6, more preferably 0.6-1.4, more preferably 0.8-1.2; Then, calculate respectively y code book cost (that is one of 27 code books, mentioning in DRA algorithm), corresponding different x values
y(x) numerical value (y is code book numbering, span from 1 to 27); Then, record each cost
y(x) the value x of the x while getting minimum value
yMin; Last corresponding y DRA code book, output x
yMinvalue, instructs the decoding of y DRA code book (with x with this numerical value subsequently
yMinas the first order, initially read in bit number).
In improving Hoffman decoding device 6000, comprising: assignment module, for giving value of α, this scope is preferably 0.4-1.6, more preferably 0.6-1.4, more preferably 0.8-1.2; Computing module, for calculating respectively y DRA code book cost (that is one of 27 code books, mentioning in DRA algorithm), corresponding different x values
y(x) numerical value (y is code book numbering, span from 1 to 27); Logging modle, for recording each cost
y(x) the value x of the x while getting minimum value
yMin; Output module, for corresponding y DRA code book output x
yMinvalue, instructs the decoding of y DRA code book (with x with this numerical value subsequently
yMinas the first order, initially read in bit number).
Improving Hofmann decoding method 5000 and/or improving in Hoffman decoding device 6000, larger α value means that choosing the code book that x is corresponding has less max_depth and larger size, i.e. lower decoding speed and larger code book memory space; Otherwise less α value means that code book corresponding to x of choosing just has larger max_depth and less size, i.e. lower decoding speed and less code book memory space.
According to an example of the present invention, when α=1.0, and 4≤m≤8 o'clock, 27 cost that code book is corresponding of table 3
yand x (x)
yMinas shown in table 4.Table 4 first order search bit is counted choosing of m
It will be appreciated by persons skilled in the art that above-mentioned second, third embodiment is not limited to for DRA Huffman code book.By reading specification of the present invention and claims, those skilled in the art can all can be used for the optimization method for DRA Huffman code book of mentioning in second, third embodiment, device the first order bit number optimization of general two-stage Huffman code book.
Simulation result
Three kinds of improvement Hofmann decoding methods and existing linear search Hofmann decoding method that present inventor proposes the application at fixed point 16 bit digital signal processor Blackfin-533 have been done performance comparison.Two kinds of Hofmann decoding programs all adopt standard C language, do not comprise the optimization for platform, are embedded in actual DRA audio decoder and move.Although verification platform is Blackfin-533, program itself can be moved on universal PC and most flush bonding processor and digital signal processor.The data that table 4-1~4-3 provides are the processor clock cycle consumption of 1 second DRA code stream Hofmann decoding part of decoding, and wherein the sample rate of DRA code stream is 48kHz, and code check is 128kbps.Table 4-1 is for the contrast test of first embodiment of the invention
For the situation of the first embodiment, code book item number 3112 after improving, original item number 2819, item number has increased approximately 10%.Table 4-2 is for the contrast test of second embodiment of the invention
For the situation of the second embodiment, code book item number 4266 after improving, original item number 2819, item number has increased approximately 51%.Table 4-3 is for the contrast test of third embodiment of the invention
For the situation of the 3rd embodiment, code book item number 4324 after improving, original item number 2819, item number has increased approximately 53%.
By table, 4-1~4-3 can draw following conclusion: without Compiler Optimization in the situation that, the Hofmann decoding module processor clock cycle consumption per second after optimization is 1/6~1/7 of the corresponding consumption of linear search, has 6-7 speed-up ratio doubly; In the situation that having Compiler Optimization, the processor clock cycle consumption per second of Hofmann decoding module is 1/4~1/5 of the corresponding consumption of linear search, has 4-5 speed-up ratio doubly.Visible, the method that the application proposes and device have significant effect of optimization to Hofmann decoding module.
Although described the present invention in conjunction with being considered at present most realistic and optimum embodiment, but those skilled in the art are to be understood that and the invention is not restricted to the disclosed embodiments, on the contrary, the present invention is intended to cover various modifications and the equivalent construction comprising within the spirit of claims and category.Those skilled in the art can understand: can various deformation and/or improvement be used to the present invention as being shown in specific embodiment, and this does not depart from the spirit or scope of the present invention of describing in broad mode.Therefore, to be considered to be descriptive but not determinate in all fields for embodiment herein.