首页 > 代码库 > Locality-constrained Linear Coding for Image Classification

Locality-constrained Linear Coding for Image Classification

引入

This paper presents a simple but effective coding scheme called Locality-constrained
Linear Coding (LLC) in place of the VQ coding in traditional SPM.

特征量化机制

LLC utilizes the locality constraints to project each descriptor into itslocal-coordinate system, and the projected coordinates are integrated bymax poolingto generate the final representation. 添加的局部约束

Compared with the sparse coding strategy , the objective function used by LLC has an analytical solution. In addition, the paper proposes a fast approximated LLC method by first performing a K-nearest-neighbor search and then solving a constrained least square fitting problem

与sparse coding相比LLC有解析解,并且计算代价小,计算速度快,可用于实时任务

8}`K@)M(SXR[80Q7Z5A(G9G

流程

A typical flowchart of the SPM approach based on BoF is illustrated on the left of Figure 1. First, feature points are detected or densely located on the input image, and descriptors such as “SIFT” or “color moment” are extracted from each feature point (highlighted in blue circle in Figure 1). This obtains the “Descriptor” layer. Then, a codebook with Mentries is applied to quantize each descriptor and generate the “Code” layer, where each descriptor is converted into anRMcode (highlighted in green circle). If hard vector quantization (VQ) is used, each code has only one non-zero element, while for soft-VQ, a small group of elements can be non-zero. Next in the “SPM” layer, multiple codes from inside each sub-region are pooled together by averaging and
normalizing into a histogram. Finally, the histograms from all sub-regions are concatenated together to generate the final representation of the image for classification.

1,从输入图像中发掘兴趣点

2,对兴趣点应用特征描述子,得到特征向量

3,量化特征得到codebook

4,特征编码 —— 若是硬投票,每个特征对应一个code;若是软投票,每个特征对应一组code;若是SPM,特征对应的是sub-region中的code,并用averaging pooling 和 归一化成直方图,最后将这些子区域的直方图串连起来形成最终的图像表达

但是为了达到更好的性能,使用SPM的时候要用非线性核SVM,这样计算代价大,效果也一般般

 

ScSPM 使用sparse codeing 来实现非线性编码(替换了原始的kmeans)

ScSPM 实验过程中发现,特征总是和它距离较近的code相关。于是建议不如直接加个局部性约束来鼓励局部的code ——locality is more essential than sparsity

且,LLC目标函数的优化有解析解,计算代价喜人~~ 可用于实时任务

Locality-constrained Linear Coding

L7YOT9TL}@)5]C5{1WA{V$5

NP(]H~69Y}H[1NQ~7M6~4D0

VBLP{O89G0[(53MGGGEIUM2

D)L}SNJUTRJ{{661AP7G}8C

稀疏编码竟是如此牛逼~~

 

locality is more essential than sparsity, as locality must lead to sparsity but not necessary vice versa

LLC用局部约束来替代稀疏性约束,取得了一些很好的性质

U{JZ_73@8QO8Q]3Z3]H{94G

上面公式中B是basis ,c 是 coefficient ,d是与基同维数的局部约束

d与c之间是元素级别的乘法

(4)式就是d的构造了,theata 用来调整权重,最后还要对d进行归一化

Properties of LLC

To achieve good classification performance, the coding scheme should generate similar codes for similar descriptors.

1,更好的重建 —— 或更小的量化误差

2,局部平滑稀疏—— 与sparse coding相比较,SC包含L1范式的规则项不是平滑的;而且由于SC的基是过完备的,相似的特征可能会选择不同的基地来表达,这会带来一定的误差

3,解析解 —— SC的求解繁琐

S5%UV@XL0)FA8H4C7}(S[0E

CGL(FY8ORM)AY2]QNF`T_O8

Approximated LLC for Fast Encoding

LLC拥有解析解,但我们可以选择特征附近的K个code来构造其局部坐标系统,加速计算

而且,这样LLC可以拥有一个巨大的codebook,但每个feature相关的code就那么几个

XHU%`O_NR$S`NQL6MQPKQZU

Codebook Optimization

According to our experimental results in Subsection 5.4, the codebook generated by K-Means can produce satisfactory accuracy. In this work, we use the LLC coding criteria to train the codebook, which further improves the performance.

NJCFJSV{D_5A%61B2_[1CN5

4_MTB`L)QSY15UW0G737UZH

分析

1,codebook的学习方法,kmean和作者提供的性能差别不大,但识别率会随着codebook的增大而提高

2,局部code数目的影响,也就是K的影响 —— K越小识别率越高~~(但不能小于5)

3,we tested the performance under different constraints other than the shift-invariant constraint —— the shift-invariant constraint leads to the best performance.

总结

LLC 用局部约束替代SC的稀疏约束,性能更上一层楼

尤其是提出的最近邻编码方法,可以完成实时任务

我觉得特征编码这块,LLC要火一阵子了~~

Locality-constrained Linear Coding for Image Classification