«上一篇
 文章快速检索 高级检索

 智能系统学报  2018, Vol. 13 Issue (2): 196-201  DOI: 10.11992/tis.201612012 0

### 引用本文

GUO Longwei, GUAN Xin, LI Qiang. Recognition of difficulty level of piano score based on metric learning support vector machine[J]. CAAI Transactions on Intelligent Systems, 2018, 13(2): 196-201. DOI: 10.11992/tis.201612012.

### 文章历史

Recognition of difficulty level of piano score based on metric learning support vector machine
GUO Longwei, GUAN Xin, LI Qiang
Department of Electronic Information Engineering, Tianjin University, Tianjin 300072, China
Abstract: The existing classification work about piano score’s level is mainly done manually and inefficient, while the algorithm automatically recognizing the difficulty class of music scopre has a low classification fitting degree. Therefore, different from the traditional method that takes the recognition for the difficulty class of music scope as a regression issue, the paper directly modelled it as a classification based on the support vector machine, in addition, in combination with such characteristics of the score classification as intense subjectivity and common dependency among features, the metric learning theory was utilized. The prior knowledge of the score with difficult level tag was sufficiently utilized, according to the contribution of feature in difficulty distinguishment, the Gauss radial basis kernel function was improved, so as to propose a kind of metric learning support vector machine classification algorithm —ML-SVM algorithm. In the score datasets with level 9 and level 4 difficulty, ML-SVM algorithm was compared with logistic regression, the support vector machine algorithm based on linear kernel function, polynomial kernel function, Gauss radical basis (GRB) kernel function, and various support vector machine algorithms combining principal component analysis. The results show that the proposed algorithm is much more accurate than the existing algorithms, reaching the accuracy rate 68.74% and 84.67% respectively. The proposed algorithm effectively improves the classification performance of SVM algorithm based on GRB kernel function in this application.
Key words: digital piano score    recognition of difficulty level    classification algorithm    support vector machine    metric learning    Gauss radial basis kernel function

Shih-Chuan Chiu等[1]最先开始钢琴乐谱难度等级识别研究。由于钢琴乐谱难度识别是一个相对较新的研究问题，现有的符号音乐特征(symbolic music feature)较少直接用于难度等级识别。所以他们首先定义8个与乐谱难度密切相关的特征，将此8个难度相关特征与语义特征一起作为特征空间。后用特征选择算法ReliefF[2]按照特征重要程度(即各个特征对难度等级的辨别能力大小)分配权值，选择权值最大的10个特征作为后续实现难度等级识别的特征空间。最终用3个回归算法：多元线性回归、逐步回归[3]、支持向量回归[4]实现难度等级识别。实验中，支持向量回归算法得到最好的效果，其R2统计量为39.9%[3]

Véronique Sébastien等[5]也将乐谱难度等级识别看作分类问题，他们利用无监督的聚类算法实现乐谱难度等级识别。首先定义7个难度相关特征，这些特征从MusicXML格式[6]乐谱文件提取。之后用主成分分析 (principal component analysis，PCA)将特征投影到低维空间，以降低特征维数。然后，用分层聚类(hierarchical clustering)[7]将乐谱聚成3类，即3个难度类别。

1 支持向量机理论

SVM是基于统计学习理论中经验风险最小化原则的一种机器学习算法[8]。SVM已广泛应用于纹理分类(texture classification)[9]、文本分类[10]、人脸识别[11]、语音识别[12]等各个领域。理论和实践证明，SVM对噪声和离群点鲁棒性好，泛化能力强，经过扩展可解决多分类问题[8]

SVM实现分类的关键是核函数，利用核函数[13]可以将低维线性不可分的问题转化到高维空间实现线性可分，同时避免因维数增加而导致过大的计算量。常用的核函数有线性核函数(linear kernel function)、多项式核函数(polynomial kernel function)、高斯径向基核函数(Gauss radical basis kernel function，GRB)等[13]。由于本研究问题的特征数目远小于样本数目，并且为降低分类模型的参数复杂度，本文考虑采用高斯径向基核函数。

2 算法原理与算法描述

 Download: 图 1 ML-SVM算法的框图 Fig. 1 The frame chart of ML-SVM algorithm
2.1 特征空间的建立

2.2 测度学习得到距离测度DM，改进GRB核函数

 ${k_M}({{{x}}_i},{{{x}}_j}) = \exp ( - \frac{1}{{2{\sigma ^2}}}{D_M}({{{x}}_i},{{{x}}_j}))$ (1)

 ${D_M} = {({{{x}}_i} - {{{x}}_j})^{{T}}}{{M}}({{{x}}_i} - {{{x}}_j})$ (2)

 ${{x}}_i' = {{L}}{{{x}}_i}$ (3)

 $\begin{array}{c} {d_L}({{{x}}_i},{{{x}}_j}) = ||{{L}}({{{x}}_i} - {{{x}}_j})|{|_2} = \\ \sqrt {{{{{[}}{{L}}({{{x}}_i} - {{{x}}_j})]}^{{T}}}{{[}}{{L}}({{{x}}_i} - {{{x}}_j})]} \\ \end{array}$ (4)

 $\begin{array}{c} {D_M} = {d_L}^2 = {[{{L}}({{{x}}_i} - {{{x}}_j})]^{{T}}}[{{L}}({{{x}}_i} - {{{x}}_j})] = \\{({{{x}}_i} - {{{x}}_j})^{{T}}}{{{L}}^{{T}}}{{L}}({{{x}}_i} - {{{x}}_j}) = \\{({{{x}}_i} - {{{x}}_j})^{{T}}}{{M}}({{{x}}_i} - {{{x}}_j}) \\ \end{array}$ (5)

 $\begin{array}{c} \mathop {\min }\limits_{{{M}} \geqslant 0} \sum\limits_{(i,j) \in s} {{D_M}({{{x}}_i},{{{x}}_j})} \\ { s.t.}\;\;\sum\limits_{(i,j,k) \in R} [ {D_M}({{{x}}_i},{{{x}}_j}) - {D_M}({{{x}}_i},{{{x}}_j})] \geqslant 1 \\ \end{array}$ (6)

2.3 ML-SVM算法

 $\begin{array}{c}\mathop {\max }\limits_\alpha f(\alpha ) = \sum\limits_{i = 1}^p {{\alpha _i}} - \displaystyle\frac{1}{2}\sum\limits_{i,j = 1}^p {{\alpha _i}{\alpha _j}{y_i}{y_j}} {k_M}\\{ s.t.}\;\;\;\;\;\;\sum\limits_{i = 1}^p {{\alpha _i}} {y_i} = 0\\0 \leqslant {\alpha _i} \leqslant C,\;\;i = 1,2,\cdots \!,p\end{array}$ (7)

 ${b^*} = {y_j} - \sum\limits_{i = 1}^p {{y_i}\alpha _i^*} k\left( {{{{x}}_i},{{{x}}_j}} \right)$ (8)

 $f({{x}}) = \operatorname{sgn} (\sum\limits_{i = 1}^p {{y_i}\alpha _i^*} k\left( {{{x}},{{{x}}_i}} \right) + {b^*})$ (9)

3 实验

3.1 实验数据集

3.2 数据预处理

 ${{{x}}^*} = \frac{{{{x}} - \min }}{{\max - \min }}$ (10)

3.3 实验与结果分析

4 结束语

 [1] CHIU S C, CHEN M S. A study on difficulty level recognition of piano sheet music[C]//IEEE International Symposium on Multimedia. Irvine, CA, USA: IEEE, 2012: 17–23. (0) [2] ROBNIK-ŠIKONJA M, KONONENKO I. Theoretical and empirical analysis of Relief[J]. Machine learning, 2003, 53(1/2): 23-69. DOI:10.1023/A:1025667309714 (0) [3] JAMES G, WITTEN D, HASTIE T, et al. An introduction to statistical learning with applications in R[M]. New York: Springer, 2013: 59–102. (0) [4] SMOLA A J, SCHÖLKOPF B. A tutorial on support vector regression[J]. Statistics and computing, 2003, 14(3): 199-222. (0) [5] SÉBASTIEN V, RALAMBONDRAINY H, SÉBASTIEN O, et al. Score analyzer: automatically determining scores difficulty level for instrumental e-learning[C]//Proceedings of the 13th International Society for Music Information Retrieval Conference. Porto, Portugal: ISMIR, 2012: 571–576. (0) [6] CASTAN G, GOOD M, ROLAND P. Extensible markup language (XML) for music applications: an introduction, the virtual score: representation, retrieval, restoration[M]. Cambridge: MIT Press, 2001: 95–102. (0) [7] WARD JR J H. Hierarchical grouping to optimize an ob-jective function[J]. Journal of the American statistical association, 1963, 58(301): 236-244. DOI:10.1080/01621459.1963.10500845 (0) [8] 丁世飞, 齐丙娟, 谭红艳. 支持向量机理论与算法研究综述[J]. 电子科技大学学报, 2011, 40(1): 2-10. DING Shifei, QI Bingjuan, TAN Hongyan. An overview on theory and algorithm of support vector machines[J]. Journal of university of electronic science and technology of China, 2011, 40(1): 2-10. (0) [9] LI Shutao, KWOK J T, ZHU Hailong, et al. Texture clas-sification using the support vector machines[J]. Pattern recog-nition, 2003, 36(12): 2883-2893. DOI:10.1016/S0031-3203(03)00219-X (0) [10] SIMON T, KOLLER D. Support vector machine active learning with applications to text classification[J]. The journal of machine learning research, 2002, 2: 45-66. (0) [11] OSUNA E, FREUND R, GIROSIT F. Training support vector machines: an application to face detection[C]//IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Juan, Puerto Rico, USA: IEEE, 1997: 130–136. (0) [12] WAN V, CAMPBELL W M. Support vector machines for speaker verification and identification[C]//Neural Networks for Signal Processing X. Proceedings of the 2000 IEEE Signal Processing Society Workshop. Sydney, NSW, Australia: IEEE, 2000, 2: 775–784. (0) [13] SCHÖLKOPF B, SMOLA, A J. Learning with kernels[M]. GMD-For Schungszentrum Information Stechnik, 1998: 5–93. (0) [14] KULIS B. Metric learning: a survey[J]. Foundations and trends in machine learning, 2012, 5(4): 287-364. (0) [15] WEINBERGER K Q, SAUL L K. Distance metric learning for large margin nearest neighbor classification[J]. Journal of machine learning research, 2009, 10: 207-244. (0) [16] HSU C W, LIN C J. A comparison of methods for multiclass support vector machines[J]. IEEE transactions on neural networks, 2002, 13(2): 415-425. DOI:10.1109/72.991427 (0) [17] MIDI Manufacturers Association. An introduction to MIDI[M]. California: MIDI Manufacturers Association, 2009: 1–16. (0) [18] Fours set data sources[EB/OL]. [2015-07-24]. http://www.8notes.com. (0) [19] HOSMER D W, LEMESHOW S. Applied logistic regres-sion[M]. New York: Wiley, 2000: 31–46. (0) [20] WESTON J, WATKINS C. Multi-class support vector machines, CSD-TR-98-04[R/OL]. Egham: Royal Holloway University of London, 1998: 1–10. (0) [21] CHANG C C, LIN C J. LIBSVM——a library for support vector machines[J/OL]. ACM transactions on intelligent systems and technology, 2011, 2(3): 27. (0) [22] 徐晓明. SVM参数寻优及其在分类中的应用[D]. 大连: 大连海事大学, 2014: 6–58. XU Xiaoming. SVM parameter optimization and its application in the classification[D]. Dalian: Dalian Maritime University, 2014: 6–58. (0)