Responsive image
博碩士論文 etd-0212107-164030 詳細資訊
Title page for etd-0212107-164030
論文名稱
Title
以口哨為基礎之電腦音樂: 內容搜尋與作曲
Content-based MIDI Music Retrieval and Computer-aided Composition Based on Musical Whistling
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
98
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2007-01-05
繳交日期
Date of Submission
2007-02-12
關鍵字
Keywords
口哨輸入找音樂、口哨轉MIDI、口哨輸入作曲、合弦表格、旋律表格
Harmonic Table, Whistle-to-Music, Whistle-for-Music, Melodic Form, Whistle-to-MIDI
統計
Statistics
本論文已被瀏覽 5738 次,被下載 1866
The thesis/dissertation has been browsed 5738 times, has been downloaded 1866 times.
中文摘要
以口哨為基礎之電腦音樂:內容搜尋與作曲
沈鴻哲* 李宗南+
中文摘要
本論文主要探討兩個主題:“口哨輸入找MIDI音樂”與“口哨輸入作曲”。口哨輸入找MIDI音樂系統能提供使用者輸入片斷口哨旋律來搜尋MIDI格式之音樂。“口哨輸入找音樂”的三個子系統分別為 1.查詢處理 2.MIDI前置處理 3.近似搜尋引擎。在查詢處理階段時,我們達成即時且可靠的口哨旋律輸入轉換成MIDI訊號的功能。MIDI前置處理是從MIDI音樂檔中自動抽取出涵蓋單獨音符, 相對音符與整體音符之旋律特徵。我們提出之近似搜尋引擎可將搜尋結果將按照旋律特徵比對之近似值來排序。對於”口哨輸入找音樂”系統的表現評估中,我們測試了“查詢等待時間”, “唯一字首(旋律特徵)搜尋”, “(旋律特徵)之錯誤密度與目標碰撞率的比值關連性”, 和有關”整體音符之旋律特徵分佈統計”。其實驗數據證明了使用口哨旋律輸入法來搜尋音樂資料庫之可行性與有效性。另一個“口哨輸入作曲”子系統則是提供使用者輸入片斷口哨旋律來創作音樂。首先, 口哨輸入作曲轉換機制根據音頻樣板與節奏樣板從口哨旋律得到一連串之MIDI音符,這一片斷旋律將當成旋律動機來加以發展,在音樂資料表示法中,我們提出了旋律表格與合弦表格來達成電腦輔助作曲。旋律表格中包含了旋律組織與旋律發展技巧.提供的旋律表格可成為作曲時的輔助樣版資料庫。它涵蓋的旋律格式分別為平行樂段,對比樂段,三樂句樂段複樂段,非樂段結構與樂句群,合弦表格中說明了合弦進行方式,它是自動合弦之演算法之依據,我們提出三種合弦表格,它們分別是採用合弦理論中的根音移動理論,機能性理論與音階理論。最後,對於“口哨輸入作曲”系統,我們有實際範例來展示口哨自動作曲的過程與其即時性之電腦互動表演。
Abstract
Content-based MIDI Music Retrieval and Computer-aided Composition Based on Musical Whistling
Hung-Cche Shen* Chung-Nan Lee+
ABSTRACT
In this dissertation, we have focused on the research of content-based MIDI music retrieval and computer-aided composition based on musical whistling. For MIDI music retrieval, a prototype system called “Whistle-for-Music” is developed. This system enables users to retrieve MIDI format music by whistling a melodic fragment. It consists of three essential components which are query processing, MIDI preprocessing and an approximate search engine. For query processing, we have achieved a real-time and robust whistle-to-MIDI converter. For MIDI preprocessing, the extracted features from MIDI files contain individual, local and global melodic descriptions. In order to match a querying pattern with target, we extend an existing search engine into a fast approximate melodic matching engine. There is a systematic evaluation of “Whistle-for-Music” system. The performed experiments include “Query turnaround time”, “Unique prefix searching”, “Error density vs. match rank percent” and “Statistics of global descriptions”. The results show that careful measurement and objective comparisons can lead us to know the scaling trend about query and target. For computer-aided composition, a system called “Whistle-to- Music” is presented. The “Whistle-to-Music” system can ease the melody input and musical composition. Firstly, our “Whistle-to-MIDI” transformation is to translate a whistled tune into a sequence of notes, which are defined by onset, duration, velocity and note pitch. This “Whistle-to-MIDI” is a relative melody transcription since we apply “pitch templates” and “rhythm templates”. Based on the given melodic motives, we propose the templates of melodic forms and harmonic tables for the tasks of motive development and automatic harmonization. We demonstrate that the proposed melodic forms templates can successfully produce various formal songs from one or few bass motives. The harmonic tables allow us to produce classic-style harmonization based on three theories. The process of “Whistle-to-Music” system is described by giving examples of melody transcription, motive development and demonstrating the resulting music from automatic harmonization. In automatic harmonization, we demonstrate that the proposed harmonic tables can successfully produce a piece of well-formed harmonization from a variety of whistled tunes.

* Author
+Advisor
目次 Table of Contents
Contents
Table of Contents……………………………………………………………………..I
List of Figures…………………………………………………………………III
List of Tables………………………………………………………………….V
中文摘要......................................................................................................................VI
Abstract…………………………………………………………………………….VII
1. Introduction.........................................................................................................1
1.1 Related Work.................................................................................................1
1.1.1 Signal Processing..................................................................................2
1.1.2 Computer Music Science.......................................................................2
1.1.3 Information Retrieval............................................................................3
1.1.4 Computer-aided Composition............................................................3
1.1.5 Automatic Harmonization……………………………………….……..5
1.2 Motivation…………………………………………………….……………..6
1.3 Contributions of This Dissertation..................................................................6
2. Background Materials…................................................................................9
2.1 The Spectrogram.............................................................................................10
2.2 Melody Transcription of Whistling.................................................................11
3. Whistle-For-Music System............................................................................16
3.1 Design Issues....................................................................................................16
3.2 “Whistle-for-Music” System Overview............................................................18
3.3 Melody Descriptions......................................................................................18
3.3.1 A Sight-whistling Tutor.........................................................................22
3.4 MIDI File Preprocessing................................................................................24
3.4.1 MIDI File Disassembler........................................................................24
3.4.2 Melody Extraction................................................................................26
3.4.3 Melodic Segmentation..........................................................................27
3.4.4 Melody Standardization........................................................................28
3.5 Melodic Approximate String Matching.........................................................30
3.5.1 Search Options......................................................................................31
3.5.2 Combination of Pitch and Rhythm in Similarity Retrieval...................32
3.6 Evaluation and Discussion..............................................................................32
4. Whistle-To-Music System..............................................................................37
4.1 Overview of “Whistle-to-Music”....................................................................37
4.2 Whistle-to-MIDI Transcription........................................................................38
4.2.1 Transcription Templates..……………………………….………….40
I
4.2.2 The GUI of Whistle-to-MIDI……………………..………………….43
4.3. Motive Development…………....................................................................45
4.3.1 Melodic Units and Melodic Forms…………………….......................45
4.3.2 A Database of Melodic Form Templates.............................................50
4.3.3 Motivic Functions……………………..……………….......................52
4.3.4 Motivic Development Process……………………………………58
4.4 Automatic Harmonization…………................................................................60
4.4.1 The Scale-degree Theory……..………..................................................61
4.4.2 The Root-motion Theory……………………………..………….……62
4.4.3 Function Theory……………………….……………..………….……63
4.4.4 Procedures of Automatic Harmonization.………………………..…..65
4.4.5 User Interface of a “Motive Editor”…....................................................68
5. Conclusion and Future Work…................................................................70
References………………………………………………………….…………72
Appendix…..………………………………………………………………………76
參考文獻 References
References
[1] A. Ghias, J. Logan, D. Chamberlin and B. C. Smith, “Query by Humming: Musical Information Retrieval in an Audio Database,” In Proc. of ACM Multimedia, pp. 231-236, 1995.
[2] D. Parsons, “The Directory of Tunes and Musical Themes,” Spencer Brown, Cambridge, 1975.
[3] R. J. MeNab, L. A. Smith, D. Bainbridge and I. H. Witten, “The New Zealand Digital Library MELody index,” http://www.dlib.org/dlib/may97/meldex/OSwritten.html, May, 1997.
[4] L. Prechelt and R. Typke, “An Interface for Melody Input,” ACM Transactions on Computer-Human Interaction, Vol. 8, No. 2, pp.133-149, 2001.
[5] The MMA web site http://www.midi.org/.
[6] C.L.Yip and B. Kao, “A Study on Musical Features for Melody Databases,” in Proc. l0 th International Conference on Database and Expert Systems Applications, pp. 724-733, 1999.
[7] A.L. Uitdenbgerd and J. Zobel, “Melodic Matching Techniques for Large Music Databases,” Proceedings of the seventh ACM international conferenceon Multimedia, pp. 57– 66, 1999.
[8] L. Lu, H. You and H. J. Zhang, “A New Approach to Query by Humming In Music Retrieval,” In Proceedings of the IEEE International Conference on Multimedia and Expo, 2001.
[9] S. Blackburn and D. DeRoure, “A Tool for Content Based Navigation of Music,” In Proc. ACM Multimedia 98, pp. 361-368, 1998.
[10] R.J. McNab, L.A. Smith, I.H. Witten and C.L. Henderson, “Tune Retrieval in the Multimedia Library,” Multimedia Tools and Applications, Vol. 10, No. 2-3, pp. 113–132, 2000.
[11] R.L. Kline and E.P. Glinert, “Approximate matching algorithms for music information retrieval using vocal input,” Proceedings of the eleventh ACM international conference on Multimedia, pp. 130-139, 2003.
[12] S. Wu and U. Manber, “Fast text searching allowing errors,” Commun. ACM, pp. 35:83-91, 1992.
[13] A. Kornstadt. “Themefinder- A web-based melodic search tool,” Computing in Musicology 11, pp. 231-234, MIT Press, 1998.
[14] D. Huron, C. S. Sapp and B. Aarden, “Themefinder,” 2000.
http://www.themefinder.org.
[15] J. Wild, “A Review of the Humdrum Toolkit: UNIX Tools for Musical Research,” Music Theory Online, 1996. Vol. 2, No.7.
[16] J. S. Downie and M. Nelson, “Evaluation of a simple and effective music information retrieval method,” In Proc. 23rd ACM Conf. on Research and Development in Information Retrieval (SIGIR ’00), pp. 73–80, Athens, Greece, 2000.
[17] C. Yu. “Computer G enerated Music Composition,” MSc Thesis, M.I.T., 1996.
[18] A. Hornof and L. Sato, “EyeMusic: Making Music with the Eyes,” In Proceedings of the 2004 Conference on New Interfaces for Musical Expression (NIME04), Hamamatsu, Japan, June 3-5, pp. 185-188, 2004.
[19] M. Funk, K. Kuwabara and M. J. Lyons, “Sonification of Facial Actions for Musical Expression,” Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada.
[20] S. Fels, K. Nishimoto and K. Mase, “MusiKalscope: A Graphical Musical Instrument,” IEEE Multimedia Magazine, Vol. 5, No. 3, Jul/Sep, pp. 26-35, 1998.
[21] S. Fels, and K. Mase, “Iamascope: A Graphical Musical Instrument,” accepted for publication in Computers and Graphics, 1999.
[22] K. Essl, “Lexikon-Sonate: An Interactive Realtime Composition for Computer-Controlled Piano,” in Proceedings of the "II Brazilian Symposium on Computer Music, 1995.
[23] M. M. Farbood, E. Pasztor and K. Jennings, “Hyperscore: A Graphical Sketchpad for Novice Composers,” IEEE Computer Graphics and Applications, pp. 50-54, January/February 2004.
[24] S. Tsuruta, Fujimoto, M. Fujimoto, M. Mizuno and Y. Takashima, “Personal computer-music system-song transcription and its application,” IEEE Transactions on Consumer Electronics, Vol. 34, No. 3, pp. 819 –823, Aug. 1988.
[25] B. Jacob, “Algorithmic Composition as a Model of Creativity,” Organised Sound 1(3), pp. 157-165, 1996. Internet: http://www.ee.umd.edu/ ~blj/ algorithmic_composition/ algorithmicmodel.html.
[26] F. Pachet, and P. Roy, “Musical harmonization with constraints: A survey,” Constraints, Vol. 6, No. 1, pp. 7-19, 2001.
[27] C. P. Tsang and M. Aitken, “Harmonizing music as a discipline of constraint logic programming,” In Proceedings of ICMC 1991, pp. 61–64, Montreal, 1991.
[28] D. Tao, H. Liu and X. Tang: “K-BOX: a query-by-singing based music retrieval system,” Proceedings of the 12th annual ACM international conference on Multimedia, pp. 464– 467, 2004.
[29] E. Unal, S. S. Narayanan and E. Chew, “A statistical approach to retrieval under user-dependent uncertainty in query-by-humming systems,” Proceedings of the 6th ACM SIGMM international workshop on Multimedia information retrieval, pp. 113–118, 2004.
[30] R. B. Dannenberg, W. P. Birmingham, G. Tzanetakis, C. Meek, N. Hu and B. Pardo: “The MUSART Testbed for Query-by-Humming Evaluation,” Vol. 28, No. 2, pp. 34-48, 2004.
[31] C. Meek and W. P. Birmingham: “Thematic Extractor,” 2nd Annual International Symposium on Music Information Retrieval. Bloomington: Indiana University, pp. 119-128, 2001.
[32] Opcode Studio Vision Pro, http://www.dg.co.il/Partners/opcode.html, Last Visited May 30, 2006.
[33] AutoScore by Wildcat Canyon Software, http://www.wildcat.com/Web/Wildcat/Html/Site/Homepage.html, Last Visited May 30, 2006.
[34] A. Oppenheim and R. Schafer, 1975. Digital Signal Processing. Englewood Cliffs, New Jersey: Prentice-Hall.
[35] W. J. Dowling, “Scale and contour: Two components of a theory of memory for melodies,” Psychological Review, Vol. 85, No. 4, pp. 341–354, 1978.
[36] J. McNab, R. A. Smith, L. H. Witten, I. L. Henderson and S. Cunningham, “Towards the Digital Music Library: Tune Retrieval from Acoustic Input,” Proc. ACM Digital Libraries, Bethesda, pp. 11-18, 1996.
[37] A. L. Uitdenbgerd and R. G. Schyndel, “A review of factors affecting music recommender success,” in M. Fingerhut, ed., Third International Conference on Music Information Retrieval, Paris, France, pp. 204-208, 2002.
[38] S. Dixon, “A beat tracking system for audio signals,” in Proceedings of the Diderot Forum on Mathematics and Music, Austrian Computer Society, pp. 101–110., 1999.
[39] The specification of MPEG 7. http://www.chiariglione.org/mpeg/.
[40] L. Lie, W. Muyuan and Z. H. Jiang, “Repeating pattern discovery and structure analysis from acoustic music data,” Proceedings of the 6th ACM SIGMM international workshop on Multimedia information retrieval, pp. 275 – 282, 2004.
[41] M. Harrison, “Contemporary Music Theory: Level One,” Hal Leonard. January 1999.
[42] Y. Tonta, “Analysis of search failures in document retrieval systems: A review,” Public-Access Computer Systems Review Vol. 3, No 2, pp. 4-53, 1992.
[43] T. S. Jean, “The pragmatics of information retrieval experimentation,” Information Processing and Management Vol. 28, No. 4, pp. 467-490, 1992.
[44] C. Y. Kuo, “Chinese art song: A melodic analysis,” YUEH-YUHN Music Publishing Co., Ltd., 1996.
[45] G. Vogler. 1776. Tonwissenschaft und Tonsezkunst. Mannheim, Kurfürstliche Hofbuchdruckerei.
[46] G. Weber. 1817-21. Versuch einer gordneten Theorie der Tonsetzkunst. 3 vols. Mainz: B. Schott.
[47] M. Devoto, W. Piston and A. Jannery, “Harmony” W W Norton & Co Inc, 1987.
[48] J. P. Rameau,: 1722. Traité de l’harmonie, Paris: Ballard. Translated by Philip Gossett as Treatise onHarmony. New York: Dover, 1971.
[49] H. Riemann, “Vereinfachte Harmonielehre,“ London: Augener, 1893.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內外都一年後公開 withheld
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code