Responsive image
博碩士論文 etd-0802107-114724 詳細資訊
Title page for etd-0802107-114724
論文名稱
Title
將資源使用分析應用於軟體測試
Applying Resource Usage Analysis to Software Testing
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
62
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2007-06-25
繳交日期
Date of Submission
2007-08-02
關鍵字
Keywords
資源導向測試、記憶體重複釋放、記憶體洩漏、軟體測試
resource-oriented testing, double free, software testing, memory leak
統計
Statistics
本論文已被瀏覽 5954 次,被下載 8
The thesis/dissertation has been browsed 5954 times, has been downloaded 8 times.
中文摘要
隨著軟體與網路的發展,現今軟體越趨複雜,許多利用軟體弱點的攻擊手法使得軟體測試面臨嚴峻的挑戰,根據CSI/FBI的調查報告顯示阻斷式服務攻擊(Denial-of-Service attack)在過去三年中名列前五名造成重大損失的網路攻擊,除了消耗頻寬的攻擊方式之外,利用軟體資源使用的弱點往往是攻擊者最常使用的攻擊手法,本篇研究發現傳統的軟體測試方法並無法有效的發現軟體弱點,其原因在於傳統測試方法的目的在發現程式的邏輯的正確性,這樣的思維模式造成許多並不屬於邏輯問題的漏洞無法被發現,例如,記憶體洩漏(memory leak)。
另一方面許多為解決軟體資源耗用漏洞的新測試方法也一一被提出,但是這些測試方法所呈現的結果卻相當的原始,因此,本研究嘗試從資源耗用的角度為軟體測試加上一個新的定義,並且提出3個測試準則,測試人員可以結合測試準則與目前現有工具對程式資源耗用情況進行測試,透過這些測試準則的指引將可以更有效的找出程式在資源耗用上不健康之行為。
Abstract
With the developing of the software and network environment, software becomes more and more complex. The network attacks which exploit the software vulnerability make the traditional software testing face a crucible challenge. According to the report by the CSI/FBI, the lose cause from Denial-of-Service remains in top 5 highest rank of network attacks in the past 3 years. Besides the network bandwidth consuming, the commonest attack is to exploit the software vulnerabilities. In my research, I found the traditional testing technique could not find the software vulnerabilities efficiently for they just verify the correctness of software. This way of thinking would bypass many software vulnerabilities which do not belong to the logical error such as memory leak.
In another way, some test techniques to solve the resource usage vulnerability were proposed in recent years but the results of them are very primitive. Thus, I try to give the software testing a new definition from the resource usage analysis. I propose 3 test criteria in this paper. Testers could combine these test criteria with existing tools as a guide to test the resource usage of the program. With these test criteria, testers can find out the unhealthy usage of software resource.
目次 Table of Contents
1. Introduction 1
1.1. The problem of existing methods 4
1.2. The importance of this work 5
1.3. Paper organization 6
2. Related Works 7
2.1. Software testing 7
2.1.1. Structural Testing 8
2.1.1.1. Statement Coverage 8
2.1.1.2. Branch Coverage 9
2.1.1.3. Basic Path Coverage 10
2.1.2. Functional testing 10
2.1.3. Performance testing 11
2.1.4. Load testing 12
2.1.5. Stress testing 12
2.2. Software aging 13
2.3. Memory management dilemma 13
2.3.1. The memory management solutions 14
2.3.1.1. Garbage Collection(reference counting、tracing collector) 15
2.3.1.2. Dynamic testing 16
2.3.1.3. Static analysis 18
3. Resource-Oriented testing 19
3.1. The memory management problems in C language 19
3.1.1. Buffer overflow 19
3.1.2. Memory leak 20
3.1.3. Double free 21
3.2. Resource usage cycle 22
3.3. The functions related to memory usage life cycle 23
3.3.1. Create Function and Dispose Function 23
3.3.2. Modify function 24
3.4. The difference between logical testing and resource-oriented testing 25
3.4.1. Unobvious error feature 25
3.4.2. Reduced flow graph and path amount 26
3.4.3. Memory effect testing:Single test run & multiple test run 26
3.5. Resource-oriented test criteria 28
3.5.1. Basic test criterion 29
3.5.2. Create-dispose-free pair test criterion 31
3.6. Handling memory blocks among the functions 32
3.7. Combine the test technique with the criteria 33
4. Experiment Result 36
4.1. Experiment process 37
4.2. Memory leak patterns and case analysis 38
4.3. Analysis of double free case 44
4.4. Special case 46
4.5. Path complexity analysis 47
5. Conclusion 49
6. Research Constraint and Future work 50
7. Reference 51
參考文獻 References
1. Gelperin, D. and B. Hetzel, The growth of software testing. 1988, ACM Press. p. 687-695.
2. Myers, G.J., et al., The Art of Software Testing. 2004: John Wiley and Sons.
3. Hetzel, B., The complete guide to software testing. 1988: QED Information Sciences, Inc. Wellesley, MA, USA.
4. United States Department of Justice. [cited; Available from: http://www.usdoj.gov/criminal/cybercrime/index.html.
5. William R. Bush, J.D.P.D.J.S., A static analyzer for finding dynamic programming errors. 2000. p. 775-802.
6. Dawson, E., et al., Bugs as deviant behavior: a general approach to inferring errors in systems code, in Proceedings of the eighteenth ACM symposium on Operating systems principles. 2001, ACM Press: Banff, Alberta, Canada.
7. Hastings, R. and B. Joyce, Purify: Fast detection of memory leaks and access errors. 1992. p. 125-136.
8. Valgrind. [cited; Available from: http://www.valgrind.org/.
9. David, E., Static detection of dynamic memory errors. 1996, ACM Press. p. 44-53.
10. Chen, P.-K., An Automated Method for Resource Testing. 2006.
11. Beizer, B., Software testing techniques. 1990: Van Nostrand Reinhold Co. New York, NY, USA.
12. Ntafos, S.C., A comparison of some structural testing strategies. Software Engineering, IEEE Transactions on, 1988. 14(6): p. 868-874.
13. Apache foundation. [cited; Available from: http://www.apache.org/.
14. joedog.org. [cited; Available from: http://www.joedog.org/.
15. httperf home page. [cited; Available from: http://www.hpl.hp.com/research/linux/httperf/.
16. Standard Performance Evaluation Corporation. [cited; Available from: http://www.spec.org/.
17. Transaction Processing Performance Council. [cited; Available from: http://www.tpc.org/.
18. McGee, P. and C. Kaner, Experiments with High Volume Test Automation.
19. Kaner, C., W. Bond, and P. McGee, High Volume Test Automation. 2004.
20. Berndt, D.J. and A. Watkins, High Volume Software Testing using Genetic Algorithms. 2005, IEEE Computer Society Washington, DC, USA.
21. Briand, L., Y. Labiche, and M. Shousha, Using genetic algorithms for early schedulability analysis and stress testing in real-time systems. Genetic Programming and Evolvable Machines, 2006. 7(2): p. 145-170.
22. Jian Zhang, S.C.C., Automated test case generation for the stress testing of multimedia systems. 2002. p. 1411-1435.
23. Garg, S., et al. A methodology for detection and estimation of software aging. 1998.
24. Huang, Y., et al. Software Rejuvenation: Analysis, Module and Applications. 1995.
25. Vaidyanathan, K. and K.S. Trivedi. A measurement-based model for estimation of resource exhaustion in operational software systems. 1999.
26. Chillarege, R., S. Biyani, and J. Rosenthal, Measurement of Failure Rate in Widely Distributed Software. 1995. p. 424–433.
27. Iyer, R.K. and D.J. Rossetti, Effect of System Workload on Operating System Reliability: A Study on IBM 3081. Software Engineering, IEEE Transactions on, 1985. SE-11(12): p. 1438-1448.
28. Lee, I., R.K. Iyer, and A. Mehta. Identifying software problems using symptoms. 1994.
29. Iyer, R.K., L.T. Young, and P.V.K. Iyer, Automatic recognition of intermittent failures: an experimental study of field data. Computers, IEEE Transactions on, 1990. 39(4): p. 525-537.
30. Tang, D. and R.K. Iyer, Dependability measurement and modeling of a multicomputer system. Computers, IEEE Transactions on, 1993. 42(1): p. 62-75.
31. Thakur, A. and R.K. Iyer, Analyze-NOW-an environment for collection and analysis of failures in a network of workstations. Reliability, IEEE Transactions on, 1996. 45(4): p. 561-570.
32. Common Vulnerabilities and Exposures. [cited; Available from: http://cve.mitre.org/.
33. Cowan, C., et al., StackGuard: Automatic Adaptive Detection and Prevention of Buffer-Overflow Attacks.
34. TWCERT/CC. [cited; Available from: http://www.cert.org.tw/.
35. Jones, R. and R. Lins, Garbage collection: algorithms for automatic dynamic memory management. 1996: John Wiley & Sons, Inc. New York, NY, USA.
36. Stephen, M.B. and S.M. Kathryn, Ulterior reference counting: fast garbage collection without a long wait, in Proceedings of the 18th annual ACM SIGPLAN conference on Object-oriented programing, systems, languages, and applications. 2003, ACM Press: Anaheim, California, USA.
37. Erickson, C., Memory leak detection in embedded systems. 2002, Specialized Systems Consultants, Inc. Seattle, WA, USA.
38. Evans, D., et al., LCLint: a tool for using specifications to check code. 1994, ACM Press New York, NY, USA. p. 87-96.
39. Cowan, C., et al., PointGuardTM: Protecting Pointers From Buffer Overflow Vulnerabilities.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內一年後公開,校外永不公開 campus withheld
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus:永不公開 not available

您的 IP(校外) 位址是 3.141.100.120
論文開放下載的時間是 校外不公開

Your IP address is 3.141.100.120
This thesis will be available to you on Indicate off-campus access is not available.

紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code