A baseline study of potentially malicious activity across five network telescopes
- Authors: Irwin, Barry V W
- Date: 2013
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429714 , vital:72634 , https://ieeexplore.ieee.org/abstract/document/6568378
- Description: +This paper explores the Internet Background Radiation (IBR) observed across five distinct network telescopes over a 15 month period. These network telescopes consisting of a /24 netblock each and are deployed in IP space administered by TENET, the tertiary education network in South Africa covering three numerically distant /8 network blocks. The differences and similarities in the observed network traffic are explored. Two anecdotal case studies are presented relating to the MS08-067 and MS12-020 vulnerabilities in the Microsoft Windows platforms. The first of these is related to the Conficker worm outbreak in 2008, and traffic targeting 445/tcp remains one of the top constituents of IBR as observed on the telescopes. The case of MS12-020 is of interest, as a long period of scanning activity targeting 3389/tcp, used by the Microsoft RDP service, was observed, with a significant drop on activity relating to the release of the security advisory and patch. Other areas of interest are highlighted, particularly where correlation in scanning activity was observed across the sensors. The paper concludes with some discussion on the application of network telescopes as part of a cyber-defence solution.
- Full Text:
- Authors: Irwin, Barry V W
- Date: 2013
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429714 , vital:72634 , https://ieeexplore.ieee.org/abstract/document/6568378
- Description: +This paper explores the Internet Background Radiation (IBR) observed across five distinct network telescopes over a 15 month period. These network telescopes consisting of a /24 netblock each and are deployed in IP space administered by TENET, the tertiary education network in South Africa covering three numerically distant /8 network blocks. The differences and similarities in the observed network traffic are explored. Two anecdotal case studies are presented relating to the MS08-067 and MS12-020 vulnerabilities in the Microsoft Windows platforms. The first of these is related to the Conficker worm outbreak in 2008, and traffic targeting 445/tcp remains one of the top constituents of IBR as observed on the telescopes. The case of MS12-020 is of interest, as a long period of scanning activity targeting 3389/tcp, used by the Microsoft RDP service, was observed, with a significant drop on activity relating to the release of the security advisory and patch. Other areas of interest are highlighted, particularly where correlation in scanning activity was observed across the sensors. The paper concludes with some discussion on the application of network telescopes as part of a cyber-defence solution.
- Full Text:
Audio Device Representation, Control, and Monitoring Using SNMP
- Eales, Andrew, Foss, Richard
- Authors: Eales, Andrew , Foss, Richard
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/426855 , vital:72396 , https://www.aes.org/e-lib/browse.cfm?elib=17012
- Description: The Simple Network Management Protocol (SNMP) is widely used to configure and monitor networked devices. The architecture of complex audio devices can be elegantly represented using SNMP tables. Carefully considered table indexing schemes support a logical device model that can be accessed using standard SNMP commands. This paper examines the use of SNMP tables to represent the architecture of audio devices. A representational scheme that uses table indexes to provide direct-access to context-sensitive SNMP data objects is presented. The monitoring of parameter values and the implementation of connection management using SNMP are also discussed.
- Full Text:
- Authors: Eales, Andrew , Foss, Richard
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/426855 , vital:72396 , https://www.aes.org/e-lib/browse.cfm?elib=17012
- Description: The Simple Network Management Protocol (SNMP) is widely used to configure and monitor networked devices. The architecture of complex audio devices can be elegantly represented using SNMP tables. Carefully considered table indexing schemes support a logical device model that can be accessed using standard SNMP commands. This paper examines the use of SNMP tables to represent the architecture of audio devices. A representational scheme that uses table indexes to provide direct-access to context-sensitive SNMP data objects is presented. The monitoring of parameter values and the implementation of connection management using SNMP are also discussed.
- Full Text:
Developing a virtualised testbed environment in preparation for testing of network based attacks
- Van Heerden, Renier, Pieterse, Heloise, Burke, Ivan, Irwin, Barry V W
- Authors: Van Heerden, Renier , Pieterse, Heloise , Burke, Ivan , Irwin, Barry V W
- Date: 2013
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429648 , vital:72629 , 10.1109/ICASTech.2013.6707509
- Description: Computer network attacks are difficult to simulate due to the damage they may cause to live networks and the complexity required simulating a useful network. We constructed a virtualised network within a vSphereESXi environment which is able to simulate: thirty workstations, ten servers, three distinct network segments and the accompanying network traffic. The VSphere environment provided added benefits, such as the ability to pause, restart and snapshot virtual computers. These abilities enabled the authors to reset the simulation environment before each test and mitigated against the damage that an attack potentially inflicts on the test network. Without simulated network traffic, the virtualised network was too sterile. This resulted in any network event being a simple task to detect, making network traffic simulation a requirement for an event detection test bed. Five main kinds of traffic were simulated: Web browsing, File transfer, e-mail, version control and Intranet File traffic. The simulated traffic volumes were pseudo randomised to represent differing temporal patterns. By building a virtualised network with simulated traffic we were able to test IDS' and other network attack detection sensors in a much more realistic environment before moving it to a live network. The goal of this paper is to present a virtualised testbedenvironmentin which network attacks can safely be tested.
- Full Text:
- Authors: Van Heerden, Renier , Pieterse, Heloise , Burke, Ivan , Irwin, Barry V W
- Date: 2013
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429648 , vital:72629 , 10.1109/ICASTech.2013.6707509
- Description: Computer network attacks are difficult to simulate due to the damage they may cause to live networks and the complexity required simulating a useful network. We constructed a virtualised network within a vSphereESXi environment which is able to simulate: thirty workstations, ten servers, three distinct network segments and the accompanying network traffic. The VSphere environment provided added benefits, such as the ability to pause, restart and snapshot virtual computers. These abilities enabled the authors to reset the simulation environment before each test and mitigated against the damage that an attack potentially inflicts on the test network. Without simulated network traffic, the virtualised network was too sterile. This resulted in any network event being a simple task to detect, making network traffic simulation a requirement for an event detection test bed. Five main kinds of traffic were simulated: Web browsing, File transfer, e-mail, version control and Intranet File traffic. The simulated traffic volumes were pseudo randomised to represent differing temporal patterns. By building a virtualised network with simulated traffic we were able to test IDS' and other network attack detection sensors in a much more realistic environment before moving it to a live network. The goal of this paper is to present a virtualised testbedenvironmentin which network attacks can safely be tested.
- Full Text:
Implementation of AES-64 connection management for Ethernet Audio/Video Bridging devices
- Dibley, James, Foss, Richard
- Authors: Dibley, James , Foss, Richard
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/426875 , vital:72397 , https://www.aes.org/e-lib/online/browse.cfm?elib=17014
- Description: AES-64 is a standard for the discovery, enumeration, connection management, and control of multimedia network devices. This paper describes the implementation of an AES-64 protocol stack and control application on devices that support the IEEE Ethernet Audio/Video Bridging standards for streaming multimedia, enabling connection management of network audio streams.
- Full Text:
- Authors: Dibley, James , Foss, Richard
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/426875 , vital:72397 , https://www.aes.org/e-lib/online/browse.cfm?elib=17014
- Description: AES-64 is a standard for the discovery, enumeration, connection management, and control of multimedia network devices. This paper describes the implementation of an AES-64 protocol stack and control application on devices that support the IEEE Ethernet Audio/Video Bridging standards for streaming multimedia, enabling connection management of network audio streams.
- Full Text:
Insights from experimental economics on local cooperation in a small-scale fishery management system:
- Aswani, Shankar, Gurney, Georgina G, Mulville, Sara, Matera, Jaime, Gurven, Michael
- Authors: Aswani, Shankar , Gurney, Georgina G , Mulville, Sara , Matera, Jaime , Gurven, Michael
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/145437 , vital:38438 , DOI: 10.1016/j.gloenvcha.2013.08.003
- Description: Cooperation is central to collective management of small-scale fisheries management, including marine protected areas. Thus an understanding of the factors influencing stakeholders’ propensity to cooperate to achieve shared benefits is essential to accomplishing successful collective fisheries management. In this paper we study stakeholders’ cooperative behavioral disposition and elucidate the role of various socio-economic factors in influencing it in the Roviana Lagoon, Western Solomon Islands. We employed a Public Goods Game from experimental economics tailored to mimic the problem of common pool fisheries management to elucidate peoples’ cooperative behavior. Using Ostrom's framework for analyzing social-ecological systems to guide our analysis, we examined how individual-scale variables (e.g., age, education, family size, ethnicity, occupational status, personal norms), in the context of village-scale variables (e.g., village, governance institutions, group coercive action), influence cooperative behavior, as indexed by game contribution. Ostrom's framework provides an effective window for conceptually peeling back the various socio-economic and governance layers which influence cooperation within these communities.
- Full Text:
- Authors: Aswani, Shankar , Gurney, Georgina G , Mulville, Sara , Matera, Jaime , Gurven, Michael
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/145437 , vital:38438 , DOI: 10.1016/j.gloenvcha.2013.08.003
- Description: Cooperation is central to collective management of small-scale fisheries management, including marine protected areas. Thus an understanding of the factors influencing stakeholders’ propensity to cooperate to achieve shared benefits is essential to accomplishing successful collective fisheries management. In this paper we study stakeholders’ cooperative behavioral disposition and elucidate the role of various socio-economic factors in influencing it in the Roviana Lagoon, Western Solomon Islands. We employed a Public Goods Game from experimental economics tailored to mimic the problem of common pool fisheries management to elucidate peoples’ cooperative behavior. Using Ostrom's framework for analyzing social-ecological systems to guide our analysis, we examined how individual-scale variables (e.g., age, education, family size, ethnicity, occupational status, personal norms), in the context of village-scale variables (e.g., village, governance institutions, group coercive action), influence cooperative behavior, as indexed by game contribution. Ostrom's framework provides an effective window for conceptually peeling back the various socio-economic and governance layers which influence cooperation within these communities.
- Full Text:
Prevalence of sustainability reporting practices of a sample of listed companies on established and emerging stock exchanges
- Turk, Brendan K, Shackleton, Charlie M, Whittington-Jones, Kevin J
- Authors: Turk, Brendan K , Shackleton, Charlie M , Whittington-Jones, Kevin J
- Date: 2013
- Language: English
- Type: article , text
- Identifier: http://hdl.handle.net/10962/60995 , vital:27908 , DOI: https://doi.org/10.4102/sajems.v16i1.234
- Description: The business sector has a substantial role in addressing current environmental issues and concerns. Consequently, there is a growing adoption of corporate sustainability principles and practices across all market sectors. This study examined four developed and four emerging stock markets and the sustainability reporting practices of the top 20 and bottom 20 companies in each. The results illustrate that the developed market sector was more advanced in its corporate sustainability reporting, both in the proportion of companies issuing a sustainability report (approximately 60 per cent) and the proportion of company webpages dedicated to sustainability reporting. This difference was largely due to the effect of the top 20 companies. There was little difference between developed and developing markets when only the bottom 20 companies were considered, of which less than one-third provided sustainability reports. These results show that sustainability reporting is prevalent in both developed and developing markets, especially among market leading companies, but that overall, most developing markets have some catching up to do.
- Full Text:
- Authors: Turk, Brendan K , Shackleton, Charlie M , Whittington-Jones, Kevin J
- Date: 2013
- Language: English
- Type: article , text
- Identifier: http://hdl.handle.net/10962/60995 , vital:27908 , DOI: https://doi.org/10.4102/sajems.v16i1.234
- Description: The business sector has a substantial role in addressing current environmental issues and concerns. Consequently, there is a growing adoption of corporate sustainability principles and practices across all market sectors. This study examined four developed and four emerging stock markets and the sustainability reporting practices of the top 20 and bottom 20 companies in each. The results illustrate that the developed market sector was more advanced in its corporate sustainability reporting, both in the proportion of companies issuing a sustainability report (approximately 60 per cent) and the proportion of company webpages dedicated to sustainability reporting. This difference was largely due to the effect of the top 20 companies. There was little difference between developed and developing markets when only the bottom 20 companies were considered, of which less than one-third provided sustainability reports. These results show that sustainability reporting is prevalent in both developed and developing markets, especially among market leading companies, but that overall, most developing markets have some catching up to do.
- Full Text:
Synthesis and magnetic properties of a superparamagnetic nanocomposite pectin-magnetite nanocomposite
- Namanga, Jude, Foba, Josepha, Ndinteh, Derek T, Yufanyi, Divine M, Krause, Rui W M
- Authors: Namanga, Jude , Foba, Josepha , Ndinteh, Derek T , Yufanyi, Divine M , Krause, Rui W M
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/125075 , vital:35726 , https://doi.org/10.1155/2013/137275
- Description: Magnetic nanocomposites composed of superparamagnetic magnetite nanoparticles in a pectin matrix were synthesized by anin situ coprecipitation method. The pectin matrix acted as a stabilizer and size control host for the magnetite nanoparticles(MNPs) ensuring particle size homogeneity. The effects of the different reactant ratios and nanocomposite drying conditions onthe magnetic properties were investigated. The nanocomposites were characterized by X-ray diffraction (XRD), scanning electronmicroscopy (SEM), transmission electron microscopy (TEM), energy dispersive X-ray spectroscopy (EDX), Fourier-transforminfrared (FT-IR) spectroscopy, and superconducting quantum interference device magnetometer (SQUID). Superparamagneticmagnetite nanoparticles with mean diameters of 9 and 13 nm were obtained, and the freeze-dried nanocomposites had a saturationmagnetization of 54 and 53 emu/g, respectivel
- Full Text:
- Authors: Namanga, Jude , Foba, Josepha , Ndinteh, Derek T , Yufanyi, Divine M , Krause, Rui W M
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/125075 , vital:35726 , https://doi.org/10.1155/2013/137275
- Description: Magnetic nanocomposites composed of superparamagnetic magnetite nanoparticles in a pectin matrix were synthesized by anin situ coprecipitation method. The pectin matrix acted as a stabilizer and size control host for the magnetite nanoparticles(MNPs) ensuring particle size homogeneity. The effects of the different reactant ratios and nanocomposite drying conditions onthe magnetic properties were investigated. The nanocomposites were characterized by X-ray diffraction (XRD), scanning electronmicroscopy (SEM), transmission electron microscopy (TEM), energy dispersive X-ray spectroscopy (EDX), Fourier-transforminfrared (FT-IR) spectroscopy, and superconducting quantum interference device magnetometer (SQUID). Superparamagneticmagnetite nanoparticles with mean diameters of 9 and 13 nm were obtained, and the freeze-dried nanocomposites had a saturationmagnetization of 54 and 53 emu/g, respectivel
- Full Text:
Visualization of a data leak
- Swart, Ignus, Grobler, Marthie, Irwin, Barry V W
- Authors: Swart, Ignus , Grobler, Marthie , Irwin, Barry V W
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/428584 , vital:72522 , 10.1109/ISSA.2013.6641046
- Description: The potential impact that data leakage can have on a country, both on a national level as well as on an individual level, can be wide reaching and potentially catastrophic. In January 2013, several South African companies became the target of a hack attack, resulting in the breach of security measures and the leaking of a claimed 700000 records. The affected companies are spread across a number of domains, thus making the leak a very wide impact area. The aim of this paper is to analyze the data released from the South African breach and to visualize the extent of the loss by the companies affected. The value of this work lies in its connection to and interpretation of related South African legislation. The data extracted during the analysis is primarily personally identifiable information, such as defined by the Electronic Communications and Transactions Act of 2002 and the Protection of Personal Information Bill of 2009.
- Full Text:
- Authors: Swart, Ignus , Grobler, Marthie , Irwin, Barry V W
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/428584 , vital:72522 , 10.1109/ISSA.2013.6641046
- Description: The potential impact that data leakage can have on a country, both on a national level as well as on an individual level, can be wide reaching and potentially catastrophic. In January 2013, several South African companies became the target of a hack attack, resulting in the breach of security measures and the leaking of a claimed 700000 records. The affected companies are spread across a number of domains, thus making the leak a very wide impact area. The aim of this paper is to analyze the data released from the South African breach and to visualize the extent of the loss by the companies affected. The value of this work lies in its connection to and interpretation of related South African legislation. The data extracted during the analysis is primarily personally identifiable information, such as defined by the Electronic Communications and Transactions Act of 2002 and the Protection of Personal Information Bill of 2009.
- Full Text:
Walking The Other Side: Doung Anwar Jahangeer
- Authors: Simbao, Ruth K
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/147681 , vital:38660 , https://0-doi.org.wam.seals.ac.za/10.1080/09528822.2013.796204
- Description: While certain forms of mobility are romanticized in the privileged worlds of art and academia, the need to move is often triggered by vulnerability, and literal pathways on the ground reveal much about human engagement with place. This article considers the work of Mauritian-born architect/artist/performer Doung Anwar Jahangeer who is based in Durban, South Africa. Inspired by Michel de Certeau, Jahangeer argues that pathways reveal the characteristics of society and uses the act of walking to question the degree to which meaningful transformation has taken place in South Africa. His City Walk performances disclose to audiences how grounded ways of engaging with movement can challenge the metaphoric blindness that handicaps privileged movement. Focusing on his performances from the 2012 ‘Making Way’ exhibition the author interprets Jahangeer's work as challenging blind spots with regard to space, particularly partial spaces still marred by Apartheid. Through performative walking he encourages his audiences to read between the lines of road markings, cracks and signs, and to experience the power of corporeally engaging with the road by thoughtfully placing one foot in front of the other as a mode of seeing.
- Full Text:
- Authors: Simbao, Ruth K
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/147681 , vital:38660 , https://0-doi.org.wam.seals.ac.za/10.1080/09528822.2013.796204
- Description: While certain forms of mobility are romanticized in the privileged worlds of art and academia, the need to move is often triggered by vulnerability, and literal pathways on the ground reveal much about human engagement with place. This article considers the work of Mauritian-born architect/artist/performer Doung Anwar Jahangeer who is based in Durban, South Africa. Inspired by Michel de Certeau, Jahangeer argues that pathways reveal the characteristics of society and uses the act of walking to question the degree to which meaningful transformation has taken place in South Africa. His City Walk performances disclose to audiences how grounded ways of engaging with movement can challenge the metaphoric blindness that handicaps privileged movement. Focusing on his performances from the 2012 ‘Making Way’ exhibition the author interprets Jahangeer's work as challenging blind spots with regard to space, particularly partial spaces still marred by Apartheid. Through performative walking he encourages his audiences to read between the lines of road markings, cracks and signs, and to experience the power of corporeally engaging with the road by thoughtfully placing one foot in front of the other as a mode of seeing.
- Full Text:
An Analysis and Implementation of Methods for High Speed Lexical Classification of Malicious URLs
- Egan, Shaun P, Irwin, Barry V W
- Authors: Egan, Shaun P , Irwin, Barry V W
- Date: 2012
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429757 , vital:72637 , https://digifors.cs.up.ac.za/issa/2012/Proceedings/Research/58_ResearchInProgress.pdf
- Description: Several authors have put forward methods of using Artificial Neural Networks (ANN) to classify URLs as malicious or benign by using lexical features of those URLs. These methods have been compared to other methods of classification, such as blacklisting and spam filtering, and have been found to be as accurate. Early attempts proved to be as highly accurate. Fully featured classifications use lexical features as well as lookups to classify URLs and include (but are not limited to) blacklists, spam filters and reputation services. These classifiers are based on the Online Perceptron Model, using a single neuron as a linear combiner and used lexical features that rely on the presence (or lack thereof) of words belonging to a bag-of-words. Several obfuscation resistant features are also used to increase the positive classification rate of these perceptrons. Examples of these include URL length, number of directory traversals and length of arguments passed to the file within the URL. In this paper we describe how we implement the online perceptron model and methods that we used to try to increase the accuracy of this model through the use of hidden layers and training cost validation. We discuss our results in relation to those of other papers, as well as other analysis performed on the training data and the neural networks themselves to best understand why they are so effective. Also described will be the proposed model for developing these Neural Networks, how to implement them in the real world through the use of browser extensions, proxy plugins and spam filters for mail servers, and our current implementation. Finally, work that is still in progress will be described. This work includes other methods of increasing accuracy through the use of modern training techniques and testing in a real world environment.
- Full Text:
- Authors: Egan, Shaun P , Irwin, Barry V W
- Date: 2012
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429757 , vital:72637 , https://digifors.cs.up.ac.za/issa/2012/Proceedings/Research/58_ResearchInProgress.pdf
- Description: Several authors have put forward methods of using Artificial Neural Networks (ANN) to classify URLs as malicious or benign by using lexical features of those URLs. These methods have been compared to other methods of classification, such as blacklisting and spam filtering, and have been found to be as accurate. Early attempts proved to be as highly accurate. Fully featured classifications use lexical features as well as lookups to classify URLs and include (but are not limited to) blacklists, spam filters and reputation services. These classifiers are based on the Online Perceptron Model, using a single neuron as a linear combiner and used lexical features that rely on the presence (or lack thereof) of words belonging to a bag-of-words. Several obfuscation resistant features are also used to increase the positive classification rate of these perceptrons. Examples of these include URL length, number of directory traversals and length of arguments passed to the file within the URL. In this paper we describe how we implement the online perceptron model and methods that we used to try to increase the accuracy of this model through the use of hidden layers and training cost validation. We discuss our results in relation to those of other papers, as well as other analysis performed on the training data and the neural networks themselves to best understand why they are so effective. Also described will be the proposed model for developing these Neural Networks, how to implement them in the real world through the use of browser extensions, proxy plugins and spam filters for mail servers, and our current implementation. Finally, work that is still in progress will be described. This work includes other methods of increasing accuracy through the use of modern training techniques and testing in a real world environment.
- Full Text:
Building a Graphical Fuzzing Framework
- Zeisberger, Sascha, Irwin, Barry V W
- Authors: Zeisberger, Sascha , Irwin, Barry V W
- Date: 2012
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429772 , vital:72638 , https://digifors.cs.up.ac.za/issa/2012/Proceedings/Research/59_ResearchInProgress.pdf
- Description: Fuzz testing is a robustness testing technique that sends malformed data to an application’s input. This is to test an application’s behaviour when presented with input beyond its specification. The main difference between traditional testing techniques and fuzz testing is that in most traditional techniques an application is tested according to a specification and rated on how well the application conforms to that specification. Fuzz testing tests beyond the scope of a specification by intelligently generating values that may be interpreted by an application in an unintended manner. The use of fuzz testing has been more prevalent in academic and security communities despite showing success in production environments. To measure the effectiveness of fuzz testing, an experiment was conducted where several publicly available applications were fuzzed. In some instances, fuzz testing was able to force an application into an invalid state and it was concluded that fuzz testing is a relevant testing technique that could assist in developing more robust applications. This success prompted a further investigation into fuzz testing in order to compile a list of requirements that makes an effective fuzzer. The aforementioned investigation assisted in the design of a fuzz testing framework, the goal of which is to make the process more accessible to users outside of an academic and security environment. Design methodologies and justifications of said framework are discussed, focusing on the graphical user interface components as this aspect of the framework is used to increase the usability of the framework.
- Full Text:
- Authors: Zeisberger, Sascha , Irwin, Barry V W
- Date: 2012
- Subjects: To be catalogued
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/429772 , vital:72638 , https://digifors.cs.up.ac.za/issa/2012/Proceedings/Research/59_ResearchInProgress.pdf
- Description: Fuzz testing is a robustness testing technique that sends malformed data to an application’s input. This is to test an application’s behaviour when presented with input beyond its specification. The main difference between traditional testing techniques and fuzz testing is that in most traditional techniques an application is tested according to a specification and rated on how well the application conforms to that specification. Fuzz testing tests beyond the scope of a specification by intelligently generating values that may be interpreted by an application in an unintended manner. The use of fuzz testing has been more prevalent in academic and security communities despite showing success in production environments. To measure the effectiveness of fuzz testing, an experiment was conducted where several publicly available applications were fuzzed. In some instances, fuzz testing was able to force an application into an invalid state and it was concluded that fuzz testing is a relevant testing technique that could assist in developing more robust applications. This success prompted a further investigation into fuzz testing in order to compile a list of requirements that makes an effective fuzzer. The aforementioned investigation assisted in the design of a fuzz testing framework, the goal of which is to make the process more accessible to users outside of an academic and security environment. Design methodologies and justifications of said framework are discussed, focusing on the graphical user interface components as this aspect of the framework is used to increase the usability of the framework.
- Full Text:
- «
- ‹
- 1
- ›
- »