A cyber security awareness and education framework for South Africa
- Authors: Kortjan, Noloxolo
- Date: 2013
- Subjects: Computer networks -- Security measures , Computer crimes -- Prevention , Computer security
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9811 , http://hdl.handle.net/10948/d1014829
- Description: The Internet is becoming increasingly interwoven in the daily life of many individuals, organisations and nations. It has, to a large extent, had a positive effect on the way people communicate. It has also introduced new avenues for business and has offered nations an opportunity to govern online. Nevertheless, although cyberspace offers an endless list of services and opportunities, it is also accompanied by many risks. One of these risks is cybercrime. The Internet has given criminals a platform on which to grow and proliferate. As a result of the abstract nature of the Internet, it is easy for these criminals to go unpunished. Moreover, many who use the Internet are not aware of such threats; therefore they may themselves be at risk, together with businesses and governmental assets and infrastructure. In view of this, there is a need for cyber security awareness and education initiatives that will promote users who are well versed in the risks associated with the Internet. In this context, it is the role of the government to empower all levels of society by providing the necessary knowledge and expertise to act securely online. However, there is currently a definite lack in South Africa (SA) in this regard, as there are currently no government-led cyber security awareness and education initiatives. The primary research objective of this study, therefore, is to propose a cyber security awareness and education framework for SA that will assist in creating a cyber secure culture in SA among all of its users of the Internet.
- Full Text:
- Date Issued: 2013
- Authors: Kortjan, Noloxolo
- Date: 2013
- Subjects: Computer networks -- Security measures , Computer crimes -- Prevention , Computer security
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9811 , http://hdl.handle.net/10948/d1014829
- Description: The Internet is becoming increasingly interwoven in the daily life of many individuals, organisations and nations. It has, to a large extent, had a positive effect on the way people communicate. It has also introduced new avenues for business and has offered nations an opportunity to govern online. Nevertheless, although cyberspace offers an endless list of services and opportunities, it is also accompanied by many risks. One of these risks is cybercrime. The Internet has given criminals a platform on which to grow and proliferate. As a result of the abstract nature of the Internet, it is easy for these criminals to go unpunished. Moreover, many who use the Internet are not aware of such threats; therefore they may themselves be at risk, together with businesses and governmental assets and infrastructure. In view of this, there is a need for cyber security awareness and education initiatives that will promote users who are well versed in the risks associated with the Internet. In this context, it is the role of the government to empower all levels of society by providing the necessary knowledge and expertise to act securely online. However, there is currently a definite lack in South Africa (SA) in this regard, as there are currently no government-led cyber security awareness and education initiatives. The primary research objective of this study, therefore, is to propose a cyber security awareness and education framework for SA that will assist in creating a cyber secure culture in SA among all of its users of the Internet.
- Full Text:
- Date Issued: 2013
A framework for assuring conformance of cloud-based email at higher education institutions
- Authors: Willett, Melanie
- Date: 2013
- Subjects: Cloud computing -- Security measures , Computer networks -- Security measures , Web services , Education, Higher -- Technological innovations
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: vital:9815 , http://hdl.handle.net/10948/d1018664
- Description: Cloud computing is a relatively immature computing paradigm that could significantly benefit users. Cloud computing solutions are often associated with potential benefits such as cost reduction, less administrative hassle, flexibility and scalability. For organisations to realize such potential benefits, cloud computing solutions need to be chosen, implemented, managed and governed in a way that is secure, compliant with internal and external requirements and indicative of due diligence. This can be a challenge, given the many concerns and risks commonly associated with cloud computing solutions. One cloud computing solution that is being widely adopted around the world is cloud-based email. One of the foremost adopters of this cloud computing solution is higher education institutions. These higher education institutions stand to benefit greatly from using such services. Cloud-based email can be provisioned to staff and students at these institutions for free. Additionally, cloud service providers (CSPs) are able to provide a better email service than some higher education institutions would be able to provide if they were required to do so in-house. CSPs often provide larger inboxes and many extra services with cloud-based email. Cloud-based email is, therefore, clearly an example of a cloud computing solution that has the potential to benefit organisations. There are however, risks and challenges associated with the use of this cloud computing solution. Two of these challenges relate to ensuring conformance to internal and external (legal, regulatory and contractual obligations) requirements and to providing a mechanism of assuring that cloud-based email related activities are sound. The lack of structured guidelines for assuring the conformance of cloud-based email is putting this service at risk at higher education institutions in South Africa. This work addresses this problem by promoting a best practice based approach to assuring the conformance of cloud-based email at higher education institutions. To accomplish this, components of applicable standards and best practice guidelines for IT governance, IT assurance and IT conformance are used to construct a framework for assuring the conformance of cloud-based email. The framework is designed and verified using sound design science principles. The utility and value of the framework has been demonstrated at a higher education institution in South Africa. This framework can be used to assist higher education institutions to demonstrate due diligence in assuring that they conform to legal and best practice requirements for the management and governance of cloud-based email. This is a significant contribution in the relatively new field of cloud computing governance.
- Full Text:
- Date Issued: 2013
- Authors: Willett, Melanie
- Date: 2013
- Subjects: Cloud computing -- Security measures , Computer networks -- Security measures , Web services , Education, Higher -- Technological innovations
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: vital:9815 , http://hdl.handle.net/10948/d1018664
- Description: Cloud computing is a relatively immature computing paradigm that could significantly benefit users. Cloud computing solutions are often associated with potential benefits such as cost reduction, less administrative hassle, flexibility and scalability. For organisations to realize such potential benefits, cloud computing solutions need to be chosen, implemented, managed and governed in a way that is secure, compliant with internal and external requirements and indicative of due diligence. This can be a challenge, given the many concerns and risks commonly associated with cloud computing solutions. One cloud computing solution that is being widely adopted around the world is cloud-based email. One of the foremost adopters of this cloud computing solution is higher education institutions. These higher education institutions stand to benefit greatly from using such services. Cloud-based email can be provisioned to staff and students at these institutions for free. Additionally, cloud service providers (CSPs) are able to provide a better email service than some higher education institutions would be able to provide if they were required to do so in-house. CSPs often provide larger inboxes and many extra services with cloud-based email. Cloud-based email is, therefore, clearly an example of a cloud computing solution that has the potential to benefit organisations. There are however, risks and challenges associated with the use of this cloud computing solution. Two of these challenges relate to ensuring conformance to internal and external (legal, regulatory and contractual obligations) requirements and to providing a mechanism of assuring that cloud-based email related activities are sound. The lack of structured guidelines for assuring the conformance of cloud-based email is putting this service at risk at higher education institutions in South Africa. This work addresses this problem by promoting a best practice based approach to assuring the conformance of cloud-based email at higher education institutions. To accomplish this, components of applicable standards and best practice guidelines for IT governance, IT assurance and IT conformance are used to construct a framework for assuring the conformance of cloud-based email. The framework is designed and verified using sound design science principles. The utility and value of the framework has been demonstrated at a higher education institution in South Africa. This framework can be used to assist higher education institutions to demonstrate due diligence in assuring that they conform to legal and best practice requirements for the management and governance of cloud-based email. This is a significant contribution in the relatively new field of cloud computing governance.
- Full Text:
- Date Issued: 2013
A framework to mitigate phishing threats
- Authors: Frauenstein, Edwin Donald
- Date: 2013
- Subjects: Computer networks -- Security measures , Mobile computing -- Security measures , Online social networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9832 , http://hdl.handle.net/10948/d1021208
- Description: We live today in the information age with users being able to access and share information freely by using both personal computers and their handheld devices. This, in turn, has been made possible by the Internet. However, this poses security risks as attempts are made to use this same environment in order to compromise the confidentiality, integrity and availability of information. Accordingly, there is an urgent need for users and organisations to protect their information resources from agents posing a security threat. Organisations typically spend large amounts of money as well as dedicating resources to improve their technological defences against general security threats. However, the agents posing these threats are adopting social engineering techniques in order to bypass the technical measures which organisations are putting in place. These social engineering techniques are often effective because they target human behaviour, something which the majority of researchers believe is a far easier alternative than hacking information systems. As such, phishing effectively makes use of a combination of social engineering techniques which involve crafty technical emails and website designs which gain the trust of their victims. Within an organisational context, there are a number of areas which phishers exploit. These areas include human factors, organisational aspects and technological controls. Ironically, these same areas serve simultaneously as security measures against phishing attacks. However, each of these three areas mentioned above are characterised by gaps which arise as a result of human involvement. As a result, the current approach to mitigating phishing threats comprises a single-layer defence model only. However, this study proposes a holistic model which integrates each of these three areas by strengthening the human element in each of these areas by means of a security awareness, training and education programme.
- Full Text:
- Date Issued: 2013
- Authors: Frauenstein, Edwin Donald
- Date: 2013
- Subjects: Computer networks -- Security measures , Mobile computing -- Security measures , Online social networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9832 , http://hdl.handle.net/10948/d1021208
- Description: We live today in the information age with users being able to access and share information freely by using both personal computers and their handheld devices. This, in turn, has been made possible by the Internet. However, this poses security risks as attempts are made to use this same environment in order to compromise the confidentiality, integrity and availability of information. Accordingly, there is an urgent need for users and organisations to protect their information resources from agents posing a security threat. Organisations typically spend large amounts of money as well as dedicating resources to improve their technological defences against general security threats. However, the agents posing these threats are adopting social engineering techniques in order to bypass the technical measures which organisations are putting in place. These social engineering techniques are often effective because they target human behaviour, something which the majority of researchers believe is a far easier alternative than hacking information systems. As such, phishing effectively makes use of a combination of social engineering techniques which involve crafty technical emails and website designs which gain the trust of their victims. Within an organisational context, there are a number of areas which phishers exploit. These areas include human factors, organisational aspects and technological controls. Ironically, these same areas serve simultaneously as security measures against phishing attacks. However, each of these three areas mentioned above are characterised by gaps which arise as a result of human involvement. As a result, the current approach to mitigating phishing threats comprises a single-layer defence model only. However, this study proposes a holistic model which integrates each of these three areas by strengthening the human element in each of these areas by means of a security awareness, training and education programme.
- Full Text:
- Date Issued: 2013
Log analysis aided by latent semantic mapping
- Authors: Buys, Stephanus
- Date: 2013 , 2013-04-14
- Subjects: Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4575 , http://hdl.handle.net/10962/d1002963 , Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Description: In an age of zero-day exploits and increased on-line attacks on computing infrastructure, operational security practitioners are becoming increasingly aware of the value of the information captured in log events. Analysis of these events is critical during incident response, forensic investigations related to network breaches, hacking attacks and data leaks. Such analysis has led to the discipline of Security Event Analysis, also known as Log Analysis. There are several challenges when dealing with events, foremost being the increased volumes at which events are often generated and stored. Furthermore, events are often captured as unstructured data, with very little consistency in the formats or contents of the events. In this environment, security analysts and implementers of Log Management (LM) or Security Information and Event Management (SIEM) systems face the daunting task of identifying, classifying and disambiguating massive volumes of events in order for security analysis and automation to proceed. Latent Semantic Mapping (LSM) is a proven paradigm shown to be an effective method of, among other things, enabling word clustering, document clustering, topic clustering and semantic inference. This research is an investigation into the practical application of LSM in the discipline of Security Event Analysis, showing the value of using LSM to assist practitioners in identifying types of events, classifying events as belonging to certain sources or technologies and disambiguating different events from each other. The culmination of this research presents adaptations to traditional natural language processing techniques that resulted in improved efficacy of LSM when dealing with Security Event Analysis. This research provides strong evidence supporting the wider adoption and use of LSM, as well as further investigation into Security Event Analysis assisted by LSM and other natural language or computer-learning processing techniques. , LaTeX with hyperref package , Adobe Acrobat 9.54 Paper Capture Plug-in
- Full Text:
- Date Issued: 2013
- Authors: Buys, Stephanus
- Date: 2013 , 2013-04-14
- Subjects: Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4575 , http://hdl.handle.net/10962/d1002963 , Latent semantic indexing , Data mining , Computer networks -- Security measures , Computer hackers , Computer security
- Description: In an age of zero-day exploits and increased on-line attacks on computing infrastructure, operational security practitioners are becoming increasingly aware of the value of the information captured in log events. Analysis of these events is critical during incident response, forensic investigations related to network breaches, hacking attacks and data leaks. Such analysis has led to the discipline of Security Event Analysis, also known as Log Analysis. There are several challenges when dealing with events, foremost being the increased volumes at which events are often generated and stored. Furthermore, events are often captured as unstructured data, with very little consistency in the formats or contents of the events. In this environment, security analysts and implementers of Log Management (LM) or Security Information and Event Management (SIEM) systems face the daunting task of identifying, classifying and disambiguating massive volumes of events in order for security analysis and automation to proceed. Latent Semantic Mapping (LSM) is a proven paradigm shown to be an effective method of, among other things, enabling word clustering, document clustering, topic clustering and semantic inference. This research is an investigation into the practical application of LSM in the discipline of Security Event Analysis, showing the value of using LSM to assist practitioners in identifying types of events, classifying events as belonging to certain sources or technologies and disambiguating different events from each other. The culmination of this research presents adaptations to traditional natural language processing techniques that resulted in improved efficacy of LSM when dealing with Security Event Analysis. This research provides strong evidence supporting the wider adoption and use of LSM, as well as further investigation into Security Event Analysis assisted by LSM and other natural language or computer-learning processing techniques. , LaTeX with hyperref package , Adobe Acrobat 9.54 Paper Capture Plug-in
- Full Text:
- Date Issued: 2013
A framework for information security governance in SMMEs
- Authors: Coertze, Jacques Jacobus
- Date: 2012
- Subjects: Business -- Data processing -- Security measures , Management information systems -- Security measures , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9810 , http://hdl.handle.net/10948/d1014083
- Description: It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
- Full Text:
- Date Issued: 2012
- Authors: Coertze, Jacques Jacobus
- Date: 2012
- Subjects: Business -- Data processing -- Security measures , Management information systems -- Security measures , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9810 , http://hdl.handle.net/10948/d1014083
- Description: It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
- Full Text:
- Date Issued: 2012
GPF : a framework for general packet classification on GPU co-processors
- Authors: Nottingham, Alastair
- Date: 2012
- Subjects: Graphics processing units , Coprocessors , Computer network protocols , Computer networks -- Security measures , NVIDIA Corporation
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4661 , http://hdl.handle.net/10962/d1006662 , Graphics processing units , Coprocessors , Computer network protocols , Computer networks -- Security measures , NVIDIA Corporation
- Description: This thesis explores the design and experimental implementation of GPF, a novel protocol-independent, multi-match packet classification framework. This framework is targeted and optimised for flexible, efficient execution on NVIDIA GPU platforms through the CUDA API, but should not be difficult to port to other platforms, such as OpenCL, in the future. GPF was conceived and developed in order to accelerate classification of large packet capture files, such as those collected by Network Telescopes. It uses a multiphase SIMD classification process which exploits both the parallelism of packet sets and the redundancy in filter programs, in order to classify packet captures against multiple filters at extremely high rates. The resultant framework - comprised of classification, compilation and buffering components - efficiently leverages GPU resources to classify arbitrary protocols, and return multiple filter results for each packet. The classification functions described were verified and evaluated by testing an experimental prototype implementation against several filter programs, of varying complexity, on devices from three GPU platform generations. In addition to the significant speedup achieved in processing results, analysis indicates that the prototype classification functions perform predictably, and scale linearly with respect to both packet count and filter complexity. Furthermore, classification throughput (packets/s) remained essentially constant regardless of the underlying packet data, and thus the effective data rate when classifying a particular filter was heavily influenced by the average size of packets in the processed capture. For example: in the trivial case of classifying all IPv4 packets ranging in size from 70 bytes to 1KB, the observed data rate achieved by the GPU classification kernels ranged from 60Gbps to 900Gbps on a GTX 275, and from 220Gbps to 3.3Tbps on a GTX 480. In the less trivial case of identifying all ARP, TCP, UDP and ICMP packets for both IPv4 and IPv6 protocols, the effective data rates ranged from 15Gbps to 220Gbps (GTX 275), and from 50Gbps to 740Gbps (GTX 480), for 70B and 1KB packets respectively. , LaTeX with hyperref package
- Full Text:
- Date Issued: 2012
- Authors: Nottingham, Alastair
- Date: 2012
- Subjects: Graphics processing units , Coprocessors , Computer network protocols , Computer networks -- Security measures , NVIDIA Corporation
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4661 , http://hdl.handle.net/10962/d1006662 , Graphics processing units , Coprocessors , Computer network protocols , Computer networks -- Security measures , NVIDIA Corporation
- Description: This thesis explores the design and experimental implementation of GPF, a novel protocol-independent, multi-match packet classification framework. This framework is targeted and optimised for flexible, efficient execution on NVIDIA GPU platforms through the CUDA API, but should not be difficult to port to other platforms, such as OpenCL, in the future. GPF was conceived and developed in order to accelerate classification of large packet capture files, such as those collected by Network Telescopes. It uses a multiphase SIMD classification process which exploits both the parallelism of packet sets and the redundancy in filter programs, in order to classify packet captures against multiple filters at extremely high rates. The resultant framework - comprised of classification, compilation and buffering components - efficiently leverages GPU resources to classify arbitrary protocols, and return multiple filter results for each packet. The classification functions described were verified and evaluated by testing an experimental prototype implementation against several filter programs, of varying complexity, on devices from three GPU platform generations. In addition to the significant speedup achieved in processing results, analysis indicates that the prototype classification functions perform predictably, and scale linearly with respect to both packet count and filter complexity. Furthermore, classification throughput (packets/s) remained essentially constant regardless of the underlying packet data, and thus the effective data rate when classifying a particular filter was heavily influenced by the average size of packets in the processed capture. For example: in the trivial case of classifying all IPv4 packets ranging in size from 70 bytes to 1KB, the observed data rate achieved by the GPU classification kernels ranged from 60Gbps to 900Gbps on a GTX 275, and from 220Gbps to 3.3Tbps on a GTX 480. In the less trivial case of identifying all ARP, TCP, UDP and ICMP packets for both IPv4 and IPv6 protocols, the effective data rates ranged from 15Gbps to 220Gbps (GTX 275), and from 50Gbps to 740Gbps (GTX 480), for 70B and 1KB packets respectively. , LaTeX with hyperref package
- Full Text:
- Date Issued: 2012
A framework for the development of a personal information security agent
- Authors: Stieger, Ewald Andreas
- Date: 2011
- Subjects: Computer networks -- Security measures , Information storage and retrieval systems , Artificial intelligence
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9803 , http://hdl.handle.net/10948/d1012326 , Computer networks -- Security measures , Information storage and retrieval systems , Artificial intelligence
- Description: Nowadays information is everywhere. Organisations process, store and create information in unprecedented quantities to support their business processes. Similarly, people use, share and synthesise information to accomplish their daily tasks. Indeed, information and information technology are the core of business activities, and a part of daily life. Information has become a crucial resource in today‘s information age and any corruption, destruction or leakage of information can have a serious negative impact on an organisation. Thus, information should be kept safe. This requires the successful implementation of information security, which ensures that information assets are only used, modified and accessed by authorised people. Information security faces many challenges; and organisations still have not successfully addressed them. One of the main challenges is the human element. Information security depends to a large extent on people and their ability to follow and apply sound security practices. Unfortunately, people are often not very security-conscious in their behaviour; and this is the cause of many security breaches. There are a variety of reasons for this such as a lack of knowledge and a negative attitude to security. Many organisations are aware of this; and they attempt to remedy the situation by means of information security awareness programs. These programs aim to educate, train and increase the security awareness of individuals. However, information security awareness programs are not always successful. They are not a once-off remedy that can quickly cure information security. The programs need to be implemented effectively, and they require an ongoing effort. Unfortunately, this is where many organisations fail. Furthermore, changing individuals‘ security behaviour is difficult due to the complexity of factors that influence everyday behaviour. In view of the above, this research project proposes an alternative approach in the form of a personal information security agent. The goal of this agent is to influence individuals to adopt more secure behaviour. There are a variety of factors that need to be considered, in order to achieve this goal, and to positively influence security behaviour. Consequently, this research establishes criteria and principles for such an agent, based on the theory and practice. From a theoretical point of view, a variety of factors that influence human behaviour such as self-efficacy and normative beliefs were investigated. Furthermore, the field of persuasive technology has provided for strategies that can be used by technology to influence individuals. On the practical side, a prototype of a personal information security agent was created and evaluated through a technical software review process. The evaluation of the prototype showed that the theoretical criteria have merit but their effectiveness is largely dependent on how they are implemented. The criteria were thus revised, based on the practical findings. The findings also suggest that a personal information security agent, based on the criteria, may be able to positively influence individuals to be more secure in their behaviour. The insights gained by the research are presented in the form of a framework that makes both theoretical and practical recommendations for developing a personal information security agent. One may, consequently, conclude that the purpose of this research is to provide a foundation for the development of a personal information security agent to positively influence computer users to be more security-conscious in their behavior.
- Full Text:
- Date Issued: 2011
- Authors: Stieger, Ewald Andreas
- Date: 2011
- Subjects: Computer networks -- Security measures , Information storage and retrieval systems , Artificial intelligence
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9803 , http://hdl.handle.net/10948/d1012326 , Computer networks -- Security measures , Information storage and retrieval systems , Artificial intelligence
- Description: Nowadays information is everywhere. Organisations process, store and create information in unprecedented quantities to support their business processes. Similarly, people use, share and synthesise information to accomplish their daily tasks. Indeed, information and information technology are the core of business activities, and a part of daily life. Information has become a crucial resource in today‘s information age and any corruption, destruction or leakage of information can have a serious negative impact on an organisation. Thus, information should be kept safe. This requires the successful implementation of information security, which ensures that information assets are only used, modified and accessed by authorised people. Information security faces many challenges; and organisations still have not successfully addressed them. One of the main challenges is the human element. Information security depends to a large extent on people and their ability to follow and apply sound security practices. Unfortunately, people are often not very security-conscious in their behaviour; and this is the cause of many security breaches. There are a variety of reasons for this such as a lack of knowledge and a negative attitude to security. Many organisations are aware of this; and they attempt to remedy the situation by means of information security awareness programs. These programs aim to educate, train and increase the security awareness of individuals. However, information security awareness programs are not always successful. They are not a once-off remedy that can quickly cure information security. The programs need to be implemented effectively, and they require an ongoing effort. Unfortunately, this is where many organisations fail. Furthermore, changing individuals‘ security behaviour is difficult due to the complexity of factors that influence everyday behaviour. In view of the above, this research project proposes an alternative approach in the form of a personal information security agent. The goal of this agent is to influence individuals to adopt more secure behaviour. There are a variety of factors that need to be considered, in order to achieve this goal, and to positively influence security behaviour. Consequently, this research establishes criteria and principles for such an agent, based on the theory and practice. From a theoretical point of view, a variety of factors that influence human behaviour such as self-efficacy and normative beliefs were investigated. Furthermore, the field of persuasive technology has provided for strategies that can be used by technology to influence individuals. On the practical side, a prototype of a personal information security agent was created and evaluated through a technical software review process. The evaluation of the prototype showed that the theoretical criteria have merit but their effectiveness is largely dependent on how they are implemented. The criteria were thus revised, based on the practical findings. The findings also suggest that a personal information security agent, based on the criteria, may be able to positively influence individuals to be more secure in their behaviour. The insights gained by the research are presented in the form of a framework that makes both theoretical and practical recommendations for developing a personal information security agent. One may, consequently, conclude that the purpose of this research is to provide a foundation for the development of a personal information security agent to positively influence computer users to be more security-conscious in their behavior.
- Full Text:
- Date Issued: 2011
Digital forensic model for computer networks
- Authors: Sanyamahwe, Tendai
- Date: 2011
- Subjects: Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11127 , http://hdl.handle.net/10353/d1000968 , Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Description: The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
- Full Text:
- Date Issued: 2011
- Authors: Sanyamahwe, Tendai
- Date: 2011
- Subjects: Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11127 , http://hdl.handle.net/10353/d1000968 , Computer crimes -- Investigation , Evidence, Criminal , Computer networks -- Security measures , Electronic evidence , Forensic sciences , Internet -- Security measures
- Description: The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
- Full Text:
- Date Issued: 2011
Educating users about information security by means of game play
- Authors: Monk, Thomas Philippus
- Date: 2011
- Subjects: Computer security , Educational games -- Design , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9748 , http://hdl.handle.net/10948/1493 , Computer security , Educational games -- Design , Computer networks -- Security measures
- Description: Information is necessary for any business to function. However, if one does not manage one’s information assets properly then one’s business is likely to be at risk. By implementing Information Security controls, procedures, and/or safeguards one can secure information assets against risks. The risks of an organisation can be mitigated if employees implement safety measures. However, employees are often unable to work securely due to a lack of knowledge. This dissertation evaluates the premise that a computer game could be used to educate employees about Information Security. A game was developed with the aim of educating employees in this regard. If people were motivated to play the game, without external motivation from an organisation, then people would also, indirectly, be motivated to learn about Information Security. Therefore, a secondary aim of this game was to be self-motivating. An experiment was conducted in order to test whether or not these aims were met. The experiment was conducted on a play test group and a control group. The play test group played the game before completing a questionnaire that tested the information security knowledge of participants, while the control group simply completed the questionnaire. The two groups’ answers were compared in order to obtain results. This dissertation discusses the research design of the experiment and also provides an analysis of the results. The game design will be discussed which provides guidelines for future game designers to follow. The experiment indicated that the game is motivational, but perhaps not educational enough. However, the results suggest that a computer game can still be used to teach users about Information Security. Factors that involved consequence and repetition contributed towards the educational value of the game, whilst competitiveness and rewards contributed to the motivational aspect of the game.
- Full Text:
- Date Issued: 2011
- Authors: Monk, Thomas Philippus
- Date: 2011
- Subjects: Computer security , Educational games -- Design , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9748 , http://hdl.handle.net/10948/1493 , Computer security , Educational games -- Design , Computer networks -- Security measures
- Description: Information is necessary for any business to function. However, if one does not manage one’s information assets properly then one’s business is likely to be at risk. By implementing Information Security controls, procedures, and/or safeguards one can secure information assets against risks. The risks of an organisation can be mitigated if employees implement safety measures. However, employees are often unable to work securely due to a lack of knowledge. This dissertation evaluates the premise that a computer game could be used to educate employees about Information Security. A game was developed with the aim of educating employees in this regard. If people were motivated to play the game, without external motivation from an organisation, then people would also, indirectly, be motivated to learn about Information Security. Therefore, a secondary aim of this game was to be self-motivating. An experiment was conducted in order to test whether or not these aims were met. The experiment was conducted on a play test group and a control group. The play test group played the game before completing a questionnaire that tested the information security knowledge of participants, while the control group simply completed the questionnaire. The two groups’ answers were compared in order to obtain results. This dissertation discusses the research design of the experiment and also provides an analysis of the results. The game design will be discussed which provides guidelines for future game designers to follow. The experiment indicated that the game is motivational, but perhaps not educational enough. However, the results suggest that a computer game can still be used to teach users about Information Security. Factors that involved consequence and repetition contributed towards the educational value of the game, whilst competitiveness and rewards contributed to the motivational aspect of the game.
- Full Text:
- Date Issued: 2011
A framework towards effective control in information security governance
- Authors: Viljoen, Melanie
- Date: 2009
- Subjects: Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9773 , http://hdl.handle.net/10948/887 , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
- Full Text:
- Date Issued: 2009
- Authors: Viljoen, Melanie
- Date: 2009
- Subjects: Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9773 , http://hdl.handle.net/10948/887 , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
- Full Text:
- Date Issued: 2009
A model to measure the maturuty of smartphone security at software consultancies
- Authors: Allam, Sean
- Date: 2009
- Subjects: Computer networks -- Security measures , Capability maturity model (Computer software) , Smartphones , Wireless Internet , Mobile communication systems , Mobile computing
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11135 , http://hdl.handle.net/10353/281 , Computer networks -- Security measures , Capability maturity model (Computer software) , Smartphones , Wireless Internet , Mobile communication systems , Mobile computing
- Description: Smartphones are proliferating into the workplace at an ever-increasing rate, similarly the threats that they pose is increasing. In an era of constant connectivity and availability, information is freed up of constraints of time and place. This research project delves into the risks introduced by smartphones, and through multiple cases studies, a maturity measurement model is formulated. The model is based on recommendations from two leading information security frameworks, the COBIT 4.1 framework and ISO27002 code of practice. Ultimately, a combination of smartphone specific risks are integrated with key control recommendations, in providing a set of key measurable security maturity components. The subjective opinions of case study respondents are considered a key component in achieving a solution. The solution addresses the concerns of not only policy makers, but also the employees subjected to the security policies. Nurturing security awareness into organisational culture through reinforcement and employee acceptance is highlighted in this research project. Software consultancies can use this model to mitigate risks, while harnessing the potential strategic advantages of mobile computing through smartphone devices. In addition, this research project identifies the critical components of a smartphone security solution. As a result, a model is provided for software consultancies due to the intense reliance on information within these types of organisations. The model can be effectively applied to any information intensive organisation.
- Full Text:
- Date Issued: 2009
- Authors: Allam, Sean
- Date: 2009
- Subjects: Computer networks -- Security measures , Capability maturity model (Computer software) , Smartphones , Wireless Internet , Mobile communication systems , Mobile computing
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11135 , http://hdl.handle.net/10353/281 , Computer networks -- Security measures , Capability maturity model (Computer software) , Smartphones , Wireless Internet , Mobile communication systems , Mobile computing
- Description: Smartphones are proliferating into the workplace at an ever-increasing rate, similarly the threats that they pose is increasing. In an era of constant connectivity and availability, information is freed up of constraints of time and place. This research project delves into the risks introduced by smartphones, and through multiple cases studies, a maturity measurement model is formulated. The model is based on recommendations from two leading information security frameworks, the COBIT 4.1 framework and ISO27002 code of practice. Ultimately, a combination of smartphone specific risks are integrated with key control recommendations, in providing a set of key measurable security maturity components. The subjective opinions of case study respondents are considered a key component in achieving a solution. The solution addresses the concerns of not only policy makers, but also the employees subjected to the security policies. Nurturing security awareness into organisational culture through reinforcement and employee acceptance is highlighted in this research project. Software consultancies can use this model to mitigate risks, while harnessing the potential strategic advantages of mobile computing through smartphone devices. In addition, this research project identifies the critical components of a smartphone security solution. As a result, a model is provided for software consultancies due to the intense reliance on information within these types of organisations. The model can be effectively applied to any information intensive organisation.
- Full Text:
- Date Issued: 2009
An investigation into interoperable end-to-end mobile web service security
- Authors: Moyo, Thamsanqa
- Date: 2008
- Subjects: Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4595 , http://hdl.handle.net/10962/d1004838 , Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Description: The capacity to engage in web services transactions on smartphones is growing as these devices become increasingly powerful and sophisticated. This capacity for mobile web services is being realised through mobile applications that consume web services hosted on larger computing devices. This thesis investigates the effect that end-to-end web services security has on the interoperability between mobile web services requesters and traditional web services providers. SOAP web services are the preferred web services approach for this investigation. Although WS-Security is recognised as demanding on mobile hardware and network resources, the selection of appropriate WS-Security mechanisms lessens this burden. An attempt to implement such mechanisms on smartphones is carried out via an experiment. Smartphones are selected as the mobile device type used in the experiment. The experiment is conducted on the Java Micro Edition (Java ME) and the .NET Compact Framework (.NET CF) smartphone platforms. The experiment shows that the implementation of interoperable, end-to-end, mobile web services security on both platforms is reliant on third-party libraries. This reliance on third-party libraries results in poor developer support and exposes developers to the complexity of cryptography. The experiment also shows that there are no standard message size optimisation libraries available for both platforms. The implementation carried out on the .NET CF is also shown to rely on the underlying operating system. It is concluded that standard WS-Security APIs must be provided on smartphone platforms to avoid the problems of poor developer support and the additional complexity of cryptography. It is recommended that these APIs include a message optimisation technique. It is further recommended that WS-Security APIs be completely operating system independent when they are implemented in managed code. This thesis contributes by: providing a snapshot of mobile web services security; identifying the smartphone platform state of readiness for end-to-end secure web services; and providing a set of recommendations that may improve this state of readiness. These contributions are of increasing importance as mobile web services evolve from a simple point-to-point environment to the more complex enterprise environment.
- Full Text:
- Date Issued: 2008
- Authors: Moyo, Thamsanqa
- Date: 2008
- Subjects: Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4595 , http://hdl.handle.net/10962/d1004838 , Web services , Mobile computing , Smartphones , Internetworking (Telecommunication) , Computer networks -- Security measures , XML (Document markup language) , Microsoft .NET Framework , Java (Computer program language)
- Description: The capacity to engage in web services transactions on smartphones is growing as these devices become increasingly powerful and sophisticated. This capacity for mobile web services is being realised through mobile applications that consume web services hosted on larger computing devices. This thesis investigates the effect that end-to-end web services security has on the interoperability between mobile web services requesters and traditional web services providers. SOAP web services are the preferred web services approach for this investigation. Although WS-Security is recognised as demanding on mobile hardware and network resources, the selection of appropriate WS-Security mechanisms lessens this burden. An attempt to implement such mechanisms on smartphones is carried out via an experiment. Smartphones are selected as the mobile device type used in the experiment. The experiment is conducted on the Java Micro Edition (Java ME) and the .NET Compact Framework (.NET CF) smartphone platforms. The experiment shows that the implementation of interoperable, end-to-end, mobile web services security on both platforms is reliant on third-party libraries. This reliance on third-party libraries results in poor developer support and exposes developers to the complexity of cryptography. The experiment also shows that there are no standard message size optimisation libraries available for both platforms. The implementation carried out on the .NET CF is also shown to rely on the underlying operating system. It is concluded that standard WS-Security APIs must be provided on smartphone platforms to avoid the problems of poor developer support and the additional complexity of cryptography. It is recommended that these APIs include a message optimisation technique. It is further recommended that WS-Security APIs be completely operating system independent when they are implemented in managed code. This thesis contributes by: providing a snapshot of mobile web services security; identifying the smartphone platform state of readiness for end-to-end secure web services; and providing a set of recommendations that may improve this state of readiness. These contributions are of increasing importance as mobile web services evolve from a simple point-to-point environment to the more complex enterprise environment.
- Full Text:
- Date Issued: 2008
Towards a user centric model for identity and access management within the online environment
- Authors: Deas, Matthew Burns
- Date: 2008
- Subjects: Computers -- Access control , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9780 , http://hdl.handle.net/10948/775 , Computers -- Access control , Computer networks -- Security measures
- Description: Today, one is expected to remember multiple user names and passwords for different domains when one wants to access on the Internet. Identity management seeks to solve this problem through creating a digital identity that is exchangeable across organisational boundaries. Through the setup of collaboration agreements between multiple domains, users can easily switch across domains without being required to sign in again. However, use of this technology comes with risks of user identity and personal information being compromised. Criminals make use of spoofed websites and social engineering techniques to gain illegal access to user information. Due to this, the need for users to be protected from online threats has increased. Two processes are required to protect the user login information at the time of sign-on. Firstly, user’s information must be protected at the time of sign-on, and secondly, a simple method for the identification of the website is required by the user. This treatise looks at the process for identifying and verifying user information, and how the user can verify the system at sign-in. Three models for identity management are analysed, namely the Microsoft .NET Passport, Liberty Alliance Federated Identity for Single Sign-on and the Mozilla TrustBar for system authentication.
- Full Text:
- Date Issued: 2008
- Authors: Deas, Matthew Burns
- Date: 2008
- Subjects: Computers -- Access control , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9780 , http://hdl.handle.net/10948/775 , Computers -- Access control , Computer networks -- Security measures
- Description: Today, one is expected to remember multiple user names and passwords for different domains when one wants to access on the Internet. Identity management seeks to solve this problem through creating a digital identity that is exchangeable across organisational boundaries. Through the setup of collaboration agreements between multiple domains, users can easily switch across domains without being required to sign in again. However, use of this technology comes with risks of user identity and personal information being compromised. Criminals make use of spoofed websites and social engineering techniques to gain illegal access to user information. Due to this, the need for users to be protected from online threats has increased. Two processes are required to protect the user login information at the time of sign-on. Firstly, user’s information must be protected at the time of sign-on, and secondly, a simple method for the identification of the website is required by the user. This treatise looks at the process for identifying and verifying user information, and how the user can verify the system at sign-in. Three models for identity management are analysed, namely the Microsoft .NET Passport, Liberty Alliance Federated Identity for Single Sign-on and the Mozilla TrustBar for system authentication.
- Full Text:
- Date Issued: 2008
Assessing program code through static structural similarity
- Authors: Naude, Kevin Alexander
- Date: 2007
- Subjects: Computer networks -- Security measures , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:10478 , http://hdl.handle.net/10948/578 , Computer networks -- Security measures , Internet -- Security measures
- Description: Learning to write software requires much practice and frequent assessment. Consequently, the use of computers to assist in the assessment of computer programs has been important in supporting large classes at universities. The main approaches to the problem are dynamic analysis (testing student programs for expected output) and static analysis (direct analysis of the program code). The former is very sensitive to all kinds of errors in student programs, while the latter has traditionally only been used to assess quality, and not correctness. This research focusses on the application of static analysis, particularly structural similarity, to marking student programs. Existing traditional measures of similarity are limiting in that they are usually only effective on tree structures. In this regard they do not easily support dependencies in program code. Contemporary measures of structural similarity, such as similarity flooding, usually rely on an internal normalisation of scores. The effect is that the scores only have relative meaning, and cannot be interpreted in isolation, ie. they are not meaningful for assessment. The SimRank measure is shown to have the same problem, but not because of normalisation. The problem with the SimRank measure arises from the fact that its scores depend on all possible mappings between the children of vertices being compared. The main contribution of this research is a novel graph similarity measure, the Weighted Assignment Similarity measure. It is related to SimRank, but derives propagation scores from only the locally optimal mapping between child vertices. The resulting similarity scores may be regarded as the percentage of mutual coverage between graphs. The measure is proven to converge for all directed acyclic graphs, and an efficient implementation is outlined for this case. Attributes on graph vertices and edges are often used to capture domain specific information which is not structural in nature. It has been suggested that these should influence the similarity propagation, but no clear method for doing this has been reported. The second important contribution of this research is a general method for incorporating these local attribute similarities into the larger similarity propagation method. An example of attributes in program graphs are identifier names. The choice of identifiers in programs is arbitrary as they are purely symbolic. A problem facing any comparison between programs is that they are unlikely to use the same set of identifiers. This problem indicates that a mapping between the identifier sets is required. The third contribution of this research is a method for applying the structural similarity measure in a two step process to find an optimal identifier mapping. This approach is both novel and valuable as it cleverly reuses the similarity measure as an existing resource. In general, programming assignments allow a large variety of solutions. Assessing student programs through structural similarity is only feasible if the diversity in the solution space can be addressed. This study narrows program diversity through a set of semantic preserving program transformations that convert programs into a normal form. The application of the Weighted Assignment Similarity measure to marking student programs is investigated, and strong correlations are found with the human marker. It is shown that the most accurate assessment requires that programs not only be compared with a set of good solutions, but rather a mixed set of programs of varying levels of correctness. This research represents the first documented successful application of structural similarity to the marking of student programs.
- Full Text:
- Date Issued: 2007
- Authors: Naude, Kevin Alexander
- Date: 2007
- Subjects: Computer networks -- Security measures , Internet -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:10478 , http://hdl.handle.net/10948/578 , Computer networks -- Security measures , Internet -- Security measures
- Description: Learning to write software requires much practice and frequent assessment. Consequently, the use of computers to assist in the assessment of computer programs has been important in supporting large classes at universities. The main approaches to the problem are dynamic analysis (testing student programs for expected output) and static analysis (direct analysis of the program code). The former is very sensitive to all kinds of errors in student programs, while the latter has traditionally only been used to assess quality, and not correctness. This research focusses on the application of static analysis, particularly structural similarity, to marking student programs. Existing traditional measures of similarity are limiting in that they are usually only effective on tree structures. In this regard they do not easily support dependencies in program code. Contemporary measures of structural similarity, such as similarity flooding, usually rely on an internal normalisation of scores. The effect is that the scores only have relative meaning, and cannot be interpreted in isolation, ie. they are not meaningful for assessment. The SimRank measure is shown to have the same problem, but not because of normalisation. The problem with the SimRank measure arises from the fact that its scores depend on all possible mappings between the children of vertices being compared. The main contribution of this research is a novel graph similarity measure, the Weighted Assignment Similarity measure. It is related to SimRank, but derives propagation scores from only the locally optimal mapping between child vertices. The resulting similarity scores may be regarded as the percentage of mutual coverage between graphs. The measure is proven to converge for all directed acyclic graphs, and an efficient implementation is outlined for this case. Attributes on graph vertices and edges are often used to capture domain specific information which is not structural in nature. It has been suggested that these should influence the similarity propagation, but no clear method for doing this has been reported. The second important contribution of this research is a general method for incorporating these local attribute similarities into the larger similarity propagation method. An example of attributes in program graphs are identifier names. The choice of identifiers in programs is arbitrary as they are purely symbolic. A problem facing any comparison between programs is that they are unlikely to use the same set of identifiers. This problem indicates that a mapping between the identifier sets is required. The third contribution of this research is a method for applying the structural similarity measure in a two step process to find an optimal identifier mapping. This approach is both novel and valuable as it cleverly reuses the similarity measure as an existing resource. In general, programming assignments allow a large variety of solutions. Assessing student programs through structural similarity is only feasible if the diversity in the solution space can be addressed. This study narrows program diversity through a set of semantic preserving program transformations that convert programs into a normal form. The application of the Weighted Assignment Similarity measure to marking student programs is investigated, and strong correlations are found with the human marker. It is shown that the most accurate assessment requires that programs not only be compared with a set of good solutions, but rather a mixed set of programs of varying levels of correctness. This research represents the first documented successful application of structural similarity to the marking of student programs.
- Full Text:
- Date Issued: 2007
Evolving a secure grid-enabled, distributed data warehouse : a standards-based perspective
- Authors: Li, Xiao-Yu
- Date: 2007
- Subjects: Computational grids (Computer systems) , Computer networks -- Security measures , Electronic data processing -- Distributed processing
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9738 , http://hdl.handle.net/10948/544 , Computational grids (Computer systems) , Computer networks -- Security measures , Electronic data processing -- Distributed processing
- Description: As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.
- Full Text:
- Date Issued: 2007
- Authors: Li, Xiao-Yu
- Date: 2007
- Subjects: Computational grids (Computer systems) , Computer networks -- Security measures , Electronic data processing -- Distributed processing
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9738 , http://hdl.handle.net/10948/544 , Computational grids (Computer systems) , Computer networks -- Security measures , Electronic data processing -- Distributed processing
- Description: As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.
- Full Text:
- Date Issued: 2007
Governing information security using organisational information security profiles
- Authors: Tyukala, Mkhululi
- Date: 2007
- Subjects: Data protection , Computer security -- Management , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9788 , http://hdl.handle.net/10948/626 , Data protection , Computer security -- Management , Computer networks -- Security measures
- Description: The corporate scandals of the last few years have changed the face of information security and its governance. Information security has been elevated to the board of director level due to legislation and corporate governance regulations resulting from the scandals. Now boards of directors have corporate responsibility to ensure that the information assets of an organisation are secure. They are forced to embrace information security and make it part of business strategies. The new support from the board of directors gives information security weight and the voice from the top as well as the financial muscle that other business activities experience. However, as an area that is made up of specialist activities, information security may not easily be comprehended at board level like other business related activities. Yet the board of directors needs to provide oversight of information security. That is, put an information security programme in place to ensure that information is adequately protected. This raises a number of challenges. One of the challenges is how can information security be understood and well informed decisions about it be made at the board level? This dissertation provides a mechanism to present information at board level on how information security is implemented according to the vision of the board of directors. This mechanism is built upon well accepted and documented concepts of information security. The mechanism (termed An Organisational Information Security Profile or OISP) will assist organisations with the initialisation, monitoring, measuring, reporting and reviewing of information security programmes. Ultimately, the OISP will make it possible to know if the information security endeavours of the organisation are effective or not. If the information security programme is found to be ineffective, The OISP will facilitate the pointing out of areas that are ineffective and what caused the ineffectiveness. This dissertation also presents how the effectiveness or ineffctiveness of information security can be presented at board level using well known visualisation methods. Finally the contribution, limits and areas that need more investigation are provided.
- Full Text:
- Date Issued: 2007
- Authors: Tyukala, Mkhululi
- Date: 2007
- Subjects: Data protection , Computer security -- Management , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9788 , http://hdl.handle.net/10948/626 , Data protection , Computer security -- Management , Computer networks -- Security measures
- Description: The corporate scandals of the last few years have changed the face of information security and its governance. Information security has been elevated to the board of director level due to legislation and corporate governance regulations resulting from the scandals. Now boards of directors have corporate responsibility to ensure that the information assets of an organisation are secure. They are forced to embrace information security and make it part of business strategies. The new support from the board of directors gives information security weight and the voice from the top as well as the financial muscle that other business activities experience. However, as an area that is made up of specialist activities, information security may not easily be comprehended at board level like other business related activities. Yet the board of directors needs to provide oversight of information security. That is, put an information security programme in place to ensure that information is adequately protected. This raises a number of challenges. One of the challenges is how can information security be understood and well informed decisions about it be made at the board level? This dissertation provides a mechanism to present information at board level on how information security is implemented according to the vision of the board of directors. This mechanism is built upon well accepted and documented concepts of information security. The mechanism (termed An Organisational Information Security Profile or OISP) will assist organisations with the initialisation, monitoring, measuring, reporting and reviewing of information security programmes. Ultimately, the OISP will make it possible to know if the information security endeavours of the organisation are effective or not. If the information security programme is found to be ineffective, The OISP will facilitate the pointing out of areas that are ineffective and what caused the ineffectiveness. This dissertation also presents how the effectiveness or ineffctiveness of information security can be presented at board level using well known visualisation methods. Finally the contribution, limits and areas that need more investigation are provided.
- Full Text:
- Date Issued: 2007
Limiting vulnerability exposure through effective patch management: threat mitigation through vulnerability remediation
- Authors: White, Dominic Stjohn Dolin
- Date: 2007 , 2007-02-08
- Subjects: Computer networks -- Security measures , Computer viruses , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4629 , http://hdl.handle.net/10962/d1006510 , Computer networks -- Security measures , Computer viruses , Computer security
- Description: This document aims to provide a complete discussion on vulnerability and patch management. The first chapters look at the trends relating to vulnerabilities, exploits, attacks and patches. These trends describe the drivers of patch and vulnerability management and situate the discussion in the current security climate. The following chapters then aim to present both policy and technical solutions to the problem. The policies described lay out a comprehensive set of steps that can be followed by any organisation to implement their own patch management policy, including practical advice on integration with other policies, managing risk, identifying vulnerability, strategies for reducing downtime and generating metrics to measure progress. Having covered the steps that can be taken by users, a strategy describing how best a vendor should implement a related patch release policy is provided. An argument is made that current monthly patch release schedules are inadequate to allow users to most effectively and timeously mitigate vulnerabilities. The final chapters discuss the technical aspect of automating parts of the policies described. In particular the concept of 'defense in depth' is used to discuss additional strategies for 'buying time' during the patch process. The document then goes on to conclude that in the face of increasing malicious activity and more complex patching, solid frameworks such as those provided in this document are required to ensure an organisation can fully manage the patching process. However, more research is required to fully understand vulnerabilities and exploits. In particular more attention must be paid to threats, as little work as been done to fully understand threat-agent capabilities and activities from a day to day basis. , TeX output 2007.02.08:2212 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Date Issued: 2007
- Authors: White, Dominic Stjohn Dolin
- Date: 2007 , 2007-02-08
- Subjects: Computer networks -- Security measures , Computer viruses , Computer security
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4629 , http://hdl.handle.net/10962/d1006510 , Computer networks -- Security measures , Computer viruses , Computer security
- Description: This document aims to provide a complete discussion on vulnerability and patch management. The first chapters look at the trends relating to vulnerabilities, exploits, attacks and patches. These trends describe the drivers of patch and vulnerability management and situate the discussion in the current security climate. The following chapters then aim to present both policy and technical solutions to the problem. The policies described lay out a comprehensive set of steps that can be followed by any organisation to implement their own patch management policy, including practical advice on integration with other policies, managing risk, identifying vulnerability, strategies for reducing downtime and generating metrics to measure progress. Having covered the steps that can be taken by users, a strategy describing how best a vendor should implement a related patch release policy is provided. An argument is made that current monthly patch release schedules are inadequate to allow users to most effectively and timeously mitigate vulnerabilities. The final chapters discuss the technical aspect of automating parts of the policies described. In particular the concept of 'defense in depth' is used to discuss additional strategies for 'buying time' during the patch process. The document then goes on to conclude that in the face of increasing malicious activity and more complex patching, solid frameworks such as those provided in this document are required to ensure an organisation can fully manage the patching process. However, more research is required to fully understand vulnerabilities and exploits. In particular more attention must be paid to threats, as little work as been done to fully understand threat-agent capabilities and activities from a day to day basis. , TeX output 2007.02.08:2212 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Date Issued: 2007
Securing media streams in an Asterisk-based environment and evaluating the resulting performance cost
- Authors: Clayton, Bradley
- Date: 2007 , 2007-01-08
- Subjects: Asterisk (Computer file) , Computer networks -- Security measures , Internet telephony -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4647 , http://hdl.handle.net/10962/d1006606 , Asterisk (Computer file) , Computer networks -- Security measures , Internet telephony -- Security measures
- Description: When adding Confidentiality, Integrity and Availability (CIA) to a multi-user VoIP (Voice over IP) system, performance and quality are at risk. The aim of this study is twofold. Firstly, it describes current methods suitable to secure voice streams within a VoIP system and make them available in an Asterisk-based VoIP environment. (Asterisk is a well established, open-source, TDM/VoIP PBX.) Secondly, this study evaluates the performance cost incurred after implementing each security method within the Asterisk-based system, using a special testbed suite, named DRAPA, which was developed expressly for this study. The three security methods implemented and studied were IPSec (Internet Protocol Security), SRTP (Secure Real-time Transport Protocol), and SIAX2 (Secure Inter-Asterisk eXchange 2 protocol). From the experiments, it was found that bandwidth and CPU usage were significantly affected by the addition of CIA. In ranking the three security methods in terms of these two resources, it was found that SRTP incurs the least bandwidth overhead, followed by SIAX2 and then IPSec. Where CPU utilisation is concerned, it was found that SIAX2 incurs the least overhead, followed by IPSec, and then SRTP.
- Full Text:
- Date Issued: 2007
- Authors: Clayton, Bradley
- Date: 2007 , 2007-01-08
- Subjects: Asterisk (Computer file) , Computer networks -- Security measures , Internet telephony -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4647 , http://hdl.handle.net/10962/d1006606 , Asterisk (Computer file) , Computer networks -- Security measures , Internet telephony -- Security measures
- Description: When adding Confidentiality, Integrity and Availability (CIA) to a multi-user VoIP (Voice over IP) system, performance and quality are at risk. The aim of this study is twofold. Firstly, it describes current methods suitable to secure voice streams within a VoIP system and make them available in an Asterisk-based VoIP environment. (Asterisk is a well established, open-source, TDM/VoIP PBX.) Secondly, this study evaluates the performance cost incurred after implementing each security method within the Asterisk-based system, using a special testbed suite, named DRAPA, which was developed expressly for this study. The three security methods implemented and studied were IPSec (Internet Protocol Security), SRTP (Secure Real-time Transport Protocol), and SIAX2 (Secure Inter-Asterisk eXchange 2 protocol). From the experiments, it was found that bandwidth and CPU usage were significantly affected by the addition of CIA. In ranking the three security methods in terms of these two resources, it was found that SRTP incurs the least bandwidth overhead, followed by SIAX2 and then IPSec. Where CPU utilisation is concerned, it was found that SIAX2 incurs the least overhead, followed by IPSec, and then SRTP.
- Full Text:
- Date Issued: 2007
Securing softswitches from malicious attacks
- Authors: Opie, Jake Weyman
- Date: 2007
- Subjects: Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4683 , http://hdl.handle.net/10962/d1007714 , Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Description: Traditionally, real-time communication, such as voice calls, has run on separate, closed networks. Of all the limitations that these networks had, the ability of malicious attacks to cripple communication was not a crucial one. This situation has changed radically now that real-time communication and data have merged to share the same network. The objective of this project is to investigate the securing of softswitches with functionality similar to Private Branch Exchanges (PBX) from malicious attacks. The focus of the project will be a practical investigation of how to secure ILANGA, an ASTERISK-based system under development at Rhodes University. The practical investigation that focuses on ILANGA is based on performing six varied experiments on the different components of ILANGA. Before the six experiments are performed, basic preliminary security measures and the restrictions placed on the access to the database are discussed. The outcomes of these experiments are discussed and the precise reasons why these attacks were either successful or unsuccessful are given. Suggestions of a theoretical nature on how to defend against the successful attacks are also presented.
- Full Text:
- Date Issued: 2007
- Authors: Opie, Jake Weyman
- Date: 2007
- Subjects: Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4683 , http://hdl.handle.net/10962/d1007714 , Internet telephony -- Security measures , Computer networks -- Security measures , Digital telephone systems , Communication -- Technological innovations , Computer network protocols , TCP/IP (Computer network protocol) , Switching theory
- Description: Traditionally, real-time communication, such as voice calls, has run on separate, closed networks. Of all the limitations that these networks had, the ability of malicious attacks to cripple communication was not a crucial one. This situation has changed radically now that real-time communication and data have merged to share the same network. The objective of this project is to investigate the securing of softswitches with functionality similar to Private Branch Exchanges (PBX) from malicious attacks. The focus of the project will be a practical investigation of how to secure ILANGA, an ASTERISK-based system under development at Rhodes University. The practical investigation that focuses on ILANGA is based on performing six varied experiments on the different components of ILANGA. Before the six experiments are performed, basic preliminary security measures and the restrictions placed on the access to the database are discussed. The outcomes of these experiments are discussed and the precise reasons why these attacks were either successful or unsuccessful are given. Suggestions of a theoretical nature on how to defend against the successful attacks are also presented.
- Full Text:
- Date Issued: 2007
A holistic approach to network security in OGSA-based grid systems
- Authors: Loutsios, Demetrios
- Date: 2006
- Subjects: Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9736 , http://hdl.handle.net/10948/550 , Computer networks -- Security measures
- Description: Grid computing technologies facilitate complex scientific collaborations between globally dispersed parties, which make use of heterogeneous technologies and computing systems. However, in recent years the commercial sector has developed a growing interest in Grid technologies. Prominent Grid researchers have predicted Grids will grow into the commercial mainstream, even though its origins were in scientific research. This is much the same way as the Internet started as a vehicle for research collaboration between universities and government institutions, and grew into a technology with large commercial applications. Grids facilitate complex trust relationships between globally dispersed business partners, research groups, and non-profit organizations. Almost any dispersed “virtual organization” willing to share computing resources can make use of Grid technologies. Grid computing facilitates the networking of shared services; the inter-connection of a potentially unlimited number of computing resources within a “Grid” is possible. Grid technologies leverage a range of open standards and technologies to provide interoperability between heterogeneous computing systems. Newer Grids build on key capabilities of Web-Service technologies to provide easy and dynamic publishing and discovery of Grid resources. Due to the inter-organisational nature of Grid systems, there is a need to provide adequate security to Grid users and to Grid resources. This research proposes a framework, using a specific brokered pattern, which addresses several common Grid security challenges, which include: Providing secure and consistent cross-site Authentication and Authorization; Single-sign on capabilities to Grid users; Abstract iii; Underlying platform and runtime security, and; Grid network communications and messaging security. These Grid security challenges can be viewed as comprising two (proposed) logical layers of a Grid. These layers are: a Common Grid Layer (higher level Grid interactions), and a Local Resource Layer (Lower level technology security concerns). This research is concerned with providing a generic and holistic security framework to secure both layers. This research makes extensive use of STRIDE - an acronym for Microsoft approach to addressing security threats - as part of a holistic Grid security framework. STRIDE and key Grid related standards, such as Open Grid Service Architecture (OGSA), Web-Service Resource Framework (WS-RF), and the Globus Toolkit are used to formulate the proposed framework.
- Full Text:
- Date Issued: 2006
- Authors: Loutsios, Demetrios
- Date: 2006
- Subjects: Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9736 , http://hdl.handle.net/10948/550 , Computer networks -- Security measures
- Description: Grid computing technologies facilitate complex scientific collaborations between globally dispersed parties, which make use of heterogeneous technologies and computing systems. However, in recent years the commercial sector has developed a growing interest in Grid technologies. Prominent Grid researchers have predicted Grids will grow into the commercial mainstream, even though its origins were in scientific research. This is much the same way as the Internet started as a vehicle for research collaboration between universities and government institutions, and grew into a technology with large commercial applications. Grids facilitate complex trust relationships between globally dispersed business partners, research groups, and non-profit organizations. Almost any dispersed “virtual organization” willing to share computing resources can make use of Grid technologies. Grid computing facilitates the networking of shared services; the inter-connection of a potentially unlimited number of computing resources within a “Grid” is possible. Grid technologies leverage a range of open standards and technologies to provide interoperability between heterogeneous computing systems. Newer Grids build on key capabilities of Web-Service technologies to provide easy and dynamic publishing and discovery of Grid resources. Due to the inter-organisational nature of Grid systems, there is a need to provide adequate security to Grid users and to Grid resources. This research proposes a framework, using a specific brokered pattern, which addresses several common Grid security challenges, which include: Providing secure and consistent cross-site Authentication and Authorization; Single-sign on capabilities to Grid users; Abstract iii; Underlying platform and runtime security, and; Grid network communications and messaging security. These Grid security challenges can be viewed as comprising two (proposed) logical layers of a Grid. These layers are: a Common Grid Layer (higher level Grid interactions), and a Local Resource Layer (Lower level technology security concerns). This research is concerned with providing a generic and holistic security framework to secure both layers. This research makes extensive use of STRIDE - an acronym for Microsoft approach to addressing security threats - as part of a holistic Grid security framework. STRIDE and key Grid related standards, such as Open Grid Service Architecture (OGSA), Web-Service Resource Framework (WS-RF), and the Globus Toolkit are used to formulate the proposed framework.
- Full Text:
- Date Issued: 2006