Securing software development using developer access control
- Authors: Ongers, Grant
- Date: 2020
- Subjects: Computer software -- Development , Computers -- Access control , Computer security -- Software , Computer networks -- Security measures , Source code (Computer science) , Plug-ins (Computer programs) , Data encryption (Computer science) , Network Access Control , Data Loss Prevention , Google’s BeyondCorp , Confidentiality, Integrity and Availability (CIA) triad
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/149022 , vital:38796
- Description: This research is aimed at software development companies and highlights the unique information security concerns in the context of a non-malicious software developer’s work environment; and furthermore explores an application driven solution which focuses specifically on providing developer environments with access control for source code repositories. In order to achieve that, five goals were defined as discussed in section 1.3. The application designed to provide the developer environment with access control to source code repositories was modelled on lessons taken from the principles of Network Access Control (NAC), Data Loss Prevention (DLP), and Google’s BeyondCorp (GBC) for zero-trust end-user computing. The intention of this research is to provide software developers with maximum access to source code without compromising Confidentiality, as per the Confidentiality, Integrity and Availability (CIA) triad. Employing data gleaned from examining the characteristics of DLP, NAC, and Beyond- Corp—proof-of-concept code was developed to regulate access to the developer’s environment and source code. The system required sufficient flexibility to support the diversity of software development environments. In order to achieve this, a modular design was selected. The system comprised a client side agent and a plug-in-ready server component. The client side agent mounts and dismounts encrypted volumes containing source code. Furthermore, it provides the server with information of the client that is demanded by plug-ins. The server side service provided encryption keys to facilitate the mounting of the volumes and, through plug-ins, asked questions of the client agent to determine whether access should be granted. The solution was then tested with integration and system testing. There were plans to have it used by development teams who were then to be surveyed as to their view on the proof of concept but this proved impossible. The conclusion provides a basis by which organisations that develop software can better balance the two corners of the CIA triad most often in conflict: Confidentiality in terms of their source code against the Availability of the same to developers.
- Full Text:
- Date Issued: 2020
- Authors: Ongers, Grant
- Date: 2020
- Subjects: Computer software -- Development , Computers -- Access control , Computer security -- Software , Computer networks -- Security measures , Source code (Computer science) , Plug-ins (Computer programs) , Data encryption (Computer science) , Network Access Control , Data Loss Prevention , Google’s BeyondCorp , Confidentiality, Integrity and Availability (CIA) triad
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/149022 , vital:38796
- Description: This research is aimed at software development companies and highlights the unique information security concerns in the context of a non-malicious software developer’s work environment; and furthermore explores an application driven solution which focuses specifically on providing developer environments with access control for source code repositories. In order to achieve that, five goals were defined as discussed in section 1.3. The application designed to provide the developer environment with access control to source code repositories was modelled on lessons taken from the principles of Network Access Control (NAC), Data Loss Prevention (DLP), and Google’s BeyondCorp (GBC) for zero-trust end-user computing. The intention of this research is to provide software developers with maximum access to source code without compromising Confidentiality, as per the Confidentiality, Integrity and Availability (CIA) triad. Employing data gleaned from examining the characteristics of DLP, NAC, and Beyond- Corp—proof-of-concept code was developed to regulate access to the developer’s environment and source code. The system required sufficient flexibility to support the diversity of software development environments. In order to achieve this, a modular design was selected. The system comprised a client side agent and a plug-in-ready server component. The client side agent mounts and dismounts encrypted volumes containing source code. Furthermore, it provides the server with information of the client that is demanded by plug-ins. The server side service provided encryption keys to facilitate the mounting of the volumes and, through plug-ins, asked questions of the client agent to determine whether access should be granted. The solution was then tested with integration and system testing. There were plans to have it used by development teams who were then to be surveyed as to their view on the proof of concept but this proved impossible. The conclusion provides a basis by which organisations that develop software can better balance the two corners of the CIA triad most often in conflict: Confidentiality in terms of their source code against the Availability of the same to developers.
- Full Text:
- Date Issued: 2020
A comparative study of CERBER, MAKTUB and LOCKY Ransomware using a Hybridised-Malware analysis
- Authors: Schmitt, Veronica
- Date: 2019
- Subjects: Microsoft Windows (Computer file) , Data protection , Computer crimes -- Prevention , Computer security , Computer networks -- Security measures , Computers -- Access control , Malware (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/92313 , vital:30702
- Description: There has been a significant increase in the prevalence of Ransomware attacks in the preceding four years to date. This indicates that the battle has not yet been won defending against this class of malware. This research proposes that by identifying the similarities within the operational framework of Ransomware strains, a better overall understanding of their operation and function can be achieved. This, in turn, will aid in a quicker response to future attacks. With the average Ransomware attack taking two hours to be identified, it shows that there is not yet a clear understanding as to why these attacks are so successful. Research into Ransomware is limited by what is currently known on the topic. Due to the limitations of the research the decision was taken to only examined three samples of Ransomware from different families. This was decided due to the complexities and comprehensive nature of the research. The in depth nature of the research and the time constraints associated with it did not allow for proof of concept of this framework to be tested on more than three families, but the exploratory work was promising and should be further explored in future research. The aim of the research is to follow the Hybrid-Malware analysis framework which consists of both static and the dynamic analysis phases, in addition to the digital forensic examination of the infected system. This allows for signature-based findings, along with behavioural and forensic findings all in one. This information allows for a better understanding of how this malware is designed and how it infects and remains persistent on a system. The operating system which has been chosen is the Microsoft Window 7 operating system which is still utilised by a significant proportion of Windows users especially in the corporate environment. The experiment process was designed to enable the researcher the ability to collect information regarding the Ransomware and every aspect of its behaviour and communication on a target system. The results can be compared across the three strains to identify the commonalities. The initial hypothesis was that Ransomware variants are all much like an instant cake box consists of specific building blocks which remain the same with the flavouring of the cake mix being the unique feature.
- Full Text:
- Date Issued: 2019
- Authors: Schmitt, Veronica
- Date: 2019
- Subjects: Microsoft Windows (Computer file) , Data protection , Computer crimes -- Prevention , Computer security , Computer networks -- Security measures , Computers -- Access control , Malware (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/92313 , vital:30702
- Description: There has been a significant increase in the prevalence of Ransomware attacks in the preceding four years to date. This indicates that the battle has not yet been won defending against this class of malware. This research proposes that by identifying the similarities within the operational framework of Ransomware strains, a better overall understanding of their operation and function can be achieved. This, in turn, will aid in a quicker response to future attacks. With the average Ransomware attack taking two hours to be identified, it shows that there is not yet a clear understanding as to why these attacks are so successful. Research into Ransomware is limited by what is currently known on the topic. Due to the limitations of the research the decision was taken to only examined three samples of Ransomware from different families. This was decided due to the complexities and comprehensive nature of the research. The in depth nature of the research and the time constraints associated with it did not allow for proof of concept of this framework to be tested on more than three families, but the exploratory work was promising and should be further explored in future research. The aim of the research is to follow the Hybrid-Malware analysis framework which consists of both static and the dynamic analysis phases, in addition to the digital forensic examination of the infected system. This allows for signature-based findings, along with behavioural and forensic findings all in one. This information allows for a better understanding of how this malware is designed and how it infects and remains persistent on a system. The operating system which has been chosen is the Microsoft Window 7 operating system which is still utilised by a significant proportion of Windows users especially in the corporate environment. The experiment process was designed to enable the researcher the ability to collect information regarding the Ransomware and every aspect of its behaviour and communication on a target system. The results can be compared across the three strains to identify the commonalities. The initial hypothesis was that Ransomware variants are all much like an instant cake box consists of specific building blocks which remain the same with the flavouring of the cake mix being the unique feature.
- Full Text:
- Date Issued: 2019
Towards understanding and mitigating attacks leveraging zero-day exploits
- Authors: Smit, Liam
- Date: 2019
- Subjects: Computer crimes -- Prevention , Data protection , Hacking , Computer security , Computer networks -- Security measures , Computers -- Access control , Malware (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/115718 , vital:34218
- Description: Zero-day vulnerabilities are unknown and therefore not addressed with the result that they can be exploited by attackers to gain unauthorised system access. In order to understand and mitigate against attacks leveraging zero-days or unknown techniques, it is necessary to study the vulnerabilities, exploits and attacks that make use of them. In recent years there have been a number of leaks publishing such attacks using various methods to exploit vulnerabilities. This research seeks to understand what types of vulnerabilities exist, why and how these are exploited, and how to defend against such attacks by either mitigating the vulnerabilities or the method / process of exploiting them. By moving beyond merely remedying the vulnerabilities to defences that are able to prevent or detect the actions taken by attackers, the security of the information system will be better positioned to deal with future unknown threats. An interesting finding is how attackers exploit moving beyond the observable bounds to circumvent security defences, for example, compromising syslog servers, or going down to lower system rings to gain access. However, defenders can counter this by employing defences that are external to the system preventing attackers from disabling them or removing collected evidence after gaining system access. Attackers are able to defeat air-gaps via the leakage of electromagnetic radiation as well as misdirect attribution by planting false artefacts for forensic analysis and attacking from third party information systems. They analyse the methods of other attackers to learn new techniques. An example of this is the Umbrage project whereby malware is analysed to decide whether it should be implemented as a proof of concept. Another important finding is that attackers respect defence mechanisms such as: remote syslog (e.g. firewall), core dump files, database auditing, and Tripwire (e.g. SlyHeretic). These defences all have the potential to result in the attacker being discovered. Attackers must either negate the defence mechanism or find unprotected targets. Defenders can use technologies such as encryption to defend against interception and man-in-the-middle attacks. They can also employ honeytokens and honeypots to alarm misdirect, slow down and learn from attackers. By employing various tactics defenders are able to increase their chance of detecting and time to react to attacks, even those exploiting hitherto unknown vulnerabilities. To summarize the information presented in this thesis and to show the practical importance thereof, an examination is presented of the NSA's network intrusion of the SWIFT organisation. It shows that the firewalls were exploited with remote code execution zerodays. This attack has a striking parallel in the approach used in the recent VPNFilter malware. If nothing else, the leaks provide information to other actors on how to attack and what to avoid. However, by studying state actors, we can gain insight into what other actors with fewer resources can do in the future.
- Full Text:
- Date Issued: 2019
- Authors: Smit, Liam
- Date: 2019
- Subjects: Computer crimes -- Prevention , Data protection , Hacking , Computer security , Computer networks -- Security measures , Computers -- Access control , Malware (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/115718 , vital:34218
- Description: Zero-day vulnerabilities are unknown and therefore not addressed with the result that they can be exploited by attackers to gain unauthorised system access. In order to understand and mitigate against attacks leveraging zero-days or unknown techniques, it is necessary to study the vulnerabilities, exploits and attacks that make use of them. In recent years there have been a number of leaks publishing such attacks using various methods to exploit vulnerabilities. This research seeks to understand what types of vulnerabilities exist, why and how these are exploited, and how to defend against such attacks by either mitigating the vulnerabilities or the method / process of exploiting them. By moving beyond merely remedying the vulnerabilities to defences that are able to prevent or detect the actions taken by attackers, the security of the information system will be better positioned to deal with future unknown threats. An interesting finding is how attackers exploit moving beyond the observable bounds to circumvent security defences, for example, compromising syslog servers, or going down to lower system rings to gain access. However, defenders can counter this by employing defences that are external to the system preventing attackers from disabling them or removing collected evidence after gaining system access. Attackers are able to defeat air-gaps via the leakage of electromagnetic radiation as well as misdirect attribution by planting false artefacts for forensic analysis and attacking from third party information systems. They analyse the methods of other attackers to learn new techniques. An example of this is the Umbrage project whereby malware is analysed to decide whether it should be implemented as a proof of concept. Another important finding is that attackers respect defence mechanisms such as: remote syslog (e.g. firewall), core dump files, database auditing, and Tripwire (e.g. SlyHeretic). These defences all have the potential to result in the attacker being discovered. Attackers must either negate the defence mechanism or find unprotected targets. Defenders can use technologies such as encryption to defend against interception and man-in-the-middle attacks. They can also employ honeytokens and honeypots to alarm misdirect, slow down and learn from attackers. By employing various tactics defenders are able to increase their chance of detecting and time to react to attacks, even those exploiting hitherto unknown vulnerabilities. To summarize the information presented in this thesis and to show the practical importance thereof, an examination is presented of the NSA's network intrusion of the SWIFT organisation. It shows that the firewalls were exploited with remote code execution zerodays. This attack has a striking parallel in the approach used in the recent VPNFilter malware. If nothing else, the leaks provide information to other actors on how to attack and what to avoid. However, by studying state actors, we can gain insight into what other actors with fewer resources can do in the future.
- Full Text:
- Date Issued: 2019
Users’ perceptions regarding password policies
- Authors: Fredericks, Damian Todd
- Date: 2018
- Subjects: Computers -- Access control , Computer networks -- Security measures Computer security
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/30205 , vital:30896
- Description: Information is considered a valuable asset to most organisations and is often exposed to various threats which exploit its confidentiality, integrity and availability (CIA). Identification and Authentication are commonly used to help ensure the CIA of information. This research study specifically focused on password-based authentication. Passwords are used to log into personal computers, company computers, email accounts, bank accounts and various software systems and mobile applications. Passwords act like a protective barrier between a user and their personal and company information, and remain the most cost-effective and most efficient method to control access to computer systems. An extensive content analysis was conducted regarding the security of passwords, as well as users’ password management coping strategies. It was determined that very little research has been conducted in relation to users’ perceptions towards password policies. The problem identified by this research is that organisations often implement password policy guidelines without taking into consideration users’ perceptions regarding such guidelines. This could result in users adopting various password management coping strategies. This research therefore aimed to determine users’ perceptions with regard to current password-related standards and best practices (password policy guidelines). Standards and best practices such as ISO/IEC 27002 (2013), NIST SP 800-118 (2009), NIST SP 800-63-2 (2013), NIST SP 800-63B (2016) and the SANS Password Protection Policy (2014b) were studied in order to determine the common elements of password policies. This research argued that before organisations implement password policy guidelines, they need to determine users’ perceptions towards such guidelines. It was identified that certain human factors such as human memory, attitude and apathy often cause users to adopt insecure coping strategies such as Reusing Passwords, Writing Down Passwords and Not Changing Passwords. This research included a survey which took the form of a questionnaire. The aim of the survey was to determine users’ perceptions towards common elements of password policies and to determine the coping strategies users commonly adopt. The survey included questions related to the new NIST SP 800-63B (2016) that sought to determine users’ perceptions towards these new NIST password policy iii guidelines. Findings from the survey indicated that respondents found the new NIST guidelines to be helpful, secure and easier to adhere to. Finally, recommendations regarding password policies were presented based on the common elements of password policies and users’ perceptions of the new NIST password guidelines. These recommendations could help policy makers in the implementation of new password policies or the revision of current password policies.
- Full Text:
- Date Issued: 2018
- Authors: Fredericks, Damian Todd
- Date: 2018
- Subjects: Computers -- Access control , Computer networks -- Security measures Computer security
- Language: English
- Type: Thesis , Masters , MIT
- Identifier: http://hdl.handle.net/10948/30205 , vital:30896
- Description: Information is considered a valuable asset to most organisations and is often exposed to various threats which exploit its confidentiality, integrity and availability (CIA). Identification and Authentication are commonly used to help ensure the CIA of information. This research study specifically focused on password-based authentication. Passwords are used to log into personal computers, company computers, email accounts, bank accounts and various software systems and mobile applications. Passwords act like a protective barrier between a user and their personal and company information, and remain the most cost-effective and most efficient method to control access to computer systems. An extensive content analysis was conducted regarding the security of passwords, as well as users’ password management coping strategies. It was determined that very little research has been conducted in relation to users’ perceptions towards password policies. The problem identified by this research is that organisations often implement password policy guidelines without taking into consideration users’ perceptions regarding such guidelines. This could result in users adopting various password management coping strategies. This research therefore aimed to determine users’ perceptions with regard to current password-related standards and best practices (password policy guidelines). Standards and best practices such as ISO/IEC 27002 (2013), NIST SP 800-118 (2009), NIST SP 800-63-2 (2013), NIST SP 800-63B (2016) and the SANS Password Protection Policy (2014b) were studied in order to determine the common elements of password policies. This research argued that before organisations implement password policy guidelines, they need to determine users’ perceptions towards such guidelines. It was identified that certain human factors such as human memory, attitude and apathy often cause users to adopt insecure coping strategies such as Reusing Passwords, Writing Down Passwords and Not Changing Passwords. This research included a survey which took the form of a questionnaire. The aim of the survey was to determine users’ perceptions towards common elements of password policies and to determine the coping strategies users commonly adopt. The survey included questions related to the new NIST SP 800-63B (2016) that sought to determine users’ perceptions towards these new NIST password policy iii guidelines. Findings from the survey indicated that respondents found the new NIST guidelines to be helpful, secure and easier to adhere to. Finally, recommendations regarding password policies were presented based on the common elements of password policies and users’ perceptions of the new NIST password guidelines. These recommendations could help policy makers in the implementation of new password policies or the revision of current password policies.
- Full Text:
- Date Issued: 2018
Information security assurance model for an examination paper preparation process in a higher education institution
- Authors: Mogale, Miemie
- Date: 2016
- Subjects: Computer security -- Management -- Examinations , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/8509 , vital:26377
- Description: In today’s business world, information has become the driving force of organizations. With organizations transmitting large amounts of information to various geographical locations, it is imperative that organizations ensure the protection of their valuable commodity. Organizations should ensure that only authorized individuals receive, view and alter the information. This is also true to Higher Education Institutions (HEIs), which need to protect its examination papers, amongst other valuable information. With various threats waiting to take advantage of the examination papers, HEIs need to be prepared by equipping themselves with an information security management system (ISMS), in order to ensure that the process of setting examination papers is secure, and protects the examination papers within the process. An ISMS will ensure that all information security aspects are considered and addressed in order to provide appropriate and adequate protection for the examination papers. With the assistance of information security concepts and information security principles, the ISMS can be developed, in order to secure the process of preparing examination papers; in order to protect the examination papers from potential risks. Risk assessment form part of the ISMS, and is at the centre of any security effort; reason being that to secure an information environment, knowing and understanding the risks is imperative. Risks pertaining to that particular environment need to be assessed in order to deal with those appropriately. In addition, very important to any security effort is ensuring that employees working with the valuable information are made aware of these risks, and can be able to protect the information. Therefore, the role players (within the examination paper preparation process (EPPP)) who handle the examination papers on a daily basis have to be equipped with means of handling valuable information in a secure manner. Some of the role players’ behaviour and practices while handling the information could be seen as vulnerabilities that could be exploited by threats, resulting in the compromise in the CIA of the information. Therefore, it is imperative that role players are made aware of their practices and iv behaviour that could result in a negative impact for the institution. This awareness forms part and is addressed in the ISMS.
- Full Text:
- Date Issued: 2016
- Authors: Mogale, Miemie
- Date: 2016
- Subjects: Computer security -- Management -- Examinations , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/8509 , vital:26377
- Description: In today’s business world, information has become the driving force of organizations. With organizations transmitting large amounts of information to various geographical locations, it is imperative that organizations ensure the protection of their valuable commodity. Organizations should ensure that only authorized individuals receive, view and alter the information. This is also true to Higher Education Institutions (HEIs), which need to protect its examination papers, amongst other valuable information. With various threats waiting to take advantage of the examination papers, HEIs need to be prepared by equipping themselves with an information security management system (ISMS), in order to ensure that the process of setting examination papers is secure, and protects the examination papers within the process. An ISMS will ensure that all information security aspects are considered and addressed in order to provide appropriate and adequate protection for the examination papers. With the assistance of information security concepts and information security principles, the ISMS can be developed, in order to secure the process of preparing examination papers; in order to protect the examination papers from potential risks. Risk assessment form part of the ISMS, and is at the centre of any security effort; reason being that to secure an information environment, knowing and understanding the risks is imperative. Risks pertaining to that particular environment need to be assessed in order to deal with those appropriately. In addition, very important to any security effort is ensuring that employees working with the valuable information are made aware of these risks, and can be able to protect the information. Therefore, the role players (within the examination paper preparation process (EPPP)) who handle the examination papers on a daily basis have to be equipped with means of handling valuable information in a secure manner. Some of the role players’ behaviour and practices while handling the information could be seen as vulnerabilities that could be exploited by threats, resulting in the compromise in the CIA of the information. Therefore, it is imperative that role players are made aware of their practices and iv behaviour that could result in a negative impact for the institution. This awareness forms part and is addressed in the ISMS.
- Full Text:
- Date Issued: 2016
An investigation of issues of privacy, anonymity and multi-factor authentication in an open environment
- Authors: Miles, Shaun Graeme
- Date: 2012-06-20
- Subjects: Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4656 , http://hdl.handle.net/10962/d1006653 , Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Description: This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model. , Adobe Acrobat Pro 9.5.1 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Authors: Miles, Shaun Graeme
- Date: 2012-06-20
- Subjects: Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4656 , http://hdl.handle.net/10962/d1006653 , Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Description: This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model. , Adobe Acrobat Pro 9.5.1 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
Enabling e-learning 2.0 in information security education: a semantic web approach
- Authors: Goss, Ryan Gavin
- Date: 2009
- Subjects: Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9771 , http://hdl.handle.net/10948/909 , Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Description: The motivation for this study argued that current information security ed- ucation systems are inadequate for educating all users of computer systems world wide in acting securely during their operations with information sys- tems. There is, therefore, a pervasive need for information security knowledge in all aspects of modern life. E-Learning 2.0 could possi- bly contribute to solving this problem, however, little or no knowledge currently exists regarding the suitability and practicality of using such systems to infer information security knowledge to learners.
- Full Text:
- Date Issued: 2009
- Authors: Goss, Ryan Gavin
- Date: 2009
- Subjects: Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9771 , http://hdl.handle.net/10948/909 , Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Description: The motivation for this study argued that current information security ed- ucation systems are inadequate for educating all users of computer systems world wide in acting securely during their operations with information sys- tems. There is, therefore, a pervasive need for information security knowledge in all aspects of modern life. E-Learning 2.0 could possi- bly contribute to solving this problem, however, little or no knowledge currently exists regarding the suitability and practicality of using such systems to infer information security knowledge to learners.
- Full Text:
- Date Issued: 2009
Network-layer reservation TDM for ad-hoc 802.11 networks
- Authors: Duff, Kevin Craig
- Date: 2008
- Subjects: Computer networks -- Access control , Computers -- Access control , Computer networks -- Management , Time division multiple access , Ad hoc networks (Computer networks)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4574 , http://hdl.handle.net/10962/d1002773 , Computer networks -- Access control , Computers -- Access control , Computer networks -- Management , Time division multiple access , Ad hoc networks (Computer networks)
- Description: Ad-Hoc mesh networks offer great promise. Low-cost ad-hoc mesh networks can be built using popular IEEE 802.11 equipment, but such networks are unable to guarantee each node a fair share of bandwidth. Furthermore, hidden node problems cause collisions which can cripple the throughput of a network. This research proposes a novel mechanism which is able to overcome hidden node problems and provide fair bandwidth sharing among nodes on ad-hoc 802.11 networks, and can be implemented on existing network devices. The scheme uses TDM (time division multiplexing) with slot reservation. A distributed beacon packet latency measurement mechanism is used to achieve node synchronisation. The distributed nature of the mechanism makes it applicable to ad-hoc 802.11 networks, which can either grow or fragment dynamically.
- Full Text:
- Date Issued: 2008
- Authors: Duff, Kevin Craig
- Date: 2008
- Subjects: Computer networks -- Access control , Computers -- Access control , Computer networks -- Management , Time division multiple access , Ad hoc networks (Computer networks)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4574 , http://hdl.handle.net/10962/d1002773 , Computer networks -- Access control , Computers -- Access control , Computer networks -- Management , Time division multiple access , Ad hoc networks (Computer networks)
- Description: Ad-Hoc mesh networks offer great promise. Low-cost ad-hoc mesh networks can be built using popular IEEE 802.11 equipment, but such networks are unable to guarantee each node a fair share of bandwidth. Furthermore, hidden node problems cause collisions which can cripple the throughput of a network. This research proposes a novel mechanism which is able to overcome hidden node problems and provide fair bandwidth sharing among nodes on ad-hoc 802.11 networks, and can be implemented on existing network devices. The scheme uses TDM (time division multiplexing) with slot reservation. A distributed beacon packet latency measurement mechanism is used to achieve node synchronisation. The distributed nature of the mechanism makes it applicable to ad-hoc 802.11 networks, which can either grow or fragment dynamically.
- Full Text:
- Date Issued: 2008
Towards a user centric model for identity and access management within the online environment
- Authors: Deas, Matthew Burns
- Date: 2008
- Subjects: Computers -- Access control , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9780 , http://hdl.handle.net/10948/775 , Computers -- Access control , Computer networks -- Security measures
- Description: Today, one is expected to remember multiple user names and passwords for different domains when one wants to access on the Internet. Identity management seeks to solve this problem through creating a digital identity that is exchangeable across organisational boundaries. Through the setup of collaboration agreements between multiple domains, users can easily switch across domains without being required to sign in again. However, use of this technology comes with risks of user identity and personal information being compromised. Criminals make use of spoofed websites and social engineering techniques to gain illegal access to user information. Due to this, the need for users to be protected from online threats has increased. Two processes are required to protect the user login information at the time of sign-on. Firstly, user’s information must be protected at the time of sign-on, and secondly, a simple method for the identification of the website is required by the user. This treatise looks at the process for identifying and verifying user information, and how the user can verify the system at sign-in. Three models for identity management are analysed, namely the Microsoft .NET Passport, Liberty Alliance Federated Identity for Single Sign-on and the Mozilla TrustBar for system authentication.
- Full Text:
- Date Issued: 2008
- Authors: Deas, Matthew Burns
- Date: 2008
- Subjects: Computers -- Access control , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9780 , http://hdl.handle.net/10948/775 , Computers -- Access control , Computer networks -- Security measures
- Description: Today, one is expected to remember multiple user names and passwords for different domains when one wants to access on the Internet. Identity management seeks to solve this problem through creating a digital identity that is exchangeable across organisational boundaries. Through the setup of collaboration agreements between multiple domains, users can easily switch across domains without being required to sign in again. However, use of this technology comes with risks of user identity and personal information being compromised. Criminals make use of spoofed websites and social engineering techniques to gain illegal access to user information. Due to this, the need for users to be protected from online threats has increased. Two processes are required to protect the user login information at the time of sign-on. Firstly, user’s information must be protected at the time of sign-on, and secondly, a simple method for the identification of the website is required by the user. This treatise looks at the process for identifying and verifying user information, and how the user can verify the system at sign-in. Three models for identity management are analysed, namely the Microsoft .NET Passport, Liberty Alliance Federated Identity for Single Sign-on and the Mozilla TrustBar for system authentication.
- Full Text:
- Date Issued: 2008
A critical review of the IFIP TC11 Security Conference Series
- Authors: Gaadingwe, Tshepo Gaadingwe
- Date: 2007
- Subjects: Database security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9795 , http://hdl.handle.net/10948/507 , Database security , Data protection , Computers -- Access control
- Description: Over the past few decades the field of computing has grown and evolved. In this time, information security research has experienced the same type of growth. The increase in importance and interest in information security research is reflected by the sheer number of research efforts being produced by different type of organizations around the world. One such organization is the International Federation for Information Processing (IFIP), more specifically the IFIP Technical Committee 11 (IFIP TC11). The IFIP TC11 community has had a rich history in producing high quality information security specific articles for over 20 years now. Therefore, IFIP TC11 found it necessary to reflect on this history, mainly to try and discover where it came from and where it may be going. Its 20th anniversary of its main conference presented an opportunity to begin such a study of its history. The core belief driving the study being that the future can only be realized and appreciated if the past is well understood. The main area of interest was to find out topics which may have had prevalence in the past or could be considered as "hot" topics. To achieve this, the author developed a systematic process for the study. The underpinning element being the creation of a classification scheme which was used to aid the analysis of the IFIP TC11 20 year's worth of articles. Major themes were identified and trends in the series highlighted. Further discussion and reflection on these trends were given. It was found that, not surprisingly, the series covered a wide variety of topics in the 20 years. However, it was discovered that there has been a notable move towards technically focused papers. Furthermore, topics such as business continuity had just about disappeared in the series while topics which are related to networking and cryptography continue to gain more prevalence.
- Full Text:
- Date Issued: 2007
- Authors: Gaadingwe, Tshepo Gaadingwe
- Date: 2007
- Subjects: Database security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9795 , http://hdl.handle.net/10948/507 , Database security , Data protection , Computers -- Access control
- Description: Over the past few decades the field of computing has grown and evolved. In this time, information security research has experienced the same type of growth. The increase in importance and interest in information security research is reflected by the sheer number of research efforts being produced by different type of organizations around the world. One such organization is the International Federation for Information Processing (IFIP), more specifically the IFIP Technical Committee 11 (IFIP TC11). The IFIP TC11 community has had a rich history in producing high quality information security specific articles for over 20 years now. Therefore, IFIP TC11 found it necessary to reflect on this history, mainly to try and discover where it came from and where it may be going. Its 20th anniversary of its main conference presented an opportunity to begin such a study of its history. The core belief driving the study being that the future can only be realized and appreciated if the past is well understood. The main area of interest was to find out topics which may have had prevalence in the past or could be considered as "hot" topics. To achieve this, the author developed a systematic process for the study. The underpinning element being the creation of a classification scheme which was used to aid the analysis of the IFIP TC11 20 year's worth of articles. Major themes were identified and trends in the series highlighted. Further discussion and reflection on these trends were given. It was found that, not surprisingly, the series covered a wide variety of topics in the 20 years. However, it was discovered that there has been a notable move towards technically focused papers. Furthermore, topics such as business continuity had just about disappeared in the series while topics which are related to networking and cryptography continue to gain more prevalence.
- Full Text:
- Date Issued: 2007
Implementing the CoSaWoE models in a commercial workflow product
- Authors: Erwee, Carmen
- Date: 2005
- Subjects: Computers -- Access control , Workflow , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9732 , http://hdl.handle.net/10948/169 , Computers -- Access control , Workflow , Computer security , Data protection
- Description: Workflow systems have gained popularity not only as a research topic, but also as a key component of Enterprize Resource Planning packages and e- business. Comprehensive workflow products that automate intra- as well inter-organizational information flow are now available for commercial use. Standardization efforts have centered mostly around the interoperability of these systems, however a standard access control model have yet to be adopted. The research community has developed several models for access control to be included as part of workflow functionality. Commercial systems, however, are still implementing access control functionality in a proprietary manner. This dissertation investigates whether a comprehensive model for gain- ing context-sensitive access control, namely CoSAWoE, can be purposefully implemented in a commercial workflow product. Using methods such as an exploratory prototype, various aspects of the model was implemented to gain an understanding of the di±culties developers face when attempting to map the model to existing proprietary software. Oracle Workflow was chosen as an example of a commercial workflow product. An investigtion of the features of this product, together with the prototype, revealed the ability to affect access control in a similar manner to the model: by specifying access control constraints during administration and design, and then enforcing those constraints dynamically during run-time. However, only certain components within these two aspects of the model directly effected the commercial workflow product. It was argued that the first two requirements of context-sensitive access control, order of events and strict least privilege, addressed by the object design, role engineering and session control components of the model, can be simulated if such capabilities are not pertinently available as part of the product. As such, guidelines were provided for how this can be achieved in Oracle Workflow. However, most of the implementation effort focussed on the last requirement of context-sensitive access control, namely separation of duties. The CoSAWoE model proposes SoD administration steps that includes expressing various business rules through a set of conflicting entities which are maintained outside the scope of the workflow system. This component was implemented easily enough through tables which were created with a relational database. Evaluating these conflicts during run-time to control worklist generation proved more di±cult. First, a thorough understanding of the way in which workflow history is maintained was necessary. A re-usable function was developed to prune user lists according to user involvement in previous tasks in the workflow and the conflicts specified for those users and tasks. However, due to the lack of a central access control service, this re- usable function must be included in the appropriate places in the workflow process model. Furthermore, the dissertation utilized a practical example to develop a prototype. This prototype served a dual purpose: firstly, to aid the author's understanding of the features and principles involved, and secondly, to illustrate and explore the implementation of the model as described in the previous paragraphs. In conclusion the dissertation summarized the CoSAWoE model's compo- nents which were found to be product agnostic, directly or indirectly imple- mentable, or not implemented in the chosen workflow product. The lessons learnt and issues surrounding the implementation effort were also discussed before further research in terms of XML documents as data containers for the workfow process were suggested.
- Full Text:
- Date Issued: 2005
- Authors: Erwee, Carmen
- Date: 2005
- Subjects: Computers -- Access control , Workflow , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9732 , http://hdl.handle.net/10948/169 , Computers -- Access control , Workflow , Computer security , Data protection
- Description: Workflow systems have gained popularity not only as a research topic, but also as a key component of Enterprize Resource Planning packages and e- business. Comprehensive workflow products that automate intra- as well inter-organizational information flow are now available for commercial use. Standardization efforts have centered mostly around the interoperability of these systems, however a standard access control model have yet to be adopted. The research community has developed several models for access control to be included as part of workflow functionality. Commercial systems, however, are still implementing access control functionality in a proprietary manner. This dissertation investigates whether a comprehensive model for gain- ing context-sensitive access control, namely CoSAWoE, can be purposefully implemented in a commercial workflow product. Using methods such as an exploratory prototype, various aspects of the model was implemented to gain an understanding of the di±culties developers face when attempting to map the model to existing proprietary software. Oracle Workflow was chosen as an example of a commercial workflow product. An investigtion of the features of this product, together with the prototype, revealed the ability to affect access control in a similar manner to the model: by specifying access control constraints during administration and design, and then enforcing those constraints dynamically during run-time. However, only certain components within these two aspects of the model directly effected the commercial workflow product. It was argued that the first two requirements of context-sensitive access control, order of events and strict least privilege, addressed by the object design, role engineering and session control components of the model, can be simulated if such capabilities are not pertinently available as part of the product. As such, guidelines were provided for how this can be achieved in Oracle Workflow. However, most of the implementation effort focussed on the last requirement of context-sensitive access control, namely separation of duties. The CoSAWoE model proposes SoD administration steps that includes expressing various business rules through a set of conflicting entities which are maintained outside the scope of the workflow system. This component was implemented easily enough through tables which were created with a relational database. Evaluating these conflicts during run-time to control worklist generation proved more di±cult. First, a thorough understanding of the way in which workflow history is maintained was necessary. A re-usable function was developed to prune user lists according to user involvement in previous tasks in the workflow and the conflicts specified for those users and tasks. However, due to the lack of a central access control service, this re- usable function must be included in the appropriate places in the workflow process model. Furthermore, the dissertation utilized a practical example to develop a prototype. This prototype served a dual purpose: firstly, to aid the author's understanding of the features and principles involved, and secondly, to illustrate and explore the implementation of the model as described in the previous paragraphs. In conclusion the dissertation summarized the CoSAWoE model's compo- nents which were found to be product agnostic, directly or indirectly imple- mentable, or not implemented in the chosen workflow product. The lessons learnt and issues surrounding the implementation effort were also discussed before further research in terms of XML documents as data containers for the workfow process were suggested.
- Full Text:
- Date Issued: 2005
SoDA : a model for the administration of separation of duty requirements in workflow systems
- Authors: Perelson, Stephen
- Date: 2001
- Subjects: Electronic data processing departments -- Security measures , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech (Information Technology)
- Identifier: vital:10796 , http://hdl.handle.net/10948/68 , Electronic data processing departments -- Security measures , Computers -- Access control
- Description: The increasing reliance on information technology to support business processes has emphasised the need for information security mechanisms. This, however, has resulted in an ever-increasing workload in terms of security administration. Security administration encompasses the activity of ensuring the correct enforcement of access control within an organisation. Access rights and their allocation are dictated by the security policies within an organisation. As such, security administration can be seen as a policybased approach. Policy-based approaches promise to lighten the workload of security administrators. Separation of duties is one of the principles cited as a criterion when setting up these policy-based mechanisms. Different types of separation of duty policies exist. They can be categorised into policies that can be enforced at administration time, viz. static separation of duty requirements and policies that can be enforced only at execution time, viz. dynamic separation of duty requirements. This dissertation deals with the specification of both static separation of duty requirements and dynamic separation of duty requirements in role-based workflow environments. It proposes a model for the specification of separation of duty requirements, the expressions of which are based on set theory. The model focuses, furthermore, on the enforcement of static separation of duty. The enforcement of static separation of duty requirements is modelled in terms of invariant conditions. The invariant conditions specify restrictions upon the elements allowed in the sets representing access control requirements. The sets are themselves expressed as database tables within a relational database management system. Algorithms that stipulate how to verify the additions or deletions of elements within these sets can then be performed within the database management system. A prototype was developed in order to demonstrate the concepts of this model. This prototype helps demonstrate how the proposed model could function and flaunts its effectiveness.
- Full Text:
- Date Issued: 2001
- Authors: Perelson, Stephen
- Date: 2001
- Subjects: Electronic data processing departments -- Security measures , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech (Information Technology)
- Identifier: vital:10796 , http://hdl.handle.net/10948/68 , Electronic data processing departments -- Security measures , Computers -- Access control
- Description: The increasing reliance on information technology to support business processes has emphasised the need for information security mechanisms. This, however, has resulted in an ever-increasing workload in terms of security administration. Security administration encompasses the activity of ensuring the correct enforcement of access control within an organisation. Access rights and their allocation are dictated by the security policies within an organisation. As such, security administration can be seen as a policybased approach. Policy-based approaches promise to lighten the workload of security administrators. Separation of duties is one of the principles cited as a criterion when setting up these policy-based mechanisms. Different types of separation of duty policies exist. They can be categorised into policies that can be enforced at administration time, viz. static separation of duty requirements and policies that can be enforced only at execution time, viz. dynamic separation of duty requirements. This dissertation deals with the specification of both static separation of duty requirements and dynamic separation of duty requirements in role-based workflow environments. It proposes a model for the specification of separation of duty requirements, the expressions of which are based on set theory. The model focuses, furthermore, on the enforcement of static separation of duty. The enforcement of static separation of duty requirements is modelled in terms of invariant conditions. The invariant conditions specify restrictions upon the elements allowed in the sets representing access control requirements. The sets are themselves expressed as database tables within a relational database management system. Algorithms that stipulate how to verify the additions or deletions of elements within these sets can then be performed within the database management system. A prototype was developed in order to demonstrate the concepts of this model. This prototype helps demonstrate how the proposed model could function and flaunts its effectiveness.
- Full Text:
- Date Issued: 2001
The development of a technique to establish the security requirements of an organization
- Authors: Gerber, Mariana
- Date: 2001
- Subjects: Computer security -- Management , Electronic data processing departments -- Security measures , Businesses -- Data processing -- Security measures , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech (Information Technology)
- Identifier: vital:10789 , http://hdl.handle.net/10948/89 , Computer security -- Management , Electronic data processing departments -- Security measures , Businesses -- Data processing -- Security measures , Computers -- Access control
- Description: To perform their business activities effectively, organizations rely heavily on the use of information (ISO/IEC TR 13335-2, 1996, p 1). Owens (1998) reiterates this by claiming that all organizations depend on information for their everyday operation and without it business will fail to operate (Owens, 1998, p 1-2). For an organization it means that if the right information is not available at the right time, it can make the difference between profit and loss or success and failure (Royds, 2000, p 2). Information is an asset and just like other important business assets within the organization, it has extreme value to an organization (BS 7799-1, 1999, p 1; Humphreys, Moses & Plate, 1998, p 8). For this reason it has become very important that business information is sufficiently protected. There are many different ways in which information can exist. Information can be printed or written on paper, stored electronically, transmitted electronically or by post, even spoken in conversation or any other way in which knowledge and ideas can be conveyed (URN 99/703, 1999, p. 2; Humphreys, Moses & Plate, 1998, p 8; URN 96/702, 1996, p 3).It is, therefore, critical to protect information, and to ensure that the security of IT (Information Technology) systems within organizations is properly managed. This requirement to protect information is even more important today, since many organizations are internally and externally connected by networks of IT systems (ISO/IEC TR 13335-2, 1996, p 1). Information security is therefore required to assist in the process of controlling and securing of information from accidental or malicious changes, deletions or unauthorized disclosure (Royds, 2000, p 2; URN 96/702, 1996, p 3). By preventing and minimizing the impact of security incidents, information security can ensure business continuity and reduce business damage (Owens, 1998, p 7). Information security in an organization can be regarded as a management opportunity and should become an integral part of the whole management activity of the organization. Obtaining commitment from management is therefore extremely important for effective information security. One way in which management can show their commitment to ensuring information security, is to adopt and enforce a security policy. A security policy ensures that people understand exactly what important role they play in securing information assets.
- Full Text:
- Date Issued: 2001
- Authors: Gerber, Mariana
- Date: 2001
- Subjects: Computer security -- Management , Electronic data processing departments -- Security measures , Businesses -- Data processing -- Security measures , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech (Information Technology)
- Identifier: vital:10789 , http://hdl.handle.net/10948/89 , Computer security -- Management , Electronic data processing departments -- Security measures , Businesses -- Data processing -- Security measures , Computers -- Access control
- Description: To perform their business activities effectively, organizations rely heavily on the use of information (ISO/IEC TR 13335-2, 1996, p 1). Owens (1998) reiterates this by claiming that all organizations depend on information for their everyday operation and without it business will fail to operate (Owens, 1998, p 1-2). For an organization it means that if the right information is not available at the right time, it can make the difference between profit and loss or success and failure (Royds, 2000, p 2). Information is an asset and just like other important business assets within the organization, it has extreme value to an organization (BS 7799-1, 1999, p 1; Humphreys, Moses & Plate, 1998, p 8). For this reason it has become very important that business information is sufficiently protected. There are many different ways in which information can exist. Information can be printed or written on paper, stored electronically, transmitted electronically or by post, even spoken in conversation or any other way in which knowledge and ideas can be conveyed (URN 99/703, 1999, p. 2; Humphreys, Moses & Plate, 1998, p 8; URN 96/702, 1996, p 3).It is, therefore, critical to protect information, and to ensure that the security of IT (Information Technology) systems within organizations is properly managed. This requirement to protect information is even more important today, since many organizations are internally and externally connected by networks of IT systems (ISO/IEC TR 13335-2, 1996, p 1). Information security is therefore required to assist in the process of controlling and securing of information from accidental or malicious changes, deletions or unauthorized disclosure (Royds, 2000, p 2; URN 96/702, 1996, p 3). By preventing and minimizing the impact of security incidents, information security can ensure business continuity and reduce business damage (Owens, 1998, p 7). Information security in an organization can be regarded as a management opportunity and should become an integral part of the whole management activity of the organization. Obtaining commitment from management is therefore extremely important for effective information security. One way in which management can show their commitment to ensuring information security, is to adopt and enforce a security policy. A security policy ensures that people understand exactly what important role they play in securing information assets.
- Full Text:
- Date Issued: 2001
Distributed authentication for resource control
- Authors: Burdis, Keith Robert
- Date: 2000
- Subjects: Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4630 , http://hdl.handle.net/10962/d1006512 , Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
- Full Text:
- Date Issued: 2000
- Authors: Burdis, Keith Robert
- Date: 2000
- Subjects: Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4630 , http://hdl.handle.net/10962/d1006512 , Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
- Full Text:
- Date Issued: 2000
- «
- ‹
- 1
- ›
- »