Mapping the location of 2.4 GHz transmitters to achieve optimal usage of an IEEE 802.11 network
- Wells, David D, Siebörger, Ingrid G, Irwin, Barry V W
- Authors: Wells, David D , Siebörger, Ingrid G , Irwin, Barry V W
- Date: 2008
- Language: English
- Type: Conference paper
- Identifier: vital:6605 , http://hdl.handle.net/10962/d1009325
- Description: This paper describes the use of a low cost 2.4 GHz spectrum analyser, the MetaGeek WiSpy device, in conjunction with custom developed client-server software for the accurate identification of 2.4 GHz transmitters within a given area. The WiSpy dongle together with the custom developed software allow for determination of the positions of Wi-Fi transmitters to within a few meters, which can be helpful in reducing the work load for physical searches in the process of surveying the Wi-Fi network and geographical area. This paper describes the tool and methodology for a site survey as a component that can be used in organisations wishing to audit their environments for Wi-Fi networks. The tool produced from this project, the WiSpy Signal Source Mapping Tool, is a three part application based on a client-server architecture. One part interfaces with a low cost 2.4 GHz spectrum analyser, another stores the data collected from all the spectrum analysers and the third part interprets the data to provide a graphical overview of the Wi-Fi network being analysed. The location of the spectrum analysers are entered as GPS points, and the tool can interface with a GPS device to automatically update its geographical location. The graphical representation of the 2.4 GHz spectrum populated with Wi-Fi devices (Wi-Fi network) provided a fairly accurate method in locating and tracking 2.4 GHz devices. Accuracy of the WiSpy Signal Source Mapping Tool is hindered by obstructions, interferences within the area or non line of sight.
- Full Text:
- Date Issued: 2008
- Authors: Wells, David D , Siebörger, Ingrid G , Irwin, Barry V W
- Date: 2008
- Language: English
- Type: Conference paper
- Identifier: vital:6605 , http://hdl.handle.net/10962/d1009325
- Description: This paper describes the use of a low cost 2.4 GHz spectrum analyser, the MetaGeek WiSpy device, in conjunction with custom developed client-server software for the accurate identification of 2.4 GHz transmitters within a given area. The WiSpy dongle together with the custom developed software allow for determination of the positions of Wi-Fi transmitters to within a few meters, which can be helpful in reducing the work load for physical searches in the process of surveying the Wi-Fi network and geographical area. This paper describes the tool and methodology for a site survey as a component that can be used in organisations wishing to audit their environments for Wi-Fi networks. The tool produced from this project, the WiSpy Signal Source Mapping Tool, is a three part application based on a client-server architecture. One part interfaces with a low cost 2.4 GHz spectrum analyser, another stores the data collected from all the spectrum analysers and the third part interprets the data to provide a graphical overview of the Wi-Fi network being analysed. The location of the spectrum analysers are entered as GPS points, and the tool can interface with a GPS device to automatically update its geographical location. The graphical representation of the 2.4 GHz spectrum populated with Wi-Fi devices (Wi-Fi network) provided a fairly accurate method in locating and tracking 2.4 GHz devices. Accuracy of the WiSpy Signal Source Mapping Tool is hindered by obstructions, interferences within the area or non line of sight.
- Full Text:
- Date Issued: 2008
Location and mapping of 2.4 GHz RF transmitters
- Wells, David D, Siebörger, Ingrid G, Irwin, Barry V W
- Authors: Wells, David D , Siebörger, Ingrid G , Irwin, Barry V W
- Date: 2008
- Language: English
- Type: Conference paper
- Identifier: vital:6604 , http://hdl.handle.net/10962/d1009324
- Description: This paper describes the use of a MetaGeek WiSpy dongle in conjunction with custom developed client-server software for the accurate identication of Wireless nodes within an organisation. The MetaGeek WiSpy dongle together with the custom developed software allow for the determination of the positions of Wi-Fi transceivers to within a few meters, which can be helpful in reducing the area for physical searches in the event of rogue units. This paper describes the tool and methodology for a site survey as a component that can be used in organisations wishing to audit their environments for wireless networks. The tool produced from this project, the WiSpy Signal Source Mapping Tool, is a three part application based on a client-server architecture. One part interfaces with a low cost 2.4 GHz spectrum analyser, another stores the data collected from all the spectrum analysers and the last part interprets the data to provide a graphical overview of the Wi-Fi network being analysed. The location of the spectrum analysers are entered as GPS points, and the tool can interface with a GPS device to automatically update its geographical location. The graphical representation of the 2.4 GHz spectrum populated with Wi-Fi devices (Wi-Fi network) provided a fairly accurate method in locating and tracking 2.4 GHz devices. Accuracy of the WiSpy Signal Source Mapping Tool is hindered by obstructions or interferences within the area or non line of sight.
- Full Text:
- Date Issued: 2008
- Authors: Wells, David D , Siebörger, Ingrid G , Irwin, Barry V W
- Date: 2008
- Language: English
- Type: Conference paper
- Identifier: vital:6604 , http://hdl.handle.net/10962/d1009324
- Description: This paper describes the use of a MetaGeek WiSpy dongle in conjunction with custom developed client-server software for the accurate identication of Wireless nodes within an organisation. The MetaGeek WiSpy dongle together with the custom developed software allow for the determination of the positions of Wi-Fi transceivers to within a few meters, which can be helpful in reducing the area for physical searches in the event of rogue units. This paper describes the tool and methodology for a site survey as a component that can be used in organisations wishing to audit their environments for wireless networks. The tool produced from this project, the WiSpy Signal Source Mapping Tool, is a three part application based on a client-server architecture. One part interfaces with a low cost 2.4 GHz spectrum analyser, another stores the data collected from all the spectrum analysers and the last part interprets the data to provide a graphical overview of the Wi-Fi network being analysed. The location of the spectrum analysers are entered as GPS points, and the tool can interface with a GPS device to automatically update its geographical location. The graphical representation of the 2.4 GHz spectrum populated with Wi-Fi devices (Wi-Fi network) provided a fairly accurate method in locating and tracking 2.4 GHz devices. Accuracy of the WiSpy Signal Source Mapping Tool is hindered by obstructions or interferences within the area or non line of sight.
- Full Text:
- Date Issued: 2008
Program management : Rhodes University experience
- Authors: Vanda, Pelisa
- Date: 2014
- Language: English
- Type: Conference paper
- Identifier: vital:6982 , http://hdl.handle.net/10962/d1020652
- Description: Program Management Module was implemented after a year that Sierra was introduced in the SEALS libraries. Rhodes University had no online system for booking group study rooms. Students were queuing early in the morning outside the library so that they can sign up on a paper form for room booking. Aspects of implementation were discussed including the creation of user guidelines, training of library staff, and publicity for the booking system. Procedures were developed and made available on RUConnected and on the library webpage. , Paper delivered at the IUG-SA Conference, 19-21 November 2014.
- Full Text:
- Date Issued: 2014
- Authors: Vanda, Pelisa
- Date: 2014
- Language: English
- Type: Conference paper
- Identifier: vital:6982 , http://hdl.handle.net/10962/d1020652
- Description: Program Management Module was implemented after a year that Sierra was introduced in the SEALS libraries. Rhodes University had no online system for booking group study rooms. Students were queuing early in the morning outside the library so that they can sign up on a paper form for room booking. Aspects of implementation were discussed including the creation of user guidelines, training of library staff, and publicity for the booking system. Procedures were developed and made available on RUConnected and on the library webpage. , Paper delivered at the IUG-SA Conference, 19-21 November 2014.
- Full Text:
- Date Issued: 2014
Classifying network attack scenarios using an ontology
- Van Heerden, Renier, Irwin, Barry V W, Burke, I D
- Authors: Van Heerden, Renier , Irwin, Barry V W , Burke, I D
- Date: 2012
- Language: English
- Type: Conference paper
- Identifier: vital:6606 , http://hdl.handle.net/10962/d1009326
- Description: This paper presents a methodology using network attack ontology to classify computer-based attacks. Computer network attacks differ in motivation, execution and end result. Because attacks are diverse, no standard classification exists. If an attack could be classified, it could be mitigated accordingly. A taxonomy of computer network attacks forms the basis of the ontology. Most published taxonomies present an attack from either the attacker's or defender's point of view. This taxonomy presents both views. The main taxonomy classes are: Actor, Actor Location, Aggressor, Attack Goal, Attack Mechanism, Attack Scenario, Automation Level, Effects, Motivation, Phase, Scope and Target. The "Actor" class is the entity executing the attack. The "Actor Location" class is the Actor‟s country of origin. The "Aggressor" class is the group instigating an attack. The "Attack Goal" class specifies the attacker‟s goal. The "Attack Mechanism" class defines the attack methodology. The "Automation Level" class indicates the level of human interaction. The "Effects" class describes the consequences of an attack. The "Motivation" class specifies incentives for an attack. The "Scope" class describes the size and utility of the target. The "Target" class is the physical device or entity targeted by an attack. The "Vulnerability" class describes a target vulnerability used by the attacker. The "Phase" class represents an attack model that subdivides an attack into different phases. The ontology was developed using an "Attack Scenario" class, which draws from other classes and can be used to characterize and classify computer network attacks. An "Attack Scenario" consists of phases, has a scope and is attributed to an actor and aggressor which have a goal. The "Attack Scenario" thus represents different classes of attacks. High profile computer network attacks such as Stuxnet and the Estonia attacks can now be been classified through the “Attack Scenario” class.
- Full Text:
- Date Issued: 2012
- Authors: Van Heerden, Renier , Irwin, Barry V W , Burke, I D
- Date: 2012
- Language: English
- Type: Conference paper
- Identifier: vital:6606 , http://hdl.handle.net/10962/d1009326
- Description: This paper presents a methodology using network attack ontology to classify computer-based attacks. Computer network attacks differ in motivation, execution and end result. Because attacks are diverse, no standard classification exists. If an attack could be classified, it could be mitigated accordingly. A taxonomy of computer network attacks forms the basis of the ontology. Most published taxonomies present an attack from either the attacker's or defender's point of view. This taxonomy presents both views. The main taxonomy classes are: Actor, Actor Location, Aggressor, Attack Goal, Attack Mechanism, Attack Scenario, Automation Level, Effects, Motivation, Phase, Scope and Target. The "Actor" class is the entity executing the attack. The "Actor Location" class is the Actor‟s country of origin. The "Aggressor" class is the group instigating an attack. The "Attack Goal" class specifies the attacker‟s goal. The "Attack Mechanism" class defines the attack methodology. The "Automation Level" class indicates the level of human interaction. The "Effects" class describes the consequences of an attack. The "Motivation" class specifies incentives for an attack. The "Scope" class describes the size and utility of the target. The "Target" class is the physical device or entity targeted by an attack. The "Vulnerability" class describes a target vulnerability used by the attacker. The "Phase" class represents an attack model that subdivides an attack into different phases. The ontology was developed using an "Attack Scenario" class, which draws from other classes and can be used to characterize and classify computer network attacks. An "Attack Scenario" consists of phases, has a scope and is attributed to an actor and aggressor which have a goal. The "Attack Scenario" thus represents different classes of attacks. High profile computer network attacks such as Stuxnet and the Estonia attacks can now be been classified through the “Attack Scenario” class.
- Full Text:
- Date Issued: 2012
Media consumption and everyday life
- Authors: Strelitz, Larry N
- Date: 2008
- Language: English
- Type: text , Conference paper
- Identifier: vital:6324 , http://hdl.handle.net/10962/d1008550
- Description: I see this inaugural lecture as an opportunity to trace my journey into the field of media studies, showing how significant youthful experiences with the media set me off on my particular research trajectory. In this address I will use the umbrella term ‘mass media’ to cover both the traditional news media as well as forms of popular culture such as soap operas, popular music and so on. Sometimes we use the terms ‘popular culture’ and ‘mass media’ interchangeably as they both constitute the cultural life of ordinary people.
- Full Text:
- Date Issued: 2008
- Authors: Strelitz, Larry N
- Date: 2008
- Language: English
- Type: text , Conference paper
- Identifier: vital:6324 , http://hdl.handle.net/10962/d1008550
- Description: I see this inaugural lecture as an opportunity to trace my journey into the field of media studies, showing how significant youthful experiences with the media set me off on my particular research trajectory. In this address I will use the umbrella term ‘mass media’ to cover both the traditional news media as well as forms of popular culture such as soap operas, popular music and so on. Sometimes we use the terms ‘popular culture’ and ‘mass media’ interchangeably as they both constitute the cultural life of ordinary people.
- Full Text:
- Date Issued: 2008
Taxation and electronic commerce
- Stelloh, Marcus M, Stack, Elizabeth M
- Authors: Stelloh, Marcus M , Stack, Elizabeth M
- Date: 2008
- Subjects: To be catalogued
- Language: English
- Type: Conference paper
- Identifier: vital:6066 , http://hdl.handle.net/10962/d1004611
- Description: Transactions conducted using the Internet have expanded dramatically in the past few years and countries and their governments have become concerned about the consequences that electronic commerce may have on their tax revenues. Because of this many organisations and inter-governmental agencies have met to try to design a solution that will be compatible with the systems of the various countries and achieve tax neutrality. A number of proposals were made and discussed to try to design a fair and efficient e-tax system. The proposed system that is ultimately adopted must consider different tax bases and systems in order to achieve this. In this research the impact of e-commerce on the imposition of income tax was briefly referred to and four different proposals for levying value-added tax or sales tax were analysed in order to compare the advantages and disadvantages of each and to determine which system would most adequately address the needs of e-commerce. Certain modifications and additions to the proposed systems have been suggested in order to satisfy the specific needs of the South African tax system, while still taking other countries’ tax systems into account. Using Amazon.com Inc. and Skype Technologies South Africa Limited as examples, it is demonstrated how the new amended system will work. It was found that the proposed systems and the system adapted to meet the South African needs would, with a few relatively minor changes to the Value-Added Tax legislation, be suitable for the purposes of imposing value-added tax on e-commerce transactions.
- Full Text:
- Date Issued: 2008
- Authors: Stelloh, Marcus M , Stack, Elizabeth M
- Date: 2008
- Subjects: To be catalogued
- Language: English
- Type: Conference paper
- Identifier: vital:6066 , http://hdl.handle.net/10962/d1004611
- Description: Transactions conducted using the Internet have expanded dramatically in the past few years and countries and their governments have become concerned about the consequences that electronic commerce may have on their tax revenues. Because of this many organisations and inter-governmental agencies have met to try to design a solution that will be compatible with the systems of the various countries and achieve tax neutrality. A number of proposals were made and discussed to try to design a fair and efficient e-tax system. The proposed system that is ultimately adopted must consider different tax bases and systems in order to achieve this. In this research the impact of e-commerce on the imposition of income tax was briefly referred to and four different proposals for levying value-added tax or sales tax were analysed in order to compare the advantages and disadvantages of each and to determine which system would most adequately address the needs of e-commerce. Certain modifications and additions to the proposed systems have been suggested in order to satisfy the specific needs of the South African tax system, while still taking other countries’ tax systems into account. Using Amazon.com Inc. and Skype Technologies South Africa Limited as examples, it is demonstrated how the new amended system will work. It was found that the proposed systems and the system adapted to meet the South African needs would, with a few relatively minor changes to the Value-Added Tax legislation, be suitable for the purposes of imposing value-added tax on e-commerce transactions.
- Full Text:
- Date Issued: 2008
Reflections on the supervision of postgraduate research in Accounting Departments
- Authors: Stack, Elizabeth M
- Date: 2008
- Subjects: To be catalogued
- Language: English
- Type: Conference paper
- Identifier: vital:6067 , http://hdl.handle.net/10962/d1004614
- Description: The need to enhance the research profile of accounting departments and schools of accounting at South African universities and to increase the number of students engaging in postgraduate studies mirrors the challenges faced by universities in Australia and the United Kingdom two decades ago. Coupled with these imperatives is the recognition of the need for supervisor training in accounting departments and schools of accounting and the lack of opportunities for gaining experience in postgraduate research supervision due to the small number of students in the accounting field wishing to undertake research-based studies. This article reviews relevant literature on training for the supervisors of postgraduate research students, documents the personal experience and observations of the writer and, drawing on these sources, makes recommendations for the training of supervisors. The recommendations include a model for the training of supervisors reflecting two perspectives: “on-the-job” training and the introduction of a departmental supervision guide setting out aspects of best practice. Issues to be addressed in the training of supervisors include training in research methodology, technical expertise, managing the supervision relationship, quality control, providing constructive criticism and feedback, and ethical concerns.
- Full Text:
- Date Issued: 2008
- Authors: Stack, Elizabeth M
- Date: 2008
- Subjects: To be catalogued
- Language: English
- Type: Conference paper
- Identifier: vital:6067 , http://hdl.handle.net/10962/d1004614
- Description: The need to enhance the research profile of accounting departments and schools of accounting at South African universities and to increase the number of students engaging in postgraduate studies mirrors the challenges faced by universities in Australia and the United Kingdom two decades ago. Coupled with these imperatives is the recognition of the need for supervisor training in accounting departments and schools of accounting and the lack of opportunities for gaining experience in postgraduate research supervision due to the small number of students in the accounting field wishing to undertake research-based studies. This article reviews relevant literature on training for the supervisors of postgraduate research students, documents the personal experience and observations of the writer and, drawing on these sources, makes recommendations for the training of supervisors. The recommendations include a model for the training of supervisors reflecting two perspectives: “on-the-job” training and the introduction of a departmental supervision guide setting out aspects of best practice. Issues to be addressed in the training of supervisors include training in research methodology, technical expertise, managing the supervision relationship, quality control, providing constructive criticism and feedback, and ethical concerns.
- Full Text:
- Date Issued: 2008
Precision of tristimulus chromameter results from corticosteroid-induced skin blanching
- Smith, Eric W, Haigh, John M
- Authors: Smith, Eric W , Haigh, John M
- Date: 1998
- Language: English
- Type: Conference paper
- Identifier: vital:6342 , http://hdl.handle.net/10962/d1006609
- Description: The human skin blanching (vasoconstriction) assay has been in use for 3 decades as a tool for the assessment of the release of corticosteroids from topical dosage forms. Application of corticosteroids produces a whitening (blanching) of the skin, the intensity of which is directly related to the clinical efficacyof the formulation. Assessment of the intensity of the induced blanching has classically been, and continues to be, pe1fonned by visual grading, a method which has been criticised because of the subjectivenature of the assessment Recently there has been considerablediscussion in the literature regarding the use of the chromameter as an objective instrumental method of monitoring corticosteroid induced skin blanching for bioequivalence assessment purposes. The FDA has released a Guidance document recommending the use of the chromameter for this purpose. The chromameter measures colour in teims of three indices: the L-scale (light-dark), the a-scale (red-green) and the b-scale (yellow-blue).Any colour can be expressedabsolutelyin terms of these three values.The Guidance protocol suggests the use of only the a-scale values in quantifying the blanching response after correction of the data which includes subtraction of baseline and unmedicated site values. One of the unresolved issues in the FDA Guidance document is this method of data manipulation suggested since the instrument should be capable of assigning an absolute colour value to each site during the vasoconstriction period. The purpose of this study was to manipulate the instrumental data from a typical blanching study in a number of ways to investigate the appropriatenessof these suggested procedures.
- Full Text:
- Date Issued: 1998
- Authors: Smith, Eric W , Haigh, John M
- Date: 1998
- Language: English
- Type: Conference paper
- Identifier: vital:6342 , http://hdl.handle.net/10962/d1006609
- Description: The human skin blanching (vasoconstriction) assay has been in use for 3 decades as a tool for the assessment of the release of corticosteroids from topical dosage forms. Application of corticosteroids produces a whitening (blanching) of the skin, the intensity of which is directly related to the clinical efficacyof the formulation. Assessment of the intensity of the induced blanching has classically been, and continues to be, pe1fonned by visual grading, a method which has been criticised because of the subjectivenature of the assessment Recently there has been considerablediscussion in the literature regarding the use of the chromameter as an objective instrumental method of monitoring corticosteroid induced skin blanching for bioequivalence assessment purposes. The FDA has released a Guidance document recommending the use of the chromameter for this purpose. The chromameter measures colour in teims of three indices: the L-scale (light-dark), the a-scale (red-green) and the b-scale (yellow-blue).Any colour can be expressedabsolutelyin terms of these three values.The Guidance protocol suggests the use of only the a-scale values in quantifying the blanching response after correction of the data which includes subtraction of baseline and unmedicated site values. One of the unresolved issues in the FDA Guidance document is this method of data manipulation suggested since the instrument should be capable of assigning an absolute colour value to each site during the vasoconstriction period. The purpose of this study was to manipulate the instrumental data from a typical blanching study in a number of ways to investigate the appropriatenessof these suggested procedures.
- Full Text:
- Date Issued: 1998
Firewalls at Rhodes
- Authors: Siebörger, David
- Date: 2006
- Language: English
- Type: Conference paper
- Identifier: vital:6610 , http://hdl.handle.net/10962/d1009522
- Description: A presentation on the use of Internet Firewalls at Rhodes University. This formed part of the International Network for the Availability of Scientific Publications' Bandwidth Management and Optimisation Open Source Tools and Solutions project, being a series of workshops conducted throughout the developing world.
- Full Text:
- Date Issued: 2006
- Authors: Siebörger, David
- Date: 2006
- Language: English
- Type: Conference paper
- Identifier: vital:6610 , http://hdl.handle.net/10962/d1009522
- Description: A presentation on the use of Internet Firewalls at Rhodes University. This formed part of the International Network for the Availability of Scientific Publications' Bandwidth Management and Optimisation Open Source Tools and Solutions project, being a series of workshops conducted throughout the developing world.
- Full Text:
- Date Issued: 2006
Using a FreeBSD "cluster" to provide network services
- Authors: Siebörger, David
- Date: 2005
- Language: English
- Type: Conference paper
- Identifier: vital:6611 , http://hdl.handle.net/10962/d1009524
- Description: A presentation on how FreeBSD might be used in a load-balancing cluster, using work done at Rhodes University as a case study. Presented to a community of higher education IT practitioners in September 2005.
- Full Text:
- Date Issued: 2005
- Authors: Siebörger, David
- Date: 2005
- Language: English
- Type: Conference paper
- Identifier: vital:6611 , http://hdl.handle.net/10962/d1009524
- Description: A presentation on how FreeBSD might be used in a load-balancing cluster, using work done at Rhodes University as a case study. Presented to a community of higher education IT practitioners in September 2005.
- Full Text:
- Date Issued: 2005
Comparison of visual CR-200 and CR-300 chromameter data obtained from the corticosteroid-induced skin-blanching assay
- Schwarb, Fabian P, Smith, Eric W, Haigh, John M, Surber, Christian
- Authors: Schwarb, Fabian P , Smith, Eric W , Haigh, John M , Surber, Christian
- Date: 1998
- Language: English
- Type: Conference paper , text
- Identifier: vital:6344 , http://hdl.handle.net/10962/d1006611
- Description: In a recent Guidance document the American FDA recommended the use of a chromameterrather thanthe human eye for the assessment of the pharmacodynamic blanching response produced after topical application of corticosteroids. The purpose of this study was to investigate the appropriateness of the human eye and two types of chromameter for the estimation of skin blanching.
- Full Text:
- Date Issued: 1998
- Authors: Schwarb, Fabian P , Smith, Eric W , Haigh, John M , Surber, Christian
- Date: 1998
- Language: English
- Type: Conference paper , text
- Identifier: vital:6344 , http://hdl.handle.net/10962/d1006611
- Description: In a recent Guidance document the American FDA recommended the use of a chromameterrather thanthe human eye for the assessment of the pharmacodynamic blanching response produced after topical application of corticosteroids. The purpose of this study was to investigate the appropriateness of the human eye and two types of chromameter for the estimation of skin blanching.
- Full Text:
- Date Issued: 1998
Chromametry: measuring precision of diurnal and local variation of human forearm skin colour
- Schwarb, Fabian P, Smith, Eric W, Haigh, John M, Surber, Christian
- Authors: Schwarb, Fabian P , Smith, Eric W , Haigh, John M , Surber, Christian
- Date: 1998
- Language: English
- Type: Conference paper
- Identifier: vital:6343 , http://hdl.handle.net/10962/d1006610
- Description: Chromameters are compact portable instruments used for the assessment of surface colour based on the tristimulus analysis of a reflected xenon light pulse, and have been used for the quantification of erythema in the study of irritant dermatitis, and corticosteroid-induced skin blanching in the vasoconstriction assay. The variability and the reproducibility of chromameter results were investigated since it is known that the location and application force of the measuring head on the skin and the orthostatic maneuver of the arms influence the colour measurement. Furthermore the diurnal variation and the homogeneity of forearm skin colour were investigated.
- Full Text:
- Date Issued: 1998
- Authors: Schwarb, Fabian P , Smith, Eric W , Haigh, John M , Surber, Christian
- Date: 1998
- Language: English
- Type: Conference paper
- Identifier: vital:6343 , http://hdl.handle.net/10962/d1006610
- Description: Chromameters are compact portable instruments used for the assessment of surface colour based on the tristimulus analysis of a reflected xenon light pulse, and have been used for the quantification of erythema in the study of irritant dermatitis, and corticosteroid-induced skin blanching in the vasoconstriction assay. The variability and the reproducibility of chromameter results were investigated since it is known that the location and application force of the measuring head on the skin and the orthostatic maneuver of the arms influence the colour measurement. Furthermore the diurnal variation and the homogeneity of forearm skin colour were investigated.
- Full Text:
- Date Issued: 1998
Bioequivalence testing of topical dermatological formulations, the gap between science and legislation
- Schwarb, Fabian P, Smith, Eric W, Haigh, John M, Surber, Christian
- Authors: Schwarb, Fabian P , Smith, Eric W , Haigh, John M , Surber, Christian
- Date: 1998
- Language: English
- Type: Conference paper
- Identifier: vital:6341 , http://hdl.handle.net/10962/d1006608
- Description: Bioavailability concerns for topical dermatological products are complex and it is especially difficult to determine the bioequivalence of similar topical formulations. Since only small amounts of drug dispersed in an appropriate vehicle are applied to the skin, the amount of drug that actually reaches the systemic circulation is often too small to be easily quantified. Additionally, it can be argued that the relevance of any serum/plasma concentration-time curve of a topical agent is questionable, since the curve reflects the amount of drug after the active moiety has left the site of action. For some topical drugs e.g., topical corticosteroids, it is possible to perform a pharmacodynamic bioassay to obtain acceptable bioequivalence data. In this case, the intensity of the side effect of blanching (vasoconstriction) in the skin caused by topical corticosteroids can be measured. The response is directly proportional to the clinical efficacy, and the skin blanching assay has proved to be a reliable procedure for the determination of topical corticosteroid bioavailability. Recently, we had sight of the results of a topical bioequivalence study, which was conducted for the registration of a new generic corticosteroid cream formulation. In this trial the new formulation was compared to two equivalent product from the local market and bioequivalence was demonstrated by the investigators for all three products. These results were examined with interest as the respective reference products have been used repeatedly as standard formulations in our laboratory. However, one of these reference formulations has consistently shown superior bioavailability in our trials, but was not demonstrated to be superior in the study results examined. In the present publication an overview of topical bioequivalence testing in general is given and the difficulties occurring in practice, for topical corticosteroid formulations in particular, are demonstrated.
- Full Text:
- Date Issued: 1998
- Authors: Schwarb, Fabian P , Smith, Eric W , Haigh, John M , Surber, Christian
- Date: 1998
- Language: English
- Type: Conference paper
- Identifier: vital:6341 , http://hdl.handle.net/10962/d1006608
- Description: Bioavailability concerns for topical dermatological products are complex and it is especially difficult to determine the bioequivalence of similar topical formulations. Since only small amounts of drug dispersed in an appropriate vehicle are applied to the skin, the amount of drug that actually reaches the systemic circulation is often too small to be easily quantified. Additionally, it can be argued that the relevance of any serum/plasma concentration-time curve of a topical agent is questionable, since the curve reflects the amount of drug after the active moiety has left the site of action. For some topical drugs e.g., topical corticosteroids, it is possible to perform a pharmacodynamic bioassay to obtain acceptable bioequivalence data. In this case, the intensity of the side effect of blanching (vasoconstriction) in the skin caused by topical corticosteroids can be measured. The response is directly proportional to the clinical efficacy, and the skin blanching assay has proved to be a reliable procedure for the determination of topical corticosteroid bioavailability. Recently, we had sight of the results of a topical bioequivalence study, which was conducted for the registration of a new generic corticosteroid cream formulation. In this trial the new formulation was compared to two equivalent product from the local market and bioequivalence was demonstrated by the investigators for all three products. These results were examined with interest as the respective reference products have been used repeatedly as standard formulations in our laboratory. However, one of these reference formulations has consistently shown superior bioavailability in our trials, but was not demonstrated to be superior in the study results examined. In the present publication an overview of topical bioequivalence testing in general is given and the difficulties occurring in practice, for topical corticosteroid formulations in particular, are demonstrated.
- Full Text:
- Date Issued: 1998
Refining lecturers’ assessment practices through formal professional development at Rhodes University, Grahamstown
- Authors: Sayigh, L
- Date: 2003
- Language: English
- Type: Conference paper
- Identifier: vital:6080 , http://hdl.handle.net/10962/d1008584
- Description: In recent years, the so-called Accreditation and Registration of Assessors has given rise to much debate in the Higher Education sector. The idea that anyone assessing student learning should be required to train in order to gain a formal qualification and register as an assessor originated with the South African Qualifications Authority (SAQA) and was soon challenged within the higher education community. The Study Team appointed to investigate the implementation of the NQF (National Qualifications Framework) in 2001 recommended that registration of assessors should not be required of individuals teaching in the higher education sector if employed by an accredited institution and this recommendation was later accepted by the Department of Education and the Department of Labour in their joint consultative document entitled ‘An Interdependent National Qualifications Framework System’ (Department of Education, Department of Labour 2003). The waiving of the requirement to register assessors has been welcomed within the public higher education sector. But despite this, the need to train and qualify assessors of students’ learning remains important due to the emphasis placed on assessment by the HEQC (Higher Education Quality Committee) in its ‘Criteria for Institutional Audits’ (2004). The central issue has become how higher education institutions are to successfully train lecturers as assessors in higher education.
- Full Text:
- Date Issued: 2003
- Authors: Sayigh, L
- Date: 2003
- Language: English
- Type: Conference paper
- Identifier: vital:6080 , http://hdl.handle.net/10962/d1008584
- Description: In recent years, the so-called Accreditation and Registration of Assessors has given rise to much debate in the Higher Education sector. The idea that anyone assessing student learning should be required to train in order to gain a formal qualification and register as an assessor originated with the South African Qualifications Authority (SAQA) and was soon challenged within the higher education community. The Study Team appointed to investigate the implementation of the NQF (National Qualifications Framework) in 2001 recommended that registration of assessors should not be required of individuals teaching in the higher education sector if employed by an accredited institution and this recommendation was later accepted by the Department of Education and the Department of Labour in their joint consultative document entitled ‘An Interdependent National Qualifications Framework System’ (Department of Education, Department of Labour 2003). The waiving of the requirement to register assessors has been welcomed within the public higher education sector. But despite this, the need to train and qualify assessors of students’ learning remains important due to the emphasis placed on assessment by the HEQC (Higher Education Quality Committee) in its ‘Criteria for Institutional Audits’ (2004). The central issue has become how higher education institutions are to successfully train lecturers as assessors in higher education.
- Full Text:
- Date Issued: 2003
New Frontiers of Librarianship
- Authors: Satgoor, Ujala
- Date: 2013
- Language: English
- Type: Conference paper , text
- Identifier: vital:6976 , http://hdl.handle.net/10962/d1007308
- Description: Paper delivered at the Sabinet Client Conference, 6 September 2013
- Full Text:
- Date Issued: 2013
- Authors: Satgoor, Ujala
- Date: 2013
- Language: English
- Type: Conference paper , text
- Identifier: vital:6976 , http://hdl.handle.net/10962/d1007308
- Description: Paper delivered at the Sabinet Client Conference, 6 September 2013
- Full Text:
- Date Issued: 2013
Integrating environmental flow requirements into a stakeholder driven catchment management process
- Rowntree, Kate M, Birkholz, Sharon A, Burt, Jane C, Fox, Helen E
- Authors: Rowntree, Kate M , Birkholz, Sharon A , Burt, Jane C , Fox, Helen E
- Date: 2009
- Language: English
- Type: Conference paper
- Identifier: vital:6670 , http://hdl.handle.net/10962/d1006804
- Description: South Africa's National Water Act (NWA no 36 of 1998) recognizes the need for environmental protection through the ecological Reserve, defined in the Act as the quantity and quality of water required to protect aquatic ecosystems in order to secure ecologically sustainable development through the constrained use of the relevant water resource. Further more the NWA stipulates that the allocation of licenses to new water users, or the granting of increased water use to established water users, can only take place once the the Reserve for the river has been determined and approved by the Minister. This means that water users' needs (beyond those required for basic human needs) take second place behind the environment. Whether or not the inclusion of the ecological Reserve in South Africa's water legislation leads to sustainable use of South Africa's water resources depends on its successful implementation. This in turn depends on the will of both the Department of Water Affairs and Forestry (DWAF), the implementing agent, and the end water users who need to be convinced of the priority given to environmental needs. In this paper we look at the process of implementing the ecological Reserve in the Kat Valley in the Eastern Cape of South Africa as part of a stakeholder driven process of developing a water allocation plan for the catchment that prioritized participation by water users. The extent to which DWAF and the water users expedited or thwarted the process is examined in the light of national and international calls for local-level participation in water resource management processes.
- Full Text:
- Date Issued: 2009
- Authors: Rowntree, Kate M , Birkholz, Sharon A , Burt, Jane C , Fox, Helen E
- Date: 2009
- Language: English
- Type: Conference paper
- Identifier: vital:6670 , http://hdl.handle.net/10962/d1006804
- Description: South Africa's National Water Act (NWA no 36 of 1998) recognizes the need for environmental protection through the ecological Reserve, defined in the Act as the quantity and quality of water required to protect aquatic ecosystems in order to secure ecologically sustainable development through the constrained use of the relevant water resource. Further more the NWA stipulates that the allocation of licenses to new water users, or the granting of increased water use to established water users, can only take place once the the Reserve for the river has been determined and approved by the Minister. This means that water users' needs (beyond those required for basic human needs) take second place behind the environment. Whether or not the inclusion of the ecological Reserve in South Africa's water legislation leads to sustainable use of South Africa's water resources depends on its successful implementation. This in turn depends on the will of both the Department of Water Affairs and Forestry (DWAF), the implementing agent, and the end water users who need to be convinced of the priority given to environmental needs. In this paper we look at the process of implementing the ecological Reserve in the Kat Valley in the Eastern Cape of South Africa as part of a stakeholder driven process of developing a water allocation plan for the catchment that prioritized participation by water users. The extent to which DWAF and the water users expedited or thwarted the process is examined in the light of national and international calls for local-level participation in water resource management processes.
- Full Text:
- Date Issued: 2009
Active learning for understanding land degradation : African Catchment Game and Riskmap
- Rowntree, Kate M, Fox, Roddy C
- Authors: Rowntree, Kate M , Fox, Roddy C
- Date: 2006
- Language: English
- Type: Conference paper
- Identifier: vital:6669 , http://hdl.handle.net/10962/d1006793
- Description: Land degradation is the result of the intersection of a complex set of biophysical and socio-economic factors. The capacity of an individual or community to address land degradation is likewise constrained. While it is quite possible for professionals and learners to grasp the main issues around land degradation from a theoretical perspective, internalizing the reality of what it means to be the resource degrader is more difficult. We have developed two active learning methods that aim to address this problem. The first is the African Catchment Game, a role-playing game based on Graham Chapman’s Green Revolution Game, adapted for the southern Africa context and incorporating a land degradation component. In this game participants play out the complex dynamics of rural-urban-global linkages against a background of environmental hazards. The second is based on Save the Children Fund’s RiskMap computer simulation that models risk in terms of rural livelihoods for different income groups. Ethiopia is used as the example. This paper evaluates the two active learning techniques as tools for exploring the relationships between land degradation and poverty through an evaluation of participants’ experiences.
- Full Text:
- Date Issued: 2006
- Authors: Rowntree, Kate M , Fox, Roddy C
- Date: 2006
- Language: English
- Type: Conference paper
- Identifier: vital:6669 , http://hdl.handle.net/10962/d1006793
- Description: Land degradation is the result of the intersection of a complex set of biophysical and socio-economic factors. The capacity of an individual or community to address land degradation is likewise constrained. While it is quite possible for professionals and learners to grasp the main issues around land degradation from a theoretical perspective, internalizing the reality of what it means to be the resource degrader is more difficult. We have developed two active learning methods that aim to address this problem. The first is the African Catchment Game, a role-playing game based on Graham Chapman’s Green Revolution Game, adapted for the southern Africa context and incorporating a land degradation component. In this game participants play out the complex dynamics of rural-urban-global linkages against a background of environmental hazards. The second is based on Save the Children Fund’s RiskMap computer simulation that models risk in terms of rural livelihoods for different income groups. Ethiopia is used as the example. This paper evaluates the two active learning techniques as tools for exploring the relationships between land degradation and poverty through an evaluation of participants’ experiences.
- Full Text:
- Date Issued: 2006
Exploring risk related to future climates through role-playing games: the African catchment game
- Rowntree, Kate M, Fraenkel, Linda A, Fox, Roddy C
- Authors: Rowntree, Kate M , Fraenkel, Linda A , Fox, Roddy C
- Date: 2009
- Language: English
- Type: Conference paper
- Identifier: vital:6671 , http://hdl.handle.net/10962/d1006807
- Description: Risk is the result of two interacting components: hazard and vulnerability. Climatic hazards are related to extrinsic factors such as drought or severe storms. Vul- nerability is the result of intrinsic factors that often arise from the socio-political- economic context. The interplay of risk and vulnerability is difficult to predict. Although computer models have been widely used to forecast climate related risk, albeit with con- siderable uncertainty, they can never capture sufficiently the vulnerability of human sys- tems to these hazards. Role-playing games can be used more realistically to simulate pos- sible outcomes of different climate change scenarios, and allow players to reflect on their significance. The authors have developed the African Catchment Game to simulate a wa- ter scarce African country. Risk can be modelled mechanistically by changing the nature of the annual rainfall input. Vulnerability can in part be modelled by changing the start- ing parameters (such as access to land and resources) and, secondly, through the unpredictable response of players to game dynamics. Players’ reflections demonstrate that through the game they become more aware of the concept of risk and the complex response of individuals and societies that determine their vulnerability to climatic hazards. This paper reflects on the potential for developing the game further as a tool for participatory learning around climate change, based on the authors’ experience of playing the game with participants from South Africa.
- Full Text:
- Date Issued: 2009
- Authors: Rowntree, Kate M , Fraenkel, Linda A , Fox, Roddy C
- Date: 2009
- Language: English
- Type: Conference paper
- Identifier: vital:6671 , http://hdl.handle.net/10962/d1006807
- Description: Risk is the result of two interacting components: hazard and vulnerability. Climatic hazards are related to extrinsic factors such as drought or severe storms. Vul- nerability is the result of intrinsic factors that often arise from the socio-political- economic context. The interplay of risk and vulnerability is difficult to predict. Although computer models have been widely used to forecast climate related risk, albeit with con- siderable uncertainty, they can never capture sufficiently the vulnerability of human sys- tems to these hazards. Role-playing games can be used more realistically to simulate pos- sible outcomes of different climate change scenarios, and allow players to reflect on their significance. The authors have developed the African Catchment Game to simulate a wa- ter scarce African country. Risk can be modelled mechanistically by changing the nature of the annual rainfall input. Vulnerability can in part be modelled by changing the start- ing parameters (such as access to land and resources) and, secondly, through the unpredictable response of players to game dynamics. Players’ reflections demonstrate that through the game they become more aware of the concept of risk and the complex response of individuals and societies that determine their vulnerability to climatic hazards. This paper reflects on the potential for developing the game further as a tool for participatory learning around climate change, based on the authors’ experience of playing the game with participants from South Africa.
- Full Text:
- Date Issued: 2009
Impediments to the delivery of socioeconomic rights in South Africa
- Authors: Roodt, Monty J
- Date: 2008
- Language: English
- Type: Conference paper
- Identifier: vital:6318 , http://hdl.handle.net/10962/d1011315
- Description: [from the Introduction] The purpose of including Second and Third Generation (STG) rights in a constitution is to provide guidelines to lawmakers to formulate policy and to enable the courts to intervene where these policies are not being implemented satisfactorily. In theory these rights allow citizens to demand from the state access to basic needs, such as adequate land, housing, education, health care, nutrition, and social security. However, this inclusion of rights in the constitution often does not translate into action. The first reason for this is that Second and Third Generation rights may clash with First Generation rights. For example the right to private property may, and in South Africa does, contradict the need for land for the majority. The major problem is whether the policies flowing out of Second and Third Generation rights are pursued with enough vigour by governments, the private sector, primary groups and individuals to overcome this contradiction. In many countries in the world it is the poorest sections of the population, and as Mamdani (1996) pointed out, migrant non-citizens, that bear the brunt of administrative and bureaucratic bungling and neglect.
- Full Text:
- Date Issued: 2008
- Authors: Roodt, Monty J
- Date: 2008
- Language: English
- Type: Conference paper
- Identifier: vital:6318 , http://hdl.handle.net/10962/d1011315
- Description: [from the Introduction] The purpose of including Second and Third Generation (STG) rights in a constitution is to provide guidelines to lawmakers to formulate policy and to enable the courts to intervene where these policies are not being implemented satisfactorily. In theory these rights allow citizens to demand from the state access to basic needs, such as adequate land, housing, education, health care, nutrition, and social security. However, this inclusion of rights in the constitution often does not translate into action. The first reason for this is that Second and Third Generation rights may clash with First Generation rights. For example the right to private property may, and in South Africa does, contradict the need for land for the majority. The major problem is whether the policies flowing out of Second and Third Generation rights are pursued with enough vigour by governments, the private sector, primary groups and individuals to overcome this contradiction. In many countries in the world it is the poorest sections of the population, and as Mamdani (1996) pointed out, migrant non-citizens, that bear the brunt of administrative and bureaucratic bungling and neglect.
- Full Text:
- Date Issued: 2008
Learning science through two languages in South Africa
- Authors: Probyn, Margie J
- Date: 2005
- Language: English
- Type: Conference paper
- Identifier: vital:7015 , http://hdl.handle.net/10962/d1007208
- Description: [From the introduction]: South Africa is a multilingual country with eleven national languages - nine indigenous languages and the two former colonial languages of English and Afrikaans1 - recognised as official languages in the Constitution of 1996 (Constitution of the Republic of South Africa, 1996). Despite these provisions, since the democratic elections of 1994 English has expanded its position as the language of access and power with the relative influence of Afrikaans shrinking, and African languages effectively confined to functions of ‘home and hearth’. McLean and McCormick (1996: 329 in Mazrui 2002: 269) suggest that the constitutional recognition of 11 official languages in South Africa is largely 'intended and perceived as a symbolic statement and that for instrumental purposes, English remains the dominant language in South Africa'.
- Full Text:
- Date Issued: 2005
- Authors: Probyn, Margie J
- Date: 2005
- Language: English
- Type: Conference paper
- Identifier: vital:7015 , http://hdl.handle.net/10962/d1007208
- Description: [From the introduction]: South Africa is a multilingual country with eleven national languages - nine indigenous languages and the two former colonial languages of English and Afrikaans1 - recognised as official languages in the Constitution of 1996 (Constitution of the Republic of South Africa, 1996). Despite these provisions, since the democratic elections of 1994 English has expanded its position as the language of access and power with the relative influence of Afrikaans shrinking, and African languages effectively confined to functions of ‘home and hearth’. McLean and McCormick (1996: 329 in Mazrui 2002: 269) suggest that the constitutional recognition of 11 official languages in South Africa is largely 'intended and perceived as a symbolic statement and that for instrumental purposes, English remains the dominant language in South Africa'.
- Full Text:
- Date Issued: 2005