A combinatorial analysis of barred preferential arrangements
- Authors: Nkonkobe, Sithembele
- Date: 2016
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/36228 , vital:24530
- Description: For a non-negative integer n an ordered partition of a set Xn with n distinct elements is called a preferential arrangement (PA). A barred preferential arrangement (BPA) is a preferential arrangement with bars in between the blocks of the partition. An integer sequence an associated with the counting PA's of Xn has been intensely studied over a century and a half in many different contexts. In this thesis we develop a unified combinatorial framework to study the enumeration of BPAs and a special subclass of BPAs. The results of the study lead to a positive settlement of an open problem and a conjecture by Nelsen. We derive few important identities pertaining to the number of BPAs and restricted BPAs of an n element set using generating- functionology. Later we show that the number of restricted BPAs of Xn are intricately related to well-known numbers such as Eulerian numbers, Bell numbers, Poly-Bernoulli numbers and the number of equivalence classes of fuzzy subsets of Xn under some equivalent relation.
- Full Text:
- Authors: Nkonkobe, Sithembele
- Date: 2016
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/36228 , vital:24530
- Description: For a non-negative integer n an ordered partition of a set Xn with n distinct elements is called a preferential arrangement (PA). A barred preferential arrangement (BPA) is a preferential arrangement with bars in between the blocks of the partition. An integer sequence an associated with the counting PA's of Xn has been intensely studied over a century and a half in many different contexts. In this thesis we develop a unified combinatorial framework to study the enumeration of BPAs and a special subclass of BPAs. The results of the study lead to a positive settlement of an open problem and a conjecture by Nelsen. We derive few important identities pertaining to the number of BPAs and restricted BPAs of an n element set using generating- functionology. Later we show that the number of restricted BPAs of Xn are intricately related to well-known numbers such as Eulerian numbers, Bell numbers, Poly-Bernoulli numbers and the number of equivalence classes of fuzzy subsets of Xn under some equivalent relation.
- Full Text:
Cosmological structure formation using spectral methods
- Authors: Funcke, Michelle
- Date: 2016
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/2969 , vital:20348
- Description: Numerical simulations are becoming an increasingly important tool for understanding the growth and development of structure in the universe. Common practice is to discretize the space-time using physical variables. The discreteness is embodied by considering the dynamical variables as fields on a fixed spatial and time resolution, or by constructing the matter fields by a large number of particles which interact gravitationally (N-body methods). Recognizing that the physical quantities of interest are related to the spectrum of perturbations, we propose an alternate discretization in the frequency domain, using standard spectral methods. This approach is further aided by periodic boundary conditions which allows a straightforward decomposition of variables in a Fourier basis. Fixed resources require a high-frequency cut-off which lead to aliasing effects in non-linear equations, such as the ones considered here. This thesis describes the implementation of a 3D cosmological model based on Newtonian hydrodynamic equations in an expanding background. Initial data is constructed as a spectrum of perturbations, and evolved in the frequency domain using a pseudo-spectral evolution scheme and an explicit Runge-Kutta time integrator. The code is found to converge for both linear and non-linear evolutions, and the convergence rate is determined. The correct growth rates expected from analytical calculations are recovered in the linear case. In the non-linear model, we observe close correspondence with linear growth and are able to monitor the growth on features associated with the non-linearity. High-frequency aliasing effects were evident in the non-linear evolutions, leading to a study of two potential resolutions to this problem: a boxcar filter which adheres to“Orszag’s two thirds rule” and an exponential window function, the exponential filter suggested by Hou and Li [1], and a shifted version of the exponential filter suggested, which has the potential to alleviate high frequency- ripples resulting from the Gibbs’ phenomenon. We found that the filters were somewhat successful at reducing aliasing effects but that the Gibbs’ phenomenon could not be entirely removed by the choice of filters.
- Full Text:
- Authors: Funcke, Michelle
- Date: 2016
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/2969 , vital:20348
- Description: Numerical simulations are becoming an increasingly important tool for understanding the growth and development of structure in the universe. Common practice is to discretize the space-time using physical variables. The discreteness is embodied by considering the dynamical variables as fields on a fixed spatial and time resolution, or by constructing the matter fields by a large number of particles which interact gravitationally (N-body methods). Recognizing that the physical quantities of interest are related to the spectrum of perturbations, we propose an alternate discretization in the frequency domain, using standard spectral methods. This approach is further aided by periodic boundary conditions which allows a straightforward decomposition of variables in a Fourier basis. Fixed resources require a high-frequency cut-off which lead to aliasing effects in non-linear equations, such as the ones considered here. This thesis describes the implementation of a 3D cosmological model based on Newtonian hydrodynamic equations in an expanding background. Initial data is constructed as a spectrum of perturbations, and evolved in the frequency domain using a pseudo-spectral evolution scheme and an explicit Runge-Kutta time integrator. The code is found to converge for both linear and non-linear evolutions, and the convergence rate is determined. The correct growth rates expected from analytical calculations are recovered in the linear case. In the non-linear model, we observe close correspondence with linear growth and are able to monitor the growth on features associated with the non-linearity. High-frequency aliasing effects were evident in the non-linear evolutions, leading to a study of two potential resolutions to this problem: a boxcar filter which adheres to“Orszag’s two thirds rule” and an exponential window function, the exponential filter suggested by Hou and Li [1], and a shifted version of the exponential filter suggested, which has the potential to alleviate high frequency- ripples resulting from the Gibbs’ phenomenon. We found that the filters were somewhat successful at reducing aliasing effects but that the Gibbs’ phenomenon could not be entirely removed by the choice of filters.
- Full Text:
Observational cosmology with imperfect data
- Authors: Bester, Hertzog Landman
- Date: 2016
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/463 , vital:19961
- Description: We develop a formalism suitable to infer the background geometry of a general spherically symmetric dust universe directly from data on the past lightcone. This direct observational approach makes minimal assumptions about inaccessible parts of the Universe. The non-parametric and Bayesian framework we propose provides a very direct way to test one of the most fundamental underlying assumptions of concordance cosmology viz. the Copernican principle. We present the Copernicus algorithm for this purpose. By applying the algorithm to currently available data, we demonstrate that it is not yet possible to confirm or refute the validity of the Copernican principle within the proposed framework. This is followed by an investigation which aims to determine which future data will best be able to test the Copernican principle. Our results on simulated data suggest that, besides the need to improve the current data, it will be important to identify additional model independent observables for this purpose. The main difficulty with current data is their inability to constrain the value of the cosmological constant. We show how redshift drift data could be used to infer its value with minimal assumptions about the nature of the early Universe. We also discuss some alternative applications of the algorithm.
- Full Text:
- Authors: Bester, Hertzog Landman
- Date: 2016
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/463 , vital:19961
- Description: We develop a formalism suitable to infer the background geometry of a general spherically symmetric dust universe directly from data on the past lightcone. This direct observational approach makes minimal assumptions about inaccessible parts of the Universe. The non-parametric and Bayesian framework we propose provides a very direct way to test one of the most fundamental underlying assumptions of concordance cosmology viz. the Copernican principle. We present the Copernicus algorithm for this purpose. By applying the algorithm to currently available data, we demonstrate that it is not yet possible to confirm or refute the validity of the Copernican principle within the proposed framework. This is followed by an investigation which aims to determine which future data will best be able to test the Copernican principle. Our results on simulated data suggest that, besides the need to improve the current data, it will be important to identify additional model independent observables for this purpose. The main difficulty with current data is their inability to constrain the value of the cosmological constant. We show how redshift drift data could be used to infer its value with minimal assumptions about the nature of the early Universe. We also discuss some alternative applications of the algorithm.
- Full Text:
- «
- ‹
- 1
- ›
- »