A Systematic Visualisation Framework for Radio-Imaging Pipelines
- Authors: Andati, Lexy Acherwa Livoyi
- Date: 2021-04
- Subjects: Radio interferometers , Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Jupyter
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177338 , vital:42812
- Description: Pipelines for calibration and imaging of radio interferometric data produce many intermediate images and other data products (gain tables, etc.) These often contain valuable information about the quality of the data and the calibration, and can provide the user with valuable insights, if only visualised in the right way. However, the deluge of data that we’re experiencing with modern instruments means that most of these products are never looked at, and only the final images and data products are examined. Furthermore, the variety of imaging algorithms currently available, and the range of their options, means that very different results can be produced from the same set of original data. Proper understanding of this requires a systematic comparison that can be carried out both by individual users locally, and by the community globally. We address both problems by developing a systematic visualisation framework based around Jupyter notebooks, enriched with interactive plots based on the Bokeh and Datashader visualisation libraries. , Thesis (MSc) -- Faculty of Science, Department of Physics and Electronics, 2021
- Full Text:
- Date Issued: 2021-04
- Authors: Andati, Lexy Acherwa Livoyi
- Date: 2021-04
- Subjects: Radio interferometers , Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Jupyter
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177338 , vital:42812
- Description: Pipelines for calibration and imaging of radio interferometric data produce many intermediate images and other data products (gain tables, etc.) These often contain valuable information about the quality of the data and the calibration, and can provide the user with valuable insights, if only visualised in the right way. However, the deluge of data that we’re experiencing with modern instruments means that most of these products are never looked at, and only the final images and data products are examined. Furthermore, the variety of imaging algorithms currently available, and the range of their options, means that very different results can be produced from the same set of original data. Proper understanding of this requires a systematic comparison that can be carried out both by individual users locally, and by the community globally. We address both problems by developing a systematic visualisation framework based around Jupyter notebooks, enriched with interactive plots based on the Bokeh and Datashader visualisation libraries. , Thesis (MSc) -- Faculty of Science, Department of Physics and Electronics, 2021
- Full Text:
- Date Issued: 2021-04
A 150 MHz all sky survey with the Precision Array to Probe the Epoch of Reionization
- Authors: Chege, James Kariuki
- Date: 2020
- Subjects: Epoch of reionization -- Research , Astronomy -- Observations , Radio interferometers
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/117733 , vital:34556
- Description: The Precision Array to Probe the Epoch of Reionization (PAPER) was built to measure the redshifted 21 cm line of hydrogen from cosmic reionization. Such low frequency observations promise to be the best means of understanding the cosmic dawn; when the first galaxies in the universe formed, and also the Epoch of Reionization; when the intergalactic medium changed from neutral to ionized. The major challenges to these observations is the presence of astrophysical foregrounds that are much brighter than the cosmological signal. Here, I present an all-sky survey at 150 MHz obtained from the analysis of 300 hours of PAPER observations. Particular focus is given to the calibration and imaging techniques that need to deal with the wide field of view of a non-tracking instrument. The survey covers ~ 7000 square degrees of the southern sky. From a sky area of 4400 square degrees out of the total survey area, I extract a catalogue of sources brighter than 4 Jy whose accuracy was tested against the published GLEAM catalogue, leading to a fractional difference rms better than 20%. The catalogue provides an all-sky accurate model of the extragalactic foreground to be used for the calibration of future Epoch of Reionization observations and to be subtracted from the PAPER observations themselves in order to mitigate the foreground contamination.
- Full Text:
- Date Issued: 2020
- Authors: Chege, James Kariuki
- Date: 2020
- Subjects: Epoch of reionization -- Research , Astronomy -- Observations , Radio interferometers
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/117733 , vital:34556
- Description: The Precision Array to Probe the Epoch of Reionization (PAPER) was built to measure the redshifted 21 cm line of hydrogen from cosmic reionization. Such low frequency observations promise to be the best means of understanding the cosmic dawn; when the first galaxies in the universe formed, and also the Epoch of Reionization; when the intergalactic medium changed from neutral to ionized. The major challenges to these observations is the presence of astrophysical foregrounds that are much brighter than the cosmological signal. Here, I present an all-sky survey at 150 MHz obtained from the analysis of 300 hours of PAPER observations. Particular focus is given to the calibration and imaging techniques that need to deal with the wide field of view of a non-tracking instrument. The survey covers ~ 7000 square degrees of the southern sky. From a sky area of 4400 square degrees out of the total survey area, I extract a catalogue of sources brighter than 4 Jy whose accuracy was tested against the published GLEAM catalogue, leading to a fractional difference rms better than 20%. The catalogue provides an all-sky accurate model of the extragalactic foreground to be used for the calibration of future Epoch of Reionization observations and to be subtracted from the PAPER observations themselves in order to mitigate the foreground contamination.
- Full Text:
- Date Issued: 2020
Finite precision arithmetic in Polyphase Filterbank implementations
- Authors: Myburgh, Talon
- Date: 2020
- Subjects: Radio interferometers , Interferometry , Radio telescopes , Gate array circuits , Floating-point arithmetic , Python (Computer program language) , Polyphase Filterbank , Finite precision arithmetic , MeerKAT
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/146187 , vital:38503
- Description: The MeerKAT is the most sensitive radio telescope in its class, and it is important that systematic effects do not limit the dynamic range of the instrument, preventing this sensitivity from being harnessed for deep integrations. During commissioning, spurious artefacts were noted in the MeerKAT passband and the root cause was attributed to systematic errors in the digital signal path. Finite precision arithmetic used by the Polyphase Filterbank (PFB) was one of the main factors contributing to the spurious responses, together with bugs in the firmware. This thesis describes a software PFB simulator that was built to mimic the MeerKAT PFB and allow investigation into the origin and mitigation of the effects seen on the telescope. This simulator was used to investigate the effects in signal integrity of various rounding techniques, overflow strategies and dual polarisation processing in the PFB. Using the simulator to investigate a number of different signal levels, bit-width and algorithmic scenarios, it gave insight into how the periodic dips occurring in the MeerKAT passband were the result of the implementation using an inappropriate rounding strategy. It further indicated how to select the best strategy for preventing overflow while maintaining high quantization effciency in the FFT. This practice of simulating the design behaviour in the PFB independently of the tools used to design the DSP firmware, is a step towards an end-to-end simulation of the MeerKAT system (or any radio telescope using nite precision digital signal processing systems). This would be useful for design, diagnostics, signal analysis and prototyping of the overall instrument.
- Full Text:
- Date Issued: 2020
- Authors: Myburgh, Talon
- Date: 2020
- Subjects: Radio interferometers , Interferometry , Radio telescopes , Gate array circuits , Floating-point arithmetic , Python (Computer program language) , Polyphase Filterbank , Finite precision arithmetic , MeerKAT
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/146187 , vital:38503
- Description: The MeerKAT is the most sensitive radio telescope in its class, and it is important that systematic effects do not limit the dynamic range of the instrument, preventing this sensitivity from being harnessed for deep integrations. During commissioning, spurious artefacts were noted in the MeerKAT passband and the root cause was attributed to systematic errors in the digital signal path. Finite precision arithmetic used by the Polyphase Filterbank (PFB) was one of the main factors contributing to the spurious responses, together with bugs in the firmware. This thesis describes a software PFB simulator that was built to mimic the MeerKAT PFB and allow investigation into the origin and mitigation of the effects seen on the telescope. This simulator was used to investigate the effects in signal integrity of various rounding techniques, overflow strategies and dual polarisation processing in the PFB. Using the simulator to investigate a number of different signal levels, bit-width and algorithmic scenarios, it gave insight into how the periodic dips occurring in the MeerKAT passband were the result of the implementation using an inappropriate rounding strategy. It further indicated how to select the best strategy for preventing overflow while maintaining high quantization effciency in the FFT. This practice of simulating the design behaviour in the PFB independently of the tools used to design the DSP firmware, is a step towards an end-to-end simulation of the MeerKAT system (or any radio telescope using nite precision digital signal processing systems). This would be useful for design, diagnostics, signal analysis and prototyping of the overall instrument.
- Full Text:
- Date Issued: 2020
Observations of diffuse radio emission in the Perseus Galaxy Cluster
- Authors: Mungwariri, Clemence
- Date: 2020
- Subjects: Galaxies -- Clusters , Radio sources (Astronomy) , Radio interferometers , Perseus Galaxy Cluster , Diffuse radio emission
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/143325 , vital:38233
- Description: In this thesis we analysed Westerbork observations of the Perseus Galaxy Cluster at 1380 MHz. Observations consist of two different pointings, covering a total of ∼ 0.5 square degrees, one including the known mini halo and the source 3C 84, the other centred on the source 3C 83.1 B. We obtained images with 83 μJy beam⁻¹ and 240 μJy beam⁻¹ noise rms for the two pointings respectively. We achieved a 60000 : 1 dynamic range in the image containing the bright 3C 84 source. We imaged the mini halo surrounding 3C 84 at high sensitivity, measuring its diameter to be ∼140 kpc and its power 4 x 10²⁴ W Hz⁻¹. Its morphology agrees quite well with that observed at 240 MHz (e.g. Gendron-Marsolais et al., 2017). We measured the flux density of 3C 84 to be 20.5 ± 0.4 Jy at the 2007 epoch, consistent with a factor of ∼2 increase since the 1960s.
- Full Text:
- Date Issued: 2020
- Authors: Mungwariri, Clemence
- Date: 2020
- Subjects: Galaxies -- Clusters , Radio sources (Astronomy) , Radio interferometers , Perseus Galaxy Cluster , Diffuse radio emission
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/143325 , vital:38233
- Description: In this thesis we analysed Westerbork observations of the Perseus Galaxy Cluster at 1380 MHz. Observations consist of two different pointings, covering a total of ∼ 0.5 square degrees, one including the known mini halo and the source 3C 84, the other centred on the source 3C 83.1 B. We obtained images with 83 μJy beam⁻¹ and 240 μJy beam⁻¹ noise rms for the two pointings respectively. We achieved a 60000 : 1 dynamic range in the image containing the bright 3C 84 source. We imaged the mini halo surrounding 3C 84 at high sensitivity, measuring its diameter to be ∼140 kpc and its power 4 x 10²⁴ W Hz⁻¹. Its morphology agrees quite well with that observed at 240 MHz (e.g. Gendron-Marsolais et al., 2017). We measured the flux density of 3C 84 to be 20.5 ± 0.4 Jy at the 2007 epoch, consistent with a factor of ∼2 increase since the 1960s.
- Full Text:
- Date Issued: 2020
Observing cosmic reionization with PAPER: polarized foreground simulations and all sky images
- Authors: Nunhokee, Chuneeta Devi
- Date: 2019
- Subjects: Cosmic background radiation , Astronomy -- Observations , Epoch of reionization -- Research , Hydrogen -- Spectra , Radio interferometers
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/68203 , vital:29218
- Description: The Donald C. Backer Precision Array to Probe the Epoch of Reionization (PAPER, Parsons et al., 2010) was built with an aim to detect the redshifted 21 cm Hydrogen line, which is likely the best probe of thermal evolution of the intergalactic medium and reionization of neutral Hydrogen in our Universe. Observations of the 21 cm signal are challenged by bright astrophysical foregrounds and systematics that require precise modeling in order to extract the cosmological signal. In particular, the instrumental leakage of polarized foregrounds may contaminate the 21 cm power spectrum. In this work, we developed a formalism to describe the leakage due to instrumental widefield effects in visibility-based power spectra and used it to predict contaminations in observations. We find the leakage due to a population of point sources to be higher than the diffuse Galactic emission – for which we can predict minimal contaminations at k>0.3 h Mpc -¹ We also analyzed data from the last observing season of PAPER via all-sky imaging with a view to characterize the foregrounds. We generated an all-sky catalogue of 88 sources down to a flux density of 5 Jy. Moreover, we measured both polarized point source and the Galactic diffuse emission, and used these measurements to constrain our model of polarization leakage. We find the leakage due to a population of point sources to be 12% lower than the prediction from our polarized model.
- Full Text:
- Date Issued: 2019
- Authors: Nunhokee, Chuneeta Devi
- Date: 2019
- Subjects: Cosmic background radiation , Astronomy -- Observations , Epoch of reionization -- Research , Hydrogen -- Spectra , Radio interferometers
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/68203 , vital:29218
- Description: The Donald C. Backer Precision Array to Probe the Epoch of Reionization (PAPER, Parsons et al., 2010) was built with an aim to detect the redshifted 21 cm Hydrogen line, which is likely the best probe of thermal evolution of the intergalactic medium and reionization of neutral Hydrogen in our Universe. Observations of the 21 cm signal are challenged by bright astrophysical foregrounds and systematics that require precise modeling in order to extract the cosmological signal. In particular, the instrumental leakage of polarized foregrounds may contaminate the 21 cm power spectrum. In this work, we developed a formalism to describe the leakage due to instrumental widefield effects in visibility-based power spectra and used it to predict contaminations in observations. We find the leakage due to a population of point sources to be higher than the diffuse Galactic emission – for which we can predict minimal contaminations at k>0.3 h Mpc -¹ We also analyzed data from the last observing season of PAPER via all-sky imaging with a view to characterize the foregrounds. We generated an all-sky catalogue of 88 sources down to a flux density of 5 Jy. Moreover, we measured both polarized point source and the Galactic diffuse emission, and used these measurements to constrain our model of polarization leakage. We find the leakage due to a population of point sources to be 12% lower than the prediction from our polarized model.
- Full Text:
- Date Issued: 2019
Statistical Analysis of the Radio-Interferometric Measurement Equation, a derived adaptive weighting scheme, and applications to LOFAR-VLBI observation of the Extended Groth Strip
- Authors: Bonnassieux, Etienne
- Date: 2019
- Subjects: Radio astronomy , Astrophysics , Astrophysics -- Instruments -- Calibration , Imaging systems in astronomy , Radio interferometers , Radio telescopes , Astronomy -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/93789 , vital:30942
- Description: J.R.R Tolkien wrote, in his Mythopoeia, that “He sees no stars who does not see them first, of living silver made that sudden burst, to flame like flowers beneath the ancient song”. In his defense of myth-making, he formulates the argument that the attribution of meaning is an act of creation - that “trees are not ‘trees’ until so named and seen” - and that this capacity for creation defines the human creature. The scientific endeavour, in this context, can be understood as a social expression of a fundamental feature of humanity, and from this endeavour flows much understanding. This thesis, one thread among many, focuses on the study of astronomical objects as seen by the radio waves they emit. What are radio waves? Electromagnetic waves were theorised by James Clerk Maxwell (Maxwell 1864) in his great theoretical contribution to modern physics, their speed matching the speed of light as measured by Ole Christensen R0mer and, later, James Bradley. It was not until Heinrich Rudolf Hertz’s 1887 experiment that these waves were measured in a laboratory, leading to the dawn of radio communications - and, later, radio astronomy. The link between radio waves and light was one of association: light is known to behave as a wave (Young double-slit experiment), with the same propagation speed as electromagnetic radiation. Light “proper” is also known to exist beyond the optical regime: Herschel’s experiment shows that when diffracted through a prism, sunlight warms even those parts of a desk which are not observed to be lit (first evidence of infrared light). The link between optical light and unseen electromagnetic radiation is then an easy step to make, and one confirmed through countless technological applications (e.g. optical fiber to name but one). And as soon as this link is established, a question immediately comes to the mind of the astronomer: what does the sky, our Universe, look like to the radio “eye”? Radio astronomy has a short but storied history: from Karl Jansky’s serendipitous observation of the centre of the Milky Way, which outshines our Sun in the radio regime, in 1933, to Grote Reber’s hand-built back-yard radio antenna in 1937, which successfully detected radio emission from the Milky Way itself, to such monumental projects as the Square Kilometer Array and its multiple pathfinders, it has led to countless discoveries and the opening of a truly new window on the Universe. The work presented in this thesis is a contribution to this discipline - the culmination of three years of study, which is a rather short time to get a firm grasp of radio interferometry both in theory and in practice. The need for robust, automated methods - which are improving daily, thanks to the tireless labour of the scientists in the field - is becoming ever stronger as the SKA approaches, looming large on the horizon; but even today, in the precursor era of LOFAR, MeerKAT and other pathfinders, it is keenly felt. When I started my doctorate, the sheer scale of the task at hand felt overwhelming - to actually be able to contribute to its resolution seemed daunting indeed! Thankfully, as the saying goes, no society sets for itself material goals which it cannot achieve. This thesis took place at an exciting time for radio interferometry: at the start of my doctorate, the LOFAR international stations were - to my knowledge - only beginning to be used, and even then, only tentatively; MeerKAT had not yet shown its first light; the techniques used throughout my work were still being developed. At the time of writing, great strides have been made. One of the greatest technical challenges of LOFAR - imaging using the international stations - is starting to become reality. This technical challenge is the key problem that this thesis set out to address. While we only achieved partial success so far, it is a testament to the difficulty of the task that it is not yet truly resolved. One of the major results of this thesis is a model of a bright resolved source near a famous extragalactic field: properly modeling this source not only allows the use of international LOFAR stations, but also grants deeper access to the extragalactic field itself, which is otherwise polluted by the 3C source’s sidelobes. This result was only achieved thanks to the other major result of this thesis: the development of a theoretical framework with which to better understand the effect of calibration errors on images made from interferometric data, and an algorithm to strongly mitigate them. The structure of this manuscript is as follows: we begin with an introduction to radio interferometry, LOFAR, and the emission mechanisms which dominate for our field of interest. These introductions are primarily intended to give a brief overview of the technical aspects of the data reduced in this thesis. We follow with an overview of the Measurement Equation formalism, which underpins our theoretical work. This is the keystone of this thesis. We then show the theoretical work that was developed as part of the research work done during the doctorate - which was published in Astronomy & Astrophysics. Its practical application - a quality-based weighting scheme - is used throughout our data reduction. This data reduction is the next topic of this thesis: we contextualise the scientific interest of the data we reduce, and explain both the methods and the results we achieve.
- Full Text:
- Date Issued: 2019
- Authors: Bonnassieux, Etienne
- Date: 2019
- Subjects: Radio astronomy , Astrophysics , Astrophysics -- Instruments -- Calibration , Imaging systems in astronomy , Radio interferometers , Radio telescopes , Astronomy -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/93789 , vital:30942
- Description: J.R.R Tolkien wrote, in his Mythopoeia, that “He sees no stars who does not see them first, of living silver made that sudden burst, to flame like flowers beneath the ancient song”. In his defense of myth-making, he formulates the argument that the attribution of meaning is an act of creation - that “trees are not ‘trees’ until so named and seen” - and that this capacity for creation defines the human creature. The scientific endeavour, in this context, can be understood as a social expression of a fundamental feature of humanity, and from this endeavour flows much understanding. This thesis, one thread among many, focuses on the study of astronomical objects as seen by the radio waves they emit. What are radio waves? Electromagnetic waves were theorised by James Clerk Maxwell (Maxwell 1864) in his great theoretical contribution to modern physics, their speed matching the speed of light as measured by Ole Christensen R0mer and, later, James Bradley. It was not until Heinrich Rudolf Hertz’s 1887 experiment that these waves were measured in a laboratory, leading to the dawn of radio communications - and, later, radio astronomy. The link between radio waves and light was one of association: light is known to behave as a wave (Young double-slit experiment), with the same propagation speed as electromagnetic radiation. Light “proper” is also known to exist beyond the optical regime: Herschel’s experiment shows that when diffracted through a prism, sunlight warms even those parts of a desk which are not observed to be lit (first evidence of infrared light). The link between optical light and unseen electromagnetic radiation is then an easy step to make, and one confirmed through countless technological applications (e.g. optical fiber to name but one). And as soon as this link is established, a question immediately comes to the mind of the astronomer: what does the sky, our Universe, look like to the radio “eye”? Radio astronomy has a short but storied history: from Karl Jansky’s serendipitous observation of the centre of the Milky Way, which outshines our Sun in the radio regime, in 1933, to Grote Reber’s hand-built back-yard radio antenna in 1937, which successfully detected radio emission from the Milky Way itself, to such monumental projects as the Square Kilometer Array and its multiple pathfinders, it has led to countless discoveries and the opening of a truly new window on the Universe. The work presented in this thesis is a contribution to this discipline - the culmination of three years of study, which is a rather short time to get a firm grasp of radio interferometry both in theory and in practice. The need for robust, automated methods - which are improving daily, thanks to the tireless labour of the scientists in the field - is becoming ever stronger as the SKA approaches, looming large on the horizon; but even today, in the precursor era of LOFAR, MeerKAT and other pathfinders, it is keenly felt. When I started my doctorate, the sheer scale of the task at hand felt overwhelming - to actually be able to contribute to its resolution seemed daunting indeed! Thankfully, as the saying goes, no society sets for itself material goals which it cannot achieve. This thesis took place at an exciting time for radio interferometry: at the start of my doctorate, the LOFAR international stations were - to my knowledge - only beginning to be used, and even then, only tentatively; MeerKAT had not yet shown its first light; the techniques used throughout my work were still being developed. At the time of writing, great strides have been made. One of the greatest technical challenges of LOFAR - imaging using the international stations - is starting to become reality. This technical challenge is the key problem that this thesis set out to address. While we only achieved partial success so far, it is a testament to the difficulty of the task that it is not yet truly resolved. One of the major results of this thesis is a model of a bright resolved source near a famous extragalactic field: properly modeling this source not only allows the use of international LOFAR stations, but also grants deeper access to the extragalactic field itself, which is otherwise polluted by the 3C source’s sidelobes. This result was only achieved thanks to the other major result of this thesis: the development of a theoretical framework with which to better understand the effect of calibration errors on images made from interferometric data, and an algorithm to strongly mitigate them. The structure of this manuscript is as follows: we begin with an introduction to radio interferometry, LOFAR, and the emission mechanisms which dominate for our field of interest. These introductions are primarily intended to give a brief overview of the technical aspects of the data reduced in this thesis. We follow with an overview of the Measurement Equation formalism, which underpins our theoretical work. This is the keystone of this thesis. We then show the theoretical work that was developed as part of the research work done during the doctorate - which was published in Astronomy & Astrophysics. Its practical application - a quality-based weighting scheme - is used throughout our data reduction. This data reduction is the next topic of this thesis: we contextualise the scientific interest of the data we reduce, and explain both the methods and the results we achieve.
- Full Text:
- Date Issued: 2019
Advanced radio interferometric simulation and data reduction techniques
- Authors: Makhathini, Sphesihle
- Date: 2018
- Subjects: Interferometry , Radio interferometers , Algorithms , Radio telescopes , Square Kilometre Array (Project) , Very Large Array (Observatory : N.M.) , Radio astronomy
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/57348 , vital:26875
- Description: This work shows how legacy and novel radio Interferometry software packages and algorithms can be combined to produce high-quality reductions from modern telescopes, as well as end-to-end simulations for upcoming instruments such as the Square Kilometre Array (SKA) and its pathfinders. We first use a MeqTrees based simulations framework to quantify how artefacts due to direction-dependent effects accumulate with time, and the consequences of this accumulation when observing the same field multiple times in order to reach the survey depth. Our simulations suggest that a survey like LADUMA (Looking at the Distant Universe with MeerKAT Array), which aims to achieve its survey depth of 16 µJy/beam in a 72 kHz at 1.42 GHz by observing the same field for 1000 hours, will be able to reach its target depth in the presence of these artefacts. We also present stimela, a system agnostic scripting framework for simulating, processing and imaging radio interferometric data. This framework is then used to write an end-to-end simulation pipeline in order to quantify the resolution and sensitivity of the SKA1-MID telescope (the first phase of the SKA mid-frequency telescope) as a function of frequency, as well as the scale-dependent sensitivity of the telescope. Finally, a stimela-based reduction pipeline is used to process data of the field around the source 3C147, taken by the Karl G. Jansky Very Large Array (VLA). The reconstructed image from this reduction has a typical 1a noise level of 2.87 µJy/beam, and consequently a dynamic range of 8x106:1, given the 22.58 Jy/beam flux Density of the source 3C147.
- Full Text:
- Date Issued: 2018
- Authors: Makhathini, Sphesihle
- Date: 2018
- Subjects: Interferometry , Radio interferometers , Algorithms , Radio telescopes , Square Kilometre Array (Project) , Very Large Array (Observatory : N.M.) , Radio astronomy
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/57348 , vital:26875
- Description: This work shows how legacy and novel radio Interferometry software packages and algorithms can be combined to produce high-quality reductions from modern telescopes, as well as end-to-end simulations for upcoming instruments such as the Square Kilometre Array (SKA) and its pathfinders. We first use a MeqTrees based simulations framework to quantify how artefacts due to direction-dependent effects accumulate with time, and the consequences of this accumulation when observing the same field multiple times in order to reach the survey depth. Our simulations suggest that a survey like LADUMA (Looking at the Distant Universe with MeerKAT Array), which aims to achieve its survey depth of 16 µJy/beam in a 72 kHz at 1.42 GHz by observing the same field for 1000 hours, will be able to reach its target depth in the presence of these artefacts. We also present stimela, a system agnostic scripting framework for simulating, processing and imaging radio interferometric data. This framework is then used to write an end-to-end simulation pipeline in order to quantify the resolution and sensitivity of the SKA1-MID telescope (the first phase of the SKA mid-frequency telescope) as a function of frequency, as well as the scale-dependent sensitivity of the telescope. Finally, a stimela-based reduction pipeline is used to process data of the field around the source 3C147, taken by the Karl G. Jansky Very Large Array (VLA). The reconstructed image from this reduction has a typical 1a noise level of 2.87 µJy/beam, and consequently a dynamic range of 8x106:1, given the 22.58 Jy/beam flux Density of the source 3C147.
- Full Text:
- Date Issued: 2018
Data compression, field of interest shaping and fast algorithms for direction-dependent deconvolution in radio interferometry
- Authors: Atemkeng, Marcellin T
- Date: 2017
- Subjects: Radio astronomy , Solar radio emission , Radio interferometers , Signal processing -- Digital techniques , Algorithms , Data compression (Computer science)
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/6324 , vital:21089
- Description: In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities “decorrelate”, and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as “smearing”, which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. Averaging also results in baseline length and a position-dependent point spread function (PSF). In this work, we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be understood as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. Specifically, we can improve amplitude response over a chosen field of interest and attenuate sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Jansky Very Large Array and the European Very Long Baseline Interferometry Network. Furthermore, we show that the position-dependent PSF shape induced by averaging can be approximated using linear algebraic properties to effectively reduce the computational complexity for evaluating the PSF at each sky position. We conclude by implementing a position-dependent PSF deconvolution in an imaging and deconvolution framework. Using the Low-Frequency Array radio interferometer, we show that deconvolution with position-dependent PSFs results in higher image fidelity compared to a simple CLEAN algorithm and its derivatives.
- Full Text:
- Date Issued: 2017
- Authors: Atemkeng, Marcellin T
- Date: 2017
- Subjects: Radio astronomy , Solar radio emission , Radio interferometers , Signal processing -- Digital techniques , Algorithms , Data compression (Computer science)
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/6324 , vital:21089
- Description: In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities “decorrelate”, and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as “smearing”, which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. Averaging also results in baseline length and a position-dependent point spread function (PSF). In this work, we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be understood as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. Specifically, we can improve amplitude response over a chosen field of interest and attenuate sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Jansky Very Large Array and the European Very Long Baseline Interferometry Network. Furthermore, we show that the position-dependent PSF shape induced by averaging can be approximated using linear algebraic properties to effectively reduce the computational complexity for evaluating the PSF at each sky position. We conclude by implementing a position-dependent PSF deconvolution in an imaging and deconvolution framework. Using the Low-Frequency Array radio interferometer, we show that deconvolution with position-dependent PSFs results in higher image fidelity compared to a simple CLEAN algorithm and its derivatives.
- Full Text:
- Date Issued: 2017
- «
- ‹
- 1
- ›
- »