Design patterns and software techniques for large-scale, open and reproducible data reduction
- Authors: Molenaar, Gijs Jan
- Date: 2021
- Subjects: Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Radio astronomy -- South Africa , ASTRODECONV2019 dataset , Radio telescopes -- South Africa , KERN (omputer software)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/172169 , vital:42172 , 10.21504/10962/172169
- Description: The preparation for the construction of the Square Kilometre Array, and the introduction of its operational precursors, such as LOFAR and MeerKAT, mark the beginning of an exciting era for astronomy. Impressive new data containing valuable science just waiting for discovery is already being generated, and these devices will produce far more data than has ever been collected before. However, with every new data instrument, the data rates grow to unprecedented quantities of data, requiring novel new data-processing tools. In addition, creating science grade data from the raw data still requires significant expert knowledge for processing this data. The software used is often developed by a scientist who lacks proper training in software development skills, resulting in the software not progressing beyond a prototype stage in quality. In the first chapter, we explore various organisational and technical approaches to address these issues by providing a historical overview of the development of radioastronomy pipelines since the inception of the field in the 1940s. In that, the steps required to create a radio image are investigated. We used the lessons-learned to identify patterns in the challenges experienced, and the solutions created to address these over the years. The second chapter describes the mathematical foundations that are essential for radio imaging. In the third chapter, we discuss the production of the KERN Linux distribution, which is a set of software packages containing most radio astronomy software currently in use. Considerable effort was put into making sure that the contained software installs appropriately, all items next to one other on the same system. Where required and possible, bugs and portability fixes were solved and reported with the upstream maintainers. The KERN project also has a website, and issue tracker, where users can report bugs and maintainers can coordinate the packaging effort and new releases. The software packages can be used inside Docker and Singularity containers, enabling the installation of these packages on a wide variety of platforms. In the fourth and fifth chapters, we discuss methods and frameworks for combining the available data reduction tools into recomposable pipelines and introduce the Kliko specification and software. This framework was created to enable end-user astronomers to chain and containerise operations of software in KERN packages. Next, we discuss the Common Workflow Language (CommonWL), a similar but more advanced and mature pipeline framework invented by bio-informatics scientists. CommonWL is supported by a wide range of tools already; among other schedulers, visualisers and editors. Consequently, when a pipeline is made with CommonWL, it can be deployed and manipulated with a wide range of tools. In the final chapter, we attempt something unconventional, applying a generative adversarial network based on deep learning techniques to perform the task of sky brightness reconstruction. Since deep learning methods often require a large number of training samples, we constructed a CommonWL simulation pipeline for creating dirty images and corresponding sky models. This simulated dataset has been made publicly available as the ASTRODECONV2019 dataset. It is shown that this method is useful to perform the restoration and matches the performance of a single clean cycle. In addition, we incorporated domain knowledge by adding the point spread function to the network and by utilising a custom loss function during training. Although it was not possible to improve the cleaning performance of commonly used existing tools, the computational time performance of the approach looks very promising. We suggest that a smaller scope should be the starting point for further studies and optimising of the training of the neural network could produce the desired results.
- Full Text:
- Date Issued: 2021
- Authors: Molenaar, Gijs Jan
- Date: 2021
- Subjects: Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Radio astronomy -- South Africa , ASTRODECONV2019 dataset , Radio telescopes -- South Africa , KERN (omputer software)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/172169 , vital:42172 , 10.21504/10962/172169
- Description: The preparation for the construction of the Square Kilometre Array, and the introduction of its operational precursors, such as LOFAR and MeerKAT, mark the beginning of an exciting era for astronomy. Impressive new data containing valuable science just waiting for discovery is already being generated, and these devices will produce far more data than has ever been collected before. However, with every new data instrument, the data rates grow to unprecedented quantities of data, requiring novel new data-processing tools. In addition, creating science grade data from the raw data still requires significant expert knowledge for processing this data. The software used is often developed by a scientist who lacks proper training in software development skills, resulting in the software not progressing beyond a prototype stage in quality. In the first chapter, we explore various organisational and technical approaches to address these issues by providing a historical overview of the development of radioastronomy pipelines since the inception of the field in the 1940s. In that, the steps required to create a radio image are investigated. We used the lessons-learned to identify patterns in the challenges experienced, and the solutions created to address these over the years. The second chapter describes the mathematical foundations that are essential for radio imaging. In the third chapter, we discuss the production of the KERN Linux distribution, which is a set of software packages containing most radio astronomy software currently in use. Considerable effort was put into making sure that the contained software installs appropriately, all items next to one other on the same system. Where required and possible, bugs and portability fixes were solved and reported with the upstream maintainers. The KERN project also has a website, and issue tracker, where users can report bugs and maintainers can coordinate the packaging effort and new releases. The software packages can be used inside Docker and Singularity containers, enabling the installation of these packages on a wide variety of platforms. In the fourth and fifth chapters, we discuss methods and frameworks for combining the available data reduction tools into recomposable pipelines and introduce the Kliko specification and software. This framework was created to enable end-user astronomers to chain and containerise operations of software in KERN packages. Next, we discuss the Common Workflow Language (CommonWL), a similar but more advanced and mature pipeline framework invented by bio-informatics scientists. CommonWL is supported by a wide range of tools already; among other schedulers, visualisers and editors. Consequently, when a pipeline is made with CommonWL, it can be deployed and manipulated with a wide range of tools. In the final chapter, we attempt something unconventional, applying a generative adversarial network based on deep learning techniques to perform the task of sky brightness reconstruction. Since deep learning methods often require a large number of training samples, we constructed a CommonWL simulation pipeline for creating dirty images and corresponding sky models. This simulated dataset has been made publicly available as the ASTRODECONV2019 dataset. It is shown that this method is useful to perform the restoration and matches the performance of a single clean cycle. In addition, we incorporated domain knowledge by adding the point spread function to the network and by utilising a custom loss function during training. Although it was not possible to improve the cleaning performance of commonly used existing tools, the computational time performance of the approach looks very promising. We suggest that a smaller scope should be the starting point for further studies and optimising of the training of the neural network could produce the desired results.
- Full Text:
- Date Issued: 2021
CubiCal: a fast radio interferometric calibration suite exploiting complex optimisation
- Authors: Kenyon, Jonathan
- Date: 2019
- Subjects: Interferometry , Radio astronomy , Python (Computer program language) , Square Kilometre Array (Project)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/92341 , vital:30711
- Description: The advent of the Square Kilometre Array and its precursors marks the start of an exciting era for radio interferometry. However, with new instruments producing unprecedented quantities of data, many existing calibration algorithms and implementations will be hard-pressed to keep up. Fortunately, it has recently been shown that the radio interferometric calibration problem can be expressed concisely using the ideas of complex optimisation. The resulting framework exposes properties of the calibration problem which can be exploited to accelerate traditional non-linear least squares algorithms. We extend the existing work on the topic by considering the more general problem of calibrating a Jones chain: the product of several unknown gain terms. We also derive specialised solvers for performing phase-only, delay and pointing error calibration. In doing so, we devise a method for determining update rules for arbitrary, real-valued parametrisations of a complex gain. The solvers are implemented in an optimised Python package called CubiCal. CubiCal makes use of Cython to generate fast C and C++ routines for performing computationally demanding tasks whilst leveraging multiprocessing and shared memory to take advantage of modern, parallel hardware. The package is fully compatible with the measurement set, the most common format for interferometer data, and is well integrated with Montblanc - a third party package which implements optimised model visibility prediction. CubiCal's calibration routines are applied successfully to both simulated and real data for the field surrounding source 3C147. These tests include direction-independent and direction dependent calibration, as well as tests of the specialised solvers. Finally, we conduct extensive performance benchmarks and verify that CubiCal convincingly outperforms its most comparable competitor.
- Full Text:
- Date Issued: 2019
- Authors: Kenyon, Jonathan
- Date: 2019
- Subjects: Interferometry , Radio astronomy , Python (Computer program language) , Square Kilometre Array (Project)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/92341 , vital:30711
- Description: The advent of the Square Kilometre Array and its precursors marks the start of an exciting era for radio interferometry. However, with new instruments producing unprecedented quantities of data, many existing calibration algorithms and implementations will be hard-pressed to keep up. Fortunately, it has recently been shown that the radio interferometric calibration problem can be expressed concisely using the ideas of complex optimisation. The resulting framework exposes properties of the calibration problem which can be exploited to accelerate traditional non-linear least squares algorithms. We extend the existing work on the topic by considering the more general problem of calibrating a Jones chain: the product of several unknown gain terms. We also derive specialised solvers for performing phase-only, delay and pointing error calibration. In doing so, we devise a method for determining update rules for arbitrary, real-valued parametrisations of a complex gain. The solvers are implemented in an optimised Python package called CubiCal. CubiCal makes use of Cython to generate fast C and C++ routines for performing computationally demanding tasks whilst leveraging multiprocessing and shared memory to take advantage of modern, parallel hardware. The package is fully compatible with the measurement set, the most common format for interferometer data, and is well integrated with Montblanc - a third party package which implements optimised model visibility prediction. CubiCal's calibration routines are applied successfully to both simulated and real data for the field surrounding source 3C147. These tests include direction-independent and direction dependent calibration, as well as tests of the specialised solvers. Finally, we conduct extensive performance benchmarks and verify that CubiCal convincingly outperforms its most comparable competitor.
- Full Text:
- Date Issued: 2019
Observing cosmic reionization with PAPER: polarized foreground simulations and all sky images
- Authors: Nunhokee, Chuneeta Devi
- Date: 2019
- Subjects: Cosmic background radiation , Astronomy -- Observations , Epoch of reionization -- Research , Hydrogen -- Spectra , Radio interferometers
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/68203 , vital:29218
- Description: The Donald C. Backer Precision Array to Probe the Epoch of Reionization (PAPER, Parsons et al., 2010) was built with an aim to detect the redshifted 21 cm Hydrogen line, which is likely the best probe of thermal evolution of the intergalactic medium and reionization of neutral Hydrogen in our Universe. Observations of the 21 cm signal are challenged by bright astrophysical foregrounds and systematics that require precise modeling in order to extract the cosmological signal. In particular, the instrumental leakage of polarized foregrounds may contaminate the 21 cm power spectrum. In this work, we developed a formalism to describe the leakage due to instrumental widefield effects in visibility-based power spectra and used it to predict contaminations in observations. We find the leakage due to a population of point sources to be higher than the diffuse Galactic emission – for which we can predict minimal contaminations at k>0.3 h Mpc -¹ We also analyzed data from the last observing season of PAPER via all-sky imaging with a view to characterize the foregrounds. We generated an all-sky catalogue of 88 sources down to a flux density of 5 Jy. Moreover, we measured both polarized point source and the Galactic diffuse emission, and used these measurements to constrain our model of polarization leakage. We find the leakage due to a population of point sources to be 12% lower than the prediction from our polarized model.
- Full Text:
- Date Issued: 2019
- Authors: Nunhokee, Chuneeta Devi
- Date: 2019
- Subjects: Cosmic background radiation , Astronomy -- Observations , Epoch of reionization -- Research , Hydrogen -- Spectra , Radio interferometers
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/68203 , vital:29218
- Description: The Donald C. Backer Precision Array to Probe the Epoch of Reionization (PAPER, Parsons et al., 2010) was built with an aim to detect the redshifted 21 cm Hydrogen line, which is likely the best probe of thermal evolution of the intergalactic medium and reionization of neutral Hydrogen in our Universe. Observations of the 21 cm signal are challenged by bright astrophysical foregrounds and systematics that require precise modeling in order to extract the cosmological signal. In particular, the instrumental leakage of polarized foregrounds may contaminate the 21 cm power spectrum. In this work, we developed a formalism to describe the leakage due to instrumental widefield effects in visibility-based power spectra and used it to predict contaminations in observations. We find the leakage due to a population of point sources to be higher than the diffuse Galactic emission – for which we can predict minimal contaminations at k>0.3 h Mpc -¹ We also analyzed data from the last observing season of PAPER via all-sky imaging with a view to characterize the foregrounds. We generated an all-sky catalogue of 88 sources down to a flux density of 5 Jy. Moreover, we measured both polarized point source and the Galactic diffuse emission, and used these measurements to constrain our model of polarization leakage. We find the leakage due to a population of point sources to be 12% lower than the prediction from our polarized model.
- Full Text:
- Date Issued: 2019
Statistical Analysis of the Radio-Interferometric Measurement Equation, a derived adaptive weighting scheme, and applications to LOFAR-VLBI observation of the Extended Groth Strip
- Authors: Bonnassieux, Etienne
- Date: 2019
- Subjects: Radio astronomy , Astrophysics , Astrophysics -- Instruments -- Calibration , Imaging systems in astronomy , Radio interferometers , Radio telescopes , Astronomy -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/93789 , vital:30942
- Description: J.R.R Tolkien wrote, in his Mythopoeia, that “He sees no stars who does not see them first, of living silver made that sudden burst, to flame like flowers beneath the ancient song”. In his defense of myth-making, he formulates the argument that the attribution of meaning is an act of creation - that “trees are not ‘trees’ until so named and seen” - and that this capacity for creation defines the human creature. The scientific endeavour, in this context, can be understood as a social expression of a fundamental feature of humanity, and from this endeavour flows much understanding. This thesis, one thread among many, focuses on the study of astronomical objects as seen by the radio waves they emit. What are radio waves? Electromagnetic waves were theorised by James Clerk Maxwell (Maxwell 1864) in his great theoretical contribution to modern physics, their speed matching the speed of light as measured by Ole Christensen R0mer and, later, James Bradley. It was not until Heinrich Rudolf Hertz’s 1887 experiment that these waves were measured in a laboratory, leading to the dawn of radio communications - and, later, radio astronomy. The link between radio waves and light was one of association: light is known to behave as a wave (Young double-slit experiment), with the same propagation speed as electromagnetic radiation. Light “proper” is also known to exist beyond the optical regime: Herschel’s experiment shows that when diffracted through a prism, sunlight warms even those parts of a desk which are not observed to be lit (first evidence of infrared light). The link between optical light and unseen electromagnetic radiation is then an easy step to make, and one confirmed through countless technological applications (e.g. optical fiber to name but one). And as soon as this link is established, a question immediately comes to the mind of the astronomer: what does the sky, our Universe, look like to the radio “eye”? Radio astronomy has a short but storied history: from Karl Jansky’s serendipitous observation of the centre of the Milky Way, which outshines our Sun in the radio regime, in 1933, to Grote Reber’s hand-built back-yard radio antenna in 1937, which successfully detected radio emission from the Milky Way itself, to such monumental projects as the Square Kilometer Array and its multiple pathfinders, it has led to countless discoveries and the opening of a truly new window on the Universe. The work presented in this thesis is a contribution to this discipline - the culmination of three years of study, which is a rather short time to get a firm grasp of radio interferometry both in theory and in practice. The need for robust, automated methods - which are improving daily, thanks to the tireless labour of the scientists in the field - is becoming ever stronger as the SKA approaches, looming large on the horizon; but even today, in the precursor era of LOFAR, MeerKAT and other pathfinders, it is keenly felt. When I started my doctorate, the sheer scale of the task at hand felt overwhelming - to actually be able to contribute to its resolution seemed daunting indeed! Thankfully, as the saying goes, no society sets for itself material goals which it cannot achieve. This thesis took place at an exciting time for radio interferometry: at the start of my doctorate, the LOFAR international stations were - to my knowledge - only beginning to be used, and even then, only tentatively; MeerKAT had not yet shown its first light; the techniques used throughout my work were still being developed. At the time of writing, great strides have been made. One of the greatest technical challenges of LOFAR - imaging using the international stations - is starting to become reality. This technical challenge is the key problem that this thesis set out to address. While we only achieved partial success so far, it is a testament to the difficulty of the task that it is not yet truly resolved. One of the major results of this thesis is a model of a bright resolved source near a famous extragalactic field: properly modeling this source not only allows the use of international LOFAR stations, but also grants deeper access to the extragalactic field itself, which is otherwise polluted by the 3C source’s sidelobes. This result was only achieved thanks to the other major result of this thesis: the development of a theoretical framework with which to better understand the effect of calibration errors on images made from interferometric data, and an algorithm to strongly mitigate them. The structure of this manuscript is as follows: we begin with an introduction to radio interferometry, LOFAR, and the emission mechanisms which dominate for our field of interest. These introductions are primarily intended to give a brief overview of the technical aspects of the data reduced in this thesis. We follow with an overview of the Measurement Equation formalism, which underpins our theoretical work. This is the keystone of this thesis. We then show the theoretical work that was developed as part of the research work done during the doctorate - which was published in Astronomy & Astrophysics. Its practical application - a quality-based weighting scheme - is used throughout our data reduction. This data reduction is the next topic of this thesis: we contextualise the scientific interest of the data we reduce, and explain both the methods and the results we achieve.
- Full Text:
- Date Issued: 2019
- Authors: Bonnassieux, Etienne
- Date: 2019
- Subjects: Radio astronomy , Astrophysics , Astrophysics -- Instruments -- Calibration , Imaging systems in astronomy , Radio interferometers , Radio telescopes , Astronomy -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/93789 , vital:30942
- Description: J.R.R Tolkien wrote, in his Mythopoeia, that “He sees no stars who does not see them first, of living silver made that sudden burst, to flame like flowers beneath the ancient song”. In his defense of myth-making, he formulates the argument that the attribution of meaning is an act of creation - that “trees are not ‘trees’ until so named and seen” - and that this capacity for creation defines the human creature. The scientific endeavour, in this context, can be understood as a social expression of a fundamental feature of humanity, and from this endeavour flows much understanding. This thesis, one thread among many, focuses on the study of astronomical objects as seen by the radio waves they emit. What are radio waves? Electromagnetic waves were theorised by James Clerk Maxwell (Maxwell 1864) in his great theoretical contribution to modern physics, their speed matching the speed of light as measured by Ole Christensen R0mer and, later, James Bradley. It was not until Heinrich Rudolf Hertz’s 1887 experiment that these waves were measured in a laboratory, leading to the dawn of radio communications - and, later, radio astronomy. The link between radio waves and light was one of association: light is known to behave as a wave (Young double-slit experiment), with the same propagation speed as electromagnetic radiation. Light “proper” is also known to exist beyond the optical regime: Herschel’s experiment shows that when diffracted through a prism, sunlight warms even those parts of a desk which are not observed to be lit (first evidence of infrared light). The link between optical light and unseen electromagnetic radiation is then an easy step to make, and one confirmed through countless technological applications (e.g. optical fiber to name but one). And as soon as this link is established, a question immediately comes to the mind of the astronomer: what does the sky, our Universe, look like to the radio “eye”? Radio astronomy has a short but storied history: from Karl Jansky’s serendipitous observation of the centre of the Milky Way, which outshines our Sun in the radio regime, in 1933, to Grote Reber’s hand-built back-yard radio antenna in 1937, which successfully detected radio emission from the Milky Way itself, to such monumental projects as the Square Kilometer Array and its multiple pathfinders, it has led to countless discoveries and the opening of a truly new window on the Universe. The work presented in this thesis is a contribution to this discipline - the culmination of three years of study, which is a rather short time to get a firm grasp of radio interferometry both in theory and in practice. The need for robust, automated methods - which are improving daily, thanks to the tireless labour of the scientists in the field - is becoming ever stronger as the SKA approaches, looming large on the horizon; but even today, in the precursor era of LOFAR, MeerKAT and other pathfinders, it is keenly felt. When I started my doctorate, the sheer scale of the task at hand felt overwhelming - to actually be able to contribute to its resolution seemed daunting indeed! Thankfully, as the saying goes, no society sets for itself material goals which it cannot achieve. This thesis took place at an exciting time for radio interferometry: at the start of my doctorate, the LOFAR international stations were - to my knowledge - only beginning to be used, and even then, only tentatively; MeerKAT had not yet shown its first light; the techniques used throughout my work were still being developed. At the time of writing, great strides have been made. One of the greatest technical challenges of LOFAR - imaging using the international stations - is starting to become reality. This technical challenge is the key problem that this thesis set out to address. While we only achieved partial success so far, it is a testament to the difficulty of the task that it is not yet truly resolved. One of the major results of this thesis is a model of a bright resolved source near a famous extragalactic field: properly modeling this source not only allows the use of international LOFAR stations, but also grants deeper access to the extragalactic field itself, which is otherwise polluted by the 3C source’s sidelobes. This result was only achieved thanks to the other major result of this thesis: the development of a theoretical framework with which to better understand the effect of calibration errors on images made from interferometric data, and an algorithm to strongly mitigate them. The structure of this manuscript is as follows: we begin with an introduction to radio interferometry, LOFAR, and the emission mechanisms which dominate for our field of interest. These introductions are primarily intended to give a brief overview of the technical aspects of the data reduced in this thesis. We follow with an overview of the Measurement Equation formalism, which underpins our theoretical work. This is the keystone of this thesis. We then show the theoretical work that was developed as part of the research work done during the doctorate - which was published in Astronomy & Astrophysics. Its practical application - a quality-based weighting scheme - is used throughout our data reduction. This data reduction is the next topic of this thesis: we contextualise the scientific interest of the data we reduce, and explain both the methods and the results we achieve.
- Full Text:
- Date Issued: 2019
Advanced radio interferometric simulation and data reduction techniques
- Authors: Makhathini, Sphesihle
- Date: 2018
- Subjects: Interferometry , Radio interferometers , Algorithms , Radio telescopes , Square Kilometre Array (Project) , Very Large Array (Observatory : N.M.) , Radio astronomy
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/57348 , vital:26875
- Description: This work shows how legacy and novel radio Interferometry software packages and algorithms can be combined to produce high-quality reductions from modern telescopes, as well as end-to-end simulations for upcoming instruments such as the Square Kilometre Array (SKA) and its pathfinders. We first use a MeqTrees based simulations framework to quantify how artefacts due to direction-dependent effects accumulate with time, and the consequences of this accumulation when observing the same field multiple times in order to reach the survey depth. Our simulations suggest that a survey like LADUMA (Looking at the Distant Universe with MeerKAT Array), which aims to achieve its survey depth of 16 µJy/beam in a 72 kHz at 1.42 GHz by observing the same field for 1000 hours, will be able to reach its target depth in the presence of these artefacts. We also present stimela, a system agnostic scripting framework for simulating, processing and imaging radio interferometric data. This framework is then used to write an end-to-end simulation pipeline in order to quantify the resolution and sensitivity of the SKA1-MID telescope (the first phase of the SKA mid-frequency telescope) as a function of frequency, as well as the scale-dependent sensitivity of the telescope. Finally, a stimela-based reduction pipeline is used to process data of the field around the source 3C147, taken by the Karl G. Jansky Very Large Array (VLA). The reconstructed image from this reduction has a typical 1a noise level of 2.87 µJy/beam, and consequently a dynamic range of 8x106:1, given the 22.58 Jy/beam flux Density of the source 3C147.
- Full Text:
- Date Issued: 2018
- Authors: Makhathini, Sphesihle
- Date: 2018
- Subjects: Interferometry , Radio interferometers , Algorithms , Radio telescopes , Square Kilometre Array (Project) , Very Large Array (Observatory : N.M.) , Radio astronomy
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/57348 , vital:26875
- Description: This work shows how legacy and novel radio Interferometry software packages and algorithms can be combined to produce high-quality reductions from modern telescopes, as well as end-to-end simulations for upcoming instruments such as the Square Kilometre Array (SKA) and its pathfinders. We first use a MeqTrees based simulations framework to quantify how artefacts due to direction-dependent effects accumulate with time, and the consequences of this accumulation when observing the same field multiple times in order to reach the survey depth. Our simulations suggest that a survey like LADUMA (Looking at the Distant Universe with MeerKAT Array), which aims to achieve its survey depth of 16 µJy/beam in a 72 kHz at 1.42 GHz by observing the same field for 1000 hours, will be able to reach its target depth in the presence of these artefacts. We also present stimela, a system agnostic scripting framework for simulating, processing and imaging radio interferometric data. This framework is then used to write an end-to-end simulation pipeline in order to quantify the resolution and sensitivity of the SKA1-MID telescope (the first phase of the SKA mid-frequency telescope) as a function of frequency, as well as the scale-dependent sensitivity of the telescope. Finally, a stimela-based reduction pipeline is used to process data of the field around the source 3C147, taken by the Karl G. Jansky Very Large Array (VLA). The reconstructed image from this reduction has a typical 1a noise level of 2.87 µJy/beam, and consequently a dynamic range of 8x106:1, given the 22.58 Jy/beam flux Density of the source 3C147.
- Full Text:
- Date Issued: 2018
- «
- ‹
- 1
- ›
- »