Speaker
Description
Processing data from interferometric telescopes requires the application of computationally expensive algorithms to relatively large data volumes for imaging the sky at the sensitivity and resolution afforded by current and future telescopes. Therefore, in addition to their numerical performance, algorithms for data processing for imaging must also pay attention to computational complexity and runtime performance. As a result, algorithms R&D involves complex interactions between evolution in telescope capabilities, scientific uses cases, and computing hardware and software technologies.
In this talk I will briefly describe the working of radio interferometric telescope and highlight the resulting data processing challenges for imaging with next-generation telescopes like the ngVLA. I will then discuss the general data processing landscape and the algorithms, and the computing architecture developed by the NRAO Algorithms R&D Group (ARDG) to navigate this landscape with a focus on (near) future needs, and on hardware/software technology projections. Recently, in collaboration with the Center for High Throughput Computing we deployed this architecture on the OSG, PATh, San Diego Supercomputer Center (SDSC) and National Research Platform (NRP) resources to process a large database for the first time. This produced the deepest image ever at radio frequencies of the Hubble Ultra-Deep Field (HUDF). I will also briefly discuss this work, the lessons learnt, and the work in progress for the challenges ahead.