2021-10-28, 20:30–21:00, Grand Ballroom
In an exploratory field such as observational astronomy, it is often not trivial to evaluate the degree of absolute correctness of a scientific result. It is therefore important to establish a high degree of trust in the algorithms and software that are used to produce that result. Scientific software developers and the people who use their product must be creative in using a combination of simulations, controlled tests and predictions of behavior in new situations in order to assess the operational and numerical readiness of the tools. When available, accuracy metrics derived from science goals are of immense use, but for a variety of practical reasons they are not always an option.
This talk will discuss radio interferometric data analysis and image reconstruction as a case study to illustrate the practical complexities involved in establishing absolute correctness of scientific software used for radio astronomy. Technical, operational and sociological perspectives will be considered along with lessons learnt and strategies currently being explored towards building the requisite amount of trust in the software product. The ideas of reconstruction uncertainty, measurement noise and error propagation in the context of an entire data analysis sequence will be described alongside some key points (or challenges) when designing, producing and testing algorithms and software that will be run on a variety of operational platforms and be used on data for which the absolute truth is always unknown.
Solutions for workflow management and reproducibility, Image processing for the public and scientists, Big data: How to deal with the 5 Vs (volume, velocity, variety, veracity, value)