The ability to visualize and manipulate individual atoms makes scanning tunnelling microscopy (STM) a powerful tool for studying surface reconstruction, in-situ reactions of single molecules, and the construction of quantum devices. However, the complexity of STM operations and the presence of noise and artifacts in STM images make both data collection and interpretation time-consuming, heavily reliant on the expertise of well-trained professionals. Moreover, the large amounts of data generated by STM can be overwhelming and challenging to analyses using traditional methods. This motivates the development of highly automated and effective image analysis methods. In recent years, the field of data science has rapidly advanced and proven to accelerate the data collection and analysis processes in various scientific domains. This thesis represents a pioneering exploration into the integration of data science methodologies with STM. The research journey commences with the collection of a substantial 40-gigabyte dataset from STM experiments conducted on silicon carbide (SiC)-covered graphene. Subsequently, a Python-based ecosystem is developed to manage and analyses this extensive dataset.
History
Year
2024
Thesis type
Doctoral thesis
Faculty/School
School of Physics
Language
English
Disclaimer
Unless otherwise indicated, the views expressed in this thesis are those of the author and do not necessarily represent the views of the University of Wollongong.