Modelado y simulación de nanoestructuras 3d
- Scavello, Giovanni
- Pedro Luis Galindo Riaño Director
- Joaquín Pizarro Junquera Co-director
Defence university: Universidad de Cádiz
Fecha de defensa: 24 July 2014
- Zineb Saghi Chair
- Elisa Guerrero Vázquez Secretary
- José Manuel Jerez Aragonés Committee member
Type: Thesis
Abstract
The design of tools for structural analysis is a key point in many scientific fields. Electron Microscopy can be almost considered as a stand-alone discipline, since it concerns with the exploitation of the electron microscope capabilities in order to obtain quantitative information about the sample under study. A simulation-oriented approach should take into account all the theoretical aspects of the acquisition and image formation process as well as the interaction of the electron beam with the sample. Nevertheless, it is almost impossible to include all the theoretical aspects in the description of complex physical phenomena and usually some simplifications are introduced to manage non linearities. Such approximations must be carefully assessed in order to keep consistent the matching between the reality and the mathematical model. When simulation is included into a quantitative analysis methodology, the amount of data that must be processed is often huge, especially when a realistic structure wants to be modelled and analysed. For this reason Parallel Computing plays a central role in the above mentioned methodology. The study of the relationships between the theoretical tomographic models and the experimental data obtained from the electron microscopes have been investigated, focusing onto the HAADF-STEM imaging mode. The whole reconstruction process has been deeply analysed and all the different aspects of the reconstruction assessed. It has been seen that, although the reconstruction of tomographic series can nowadays be considered a standard procedure, the thorough knowledge of the details about the image formation and the mathematical models can lead to better results, especially if compared with available automatic tools. Since reconstruction algorithms are proved to be reliable and also their effectiveness has been widely investigated, the focus has been moved toward processing of the input data. A new data processing has been proposed. The new normalization method has been introduced with the aim of recovering a better matching between the linear models used for the reconstruction and the highly non linearity presents in the acquired data. It has been shown that the mismatch can lead to improper segmentation results. Also it has been shown that the ideal segmentation step should not consider any user intervention and should be only related with the actual values of the reconstructed volume. Classical alignment methods, based on cross-correlation techniques are effective but depend on the choices of the user. Usually, needle-shaped samples are aligned by supposing that the needle tilts around its own axis. Actually this is a very strong assumption because, during the placement in the holder, the sample can suffer shifts or movements that invalidate such assumption. By designing an ad-hoc filtering process the informative content of the sample can be recovered and therefore used to align the series. This way the time for the alignment is greatly reduced and, overall the uncertainties on the tilt axis determination minimized. The problem of the atomic scale reconstruction of thin layer materials has been studied. Instead of estimating the position of the atoms after a continuous reconstruction, the original series has been pre-processed and the reconstruction has been carried out by exploiting a well known alignment algorithm. The results show the accuracy and the precision of the method, providing good quality images are available. It has been shown that by using a special-purpose approach instead of a standard one, the results of the reconstruction can be highly improved.