Solving partial differential equations on surfaces has many applications including modeling chemical diffusion, pattern formation, geophysics and texture mapping. This dissertation presents two techniques for solving time dependent partial differential equations on various surfaces using the partition of unity method.…
Solving partial differential equations on surfaces has many applications including modeling chemical diffusion, pattern formation, geophysics and texture mapping. This dissertation presents two techniques for solving time dependent partial differential equations on various surfaces using the partition of unity method. A novel spectral cubed sphere method that utilizes the windowed Fourier technique is presented and used for both approximating functions on spherical domains and solving partial differential equations. The spectral cubed sphere method is applied to solve the transport equation as well as the diffusion equation on the unit sphere. The second approach is a partition of unity method with local radial basis function approximations. This technique is also used to explore the effect of the node distribution as it is well known that node choice plays an important role in the accuracy and stability of an approximation. A greedy algorithm is implemented to generate good interpolation nodes using the column pivoting QR factorization. The partition of unity radial basis function method is applied to solve the diffusion equation on the sphere as well as a system of reaction-diffusion equations on multiple surfaces including the surface of a red blood cell, a torus, and the Stanford bunny. Accuracy and stability of both methods are investigated.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
High-dimensional systems are difficult to model and predict. The underlying mechanisms of such systems are too complex to be fully understood with limited theoretical knowledge and/or physical measurements. Nevertheless, redcued-order models have been widely used to study high-dimensional systems, because…
High-dimensional systems are difficult to model and predict. The underlying mechanisms of such systems are too complex to be fully understood with limited theoretical knowledge and/or physical measurements. Nevertheless, redcued-order models have been widely used to study high-dimensional systems, because they are practical and efficient to develop and implement. Although model errors (biases) are inevitable for reduced-order models, these models can still be proven useful to develop real-world applications. Evaluation and validation for idealized models are indispensable to serve the mission of developing useful applications. Data assimilation and uncertainty quantification can provide a way to assess the performance of a reduced-order model. Real data and a dynamical model are combined together in a data assimilation framework to generate corrected model forecasts of a system. Uncertainties in model forecasts and observations are also quantified in a data assimilation cycle to provide optimal updates that are representative of the real dynamics. In this research, data assimilation is applied to assess the performance of two reduced-order models. The first model is developed for predicting prostate cancer treatment response under intermittent androgen suppression therapy. A sequential data assimilation scheme, the ensemble Kalman filter (EnKF), is used to quantify uncertainties in model predictions using clinical data of individual patients provided by Vancouver Prostate Center. The second model is developed to study what causes the changes of the state of stratospheric polar vortex. Two data assimilation schemes: EnKF and ES-MDA (ensemble smoother with multiple data assimilation), are used to validate the qualitative properties of the model using ECMWF (European Center for Medium-Range Weather Forecasts) reanalysis data. In both studies, the reduced-order model is able to reproduce the data patterns and provide insights to understand the underlying mechanism. However, significant model errors are also diagnosed for both models from the results of data assimilation schemes, which suggests specific improvements of the reduced-order models.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Atmospheric turbulence distorts the path of light passing through the air. When capturing images at long range, the effects of this turbulence can cause substantial geometric distortion and blur in images and videos, degrading image quality. These become more pronounced…
Atmospheric turbulence distorts the path of light passing through the air. When capturing images at long range, the effects of this turbulence can cause substantial geometric distortion and blur in images and videos, degrading image quality. These become more pronounced with greater turbulence, scaling with the refractive index structure constant, Cn2. Removing effects of atmospheric turbulence in images has a range of applications from astronomical imaging to surveillance. Thus, there is great utility in transforming a turbulent image into a ``clean image" undegraded by turbulence. However, as the turbulence is space- and time-variant and statistically random, no closed-form solution exists for a function that performs this transformation. Prior attempts to approximate the solution include spatio-temporal models and lucky frames models, which require many images to provide a good approximation, and supervised neural networks, which rely on large amounts of simulated or difficult-to-acquire real training data and can struggle to generalize. The first contribution in this thesis is an unsupervised neural-network-based model to perform image restoration for atmospheric turbulence with state-of-the-art performance. The model consists of a grid deformer, which produces an estimated distortion field, and an image generator, which estimates the distortion-free image. This model is transferable across different datasets; its efficacy is demonstrated across multiple datasets and on both air and water turbulence. The second contribution is a supervised neural network to predict Cn2 directly from the warp field. This network was trained on a wide range of Cn2 values and estimates Cn2 with relatively good accuracy. When used on the warp field produced by the unsupervised model, this allows for a Cn2 estimate requiring only a few images without any prior knowledge of ground truth or information about the turbulence.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The NBA yields billions of dollars each year and serves as a pastime and hobby for millions of Americans. However, many people do not have the time to watch several 2-hour games every week, especially when only a fraction of…
The NBA yields billions of dollars each year and serves as a pastime and hobby for millions of Americans. However, many people do not have the time to watch several 2-hour games every week, especially when only a fraction of the game is actually exciting footage. The goal of Sports Summary is to take the ``fluff'' out of these games and create a distilled summary that includes only the most exciting and relevant events. The Sports Summary model records visual and auditory data, camera angles, and game clock readings and correlates it with the game's play-by-play data. On average, a game of more than 2 hours long is shortened to a summary of less than 20 minutes. This summary is then uploaded to the Sports Summary website, where users can filter by the type of event, giving more autonomy and a more comprehensive viewing experience than highlight reels. Additionally, the website allows for users to submit footage they would like to watch for processing and later viewing. Sports Summary creates an enjoyable and accessible way to watch games.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Dengue is a mosquito-borne arboviral disease that causes significant public health burden in many trophical and sub-tropical parts of the world (where dengue is endemic). This dissertation is based on using mathematical modeling approaches, coupled with rigorous analysis and computation,…
Dengue is a mosquito-borne arboviral disease that causes significant public health burden in many trophical and sub-tropical parts of the world (where dengue is endemic). This dissertation is based on using mathematical modeling approaches, coupled with rigorous analysis and computation, to study the transmission dynamics and control of dengue disease. In Chapter 2, a new deterministic model was designed and used to assess the impact of local fluctuation of temperature and mosquito vertical (transvasorial) transmission on the population abundance of dengue mosquitoes and disease in a population. The model, which takes the form of a deterministic system of nonlinear differential equations, was parametrized using data from the Chiang Mai province of Thailand. The disease-free equilibrium of the model was shown to be globally-asymptotically stable when a certain epidemiological quantity is less than unity. Vertical transmission was shown to only have marginal impact on the disease dynamics, and its effect is temperature-dependent. Dengue burden in the province is maximized when the mean monthly temperature lie in the range [26-28] C. A new deterministic model was designed in Chapter 3 to assess the impact of the release of Wolbachia-infected mosquitoes on curtailing the mosquito population and dengue disease in a population. The model, which stratifies the mosquito population in terms of sex and Wolbachia-infection status, was rigorously analysed to characterize the bifurcation property of the model as well as the asymptotic stability of the various disease-free equilibria. Simulations, using Wolbachia-based mosquito control from Queensland, Australia, showed that the frequent release of mosquitoes infected with the bacterium can lead to the effective control of the local wild mosquito population, and that such effective control increases with increasing number of Wolbachia-infected mosquitoes released (up to 90% reduction in the wild mosquito population, from their baseline values, can be achieved). It was also shown that the well-known feature of cytoplasmic incompatibility has very little effect on the effectiveness of the Wolbachia-based mosquito control.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The goal of this thesis is to explore and present a range of approaches to “algorithmic choreography.” In the context of this thesis, algorithmic choreography is defined as choreography with computational influence or elements. Traditionally, algorithmic choreography, despite containing works…
The goal of this thesis is to explore and present a range of approaches to “algorithmic choreography.” In the context of this thesis, algorithmic choreography is defined as choreography with computational influence or elements. Traditionally, algorithmic choreography, despite containing works that use computation in a variety of ways, has been used as an umbrella term for all works that involve computation. This thesis intends to show that the diversity of algorithmic choreography can be reduced into more specific categories. As algorithmic choreography is fundamentally intertwined with the concept of computation, it is natural to propose that algorithmic choreography works be separated based on a spectrum that is defined by the extent of the involvement of computation within each piece. This thesis seeks to specifically outline three primary categories that algorithmic works can fall into: pieces that involve minimal computational influence, entirely computationally generated pieces, and pieces that lie in between. Three original works were created to reflect each of these categories. These works provide examples of the various methods by which computation can influence and enhance choreography. The first piece, entitled Rαinwater, displays a minimal amount of computational influence. The use of space in the piece was limited to random, computationally generated paths. The dancers extracted a narrative element from the random paths. This iteration resulted in a piece that explores the dancers’ emotional interaction within the context of a rainy environment. The second piece, entitled Mymec, utilizes an intermediary amount of computation. The piece sees a dancer interact with a projected display of an Ant Colony Optimization (ACO) algorithm. The dancer is to take direct inspiration from the movement of the virtual ants and embody the visualization of the algorithm. The final piece, entitled nSkeleton, exhibited maximal computational influence. Kinect position data was manipulated using iterative methods from computational mathematics to create computer-generated movement to be performed by a dancer on-stage. Each original piece was originally intended to be presented to the public as part of an evening-length show. However, due to the rise of the COVID-19 pandemic caused by the novel coronavirus, all public campus events have been canceled and the government has recommended that gatherings with more than 10 people be entirely avoided. Thus, the pieces will instead be presented in the form of a video published online. This video will encompass information about the creation of each piece as well as clips of choreography.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)