The Rhythm Harmonizer is a drum machine synthesizer with an experimental twist. The
rhythm programming is achieved through square wave oscillator pulses. This may seem limiting as it only allows for even, metronome-like patterns, but by adjusting the speed of each…
The Rhythm Harmonizer is a drum machine synthesizer with an experimental twist. The
rhythm programming is achieved through square wave oscillator pulses. This may seem limiting as it only allows for even, metronome-like patterns, but by adjusting the speed of each voice independently, anything from triplet grooves to four on the floor rock beats could potentially be created. Furthermore, the use of pulse divisions encourages the use of polymeter, a technique that is uncommon in popular music. Another advantage of the pulse programming method is that it allows the tempo of the drums to be pushed so fast that they become sustained notes. Thus, the device can seamlessly transition from drum machine to synthesizer drone.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The post-industrial era ushered in significant advancements in global living standards, largely driven by technological innovations. The events of the 20th century shaped how these innovations implemented themselves into American culture, particularly influencing consumption habits. The broad shift to reliance…
The post-industrial era ushered in significant advancements in global living standards, largely driven by technological innovations. The events of the 20th century shaped how these innovations implemented themselves into American culture, particularly influencing consumption habits. The broad shift to reliance on single use materials led to concerns about resource exploitation and environmental sustainability. Recycling stands as a vital tool in mitigating these concerns, while maximizing sustainable goals and circular material life cycles. While recycling stands as an important concept in material reuse, the United States recycling infrastructure faces some major inefficiencies that prevent it from achieving its optimal benefits. Investigating the growth of curbside recycling and the consequences of China’s ban on recycling materials reveal failures within the recycling system. Once identified, further analysis of recycling failures emphasizes the use of concepts such as industrial ecology to visualize how industrial materials are influenced by broader multi-dimensional systems. One such level of analysis involves investigating the shortcomings of current recycling technologies and their implementation. However, to provide a fuller explanation of these inefficiencies, analysis of cultural, economic, and political dimensions is necessary. Case studies of recycling systems in different types of U.S. cities such as San Francisco and Surprise, provide insights into the effectiveness of these dimensions at highlighting core failures. Analysis of these failures also provides a framework in which to engineer possible solutions for recycling systems that emphasis the growth of cohesive recycling infrastructure and leveraging legislation to influence the recycling rates and the production of more renewable materials.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of…
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of…
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)