The Performance Based Studies Research Group (PBSRG) has developed industry-tested leadership and management techniques that have been proven to increase organizational performance. The Leadership Society of Arizona (LSA) has worked closely with PBSRG to develop an educational framework that introduces…
The Performance Based Studies Research Group (PBSRG) has developed industry-tested leadership and management techniques that have been proven to increase organizational performance. The Leadership Society of Arizona (LSA) has worked closely with PBSRG to develop an educational framework that introduces these leadership concepts to college students. LSA is now endeavoring to make this curriculum more accessible for K-12 students and educators. As part of a thesis creative project, the author has developed a strategy to connect with and enable local high schools, teachers, and students to engage with the professional industry and higher education. This strategy will allow LSA to connect with up to 150 high school students over the summer of 2016. By making this education easily accessible, the author has accomplished a milestone in the larger effort encompassed by LSA. The course chosen to present to high school students is an abridged variation of the Barrett Honors College course "Deductive Logic: Leadership and Management Techniques". The class framework is designed to instantiate a self-sustaining program for future summer school courses. The summer school course will allow high school students to learn, understand, and apply college level concepts into their education, work, and personal lives. The development of the framework for the program encompasses networking/partnering efforts, marketing package creation, and the delivery of the summer school course over the months of June and July in 2016.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Billions of people around the world deal with the struggles of poverty every day. Consequently, a number of others have committed themselves to help alleviate poverty. Many various methods are used, and a current consensus on the best method to…
Billions of people around the world deal with the struggles of poverty every day. Consequently, a number of others have committed themselves to help alleviate poverty. Many various methods are used, and a current consensus on the best method to alleviate poverty is lacking. Generally the methods used or researched exist somewhere on the spectrum between top-down and bottom-up approaches to fighting poverty. This paper analyzes a specific method proposed by C.K. Prahalad known as the Bottom of the Pyramid solution. The premise of the method is that large multinational corporations should utilize the large conglomerate of money that exists amongst poor people \u2014 created due to the sheer number of poor people \u2014 for business ventures. Concurrently, the poor people can benefit from the company's entrance. This method has received acclaim theoretically, but still needs empirical evidence to prove its practicality. This paper compares this approach with other approaches, considers international development data trends, and analyzes case studies of actual attempts that provide insight into the approach's potential for success. The market of poor people at the bottom of the pyramid is extremely segmented which makes it very difficult for large companies to financially prosper. It is even harder to establish mutual benefit between the large corporation and the poor. It has been found that although aspects of the bottom of the pyramid method hold merit, higher potential for alleviating poverty exists when small companies venture into this space rather than large multinational corporations. Small companies can conform to a single community and niche economy to prosper \u2014 a flexibility that large companies lack. Moving forward, analyzing the actual attempts provides the best and only empirical insights; hence, it will be important to consider more approaches into developing economies as they materialize.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Access to clean drinking water has been identified by the National Academy of Engineering as one of the Grand Challenges of the 21st century. This thesis investigated clean drinking water access in the greater Phoenix area, specifically with regards to…
Access to clean drinking water has been identified by the National Academy of Engineering as one of the Grand Challenges of the 21st century. This thesis investigated clean drinking water access in the greater Phoenix area, specifically with regards to drinking water quality standards and management strategies. This research report provides an introduction to water quality, treatment, and management; a background on the Salt River Project; and an analysis on source water mix and drinking water quality indicators for water delivered to Tempe, Arizona water treatment facilities.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
To compete with fossil fuel electricity generation, there is a need for higher efficiency solar cells to produce renewable energy. Currently, this is the best way to lower generation costs and the price of energy [1]. The goal of this…
To compete with fossil fuel electricity generation, there is a need for higher efficiency solar cells to produce renewable energy. Currently, this is the best way to lower generation costs and the price of energy [1]. The goal of this Barrett Honors Thesis is to design an optical coating model that has five or fewer layers (with varying thickness and refractive index, within the above range) and that has the maximum reflectance possible between 950 and 1200 nanometers for normally incident light. Manipulating silicon monolayers to become efficient inversion layers to use in solar cells aligns with the Ira. A Fulton Schools of Engineering research themes of energy and sustainability [2]. Silicon monolayers could be specifically designed for different doping substrates. These substrates could range from common-used materials such as boron and phosphorus, to rare-earth doped zinc oxides or even fullerene blends. Exploring how the doping material, and in what quantity, affects solar cell energy output could revolutionize the current production methods and commercial market. If solar cells can be manufactured more economically, yet still retain high efficiency rates, then more people will have access to alternate, "green" energy that does not deplete nonrenewable resources.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The overall energy consumption around the United States has not been reduced even with the advancement of technology over the past decades. Deficiencies exist between design and actual energy performances. Energy Infrastructure Systems (EIS) are impacted when the amount of…
The overall energy consumption around the United States has not been reduced even with the advancement of technology over the past decades. Deficiencies exist between design and actual energy performances. Energy Infrastructure Systems (EIS) are impacted when the amount of energy production cannot be accurately and efficiently forecasted. Inaccurate engineering assumptions can result when there is a lack of understanding on how energy systems can operate in real-world applications. Energy systems are complex, which results in unknown system behaviors, due to an unknown structural system model. Currently, there exists a lack of data mining techniques in reverse engineering, which are needed to develop efficient structural system models. In this project, a new type of reverse engineering algorithm has been applied to a year's worth of energy data collected from an ASU research building called MacroTechnology Works, to identify the structural system model. Developing and understanding structural system models is the first step in creating accurate predictive analytics for energy production. The associative network of the building's data will be highlighted to accurately depict the structural model. This structural model will enhance energy infrastructure systems' energy efficiency, reduce energy waste, and narrow the gaps between energy infrastructure design, planning, operation and management (DPOM).
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Workplace productivity is a result of many factors, and among them is the setup of the office and its resultant noise level. The conversations and interruptions that come along with converting an office to an open plan can foster innovation…
Workplace productivity is a result of many factors, and among them is the setup of the office and its resultant noise level. The conversations and interruptions that come along with converting an office to an open plan can foster innovation and creativity, or they can be distracting and harm the performance of employees. Through simulation, the impact of different types of office noise was studied along with other changing conditions such as number of people in the office. When productivity per person, defined in terms of mood and focus, was measured, it was found that the effect of noise was positive in some scenarios and negative in others. In simulations where employees were performing very similar tasks, noise (and its correlates, such as number of employees), was beneficial. On the other hand, when employees were engaged in a variety of different types of tasks, noise had a negative overall effect. This indicates that workplaces that group their employees by common job functions may be more productive than workplaces where the problems and products that employees are working on are varied throughout the workspace.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
In this study, the implementation of educational technology and its effect on learning and user experience is measured. A demographic survey, pretest/posttest, and educational experience survey was used to collect data on the control and experimental groups. The experimental grou…
In this study, the implementation of educational technology and its effect on learning and user experience is measured. A demographic survey, pretest/posttest, and educational experience survey was used to collect data on the control and experimental groups. The experimental group was subjected to different learning material than the control group with the use of the Elements 4D mobile application by Daqri to learn basic chemical elements and compounds. The control group learning material provided all the exact information as the application, but in the 2D form of a printed packet. It was expected the experimental group would outperform the control group and have a more enjoyable experience and higher performance. After data analysis, it was concluded that the control group outperformed the experimental group on performance and both groups has similar experiences in contradiction to the hypothesis. Once the factors that contribute to the limitations of different study duration, learning the application beforehand, and only-memorization questions are addressed, the study can be conducted again. Application improvements may also alter the future results of the study and hopefully lead to full implementation into a curriculum.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
The first step in process improvement is to scope the problem, next is measure the current process, but if data is not readily available and cannot be manually collected, then a measurement system must be implemented. General Dynamics Mission Systems…
The first step in process improvement is to scope the problem, next is measure the current process, but if data is not readily available and cannot be manually collected, then a measurement system must be implemented. General Dynamics Mission Systems (GDMS) is a lean company that is always seeking to improve. One of their current bottlenecks is the incoming inspection department. This department is responsible for finding defects on parts purchased and is critical to the high reliability product produced by GDMS. To stay competitive and hold their market share, a decision was made to optimize incoming inspection. This proved difficult because no data is being collected. Early steps in many process improvement methodologies, such as Define, Measure, Analyze, Improve and Control (DMAIC), include data collection; however, no measurement system was in place, resulting in no available data for improvement. The solution to this problem was to design and implement a Management Information System (MIS) that will track a variety of data. This will provide the company with data that will be used for analysis and improvement. The first stage of the MIS was developed in Microsoft Excel with Visual Basic for Applications because of the low cost and overall effectiveness of the software. Excel allows update to be made quickly, and allows GDMS to collect data immediately. Stage two would be moving the MIS to a more practicable software, such as Access or MySQL. This thesis is only focuses on stage one of the MIS, and GDMS will proceed with stage two.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Abstract Chess has been a common research topic for expert-novice studies and thus for learning science as a whole because of its limited framework and longevity as a game. One factor is that chess studies are good at measuring how…
Abstract Chess has been a common research topic for expert-novice studies and thus for learning science as a whole because of its limited framework and longevity as a game. One factor is that chess studies are good at measuring how expert chess players use their memory and skills to approach a new chessboard con�guration. Studies have shown that chess skill is based on memory, speci�cally, "chunks" of chess piece positions that have been previously encountered by players. However, debate exists concerning how these chunks are constructed in players' memory. These chunks could be constructed by proximity of pieces on the chessboard as well as their precise location or constructed through attack-defense relations. The primary objective of this study is to support which one is more in line with chess players' actual chess abilities based off their memory, proximity or attack/defense. This study replicates and extends an experiment conducted by McGregor and Howe (2002), which explored the argument that pieces are primed more by attack and defense relations than by proximity. Like their study, the present study examined novice and expert chess players' response times for correct and error responses by showing slides of game configurations. In addition to these metrics, the present study also incorporated an eye-tracker to measure visual attention and EEG to measure affective and cognitive states. They were added to allow the comparison of subtle and unconscious behaviors of both novices and expert chess players. Overall, most McGregor and Howe's (2002) results were replicated supporting their theory on chess expertise. This included statistically significance for skill in the error rates with the mean error rates on the piece recognition tests were 70.1% for novices and 87.9% for experts, as well as significance for the two-way interaction for relatedness and proximity with error rates of 22.4% for unrelated/far, 18.8% for related/far, 15.8% for unrelated ear, and 29.3% for related ear. Unfortunately, there were no statistically significance for any of the response time effects, which McGregor and Howe found for the interaction between skill and proximity. Despite eye-tracking and EEG data not either support nor confirm McGregor and Howe's theory on how chess players memorize chessboard configurations, these metrics did help build a secondary theory on how novices typically rely on proximity to approach chess and new visual problems in general. This was exemplified by the statistically significant results for short-term excitement for the two-way interaction of skill and proximity, where the largest short-term excitement score was between novices on near proximity slides. This may indicate that novices, because they may lean toward using proximity to try to recall these pieces, experience a short burst of excitement when the pieces are close to each other because they are more likely to recall these configurations.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software…
Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)