Description
Software systems can exacerbate and cause contemporary social inequities. As such, scholars and activists have scrutinized sociotechnical systems like those used in facial recognition technology or predictive policing using the frameworks of algorithmic bias and dataset bias. However, these conversations are incomplete without study of data models: the structural, epistemological, and technical frameworks that shape data. In Modeling Power: Data Models and the Production of Social Inequality, I elucidate the connections between relational data modeling techniques and manifestations of systems of power in the United States, specifically white supremacy and cisgender normativity. This project has three distinct parts. First, I historicize early publications by E. F. Codd, Peter Chen, Miles Smith & Diane Smith, and J. R. Abrial to demonstrate that now-taken-for-granted data modeling techniques were products of their social and technical moments and, as such, reinforced dominant systems of power. I further connect database reification techniques to contemporary racial analyses of reification via the work of Cheryl Harris. Second, I reverse engineer Android applications (with Jadx and apktool) to uncover the relational data models within. I analyze DAO annotations, create entity-relationship diagrams, and then examine those resultant models, again linking them back to systems of race and gender power. I craft a method for performing a reverse engineering investigation within a specific sociotechnical context -- a situated analysis of the contextual epistemological frames embedded within relational paradigms. Finally, I develop a relational data model that integrates insights from the project’s reverse and historical engineering phases. In my speculative engineering process, I suggest that the temporality of modern digital computing is incommensurate with the temporality of modern transgender lives. Following this, I speculate and build a trans-inclusive data model that demonstrates uses of reification to actively subvert systems of racialized and gendered power. By promoting aspects of social identity to first-order objects within a data model, I show that additional “intellectual manageability” is possible through reification. Through each part, I argue that contemporary approaches to the social impacts of software systems incomplete without data models. Data models structure algorithmic opportunities. As algorithms continue to reinforce systems of inequality, data models provide opportunities for intervention and subversion.
Details
Title
- Modeling Power: Data Models and the Production of Social Inequality
Contributors
- Stevens, Nikki Lane (Author)
- Wernimont, Jacqueline D (Thesis advisor)
- Michael, Katina (Thesis advisor)
- Richter, Jennifer (Committee member)
- Duarte, Marisa E. (Committee member)
- Arizona State University (Publisher)
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2022
Subjects
Resource Type
Collections this item is in
Note
- Partial requirement for: Ph.D., Arizona State University, 2022
- Field of study: Human and Social Dimensions of Science and Technology