The theory behind setup maps: a computational tool to position parts for machining

154994-Thumbnail Image.png
Description
When manufacturing large or complex parts, often a rough operation such as casting is used to create the majority of the part geometry. Due to the highly variable nature of the casting process, for mechanical components that require precision surfaces

When manufacturing large or complex parts, often a rough operation such as casting is used to create the majority of the part geometry. Due to the highly variable nature of the casting process, for mechanical components that require precision surfaces for functionality or assembly with others, some of the important features are machined to specification. Depending on the relative locations of as-cast to-be-machined features and the amount of material at each, the part may be positioned or ‘set up’ on a fixture in a configuration that will ensure that the pre-specified machining operations will successfully clean up the rough surfaces and produce a part that conforms to any assigned tolerances. For a particular part whose features incur excessive deviation in the casting process, it may be that no setup would yield an acceptable final part. The proposed Setup-Map (S-Map) describes the positions and orientations of a part that will allow for it to be successfully machined, and will be able to determine if a particular part cannot be made to specification.

The Setup Map is a point space in six dimensions where each of the six orthogonal coordinates corresponds to one of the rigid-body displacements in three dimensional space: three rotations and three translations. Any point within the boundaries of the Setup-Map (S-Map) corresponds to a small displacement of the part that satisfies the condition that each feature will lie within its associated tolerance zone after machining. The process for creating the S-Map involves the representation of constraints imposed by the tolerances in simple coordinate systems for each to-be-machined feature. Constraints are then transformed to a single coordinate system where the intersection reveals the common allowable ‘setup’ points. Should an intersection of the six-dimensional constraints exist, an optimization scheme is used to choose a single setup that gives the best chance for machining to be completed successfully. Should no intersection exist, the particular part cannot be machined to specification or must be re-worked with weld metal added to specific locations.
Date Created
2016
Agent

Automating fixture setups based on point cloud data & CAD model

154976-Thumbnail Image.png
Description
Metal castings are selectively machined-based on dimensional control requirements. To ensure that all the finished surfaces are fully machined, each as-cast part needs to be measured and then adjusted optimally in its fixture. The topics of this thesis address two

Metal castings are selectively machined-based on dimensional control requirements. To ensure that all the finished surfaces are fully machined, each as-cast part needs to be measured and then adjusted optimally in its fixture. The topics of this thesis address two parts of this process: data translations and feature-fitting clouds of points measured on each cast part. For the first, a CAD model of the finished part is required to be communicated to the machine shop for performing various machining operations on the metal casting. The data flow must include GD&T specifications along with other special notes that may be required to communicate to the machinist. Current data exchange, among various digital applications, is limited to translation of only CAD geometry via STEP AP203. Therefore, an algorithm is developed in order to read, store and translate the data from a CAD file (for example SolidWorks, CREO) to a standard and machine readable format (ACIS format - *.sat). Second, the geometry of cast parts varies from piece to piece and hence fixture set-up parameters for each part must be adjusted individually. To predictively determine these adjustments, the datum surfaces, and to-be-machined surfaces are scanned individually and the point clouds reduced to feature fits. The scanned data are stored as separate point cloud files. The labels associated with the datum and to-be-machined (TBM) features are extracted from the *.sat file. These labels are further matched with the file name of the point cloud data to identify data for the respective features. The point cloud data and the CAD model are then used to fit the appropriate features (features at maximum material condition (MMC) for datums and features at least material condition (LMC) for TBM features) using the existing normative feature fitting (nFF) algorithm. Once the feature fitting is complete, a global datum reference frame (GDRF) is constructed based on the locating method that will be used to machine the part. The locating method is extracted from a fixture library that specifies the type of fixturing used to machine the part. All entities are transformed from its local coordinate system into the GDRF. The nominal geometry, fitted features, and the GD&T information are then stored in a neutral file format called the Constraint Tolerance Feature (CTF) Graph. The final outputs are then used to identify the locations of the critical features on each part and these are used to establish the adjustments for its setup prior to machining, in another module, not part of this thesis.
Date Created
2016
Agent

Automated iterative tolerance value allocation and analysis

154942-Thumbnail Image.png
Description
Tolerance specification for manufacturing components from 3D models is a tedious task and often requires expertise of “detailers”. The work presented here is a part of a larger ongoing project aimed at automating tolerance specification to aid less experienced designers

Tolerance specification for manufacturing components from 3D models is a tedious task and often requires expertise of “detailers”. The work presented here is a part of a larger ongoing project aimed at automating tolerance specification to aid less experienced designers by producing consistent geometric dimensioning and tolerancing (GD&T). Tolerance specification can be separated into two major tasks; tolerance schema generation and tolerance value specification. This thesis will focus on the latter part of automated tolerance specification, namely tolerance value allocation and analysis. The tolerance schema (sans values) required prior to these tasks have already been generated by the auto-tolerancing software. This information is communicated through a constraint tolerance feature graph file developed previously at Design Automation Lab (DAL) and is consistent with ASME Y14.5 standard.

The objective of this research is to allocate tolerance values to ensure that the assemblability conditions are satisfied. Assemblability refers to “the ability to assemble/fit a set of parts in specified configuration given a nominal geometry and its corresponding tolerances”. Assemblability is determined by the clearances between the mating features. These clearances are affected by accumulation of tolerances in tolerance loops and hence, the tolerance loops are extracted first. Once tolerance loops have been identified initial tolerance values are allocated to the contributors in these loops. It is highly unlikely that the initial allocation would satisfice assemblability requirements. Overlapping loops have to be simultaneously satisfied progressively. Hence, tolerances will need to be re-allocated iteratively. This is done with the help of tolerance analysis module.

The tolerance allocation and analysis module receives the constraint graph which contains all basic dimensions and mating constraints from the generated schema. The tolerance loops are detected by traversing the constraint graph. The initial allocation distributes the tolerance budget computed from clearance available in the loop, among its contributors in proportion to the associated nominal dimensions. The analysis module subjects the loops to 3D parametric variation analysis and estimates the variation parameters for the clearances. The re-allocation module uses hill climbing heuristics derived from the distribution parameters to select a loop. Re-allocation Of the tolerance values is done using sensitivities and the weights associated with the contributors in the stack.

Several test cases have been run with this software and the desired user input acceptance rates are achieved. Three test cases are presented and output of each module is discussed.
Date Created
2016
Agent

Automating GD&T schema for mechanical assemblies

154871-Thumbnail Image.png
Description
Parts are always manufactured with deviations from their nominal geometry due to many reasons such as inherent inaccuracies in the machine tools and environmental conditions. It is a designer job to devise a proper tolerance scheme to allow reasonable freedom

Parts are always manufactured with deviations from their nominal geometry due to many reasons such as inherent inaccuracies in the machine tools and environmental conditions. It is a designer job to devise a proper tolerance scheme to allow reasonable freedom to a manufacturer for imperfections without compromising performance. It takes years of experience and strong practical knowledge of the device function, manufacturing process and GD&T standards for a designer to create a good tolerance scheme. There is almost no theoretical resource to help designers in GD&T synthesis. As a result, designers often create inconsistent and incomplete tolerance schemes that lead to high assembly scrap rates. Auto-Tolerancing project was started in the Design Automation Lab (DAL) to investigate the degree to which tolerance synthesis can be automated. Tolerance synthesis includes tolerance schema generation (sans tolerance values) and tolerance value allocation. This thesis aims to address the tolerance schema generation. To develop an automated tolerance schema synthesis toolset, to-be-toleranced features need to be identified, required tolerance types should be determined, a scheme for computer representation of the GD&T information need to be developed, sequence of control should be identified, and a procedure for creating datum reference frames (DRFs) should be developed. The first three steps define the architecture of the tolerance schema generation module while the last two steps setup a base to create a proper tolerance scheme with the help of GD&T good practice rules obtained from experts. The GD&T scheme recommended by this module is used by the tolerance value allocation/analysis module to complete the process of automated tolerance synthesis. Various test cases are studied to verify the suitability of this module. The results show that software-generated schemas are proper enough to address the assemblability issues (first order tolerancing). Since this novel technology is at its initial stage of development, performing further researches and case studies will definitely help to improve the software for making more comprehensive tolerance schemas that cover design intent (second order tolerancing) and cost optimization (third order tolerancing).
Date Created
2016
Agent

Interoperability of geometric dimension & tolerance data between CAD systems through ISO STEP AP 242

154869-Thumbnail Image.png
Description
There is very little in the way of prescriptive procedures to guide designers in tolerance specification. This shortcoming motivated the group at Design Automation Lab to automate tolerancing of mechanical assemblies. GD&T data generated by the Auto-Tolerancing software is semantically

There is very little in the way of prescriptive procedures to guide designers in tolerance specification. This shortcoming motivated the group at Design Automation Lab to automate tolerancing of mechanical assemblies. GD&T data generated by the Auto-Tolerancing software is semantically represented using a neutral Constraint Tolerance Feature (CTF) graph file format that is consistent with the ASME Y14.5 standard and the ISO STEP Part 21 file. The primary objective of this research is to communicate GD&T information from the CTF file to a neutral machine readable format. The latest STEP AP 242 (ISO 10303-242) “Managed model based 3D engineering“ aims to support smart manufacturing by capturing semantic Product Manufacturing Information (PMI) within the 3D model and also helping with long-term archiving of the product information. In line with the recommended practices published by CAx Implementor Forum, this research discusses the implementation of CTF to AP 242 translator. The input geometry available in STEP AP 203 format is pre-processed using STEP-NC DLL and 3D InterOp. While the former is initially used to attach persistent IDs to the topological entities in STEP, the latter retains the IDs during translation to ACIS entities for consumption by other modules in the Auto-tolerancing module. The associativity of GD&T available in CTF file to the input geometry is through persistent IDs. C++ libraries used for the translation to STEP AP 242 is provided by StepTools Inc through the STEP-NC DLL. Finally, the output STEP file is tested using available AP 242 readers and shows full conformance with the STEP standard. Using the output AP 242 file, semantic GDT data can now be automatically consumed by downstream applications such as Computer Aided Process Planning (CAPP), Computer Aided Inspection (CAI), Computer Aided Tolerance Systems (CATS) and Coordinate Measuring Machines (CMM).
Date Created
2016
Agent

Problem map: a framework for investigating the role of problem formulation in creative design

153932-Thumbnail Image.png
Description
Design problem formulation is believed to influence creativity, yet it has received only modest attention in the research community. Past studies of problem formulation are scarce and often have small sample sizes. The main objective of this research is to

Design problem formulation is believed to influence creativity, yet it has received only modest attention in the research community. Past studies of problem formulation are scarce and often have small sample sizes. The main objective of this research is to understand how problem formulation affects creative outcome. Three research areas are investigated: development of a model which facilitates capturing the differences among designers' problem formulation; representation and implication of those differences; the relation between problem formulation and creativity.

This dissertation proposes the Problem Map (P-maps) ontological framework. P-maps represent designers' problem formulation in terms of six groups of entities (requirement, use scenario, function, artifact, behavior, and issue). Entities have hierarchies within each group and links among groups. Variables extracted from P-maps characterize problem formulation.

Three experiments were conducted. The first experiment was to study the similarities and differences between novice and expert designers. Results show that experts use more abstraction than novices do and novices are more likely to add entities in a specific order. Experts also discover more issues.

The second experiment was to see how problem formulation relates to creativity. Ideation metrics were used to characterize creative outcome. Results include but are not limited to a positive correlation between adding more issues in an unorganized way with quantity and variety, more use scenarios and functions with novelty, more behaviors and conflicts identified with quality, and depth-first exploration with all ideation metrics. Fewer hierarchies in use scenarios lower novelty and fewer links to requirements and issues lower quality of ideas.

The third experiment was to see if problem formulation can predict creative outcome. Models based on one problem were used to predict the creativity of another. Predicted scores were compared to assessments of independent judges. Quality and novelty are predicted more accurately than variety, and quantity. Backward elimination improves model fit, though reduces prediction accuracy.

P-maps provide a theoretical framework for formalizing, tracing, and quantifying conceptual design strategies. Other potential applications are developing a test of problem formulation skill, tracking students' learning of formulation skills in a course, and reproducing other researchers’ observations about designer thinking.
Date Created
2015
Agent

3-D conformance analysis of manufacturing plans using M-maps, by explicating formal GD&T schema from the process plan

153927-Thumbnail Image.png
Description
A process plan is an instruction set for the manufacture of parts generated from detailed design drawings or CAD models. While these plans are highly detailed about machines, tools, fixtures and operation parameters; tolerances typically show up in less formal

A process plan is an instruction set for the manufacture of parts generated from detailed design drawings or CAD models. While these plans are highly detailed about machines, tools, fixtures and operation parameters; tolerances typically show up in less formal manner in such plans, if at all. It is not uncommon to see only dimensional plus/minus values on rough sketches accompanying the instructions. On the other hand, design drawings use standard GD&T (Geometrical Dimensioning and tolerancing) symbols with datums and DRFs (Datum Reference Frames) clearly specified. This is not to say that process planners do not consider tolerances; they are implied by way of choices of fixtures, tools, machines, and operations. When converting design tolerances to the manufacturing datum flow, process planners do tolerance charting, that is based on operation sequence but the resulting plans cannot be audited for conformance to design specification.

In this thesis, I will present a framework for explicating the GD&T schema implied by machining process plans. The first step is to derive the DRFs from the fixturing method in each set-up. Then basic dimensions for the features to be machined in each set up are determined with respect to the extracted DRF. Using shop data for the machines and operations involved, the range of possible geometric variations are estimated for each type of tolerances (form, size, orientation, and position). The sequence of manufacturing operations determines the datum flow chain. Once we have a formal manufacturing GD&T schema, we can analyze and compare it to tolerance specifications from design using the T-map math model. Since the model is based on the manufacturing process plan, it is called resulting T-map or m-map. Then the process plan can be validated by adjusting parameters so that the m-map lies within the T-map created for the design drawing. How the m-map is created to be compared with the T-map is the focus of this research.
Date Created
2015
Agent

An improved framework for design concept generation based on experiential and intuitive methods

153188-Thumbnail Image.png
Description
Conceptual design stage plays a critical role in product development. However, few systematic methods and tools exist to support conceptual design. The long term aim of this project is to develop a tool for facilitating holistic ideation for conceptual design.

Conceptual design stage plays a critical role in product development. However, few systematic methods and tools exist to support conceptual design. The long term aim of this project is to develop a tool for facilitating holistic ideation for conceptual design. This research is a continuation of past efforts in ASU Design Automation Lab. In past research, an interactive software test bed (Holistic Ideation Tool - version 1) was developed to explore logical ideation methods. Ideation states were identified and ideation strategies were developed to overcome common ideation blocks. The next version (version 2) of the holistic ideation tool added Cascading Evolutionary Morphological Charts (CEMC) framework and intuitive ideation strategies (reframing, restructuring, random connection, and forced connection).

Despite these remarkable contributions, there exist shortcomings in the previous versions (version 1 and version 2) of the holistic ideation tool. First, there is a need to add new ideation methods to the holistic ideation tool. Second, the organizational framework provided by previous versions needs to be improved, and a holistic approach needs to be devised, instead of separate logical or intuitive approaches. Therefore, the main objective of this thesis is to make the improvements and to resolve technical issues that are involved in their implementation.

Towards this objective, a new web based holistic ideation tool (version 3) has been created. The new tool adds and integrates Knowledge Bases of Mechanisms and Components Off-The-Shelf (COTS) into logical ideation methods. Additionally, an improved CEMC framework has been devised for organizing ideas efficiently. Furthermore, the usability of the tool has been improved by designing and implementing a new graphical user interface (GUI) which is more user friendly. It is hoped that these new features will lead to a platform for the designers to not only generate creative ideas but also effectively organize and store them in the conceptual design stage. By placing it on the web for public use, the Testbed has the potential to be used for research on the ideation process by effectively collecting large amounts of data from designers.
Date Created
2014
Agent

Reconciling the differences between tolerance specification and measurement methods

153035-Thumbnail Image.png
Description
Dimensional Metrology is the branch of science that determines length, angular, and geometric relationships within manufactured parts and compares them with required tolerances. The measurements can be made using either manual methods or sampled coordinate metrology (Coordinate measuring machines). Manual

Dimensional Metrology is the branch of science that determines length, angular, and geometric relationships within manufactured parts and compares them with required tolerances. The measurements can be made using either manual methods or sampled coordinate metrology (Coordinate measuring machines). Manual measurement methods have been in practice for a long time and are well accepted in the industry, but are slow for the present day manufacturing. On the other hand CMMs are relatively fast, but these methods are not well established yet. The major problem that needs to be addressed is the type of feature fitting algorithm used for evaluating tolerances. In a CMM the use of different feature fitting algorithms on a feature gives different values, and there is no standard that describes the type of feature fitting algorithm to be used for a specific tolerance. Our research is focused on identifying the feature fitting algorithm that is best used for each type of tolerance. Each algorithm is identified as the one to best represent the interpretation of geometric control as defined by the ASME Y14.5 standard and on the manual methods used for the measurement of a specific tolerance type. Using these algorithms normative procedures for CMMs are proposed for verifying tolerances. The proposed normative procedures are implemented as software. Then the procedures are verified by comparing the results from software with that of manual measurements.

To aid this research a library of feature fitting algorithms is developed in parallel. The library consists of least squares, Chebyshev and one sided fits applied on the features of line, plane, circle and cylinder. The proposed normative procedures are useful for evaluating tolerances in CMMs. The results evaluated will be in accordance to the standard. The ambiguity in choosing the algorithms is prevented. The software developed can be used in quality control for inspection purposes.
Date Created
2014
Agent

Computer support for preliminary concept completion & evaluation/analysis of design concepts

152414-Thumbnail Image.png
Description
Creative design lies at the intersection of novelty and technical feasibility. These objectives can be achieved through cycles of divergence (idea generation) and convergence (idea evaluation) in conceptual design. The focus of this thesis is on the latter aspect. The

Creative design lies at the intersection of novelty and technical feasibility. These objectives can be achieved through cycles of divergence (idea generation) and convergence (idea evaluation) in conceptual design. The focus of this thesis is on the latter aspect. The evaluation may involve any aspect of technical feasibility and may be desired at component, sub-system or full system level. Two issues that are considered in this work are: 1. Information about design ideas is incomplete, informal and sketchy 2. Designers often work at multiple levels; different aspects or subsystems may be at different levels of abstraction Thus, high fidelity analysis and simulation tools are not appropriate for this purpose. This thesis looks at the requirements for a simulation tool and how it could facilitate concept evaluation. The specific tasks reported in this thesis are: 1. The typical types of information available after an ideation session 2. The typical types of technical evaluations done in early stages 3. How to conduct low fidelity design evaluation given a well-defined feasibility question A computational tool for supporting idea evaluation was designed and implemented. It was assumed that the results of the ideation session are represented as a morphological chart and each entry is expressed as some combination of a sketch, text and references to physical effects and machine components. Approximately 110 physical effects were identified and represented in terms of algebraic equations, physical variables and a textual description. A common ontology of physical variables was created so that physical effects could be networked together when variables are shared. This allows users to synthesize complex behaviors from simple ones, without assuming any solution sequence. A library of 16 machine elements was also created and users were given instructions about incorporating them. To support quick analysis, differential equations are transformed to algebraic equations by replacing differential terms with steady state differences), only steady state behavior is considered and interval arithmetic was used for modeling. The tool implementation is done by MATLAB; and a number of case studies are also done to show how the tool works. textual description. A common ontology of physical variables was created so that physical effects could be networked together when variables are shared. This allows users to synthesize complex behaviors from simple ones, without assuming any solution sequence. A library of 15 machine elements was also created and users were given instructions about incorporating them. To support quick analysis, differential equations are transformed to algebraic equations by replacing differential terms with steady state differences), only steady state behavior is considered and interval arithmetic was used for modeling. The tool implementation is done by MATLAB; and a number of case studies are also done to show how the tool works.
Date Created
2014
Agent