Skip to main content

ASU Electronic Theses and Dissertations


This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.


Language
  • English
Subject
Date Range
2010 2020


Public health surveillance is a special case of the general problem where counts (or rates) of events are monitored for changes. Modern data complements event counts with many additional measurements (such as geographic, demographic, and others) that comprise high-dimensional covariates. This leads to an important challenge to detect a change that only occurs within a region, initially unspecified, defined by these covariates. Current methods are typically limited to spatial and/or temporal covariate information and often fail to use all the information available in modern data that can be paramount in unveiling these subtle changes. Additional complexities associated with modern health …

Contributors
Davila, Saylisse, Runger, George C, Montgomery, Douglas C, et al.
Created Date
2010

In mixture-process variable experiments, it is common that the number of runs is greater than in mixture-only or process-variable experiments. These experiments have to estimate the parameters from the mixture components, process variables, and interactions of both variables. In some of these experiments there are variables that are hard to change or cannot be controlled under normal operating conditions. These situations often prohibit a complete randomization for the experimental runs due to practical and economical considerations. Furthermore, the process variables can be categorized into two types: variables that are controllable and directly affect the response, and variables that are uncontrollable …

Contributors
Cho, Tae-Yeon, Montgomery, Douglas C, Borror, Connie M, et al.
Created Date
2010

Yield is a key process performance characteristic in the capital-intensive semiconductor fabrication process. In an industry where machines cost millions of dollars and cycle times are a number of months, predicting and optimizing yield are critical to process improvement, customer satisfaction, and financial success. Semiconductor yield modeling is essential to identifying processing issues, improving quality, and meeting customer demand in the industry. However, the complicated fabrication process, the massive amount of data collected, and the number of models available make yield modeling a complex and challenging task. This work presents modeling strategies to forecast yield using generalized linear models (GLMs) …

Contributors
Krueger, Dana Cheree, Montgomery, Douglas C., Fowler, John, et al.
Created Date
2011

Mostly, manufacturing tolerance charts are used these days for manufacturing tolerance transfer but these have the limitation of being one dimensional only. Some research has been undertaken for the three dimensional geometric tolerances but it is too theoretical and yet to be ready for operator level usage. In this research, a new three dimensional model for tolerance transfer in manufacturing process planning is presented that is user friendly in the sense that it is built upon the Coordinate Measuring Machine (CMM) readings that are readily available in any decent manufacturing facility. This model can take care of datum reference change …

Contributors
Khan, M Nadeem Shafi, Phelan, Patrick E, Montgomery, Douglas, et al.
Created Date
2011

In many classication problems data samples cannot be collected easily, example in drug trials, biological experiments and study on cancer patients. In many situations the data set size is small and there are many outliers. When classifying such data, example cancer vs normal patients the consequences of mis-classication are probably more important than any other data type, because the data point could be a cancer patient or the classication decision could help determine what gene might be over expressed and perhaps a cause of cancer. These mis-classications are typically higher in the presence of outlier data points. The aim of …

Contributors
Gupta, Sidharth, Kim, Seungchan, Welfert, Bruno, et al.
Created Date
2011

By the von Neumann min-max theorem, a two person zero sum game with finitely many pure strategies has a unique value for each player (summing to zero) and each player has a non-empty set of optimal mixed strategies. If the payoffs are independent, identically distributed (iid) uniform (0,1) random variables, then with probability one, both players have unique optimal mixed strategies utilizing the same number of pure strategies with positive probability (Jonasson 2004). The pure strategies with positive probability in the unique optimal mixed strategies are called saddle squares. In 1957, Goldman evaluated the probability of a saddle point (a …

Contributors
Manley, Michael, Kadell, Kevin W. J., Kao, Ming-Hung, et al.
Created Date
2011

Although the issue of factorial invariance has received increasing attention in the literature, the focus is typically on differences in factor structure across groups that are directly observed, such as those denoted by sex or ethnicity. While establishing factorial invariance across observed groups is a requisite step in making meaningful cross-group comparisons, failure to attend to possible sources of latent class heterogeneity in the form of class-based differences in factor structure has the potential to compromise conclusions with respect to observed groups and may result in misguided attempts at instrument development and theory refinement. The present studies examined the sensitivity …

Contributors
Blackwell, Kimberly Carol, Millsap, Roger E, Aiken, Leona S, et al.
Created Date
2011

Designing studies that use latent growth modeling to investigate change over time calls for optimal approaches for conducting power analysis for a priori determination of required sample size. This investigation (1) studied the impacts of variations in specified parameters, design features, and model misspecification in simulation-based power analyses and (2) compared power estimates across three common power analysis techniques: the Monte Carlo method; the Satorra-Saris method; and the method developed by MacCallum, Browne, and Cai (MBC). Choice of sample size, effect size, and slope variance parameters markedly influenced power estimates; however, level-1 error variance and number of repeated measures (3 …

Contributors
Van Vleet, Bethany L., Thompson, Marilyn S., Green, Samuel B., et al.
Created Date
2011

It is common in the analysis of data to provide a goodness-of-fit test to assess the performance of a model. In the analysis of contingency tables, goodness-of-fit statistics are frequently employed when modeling social science, educational or psychological data where the interest is often directed at investigating the association among multi-categorical variables. Pearson's chi-squared statistic is well-known in goodness-of-fit testing, but it is sometimes considered to produce an omnibus test as it gives little guidance to the source of poor fit once the null hypothesis is rejected. However, its components can provide powerful directional tests. In this dissertation, orthogonal components …

Contributors
Milovanovic, Jelena, Young, Dennis, Reiser, Mark, et al.
Created Date
2011

Real-world environments are characterized by non-stationary and continuously evolving data. Learning a classification model on this data would require a framework that is able to adapt itself to newer circumstances. Under such circumstances, transfer learning has come to be a dependable methodology for improving classification performance with reduced training costs and without the need for explicit relearning from scratch. In this thesis, a novel instance transfer technique that adapts a "Cost-sensitive" variation of AdaBoost is presented. The method capitalizes on the theoretical and functional properties of AdaBoost to selectively reuse outdated training instances obtained from a "source" domain to effectively …

Contributors
Venkatesan, Ashok, Panchanathan, Sethuraman, Li, Baoxin, et al.
Created Date
2011

Sparse learning is a technique in machine learning for feature selection and dimensionality reduction, to find a sparse set of the most relevant features. In any machine learning problem, there is a considerable amount of irrelevant information, and separating relevant information from the irrelevant information has been a topic of focus. In supervised learning like regression, the data consists of many features and only a subset of the features may be responsible for the result. Also, the features might require special structural requirements, which introduces additional complexity for feature selection. The sparse learning package, provides a set of algorithms for …

Contributors
Thulasiram, Ramesh L., Ye, Jieping, Xue, Guoliang, et al.
Created Date
2011

Value-added models (VAMs) are used by many states to assess contributions of individual teachers and schools to students' academic growth. The generalized persistence VAM, one of the most flexible in the literature, estimates the ``value added'' by individual teachers to their students' current and future test scores by employing a mixed model with a longitudinal database of test scores. There is concern, however, that missing values that are common in the longitudinal student scores can bias value-added assessments, especially when the models serve as a basis for personnel decisions -- such as promoting or dismissing teachers -- as they are …

Contributors
Karl, Andrew Thomas, Lohr, Sharon L, Yang, Yan, et al.
Created Date
2012

This dissertation presents methods for addressing research problems that currently can only adequately be solved using Quality Reliability Engineering (QRE) approaches especially accelerated life testing (ALT) of electronic printed wiring boards with applications to avionics circuit boards. The methods presented in this research are generally applicable to circuit boards, but the data generated and their analysis is for high performance avionics. Avionics equipment typically requires 20 years expected life by aircraft equipment manufacturers and therefore ALT is the only practical way of performing life test estimates. Both thermal and vibration ALT induced failure are performed and analyzed to resolve industry …

Contributors
Juarez, Joseph Moses, Montgomery, Douglas C., Borror, Connie M., et al.
Created Date
2012

Coarsely grouped counts or frequencies are commonly used in the behavioral sciences. Grouped count and grouped frequency (GCGF) that are used as outcome variables often violate the assumptions of linear regression as well as models designed for categorical outcomes; there is no analytic model that is designed specifically to accommodate GCGF outcomes. The purpose of this dissertation was to compare the statistical performance of four regression models (linear regression, Poisson regression, ordinal logistic regression, and beta regression) that can be used when the outcome is a GCGF variable. A simulation study was used to determine the power, type I error, …

Contributors
Coxe, Stefany Jean, Aiken, Leona S, West, Stephen G, et al.
Created Date
2012

The purpose of this study was to examine under which conditions "good" data characteristics can compensate for "poor" characteristics in Latent Class Analysis (LCA), as well as to set forth guidelines regarding the minimum sample size and ideal number and quality of indicators. In particular, we studied to which extent including a larger number of high quality indicators can compensate for a small sample size in LCA. The results suggest that in general, larger sample size, more indicators, higher quality of indicators, and a larger covariate effect correspond to more converged and proper replications, as well as fewer boundary estimates …

Contributors
Wurpts, Ingrid Carlson, Geiser, Christian, Aiken, Leona, et al.
Created Date
2012

The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary to obtain an answer and 2) in the underlying structure of the data, which does not conform to classical normal theory statistical assumptions and includes clustering and unobserved latent constructs. Until recently, the methods and tools necessary to effectively address the complexity of biomedical data were not ordinarily available. The …

Contributors
Brown, Justin Reed, Dinu, Valentin, Johnson, William, et al.
Created Date
2012

Photovoltaic (PV) modules are typically rated at three test conditions: STC (standard test conditions), NOCT (nominal operating cell temperature) and Low E (low irradiance). The current thesis deals with the power rating of PV modules at twenty-three test conditions as per the recent International Electrotechnical Commission (IEC) standard of IEC 61853 – 1. In the current research, an automation software tool developed by a previous researcher of ASU – PRL (ASU Photovoltaic Reliability Laboratory) is validated at various stages. Also in the current research, the power rating of PV modules for four different manufacturers is carried out according to IEC …

Contributors
Vemula, Meena Gupta, Tamizhmani, Govindasamy, Macia, Narcio F., et al.
Created Date
2012

This thesis examines the application of statistical signal processing approaches to data arising from surveys intended to measure psychological and sociological phenomena underpinning human social dynamics. The use of signal processing methods for analysis of signals arising from measurement of social, biological, and other non-traditional phenomena has been an important and growing area of signal processing research over the past decade. Here, we explore the application of statistical modeling and signal processing concepts to data obtained from the Global Group Relations Project, specifically to understand and quantify the effects and interactions of social psychological factors related to intergroup conflicts. We …

Contributors
Liu, Hui, Taylor, Thomas, Cochran, Douglas, et al.
Created Date
2012

A least total area of triangle method was proposed by Teissier (1948) for fitting a straight line to data from a pair of variables without treating either variable as the dependent variable while allowing each of the variables to have measurement errors. This method is commonly called Reduced Major Axis (RMA) regression and is often used instead of Ordinary Least Squares (OLS) regression. Results for confidence intervals, hypothesis testing and asymptotic distributions of coefficient estimates in the bivariate case are reviewed. A generalization of RMA to more than two variables for fitting a plane to data is obtained by minimizing …

Contributors
Li, Jingjin, Young, Dennis, Eubank, Randall, et al.
Created Date
2012

When analyzing longitudinal data it is essential to account both for the correlation inherent from the repeated measures of the responses as well as the correlation realized on account of the feedback created between the responses at a particular time and the predictors at other times. A generalized method of moments (GMM) for estimating the coefficients in longitudinal data is presented. The appropriate and valid estimating equations associated with the time-dependent covariates are identified, thus providing substantial gains in efficiency over generalized estimating equations (GEE) with the independent working correlation. Identifying the estimating equations for computation is of utmost importance. …

Contributors
Yin, Jianqiong, Wilson, Jeffrey Wilson, Reiser, Mark, et al.
Created Date
2012

This dissertation involves three problems that are all related by the use of the singular value decomposition (SVD) or generalized singular value decomposition (GSVD). The specific problems are (i) derivation of a generalized singular value expansion (GSVE), (ii) analysis of the properties of the chi-squared method for regularization parameter selection in the case of nonnormal data and (iii) formulation of a partial canonical correlation concept for continuous time stochastic processes. The finite dimensional SVD has an infinite dimensional generalization to compact operators. However, the form of the finite dimensional GSVD developed in, e.g., Van Loan does not extend directly to …

Contributors
Huang, Qing, Eubank, Randall, Renaut, Rosemary, et al.
Created Date
2012

Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning of the relevant patterns This dissertation proposes TS representations and methods for supervised TS analysis. The approaches combine new representations that handle translations and dilations of patterns with bag-of-features strategies and tree-based ensemble learning. This provides flexibility in handling time-warped patterns in a computationally efficient way. The ensemble learners provide a …

Contributors
Baydogan, Mustafa Gokce, Runger, George C, Atkinson, Robert, et al.
Created Date
2012

Estimating cointegrating relationships requires specific techniques. Canonical correlations are used to determine the rank and space of the cointegrating matrix. The vectors used to transform the data into canonical variables have an eigenvector representation, and the associated canonical correlations have an eigenvalue representation. The number of cointegrating relations is chosen based upon a theoretical difference in the convergence rates of the eignevalues. The number of cointegrating relations is consistently estimated using a threshold function which places a lower bound on the eigenvalues associated with cointegrating relations and an upper bound on the eigenvalues on the eigenvalues not associated with cointegrating …

Contributors
Nowak, Adam Daniel, Ahn, Seung C, Liu, Crocker, et al.
Created Date
2012

With the increase in computing power and availability of data, there has never been a greater need to understand data and make decisions from it. Traditional statistical techniques may not be adequate to handle the size of today's data or the complexities of the information hidden within the data. Thus knowledge discovery by machine learning techniques is necessary if we want to better understand information from data. In this dissertation, we explore the topics of asymmetric loss and asymmetric data in machine learning and propose new algorithms as solutions to some of the problems in these topics. We also studied …

Contributors
Koh, Derek, Runger, George, Wu, Tong, et al.
Created Date
2013

The use of bias indicators in psychological measurement has been contentious, with some researchers questioning whether they actually suppress or moderate the ability of substantive psychological indictors to discriminate (McGrath, Mitchell, Kim, & Hough, 2010). Bias indicators on the MMPI-2-RF (F-r, Fs, FBS-r, K-r, and L-r) were tested for suppression or moderation of the ability of the RC1 and NUC scales to discriminate between Epileptic Seizures (ES) and Non-epileptic Seizures (NES, a conversion disorder that is often misdiagnosed as ES). RC1 and NUC had previously been found to be the best scales on the MMPI-2-RF to differentiate between ES and …

Contributors
Wershba, Rebecca Eve, Lanyon, Richard I, Barrera, Manuel, et al.
Created Date
2013

Statistical process control (SPC) and predictive analytics have been used in industrial manufacturing and design, but up until now have not been applied to threshold data of vital sign monitoring in remote care settings. In this study of 20 elders with COPD and/or CHF, extended months of peak flow monitoring (FEV1) using telemedicine are examined to determine when an earlier or later clinical intervention may have been advised. This study demonstrated that SPC may bring less than a 2.0% increase in clinician workload while providing more robust statistically-derived thresholds than clinician-derived thresholds. Using a random K-fold model, FEV1 output was …

Contributors
Fralick, Celeste Rachelle, Muthuswamy, Jitendran, O'Shea, Terrance, et al.
Created Date
2013

Parallel Monte Carlo applications require the pseudorandom numbers used on each processor to be independent in a probabilistic sense. The TestU01 software package is the standard testing suite for detecting stream dependence and other properties that make certain pseudorandom generators ineffective in parallel (as well as serial) settings. TestU01 employs two basic schemes for testing parallel generated streams. The first applies serial tests to the individual streams and then tests the resulting P-values for uniformity. The second turns all the parallel generated streams into one long vector and then applies serial tests to the resulting concatenated stream. Various forms of …

Contributors
Ismay, Chester Ivan, Eubank, Randall, Young, Dennis, et al.
Created Date
2013

Dimensionality assessment is an important component of evaluating item response data. Existing approaches to evaluating common assumptions of unidimensionality, such as DIMTEST (Nandakumar & Stout, 1993; Stout, 1987; Stout, Froelich, & Gao, 2001), have been shown to work well under large-scale assessment conditions (e.g., large sample sizes and item pools; see e.g., Froelich & Habing, 2007). It remains to be seen how such procedures perform in the context of small-scale assessments characterized by relatively small sample sizes and/or short tests. The fact that some procedures come with minimum allowable values for characteristics of the data, such as the number of …

Contributors
Reichenberg, Ray E., Levy, Roy, Thompson, Marilyn S., et al.
Created Date
2013

This work presents two complementary studies that propose heuristic methods to capture characteristics of data using the ensemble learning method of random forest. The first study is motivated by the problem in education of determining teacher effectiveness in student achievement. Value-added models (VAMs), constructed as linear mixed models, use students’ test scores as outcome variables and teachers’ contributions as random effects to ascribe changes in student performance to the teachers who have taught them. The VAMs teacher score is the empirical best linear unbiased predictor (EBLUP). This approach is limited by the adequacy of the assumed model specification with respect …

Contributors
Valdivia, Arturo, Eubank, Randall, Young, Dennis, et al.
Created Date
2013

Many longitudinal studies, especially in clinical trials, suffer from missing data issues. Most estimation procedures assume that the missing values are ignorable or missing at random (MAR). However, this assumption leads to unrealistic simplification and is implausible for many cases. For example, an investigator is examining the effect of treatment on depression. Subjects are scheduled with doctors on a regular basis and asked questions about recent emotional situations. Patients who are experiencing severe depression are more likely to miss an appointment and leave the data missing for that particular visit. Data that are not missing at random may produce bias …

Contributors
Zhang, Jun, Reiser, Mark, Barber, Jarrett, et al.
Created Date
2013

Nowadays product reliability becomes the top concern of the manufacturers and customers always prefer the products with good performances under long period. In order to estimate the lifetime of the product, accelerated life testing (ALT) is introduced because most of the products can last years even decades. Much research has been done in the ALT area and optimal design for ALT is a major topic. This dissertation consists of three main studies. First, a methodology of finding optimal design for ALT with right censoring and interval censoring have been developed and it employs the proportional hazard (PH) model and generalized …

Contributors
Yang, Tao, Pan, Rong, Montgomery, Douglas, et al.
Created Date
2013

Statistics is taught at every level of education, yet teachers often have to assume their students have no knowledge of statistics and start from scratch each time they set out to teach statistics. The motivation for this experimental study comes from interest in exploring educational applications of augmented reality (AR) delivered via mobile technology that could potentially provide rich, contextualized learning for understanding concepts related to statistics education. This study examined the effects of AR experiences for learning basic statistical concepts. Using a 3 x 2 research design, this study compared learning gains of 252 undergraduate and graduate students from …

Contributors
Conley, Quincy, Atkinson, Robert K, Nguyen, Frank, et al.
Created Date
2013

This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex performance assessment within a digital-simulation educational context grounded in theories of cognition and learning. BN models were manipulated along two factors: latent variable dependency structure and number of latent classes. Distributions of posterior predicted p-values (PPP-values) served as the primary outcome measure and were summarized in graphical presentations, by median values across replications, and by …

Contributors
Crawford, Aaron Vaughn, Levy, Roy, Green, Samuel, et al.
Created Date
2014

Extraordinary medical advances have led to significant reductions in the burden of infectious diseases in humans. However, infectious diseases still account for more than 13 million annual deaths. This large burden is partly due to some pathogens having found suitable conditions to emerge and spread in denser and more connected host populations, and others having evolved to escape the pressures imposed by the rampant use of antimicrobials. It is then critical to improve our understanding of how diseases spread in these modern landscapes, characterized by new host population structures and socio-economic environments, as well as containment measures such as the …

Contributors
Patterson-Lomba, Oscar, Castillo-Chavez, Carlos, Towers, Sherry, et al.
Created Date
2014

In the field of infectious disease epidemiology, the assessment of model robustness outcomes plays a significant role in the identification, reformulation, and evaluation of preparedness strategies aimed at limiting the impact of catastrophic events (pandemics or the deliberate release of biological agents) or used in the management of disease prevention strategies, or employed in the identification and evaluation of control or mitigation measures. The research work in this dissertation focuses on: The comparison and assessment of the role of exponentially distributed waiting times versus the use of generalized non-exponential parametric distributed waiting times of infectious periods on the quantitative and …

Contributors
Morale Butler, Emmanuel Jesús, Castillo-Chavez, Carlos, Aparicio, Juan P, et al.
Created Date
2014

The objective of this thesis is to investigate the various types of energy end-uses to be expected in future high efficiency single family residences. For this purpose, this study has analyzed monitored data from 14 houses in the 2013 Solar Decathlon competition, and segregates the energy consumption patterns in various residential end-uses (such as lights, refrigerators, washing machines, ...). The analysis was not straight-forward since these homes were operated according to schedules previously determined by the contest rules. The analysis approach allowed the isolation of the comfort energy use by the Heating, Venting and Cooling (HVAC) systems. HVAC are the …

Contributors
Garkhail, Rahul, Reddy, T Agami, Bryan, Harvey, et al.
Created Date
2014

Urban scaling analysis has introduced a new scientific paradigm to the study of cities. With it, the notions of <italic>size</italic>, <italic>heterogeneity</italic> and <italic>structure</italic> have taken a leading role. These notions are assumed to be behind the causes for why cities differ from one another, sometimes wildly. However, the mechanisms by which size, heterogeneity and structure shape the general statistical patterns that describe urban economic output are still unclear. Given the rapid rate of urbanization around the globe, we need precise and formal mathematical understandings of these matters. In this context, I perform in this dissertation probabilistic, distributional and computational explorations …

Contributors
Gomez-Lievano, Andres, Lobo, José, Muneepeerakul, Rachata, et al.
Created Date
2014

Obtaining high-quality experimental designs to optimize statistical efficiency and data quality is quite challenging for functional magnetic resonance imaging (fMRI). The primary fMRI design issue is on the selection of the best sequence of stimuli based on a statistically meaningful optimality criterion. Some previous studies have provided some guidance and powerful computational tools for obtaining good fMRI designs. However, these results are mainly for basic experimental settings with simple statistical models. In this work, a type of modern fMRI experiments is considered, in which the design matrix of the statistical model depends not only on the selected design, but also …

Contributors
Zhou, Lin, Kao, Ming-hung, Reiser, Mark, et al.
Created Date
2014

Technological advances have enabled the generation and collection of various data from complex systems, thus, creating ample opportunity to integrate knowledge in many decision making applications. This dissertation introduces holistic learning as the integration of a comprehensive set of relationships that are used towards the learning objective. The holistic view of the problem allows for richer learning from data and, thereby, improves decision making. The first topic of this dissertation is the prediction of several target attributes using a common set of predictor attributes. In a holistic learning approach, the relationships between target attributes are embedded into the learning algorithm …

Contributors
Azarnoush, Bahareh, Runger, George C, Bekki, Jennifer, et al.
Created Date
2014

This is a two part thesis: Part 1 of this thesis determines the most dominant failure modes of field aged photovoltaic (PV) modules using experimental data and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 5900 crystalline-Si glass/polymer modules fielded for 6 to 16 years in three different photovoltaic (PV) power plants with different mounting systems under the hot-dry desert climate of Arizona are evaluated. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is performed for each PV power plant to determine the dominant failure modes in the modules …

Contributors
Shrestha, Sanjay Mohan, Tamizhmani, Govindsamy, Srinivasan, Devrajan, et al.
Created Date
2014

This thesis presents a meta-analysis of lead-free solder reliability. The qualitative analyses of the failure modes of lead- free solder under different stress tests including drop test, bend test, thermal test and vibration test are discussed. The main cause of failure of lead- free solder is fatigue crack, and the speed of propagation of the initial crack could differ from different test conditions and different solder materials. A quantitative analysis about the fatigue behavior of SAC lead-free solder under thermal preconditioning process is conducted. This thesis presents a method of making prediction of failure life of solder alloy by building …

Contributors
Xu, Xinyue, Pan, Rong, Montgomery, Douglas, et al.
Created Date
2014

The main objective of this research is to develop an approach to PV module lifetime prediction. In doing so, the aim is to move from empirical generalizations to a formal predictive science based on data-driven case studies of the crystalline silicon PV systems. The evaluation of PV systems aged 5 to 30 years old that results in systematic predictive capability that is absent today. The warranty period provided by the manufacturers typically range from 20 to 25 years for crystalline silicon modules. The end of lifetime (for example, the time-to-degrade by 20% from rated power) of PV modules is usually …

Contributors
Kuitche, Joseph Mathurin, Pan, Rong, TamizhMani, Govindasamy, et al.
Created Date
2014

In this era of fast computational machines and new optimization algorithms, there have been great advances in Experimental Designs. We focus our research on design issues in generalized linear models (GLMs) and functional magnetic resonance imaging(fMRI). The first part of our research is on tackling the challenging problem of constructing exact designs for GLMs, that are robust against parameter, link and model uncertainties by improving an existing algorithm and providing a new one, based on using a continuous particle swarm optimization (PSO) and spectral clustering. The proposed algorithm is sufficiently versatile to accomodate most popular design selection criteria, and we …

Contributors
Temkit, M'Hamed, Kao, Jason, Reiser, Mark, et al.
Created Date
2014

Smoking remains the leading cause of preventable death in the United States, and early initiation is associated with greater difficulty quitting. Among adolescent smokers, those with attention-deficit hyperactivity disorder (ADHD), characterized by difficulties associated with impulsivity, hyperactivity, and inattention, smoke at nearly twice the rate of their peers. Although cigarette smoking is highly addictive, nicotine is a relatively weak primary reinforcer, spurring research on other potential targets that may maintain smoking, including the potential benefits of nicotine on attention, inhibition, and reinforcer efficacy. The present study employs the most prevalent rodent model of ADHD, the spontaneously hypertensive rat (SHR) and …

Contributors
Mazur, Gabriel Joseph, Sanabria, Federico, Killeen, Peter R, et al.
Created Date
2014

Many methodological approaches have been utilized to predict student retention and persistence over the years, yet few have utilized a Bayesian framework. It is believed this is due in part to the absence of an established process for guiding educational researchers reared in a frequentist perspective into the realms of Bayesian analysis and educational data mining. The current study aimed to address this by providing a model-building process for developing a Bayesian network (BN) that leveraged educational data mining, Bayesian analysis, and traditional iterative model-building techniques in order to predict whether community college students will stop out at the completion …

Contributors
Arcuria, Phil, Levy, Roy, Green, Samuel B, et al.
Created Date
2015

Missing data are common in psychology research and can lead to bias and reduced power if not properly handled. Multiple imputation is a state-of-the-art missing data method recommended by methodologists. Multiple imputation methods can generally be divided into two broad categories: joint model (JM) imputation and fully conditional specification (FCS) imputation. JM draws missing values simultaneously for all incomplete variables using a multivariate distribution (e.g., multivariate normal). FCS, on the other hand, imputes variables one at a time, drawing missing values from a series of univariate distributions. In the single-level context, these two approaches have been shown to be equivalent …

Contributors
Mistler, Stephen Andrew, Enders, Craig K, Aiken, Leona, et al.
Created Date
2015

Methods to test hypotheses of mediated effects in the pretest-posttest control group design are understudied in the behavioral sciences (MacKinnon, 2008). Because many studies aim to answer questions about mediating processes in the pretest-posttest control group design, there is a need to determine which model is most appropriate to test hypotheses about mediating processes and what happens to estimates of the mediated effect when model assumptions are violated in this design. The goal of this project was to outline estimator characteristics of four longitudinal mediation models and the cross-sectional mediation model. Models were compared on type 1 error rates, statistical …

Contributors
Valente, Matthew, MacKinnon, David, West, Stephen, et al.
Created Date
2015

Complex systems are pervasive in science and engineering. Some examples include complex engineered networks such as the internet, the power grid, and transportation networks. The complexity of such systems arises not just from their size, but also from their structure, operation (including control and management), evolution over time, and that people are involved in their design and operation. Our understanding of such systems is limited because their behaviour cannot be characterized using traditional techniques of modelling and analysis. As a step in model development, statistically designed screening experiments may be used to identify the main effects and interactions most significant …

Contributors
Aldaco-Gastelum, Abraham Netzahualcoyotl, Syrotiuk, Violet R., Colbourn, Charles J., et al.
Created Date
2015

Tracking targets in the presence of clutter is inevitable, and presents many challenges. Additionally, rapid, drastic changes in clutter density between different environments or scenarios can make it even more difficult for tracking algorithms to adapt. A novel approach to target tracking in such dynamic clutter environments is proposed using a particle filter (PF) integrated with Interacting Multiple Models (IMMs) to compensate and adapt to the transition between different clutter densities. This model was implemented for the case of a monostatic sensor tracking a single target moving with constant velocity along a two-dimensional trajectory, which crossed between regions of drastically …

Contributors
Dutson, Karl J, Papandreou-Suppappola, Antonia, Kovvali, Narayan, et al.
Created Date
2015

This is a two-part thesis: Part 1 characterizes soiling losses using various techniques to understand the effect of soiling on photovoltaic modules. The higher the angle of incidence (AOI), the lower will be the photovoltaic (PV) module performance. Our research group has already reported the AOI investigation for cleaned modules of five different technologies with air/glass interface. However, the modules that are installed in the field would invariably develop a soil layer with varying thickness depending on the site condition, rainfall and tilt angle. The soiled module will have the air/soil/glass interface rather than air/glass interface. This study investigates the …

Contributors
Boppana, Sravanthi, Tamizhmani, Govindasamy, Srinivasan, Devarajan, et al.
Created Date
2015