Detalhes bibliográficos
Ano de defesa: |
2015 |
Autor(a) principal: |
França, Regina Luana Santos de |
Orientador(a): |
Oliveira Júnior, Antônio Martins de |
Banca de defesa: |
Não Informado pela instituição |
Tipo de documento: |
Dissertação
|
Tipo de acesso: |
Acesso aberto |
Idioma: |
por |
Instituição de defesa: |
Não Informado pela instituição
|
Programa de Pós-Graduação: |
Pós-Graduação em Engenharia Química
|
Departamento: |
Não Informado pela instituição
|
País: |
Não Informado pela instituição
|
Palavras-chave em Português: |
|
Área do conhecimento CNPq: |
|
Link de acesso: |
http://ri.ufs.br/jspui/handle/riufs/17086
|
Resumo: |
The functions robust estimators belong to families that have the ability to mitigate errors when they are present in the measurements. Literature expose numerous theoretical works related to the use of robust functions for reconciliation, almost all directed at stationary processes represented by linear systems, however, few in-depth studies are directed to nonlinear stationary processes, as well as the actual data of industrial plants. In addition, issues related to: (i) defining the function; (ii) resolution of the technical issue of reconciliation and, (iii) prediction capacity of robust functions in the presence of gross errors, still represent a challenge to be explored and that motivated the design of this study, which aimed to assess the suitability some function in resolving robust data reconciliation problems steady state chemical processes represented by linear and nonlinear systems. Initially, the traditional robust functions Cauchy, Fair, Contaminated Normal and Logistics were used in the issue of reconciliation, and their estimates have been compared with those obtained with the use of the latest features, such as New Target and Alarm. For this purpose, the numerical method used was nonlinear programming, in particular the "Sequential Quadratic Programming" (SQP), which is implemented in the computing environment. As performance criteria, applied the average relative error, the number of iterations of the objective function and the adjustment of the actual data contaminated with errors. With the results, it was observed that the Weighted Least Squares function showed a reduced number of iterations in almost all cases studies performed. The Cauchy and Normal Contaminated functions showed good results as the number of iterations, including the case study using real data. However, in one of the cases tested, the Contaminated Normal function presented a problem of convergence. Already the Alarm function showed convergence error in one of the variables estimated the case study with real data. In nonlinear systems containing a single gross error, the functions New Target, Alarm and Cauchy had good levels of performance, especially in terms of the average relative error and the last presented fewer iterations. Reconciliation of industrial data by applying systematic deviations in a variable to compare the efficiencies of the functions, the Alarm and Normal Contaminated exhibited optimal settings. |