Detalhes bibliográficos
Ano de defesa: |
2019 |
Autor(a) principal: |
Kirchoff, Dionatrã Folle
 |
Orientador(a): |
De Rose, Cesar Augusto Fonticielha
 |
Banca de defesa: |
Não Informado pela instituição |
Tipo de documento: |
Dissertação
|
Tipo de acesso: |
Acesso aberto |
Idioma: |
por |
Instituição de defesa: |
Pontifícia Universidade Católica do Rio Grande do Sul
|
Programa de Pós-Graduação: |
Programa de Pós-Graduação em Ciência da Computação
|
Departamento: |
Escola Politécnica
|
País: |
Brasil
|
Palavras-chave em Português: |
|
Palavras-chave em Inglês: |
|
Área do conhecimento CNPq: |
|
Link de acesso: |
http://tede2.pucrs.br/tede2/handle/tede/8797
|
Resumo: |
Cloud computing has transformed the means of resource provisioning in recent years with several benefits over traditional systems, like scalability and high availability. However, there are still some opportunities, especially in the area of proactive resource allocation and scaling. Since demand may fluctuate heavily in certain environments, over-provisioning is a common practice to avoid abrupt Quality of Service (QoS) drops that may result in Ser- vice Level Agreement (SLA) violations, but at the price of an increase in provisioning costs and energy consumption. Workload prediction is one of the strategies by which efficiency and operational cost of a cloud can be improved. Knowing demand in advance allows the allocation of sufficient resources to maintain QoS and avoid SLA violations. This paper presents the advantages and disadvantages of three workload prediction techniques usu- ally applied in the context of cloud computing. We compare ARIMA, MLP and GRU under different configurations, and although all three strategies have similar accuracy results in this context, they present important differences in preparation and execution. This work helps system administrators in choosing the more appropriate and efficient predictive model for their specific problem. |