A entropia segundo Claude Shannon: o desenvolvimento do conceito fundamental da teoria da informação

Detalhes bibliográficos
Ano de defesa: 2006
Autor(a) principal: Pineda, José Octávio de Carvalho
Orientador(a): Goldfarb, José Luiz
Banca de defesa: Não Informado pela instituição
Tipo de documento: Dissertação
Tipo de acesso: Acesso aberto
Idioma: por
Instituição de defesa: Pontifícia Universidade Católica de São Paulo
Programa de Pós-Graduação: Programa de Estudos Pós-Graduados em História da Ciência
Departamento: História da Ciência
País: BR
Palavras-chave em Português:
Palavras-chave em Inglês:
Área do conhecimento CNPq:
Link de acesso: https://tede2.pucsp.br/handle/handle/13330
Resumo: This dissertation s objective is to investigate the origins of the concept of Entropy as defined by Claude Shannon in the development of the Information Theory, as well as the influences that this concept and other ones from the same theory had over other sciences, especially in Physics. Starting from its origin in Mechanical Statistics, the concept of entropy was transformed into a measure of amount of information by Shannon. Since then the approach proposed by Information Theory has influenced other areas of knowledge and there were many attempts of integrating it with physical theories. The analysis on Information Theory main authors works viewed under a historical outlook, added to the analysis of proposals for its integration with Physics will allow to demonstrate that the integration is currently at the level of approach to physical problems and not at a more fundamental level as it was some scientists expectation