LOGO

Development of statistics for cultural concepts measured by linguistic changes / Mirtha Haydee Pari Ruiz.

Por: Colaborador(es): Tipo de material: TextoTextoEditor: Valparaíso : Universidad de Valparaíso, 2019Descripción: 95 hojasTipo de contenido:
  • text
Tipo de medio:
  • unmediated
Tipo de soporte:
  • volume
Tema(s): Otra clasificación:
  • M
Nota de disertación: Resumen: In the digital era, given so many data analysis techniques it is important to select an adequate one that ensures the quality of information. This thesis focuses on the distribution of quasi-distances of frequencies of linguistic objects selected from two historical corpuses of Nican Mopohua. The probability theory allows us to make statements in the presence of uncertainty, information allows us to quantify the amount of uncertainty in a given data. The theory of information has been applied to statistical and probabilistic problems with success in many research areas, also in linguistics and text comparisons, e.g. Bigi (2003) indicates in the context of linguistics that the Kullback-Leibler divergence is a measure of relative entropy that tells us how differ two probability distributions of linguistic object. Shannon, known as "the father of information theory," along with Warren Weaver, contributed to the culmination and settlement of the 1949 Mathematical Theory of Communication (now known as Information Theory). Divergences were widely studied e.g. by Kullback, Leibler and Rényi, among others. Divergences have multiple applications in signal and image processing, medical image analysis, texture classification, applications of natural language processing, etc. There exist divergence measures such as similarity functions or quasi-distances between two probability distributions that are: Kullbak-Leibler divergence (KL), Jensen-Shannon (JS), Skew divergence, Euclidean, cosine, L1 and confusion, among others....
Valoración
    Valoración media: 0.0 (0 votos)
Existencias
Tipo de ítem Biblioteca actual Colección Signatura topográfica Estado Fecha de vencimiento Código de barras Reserva de ítems
Tesis  Postgrado Tesis Postgrado Ciencias Tesis Tesis M P231d 2019 Disponible 00421859
Total de reservas: 0

Doctor en Estadística.

In the digital era, given so many data analysis techniques it is important to select an adequate one that ensures the quality of information. This thesis focuses on the distribution of quasi-distances of frequencies of linguistic objects selected from two historical corpuses of Nican Mopohua. The probability theory allows us to make statements in the presence of uncertainty, information allows us to quantify the amount of uncertainty in a given data. The theory of information has been applied to statistical and probabilistic problems with success in many research areas, also in linguistics and text comparisons, e.g. Bigi (2003) indicates in the context of linguistics that the Kullback-Leibler divergence is a measure of relative entropy that tells us how differ two probability distributions of linguistic object. Shannon, known as "the father of information theory," along with Warren Weaver, contributed to the culmination and settlement of the 1949 Mathematical Theory of Communication (now known as Information Theory). Divergences were widely studied e.g. by Kullback, Leibler and Rényi, among others. Divergences have multiple applications in signal and image processing, medical image analysis, texture classification, applications of natural language processing, etc. There exist divergence measures such as similarity functions or quasi-distances between two probability distributions that are: Kullbak-Leibler divergence (KL), Jensen-Shannon (JS), Skew divergence, Euclidean, cosine, L1 and confusion, among others....

Dirección de Bibliotecas y Recursos para el Aprendizaje

Universidad de Valparaíso

Normativas

  • Blanco 951, Valparaíso, Chile.

  • 56-32-2603246

  • Política de privacidad