Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Psicothema ; 21(4): 509-14, 2009 Nov.
Artigo em Espanhol | MEDLINE | ID: mdl-19861090

RESUMO

Criteria to assess the needs in order to plan training programs are not usually defined explicitly in organizational contexts. We propose scaling methods as a feasible and useful procedure to set priorities of training programs; also, we propose the most suitable method for this intervention context. 404 employees from a public organization completed an ad hoc questionnaire to assess training needs in different areas from 2004 to 2006; concretely, 117, 75 and 286 stimuli were scaled, respectively. Then, four scaling methods were compared: Dunn-Rankin's method and three methods derived from Thurstone's Law of Categorical Judgment--ranking, successive intervals and equal-appearing intervals. The feasibility and utility of these scaling methods to solve the problems described is shown. Taking into account the most accurate compared methods, we propose ranking as the most parsimonious method (with regard to procedure simplicity). Future research developments are described.


Assuntos
Pessoal Administrativo/psicologia , Algoritmos , Técnicas de Apoio para a Decisão , Educação/organização & administração , Objetivos , Modelos Teóricos , Psicometria/métodos , Educação Continuada/métodos , Educação Continuada/organização & administração , Educação Continuada/estatística & dados numéricos , Estudos de Viabilidade , Órgãos Governamentais , Humanos , Julgamento , Espanha , Inquéritos e Questionários
2.
Neural Netw ; 9(9): 1561-1567, 1996 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-12662553

RESUMO

The high-order Boltzmann machine (HOBM) approximates probability distributions defined on a set of binary variables, through a learning algorithm that uses Monte Carlo methods. The approximation distribution is a normalized exponential of a consensus function formed by high-degree terms and the structure of the HOBM is given by the set of weighted connections. We prove the convexity of the Kullback-Leibler divergence between the distribution to learn and the approximation distribution of the HOBM. We prove the convergence of the learning algorithm to the strict global minimum of the divergence, which corresponds to the maximum likelihood estimate of the connection weights, establishing the uniqueness of the solution. These theoretical results do not hold in the conventional Boltzmann machine, where the consensus function has first and second-degree terms and hidden units are used. Copyright 1996 Elsevier Science Ltd.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...