z-logo
open-access-imgOpen Access
Weighted Markov chains and graphic state nodes for information retrieval
Author(s) -
Benoit G.
Publication year - 2002
Publication title -
proceedings of the american society for information science and technology
Language(s) - English
Resource type - Journals
eISSN - 1550-8390
pISSN - 0044-7870
DOI - 10.1002/meet.1450390113
Subject(s) - markov chain , computer science , markov decision process , node (physics) , markov process , state (computer science) , data mining , process (computing) , markov model , information retrieval , theoretical computer science , machine learning , algorithm , mathematics , statistics , engineering , structural engineering , operating system
Decision‐making in uncertain environments, such as data mining, involves a computer user navigating through multiple steps, from initial submission of a query through evaluating retrieval results, determining degrees of acceptability of the results, and advancing to a terminal state of evaluating where the interaction is successful or not. This paper describes iterative information seeking (IS) as a Markov process during which users advance through states of “nodes”. Nodes are graphic objects on a computer screen that represent both the state of the system and the group of users' or an individual user's degree of confidence in an individual node. After examining nodes to establish a confidence level, the system records the decision as weights affecting the probability of the transition paths between nodes. By training the system in this way, the model incorporates into the underlying Markov process users' decisions as a means to reduce uncertainty. The Markov chain becomes a weighted one whereby the IS makes justified suggestions.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here