back to life, ProbLog-I
This commit is contained in:
parent
a64c6772fc
commit
d7bc8f80ce
@ -347,7 +347,9 @@ This predicate returns the lower and upper bound of the probability of achieving
|
|||||||
This predicate returns the lower bound of the probability of achieving the goal G obtained by cutting the sld tree at the given probability for each branch.
|
This predicate returns the lower bound of the probability of achieving the goal G obtained by cutting the sld tree at the given probability for each branch.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
/**
|
||||||
### ProbLog Parameter Learning Predicates
|
### ProbLog Parameter Learning Predicates
|
||||||
|
*/
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @pred example(+N, +Q, +Prob)
|
* @pred example(+N, +Q, +Prob)
|
||||||
@ -370,6 +372,7 @@ Test examples are ignored during learning but are used afterwards to check the p
|
|||||||
*
|
*
|
||||||
Starts the learning algorithm with N iterations.
|
Starts the learning algorithm with N iterations.
|
||||||
paragraph{}
|
paragraph{}
|
||||||
|
*/
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @pred do_learning(+N, +Epsilon).
|
* @pred do_learning(+N, +Epsilon).
|
||||||
@ -377,7 +380,6 @@ paragraph{}
|
|||||||
The output is created in the output subfolder of the current folder where YAP was started. There you will find the file log.dat which contains MSE on training and test set for every iteration, the timings, and some metrics on the gradient in CSV format. The files factprobs_N.pl contain the fact probabilities after the Nth iteration and the files predictions_N.pl contain the estimated probabilities for each training and test example - per default these file are generated every 5th iteration only.
|
The output is created in the output subfolder of the current folder where YAP was started. There you will find the file log.dat which contains MSE on training and test set for every iteration, the timings, and some metrics on the gradient in CSV format. The files factprobs_N.pl contain the fact probabilities after the Nth iteration and the files predictions_N.pl contain the estimated probabilities for each training and test example - per default these file are generated every 5th iteration only.
|
||||||
|
|
||||||
Starts the learning algorithm. The learning will stop after N iterations or if the difference of the Mean Squared Error (MSE) between two iterations gets smaller than Epsilon - depending on what happens first.
|
Starts the learning algorithm. The learning will stop after N iterations or if the difference of the Mean Squared Error (MSE) between two iterations gets smaller than Epsilon - depending on what happens first.
|
||||||
*/
|
|
||||||
|
|
||||||
### Miscelaneous
|
### Miscelaneous
|
||||||
|
|
||||||
@ -1764,7 +1766,7 @@ export_facts(Filename) :-
|
|||||||
|
|
||||||
is_mvs_aux_fact(A) :-
|
is_mvs_aux_fact(A) :-
|
||||||
functor(A,B,_),
|
functor(A,B,_),
|
||||||
atomic_concat(mvs_fact_,_,B).
|
atom_concat(mvs_fact_,_,B).
|
||||||
|
|
||||||
% code for printing the compiled ADs
|
% code for printing the compiled ADs
|
||||||
print_ad_intern(Handle,(Head<--Body),_ID,Facts) :-
|
print_ad_intern(Handle,(Head<--Body),_ID,Facts) :-
|
||||||
|
@ -222,7 +222,7 @@
|
|||||||
:- use_module(library(rbtrees)).
|
:- use_module(library(rbtrees)).
|
||||||
|
|
||||||
% load our own modules
|
% load our own modules
|
||||||
:- use_module(problog).
|
:- reexport(problog).
|
||||||
:- use_module('problog/logger').
|
:- use_module('problog/logger').
|
||||||
:- use_module('problog/flags').
|
:- use_module('problog/flags').
|
||||||
:- use_module('problog/os').
|
:- use_module('problog/os').
|
||||||
|
Reference in New Issue
Block a user