2014-03-20 15:41:17 +00:00
\input texinfo @c -*- mode: texinfo; coding: utf-8; -*-
@documentencoding UTF-8
2001-04-09 20:54:03 +01:00
@c %**start of header
@setfilename yap.info
@setcontentsaftertitlepage
@settitle YAP Prolog User's Manual
@c For double-sided printing, uncomment:
@c @setchapternewpage odd
@c %**end of header
2014-03-20 15:41:17 +00:00
@set VERSION 6.3.4
2010-10-28 18:05:18 +01:00
@set EDITION 4.2.9
@set UPDATED Oct 2010
2001-04-09 20:54:03 +01:00
@c Index for C-Prolog compatible predicate
@defindex cy
@c Index for predicates not in C-Prolog
@defindex cn
@c Index for predicates sort of (almost) in C-Prolog
@defindex ca
@c Index for SICStus Prolog compatible predicate
@defindex sy
@c Index for predicates not in SICStus Prolog
@defindex sn
@c Index for predicates sort of (almost) in SICStus Prolog
@defindex sa
2014-04-21 11:14:18 +01:00
@alias pl_ example=example
@alias c_ example=example
2001-04-09 20:54:03 +01:00
@setchapternewpage odd
@c @smallbook
@comment %** end of header
2004-11-24 04:13:50 +00:00
@ifnottex
2001-04-09 20:54:03 +01:00
@format
2004-11-24 04:13:50 +00:00
@dircategory The YAP Prolog System
@direntry
2007-02-18 00:26:36 +00:00
* YAP: (yap). YAP Prolog User's Manual.
2004-11-24 04:13:50 +00:00
@end direntry
2001-04-09 20:54:03 +01:00
@end format
2004-11-24 04:13:50 +00:00
@end ifnottex
2001-04-09 20:54:03 +01:00
@titlepage
@title YAP User's Manual
@subtitle Version @value{ VERSION}
2014-03-20 15:41:17 +00:00
@author Vitor Santos Costa,
@author Luís Damas,
2014-04-09 12:39:52 +01:00
@author Rogério Reis
2014-03-20 15:41:17 +00:00
@author Rúben Azevedo
2001-04-09 20:54:03 +01:00
@page
@vskip 2pc
2014-04-09 12:39:52 +01:00
@copyright{ } 1989-2014 L. Damas, V. Santos Costa and Universidade
2001-04-09 20:54:03 +01:00
do Porto.
Permission is granted to make and distribute verbatim copies of
this manual provided the copyright notice and this permission notice
are preserved on all copies.
Permission is granted to copy and distribute modified versions of this
manual under the conditions for verbatim copying, provided that the entire
resulting derived work is distributed under the terms of a permission
notice identical to this one.
Permission is granted to copy and distribute translations of this manual
into another language, under the above conditions for modified versions.
@end titlepage
2004-11-24 04:13:50 +00:00
@ifnottex
2001-04-09 20:54:03 +01:00
@node Top, , , (dir)
@top YAP Prolog
This file documents the YAP Prolog System version @value{ VERSION} , a
high-performance Prolog compiler developed at LIACC, Universidade do
Porto. YAP is based on David H. D. Warren's WAM (Warren Abstract
Machine), with several optimizations for better performance. YAP follows
the Edinburgh tradition, and is largely compatible with DEC-10 Prolog,
Quintus Prolog, and especially with C-Prolog.
2014-04-21 11:14:18 +01:00
@ifplaintext
2014-05-28 00:12:36 +01:00
+ @subpage Install discusses how to download, compile and install YAP for your platform.
+ @subpage Syntax describes the syntax of YAP.
+ @subpage Run describes how to invoke YAP
+ @subpage Syntax describe the syntax of YAP.
+ @subpage Loading_ Programs presents the main predicates and
directives available to load files and to control the Prolog environment.
+ @subpage abs_ file_ name explains how to find a file full path.
+ Built-Ins describes predicates providing core YAP functionality:
+ @subpage page_ arithmetic describes how arithmetic works in YAP.
+ @subpage Control describes the predicates for controlling the execution of Prolog programs.
+ @subpage Testing_ Terms describes the main predicates on terms
+ @subpage Input_ Output goes into Input/Ouput.
+ @subpage Database discusses the clausal data-base
+ @subpage Grammars presents Grammar rules in Prolog that are
both a convenient way to express definite clause grammars and
an extension of the well known context-free grammars.
+ @subpage OS discusses access to Operating System functionality
+ Libraries
+ @ref maplist introduces macros to apply an operation over
all elements of a list
2014-04-21 11:14:18 +01:00
@end ifplaintext
2004-03-05 17:27:53 +00:00
This file contains extracts of the SWI-Prolog manual, as written by Jan
Wielemaker. Our thanks to the author for his kind permission in allowing
us to include his text in this document.
2001-04-09 20:54:03 +01:00
@menu
* Intro:: Introduction
* Install:: Installation
* Run:: Running YAP
* Syntax:: The syntax of YAP
* Loading Programs:: Loading Prolog programs
* Modules:: Using Modules in YAP
2007-02-18 00:26:36 +00:00
* Built-ins:: Built In Predicates
2001-04-09 20:54:03 +01:00
* Library:: Library Predicates
2005-10-31 18:12:51 +00:00
* SWI-Prolog:: SWI-Prolog emulation
2010-03-12 10:19:55 +00:00
* Global Variables :: Global Variables for Prolog
2001-04-09 20:54:03 +01:00
* Extensions:: Extensions to Standard YAP
* Rational Trees:: Working with Rational Trees
2007-02-18 00:26:36 +00:00
* Co-routining:: Changing the Execution of Goals
2001-04-09 20:54:03 +01:00
* Attributed Variables:: Using attributed Variables
2005-11-01 18:19:44 +00:00
* CLPR:: The CLP(R) System
2001-04-09 20:54:03 +01:00
* CHR:: The CHR System
2001-05-21 21:03:51 +01:00
* Logtalk:: The Logtalk Object-Oriented System
2010-09-07 15:51:59 +01:00
* MYDDAS:: The YAP Database Interface
2004-03-05 17:27:53 +00:00
* Threads:: Thread Library
2001-04-09 20:54:03 +01:00
* Parallelism:: Running in Or-Parallel
* Tabling:: Storing Intermediate Solutions of programs
* Low Level Profiling:: Profiling Abstract Machine Instructions
* Low Level Tracing:: Tracing at Abstract Machine Level
* Debugging:: Using the Debugger
* Efficiency:: Efficiency Considerations
* C-Interface:: Interfacing predicates written in C
2007-02-18 00:26:36 +00:00
* YAPLibrary:: Using YAP as a library in other programs
2001-04-09 20:54:03 +01:00
* Compatibility:: Compatibility with other Prolog systems
* Predicate Index:: An item for each predicate
* Concept Index:: An item for each concept
Built In Predicates
* Control:: Controlling the execution of Prolog programs
* Undefined Procedures:: Handling calls to Undefined Procedures
2008-02-22 15:08:37 +00:00
* Messages:: Message Handling in YAP
2001-04-09 20:54:03 +01:00
* Testing Terms:: Predicates on Terms
2007-12-29 12:26:41 +00:00
* Predicates on Atoms:: Manipulating Atoms
* Predicates on Characters:: Manipulating Characters
2001-04-09 20:54:03 +01:00
* Comparing Terms:: Comparison of Terms
* Arithmetic:: Arithmetic in YAP
2014-04-21 11:14:18 +01:00
* Input/Output:: Input/Output with YAP
2001-04-09 20:54:03 +01:00
* Database:: Modifying Prolog's Database
* Sets:: Finding All Possible Solutions
* Grammars:: Grammar Rules
* Preds:: Predicate Information
* OS:: Access to Operating System Functionality
* Term Modification:: Updating Prolog Terms
2007-09-21 15:18:12 +01:00
* Global Variables:: Manipulating Global Variables
2001-04-09 20:54:03 +01:00
* Profiling:: Profiling Prolog Execution
2002-09-03 21:14:13 +01:00
* Call Counting:: Limiting the Maximum Number of Reductions
2001-04-09 20:54:03 +01:00
* Arrays:: Supporting Global and Local Arrays
* Preds:: Information on Predicates
* Misc:: Miscellaneous Predicates
2001-04-16 17:41:04 +01:00
Subnodes of Running
2007-02-18 00:26:36 +00:00
* Running YAP Interactively:: Interacting with YAP
2002-01-23 15:17:56 +00:00
* Running Prolog Files:: Running Prolog files as scripts
2001-04-16 17:41:04 +01:00
2001-04-09 20:54:03 +01:00
Subnodes of Syntax
* Formal Syntax:: Syntax of Terms
* Tokens:: Syntax of Prolog tokens
2007-04-03 16:03:11 +01:00
* Encoding:: How characters are encoded and Wide Character Support
2001-04-09 20:54:03 +01:00
Subnodes of Tokens
* Numbers:: Integer and Floating-Point Numbers
* Strings:: Sequences of Characters
* Atoms:: Atomic Constants
* Variables:: Logical Variables
* Punctuation Tokens:: Tokens that separate other tokens
* Layout:: Comments and Other Layout Rules
Subnodes of Numbers
* Integers:: How Integers are read and represented
* Floats:: Floating Point Numbers
2007-04-03 16:03:11 +01:00
Subnodes of Encoding
* Stream Encoding:: How Prolog Streams can be coded
* BOM:: The Byte Order Mark
2001-04-09 20:54:03 +01:00
Subnodes of Loading Programs
* Compiling:: Program Loading and Updating
* Setting the Compiler:: Changing the compiler's parameters
2007-10-21 09:48:06 +01:00
* Conditional Compilation:: Compiling program fragments
2001-04-09 20:54:03 +01:00
* Saving:: Saving and Restoring Programs
Subnodes of Modules
* Module Concepts:: The Key Ideas in Modules
* Defining Modules:: How To Define a New Module
* Using Modules:: How to Use a Module
* Meta-Predicates in Modules:: How to Handle New Meta-Predicates
2007-12-05 12:17:25 +00:00
* Re-Exporting Modules:: How to Re-export Predicates From Other Modules
2001-04-09 20:54:03 +01:00
Subnodes of Input/Output
* Streams and Files:: Handling Streams and Files
* C-Prolog File Handling:: C-Prolog Compatible File Handling
2014-04-21 11:14:18 +01:00
* Input/Output of Terms:: Input/Output of terms
* Input/Output of Characters:: Input/Output of Characters
* Input/Output for Streams:: Input/Output using Streams
* C-Prolog to Terminal:: C-Prolog compatible Character Input/Output to terminal
* Input/Output Control:: Controlling your Input/Output
2001-04-09 20:54:03 +01:00
* Sockets:: Using Sockets from YAP
Subnodes of Database
* Modifying the Database:: Asserting and Retracting
* Looking at the Database:: Finding out what is in the Data Base
* Database References:: Using Data Base References
* Internal Database:: YAP's Internal Database
* BlackBoard:: Storing and Fetching Terms in the BlackBoard
Subnodes of Library
2010-04-20 23:06:41 +01:00
* Aggregate :: SWI and SICStus compatible aggregate library
2008-05-15 14:41:48 +01:00
* Apply:: SWI-Compatible Apply library.
2001-08-09 19:00:45 +01:00
* Association Lists:: Binary Tree Implementation of Association Lists.
2001-04-09 20:54:03 +01:00
* AVL Trees:: Predicates to add and lookup balanced binary trees.
2013-09-30 15:45:14 +01:00
* BDDs:: Predicates to manipulate BDDs using the CUDD libraries
2013-04-25 15:48:06 +01:00
* Exo Intervals:: Play with the UDI and exo-compilation
2014-03-20 15:41:17 +00:00
* Gecode:: Interface to the gecode constraint library
2001-04-09 20:54:03 +01:00
* Heaps:: Labelled binary tree where the key of each node is less
2001-08-09 19:00:45 +01:00
than or equal to the keys of its children.
2010-08-04 23:26:50 +01:00
* Lambda:: Ulrich Neumerkel's Lambda Library
2012-07-16 16:19:15 +01:00
* DBUsage:: Information bout data base usage.
2008-09-01 02:41:09 +01:00
* LineUtilities:: Line Manipulation Utilities
2001-04-09 20:54:03 +01:00
* Lists:: List Manipulation
2013-09-28 11:09:32 +01:00
* MapArgs:: Apply on Arguments of Compound Terms.
2010-04-20 23:06:41 +01:00
* MapList:: SWI-Compatible Apply library.
2007-07-03 16:24:20 +01:00
* matrix:: Matrix Objects
2007-06-29 02:33:35 +01:00
* MATLAB:: Matlab Interface
2006-08-26 00:22:12 +01:00
* Non-Backtrackable Data Structures:: Queues, Heaps, and Beams.
2001-04-09 20:54:03 +01:00
* Ordered Sets:: Ordered Set Manipulation
* Pseudo Random:: Pseudo Random Numbers
* Queues:: Queue Manipulation
* Random:: Random Numbers
2006-08-02 19:18:31 +01:00
* Read Utilities:: SWI inspired utilities for fast stream scanning.
2002-06-18 05:23:15 +01:00
* Red-Black Trees:: Predicates to add, lookup and delete in red-black binary trees.
2001-04-09 20:54:03 +01:00
* RegExp:: Regular Expression Manipulation
2010-06-17 00:32:52 +01:00
* shlib:: SWI Prolog shlib library
2001-04-09 20:54:03 +01:00
* Splay Trees:: Splay Trees
2014-04-21 11:14:18 +01:00
* String Input/Output:: Writing To and Reading From Strings
2001-05-24 16:26:41 +01:00
* System:: System Utilities
2001-04-09 20:54:03 +01:00
* Terms:: Utilities on Terms
2002-10-11 04:39:11 +01:00
* Cleanup:: Call With registered Cleanup Calls
2001-04-09 20:54:03 +01:00
* Timeout:: Call With Timeout
* Trees:: Updatable Binary Trees
2007-09-16 21:27:57 +01:00
* Tries:: Trie Data Structure
2001-04-09 20:54:03 +01:00
* UGraphs:: Unweighted Graphs
2006-04-10 20:24:52 +01:00
* DGraphs:: Directed Graphs Implemented With Red-Black Trees
* UnDGraphs:: Undirected Graphs Using DGraphs
2006-06-02 05:23:09 +01:00
* LAM:: LAM MPI
2010-12-02 19:57:55 +00:00
* Block Diagram:: Block Diagrams of Prolog code
2006-06-02 05:23:09 +01:00
2001-04-09 20:54:03 +01:00
Subnodes of Debugging
* Deb Preds:: Debugging Predicates
* Deb Interaction:: Interacting with the debugger
Subnodes of Compatibility
* C-Prolog:: Compatibility with the C-Prolog interpreter
* SICStus Prolog:: Compatibility with the Quintus and SICStus Prolog systems
* ISO Prolog:: Compatibility with the ISO Prolog standard
Subnodes of Attributes
* Attribute Declarations:: Declaring New Attributes
* Attribute Manipulation:: Setting and Reading Attributes
* Attributed Unification:: Tuning the Unification Algorithm
* Displaying Attributes:: Displaying Attributes in User-Readable Form
* Projecting Attributes:: Obtaining the Attributes of Interest
* Attribute Examples:: Two Simple Examples of how to use Attributes.
2005-10-31 18:12:51 +00:00
Subnodes of SWI-Prolog
* Invoking Predicates on all Members of a List :: maplist and friends
* SWI-Prolog Global Variables :: Emulating SWI-like attributed variables
2013-09-29 11:31:18 +01:00
Subnodes of Gecode
* The Gecode Interface:: calling gecode from YAP
* Gecode and ClP(FD) :: using gecode in a CLP(FD) style
2005-11-01 18:19:44 +00:00
@c Subnodes of CLP(Q,R)
@c * Introduction to CLPQ:: The CLP(Q,R) System
@c * Referencing CLPQR:: How to Reference CLP(Q,R)
@c * CLPQR Acknowledgments:: Acknowledgments for CLP(Q,R)
@c * Solver Interface:: Using the CLP(Q,R) System
@c * Notational Conventions:: The CLP(Q,R) Notation
@c * Solver Predicates:: The CLP(Q,R) Interface Predicates
@c * Unification:: Unification and CLP(Q,R)
@c * Feedback and Bindings:: Information flow in CLP(Q,R)
@c * Linearity and Nonlinear Residues:: Linear and Nonlinear Constraints
@c * How Nonlinear Residues are made to disappear:: Handling Nonlinear Residues
@c * Isolation Axioms:: Isolating the Variable to be Solved
@c * Numerical Precision and Rationals:: Reals and Rationals
@c * Projection and Redundancy Elimination:: Presenting Bindings for Query Variables
@c * Variable Ordering:: Linear Relationships between Variables
@c * Turning Answers into Terms:: using @code{ call_ residue/2}
@c * Projecting Inequalities:: How to project linear inequations
@c * Why Disequations:: Using Disequations in CLP(Q,R)
@c * Syntactic Sugar:: An easier syntax
@c * Monash Examples:: The Monash Library
@c * Compatibility Notes:: CLP(Q,R) and the clp(R) interpreter
@c * A Mixed Integer Linear Optimization Example:: MIP models
@c * Implementation Architecture:: CLP(Q,R) Components
@c * Fragments and Bits:: Final Last Words on CLP(Q,R)
@c * CLPQR Bugs:: Bugs in CLP(Q,R)
@c * CLPQR References:: References for CLP(Q,R)
Subnodes of CLPR
* CLPR Solver Predicates::
* CLPR Syntax::
* CLPR Unification::
* CLPR Non-linear Constraints::
2001-04-09 20:54:03 +01:00
Subnodes of CHR
* CHR Introduction::
2005-11-01 18:19:44 +00:00
* CHR Syntax and Semantics::
* CHR in YAP Programs::
2001-04-09 20:54:03 +01:00
* CHR Debugging::
2005-11-01 18:19:44 +00:00
* CHR Examples::
* CHR Compatibility::
* CHR Guidelines::
2001-04-09 20:54:03 +01:00
2001-05-24 16:26:41 +01:00
Subnodes of C-Interface
* Manipulating Terms:: Primitives available to the C programmer
2007-12-05 12:17:25 +00:00
* Manipulating Terms:: Primitives available to the C programmer
2001-05-24 16:26:41 +01:00
* Unifying Terms:: How to Unify Two Prolog Terms
* Manipulating Strings:: From character arrays to Lists of codes and back
2007-02-18 00:26:36 +00:00
* Memory Allocation:: Stealing Memory From YAP
* Controlling Streams:: Control How YAP sees Streams
2010-08-02 19:48:17 +01:00
* Utility Functions:: From character arrays to Lists of codes and back
2007-02-18 00:26:36 +00:00
* Calling YAP From C:: From C to YAP to C to YAP
2007-12-05 12:17:25 +00:00
* Module Manipulation in C:: Create and Test Modules from within C
2010-08-31 04:25:56 +01:00
* Miscellaneous C-Functions:: Other Helpful Interface Functions
2001-05-24 16:26:41 +01:00
* Writing C:: Writing Predicates in C
* Loading Objects:: Loading Object Files
2007-02-18 00:26:36 +00:00
* Save& Rest:: Saving and Restoring
* YAP4 Notes:: Changes in Foreign Predicates Interface
2001-05-24 16:26:41 +01:00
2001-04-09 20:54:03 +01:00
Subnodes of C-Prolog
2005-10-31 18:12:51 +00:00
* Major Differences with C-Prolog:: Major Differences between YAP and C-Prolog
2007-02-18 00:26:36 +00:00
* Fully C-Prolog Compatible:: YAP predicates fully compatible with
2001-04-09 20:54:03 +01:00
C-Prolog
2007-02-18 00:26:36 +00:00
* Not Strictly C-Prolog Compatible:: YAP predicates not strictly as C-Prolog
* Not in C-Prolog:: YAP predicates not available in C-Prolog
2001-04-09 20:54:03 +01:00
* Not in YAP:: C-Prolog predicates not available in YAP
Subnodes of SICStus Prolog
2005-10-31 18:12:51 +00:00
* Major Differences with SICStus:: Major Differences between YAP and SICStus Prolog
2007-02-18 00:26:36 +00:00
* Fully SICStus Compatible:: YAP predicates fully compatible with
2001-04-09 20:54:03 +01:00
SICStus Prolog
2007-02-18 00:26:36 +00:00
* Not Strictly SICStus Compatible:: YAP predicates not strictly as
2001-04-09 20:54:03 +01:00
SICStus Prolog
2007-02-18 00:26:36 +00:00
* Not in SICStus Prolog:: YAP predicates not available in SICStus Prolog
2001-04-09 20:54:03 +01:00
Tables
* Operators:: Predefined operators
@end menu
2004-11-24 04:13:50 +00:00
@end ifnottex
2001-04-09 20:54:03 +01:00
@node Intro, Install, , Top
2014-04-21 11:14:18 +01:00
@section Introduction
2001-04-09 20:54:03 +01:00
This document provides User information on version @value{ VERSION} of
2007-02-18 00:26:36 +00:00
YAP (@emph{ Yet Another Prolog} ). The YAP Prolog System is a
2001-04-09 20:54:03 +01:00
high-performance Prolog compiler developed at LIACC, Universidade do
Porto. YAP provides several important features:
@itemize @bullet
2007-02-18 00:26:36 +00:00
@item Speed: YAP is widely considered one of the fastest available
Prolog systems.
2001-04-09 20:54:03 +01:00
2014-04-21 11:14:18 +01:00
@item Functionality: it supports stream Input/Output, sockets, modules,
2001-04-09 20:54:03 +01:00
exceptions, Prolog debugger, C-interface, dynamic code, internal
2008-02-02 03:35:35 +00:00
database, DCGs, saved states, co-routining, arrays, threads.
2001-04-09 20:54:03 +01:00
@item We explicitly allow both commercial and non-commercial use of YAP.
@end itemize
YAP is based on the David H. D. Warren's WAM (Warren Abstract Machine),
with several optimizations for better performance. YAP follows the
Edinburgh tradition, and was originally designed to be largely
compatible with DEC-10 Prolog, Quintus Prolog, and especially with
C-Prolog.
YAP implements most of the ISO-Prolog standard. We are striving at
full compatibility, and the manual describes what is still
missing. The manual also includes a (largely incomplete) comparison
with SICStus Prolog.
The document is intended neither as an introduction to Prolog nor to the
implementation aspects of the compiler. A good introduction to
2014-03-27 15:34:25 +00:00
programming in Prolog is the book @cite{ TheArtOfProlog} , by
2001-04-09 20:54:03 +01:00
L. Sterling and E. Shapiro, published by "The MIT Press, Cambridge
2014-03-27 15:34:25 +00:00
MA". Other references should include the classical @cite{ ProgrammingInProlog} , by W.F. Clocksin and C.S. Mellish, published by
2001-04-09 20:54:03 +01:00
Springer-Verlag.
2005-10-28 18:55:30 +01:00
YAP 4.3 is known to build with many versions of gcc (<= gcc-2.7.2, >=
2001-04-09 20:54:03 +01:00
gcc-2.8.1, >= egcs-1.0.1, gcc-2.95.*) and on a variety of Unixen:
SunOS 4.1, Solaris 2.*, Irix 5.2, HP-UX 10, Dec Alpha Unix, Linux 1.2
and Linux 2.* (RedHat 4.0 thru 5.2, Debian 2.*) in both the x86 and
2005-10-28 18:55:30 +01:00
alpha platforms. It has been built on Windows NT 4.0 using Cygwin from
2014-03-27 15:34:25 +00:00
Cygnus Solutions (see @file{ README.nt} ) and using Visual C++ 6.0.
2001-04-09 20:54:03 +01:00
2005-10-28 18:55:30 +01:00
The overall copyright and permission notice for YAP4.3 can be found in
2001-04-09 20:54:03 +01:00
the Artistic file in this directory. YAP follows the Perl Artistic
license, and it is thus non-copylefted freeware.
If you have a question about this software, desire to add code, found a
bug, want to request a feature, or wonder how to get further assistance,
2007-02-18 00:26:36 +00:00
please send e-mail to @email{ yap-users AT lists.sourceforge.net} . To
subscribe to the mailing list, visit the page
@url{ https://lists.sourceforge.net/lists/listinfo/yap-users} .
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
On-line documentation is available for YAP at:
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
@url{ http://www.ncc.up.pt/~vsc/YAP/}
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
Recent versions of YAP, including both source and selected binaries,
2001-04-09 20:54:03 +01:00
can be found from this same URL.
2014-04-09 12:39:52 +01:00
This manual was written by Vítor Santos Costa,
Luís Damas, Rogério Reis, and Rúben Azevedo. The
2001-04-09 20:54:03 +01:00
manual is largely based on the DECsystem-10 Prolog User's Manual by
D.L. Bowen, L. Byrd, F. C. N. Pereira, L. M. Pereira, and
D. H. D. Warren. We have also used comments from the Edinburgh Prolog
2014-04-09 12:39:52 +01:00
library written by R. O'Keefe and from the SWI-Prolog manual written by
Jan Wielemaker. We would also like to gratefully
2001-04-09 20:54:03 +01:00
acknowledge the contributions from Ashwin Srinivasian.
We are happy to include in YAP several excellent packages developed
under separate licenses. Our thanks to the authors for their kind
2002-10-11 04:39:11 +01:00
authorization to include these packages.
2001-04-09 20:54:03 +01:00
2001-04-16 17:41:04 +01:00
The packages are, in alphabetical order:
2001-04-09 20:54:03 +01:00
@itemize @bullet
2005-11-01 18:19:44 +00:00
@item The CHR package developed by Tom Schrijvers,
Christian Holzbaur, and Jan Wielemaker.
2001-04-09 20:54:03 +01:00
2006-12-30 14:50:27 +00:00
@item The CLP(R) package developed by Leslie De Koninck, Bart Demoen, Tom
Schrijvers, and Jan Wielemaker, based on the CLP(Q,R) implementation
2009-04-25 16:59:23 +01:00
by Christian Holzbaur.
2001-04-09 20:54:03 +01:00
2001-05-21 21:03:51 +01:00
@item The Logtalk Object-Oriented system is developed at the University
2006-12-30 14:50:27 +00:00
of Beira Interior, Portugal, by Paulo Moura:
2001-05-21 21:03:51 +01:00
2006-12-30 14:50:27 +00:00
@url{ http://logtalk.org/}
2008-06-07 11:11:44 +01:00
Logtalk is no longer distributed with YAP. Please use the Logtalk standalone
installer for a smooth integration with YAP.
2001-05-21 21:03:51 +01:00
@item The Pillow WEB library developed at Universidad Politecnica de
2001-04-09 20:54:03 +01:00
Madrid by the CLIP group. This package is distributed under the FSF's
LGPL. Documentation on this package is distributed separately from
yap.tex.
2002-05-24 01:13:15 +01:00
2014-03-27 15:34:25 +00:00
@item The @file{ yap2swi} library implements some of the functionality of
2002-05-24 01:13:15 +01:00
SWI's PL interface. Please do refer to the SWI-Prolog home page:
2005-11-01 18:19:44 +00:00
@url{ http://www.swi-prolog.org}
2002-05-24 01:13:15 +01:00
for more information on SWI-Prolog and for a detailed description of its
2006-12-30 14:50:27 +00:00
foreign language interface.
2002-05-24 01:13:15 +01:00
2001-04-09 20:54:03 +01:00
@end itemize
2014-03-27 15:34:25 +00:00
@include install.tex
2001-04-09 20:54:03 +01:00
2014-03-27 15:34:25 +00:00
@include run.tex
2001-04-09 20:54:03 +01:00
2014-03-27 15:34:25 +00:00
@include syntax.tex
2007-09-21 15:18:12 +01:00
2014-03-27 15:34:25 +00:00
@include load.tex
2007-09-22 09:38:05 +01:00
2014-03-27 15:34:25 +00:00
@include builtins.tex
2010-02-26 23:13:22 +00:00
2007-02-18 00:26:36 +00:00
@node Library, SWI-Prolog, Built-ins, Top
2001-04-09 20:54:03 +01:00
@chapter Library Predicates
Library files reside in the library_ directory path (set by the
@code{ LIBDIR} variable in the Makefile for YAP). Currently,
most files in the library are from the Edinburgh Prolog library.
@menu
2007-02-18 00:26:36 +00:00
Library, Extensions, Built-ins, Top
2010-04-20 23:06:41 +01:00
* Aggregate :: SWI and SICStus compatible aggregate library
2008-05-15 14:41:48 +01:00
* Apply:: SWI-Compatible Apply library.
2001-08-09 19:00:45 +01:00
* Association Lists:: Binary Tree Implementation of Association Lists.
2001-04-09 20:54:03 +01:00
* AVL Trees:: Predicates to add and lookup balanced binary trees.
2013-09-30 15:45:14 +01:00
* BDDs:: Predicates to manipulate BDDs using the CUDD libraries
2010-12-02 19:57:55 +00:00
* Block Diagram:: Block Diagrams of Prolog code
2007-09-16 21:27:57 +01:00
* Cleanup:: Call With registered Cleanup Calls
* DGraphs:: Directed Graphs Implemented With Red-Black Trees
2013-04-25 15:48:06 +01:00
* Exo Intervals:: Play with the UDI and exo-compilation
2014-03-20 15:41:17 +00:00
* Gecode:: Interface to the gecode constraint library
2001-04-09 20:54:03 +01:00
* Heaps:: Labelled binary tree where the key of each node is less
2001-08-09 19:00:45 +01:00
than or equal to the keys of its children.
2007-09-16 21:27:57 +01:00
* LAM:: LAM MPI
2010-08-04 23:26:50 +01:00
* Lambda:: Ulrich Neumerkel's Lambda Library
2012-07-16 16:19:15 +01:00
* DBUsage:: Information bout data base usage.
2001-04-09 20:54:03 +01:00
* Lists:: List Manipulation
2008-09-01 02:41:09 +01:00
* LineUtilities:: Line Manipulation Utilities
2013-09-28 11:09:32 +01:00
* MapArgs:: Apply on Arguments of Compound Terms.
2010-04-20 23:06:41 +01:00
* MapList:: SWI-Compatible Apply library.
2007-07-03 16:24:20 +01:00
* matrix:: Matrix Objects
2007-06-29 02:33:35 +01:00
* MATLAB:: Matlab Interface
2006-08-26 00:22:12 +01:00
* Non-Backtrackable Data Structures:: Queues, Heaps, and Beams.
2001-04-09 20:54:03 +01:00
* Ordered Sets:: Ordered Set Manipulation
* Pseudo Random:: Pseudo Random Numbers
* Queues:: Queue Manipulation
* Random:: Random Numbers
2006-08-02 19:18:31 +01:00
* Read Utilities:: SWI inspired utilities for fast stream scanning.
2002-06-18 05:23:15 +01:00
* Red-Black Trees:: Predicates to add, lookup and delete in red-black binary trees.
2001-04-09 20:54:03 +01:00
* RegExp:: Regular Expression Manipulation
2010-06-17 00:32:52 +01:00
* shlib:: SWI Prolog shlib library
2001-04-09 20:54:03 +01:00
* Splay Trees:: Splay Trees
2014-04-21 11:14:18 +01:00
* String Input/Output:: Writing To and Reading From Strings
2001-05-24 16:26:41 +01:00
* System:: System Utilities
2001-04-09 20:54:03 +01:00
* Terms:: Utilities on Terms
* Timeout:: Call With Timeout
* Trees:: Updatable Binary Trees
2007-09-16 21:27:57 +01:00
* Tries:: Trie Data Structure
2001-04-09 20:54:03 +01:00
* UGraphs:: Unweighted Graphs
2006-04-10 20:24:52 +01:00
* UnDGraphs:: Undirected Graphs Using DGraphs
2001-04-09 20:54:03 +01:00
@end menu
2006-06-02 05:23:09 +01:00
2001-04-09 20:54:03 +01:00
2010-04-20 23:06:41 +01:00
@node Aggregate, Apply, , Library
@section Aggregate
@cindex aggregate
This is the SWI-Prolog library based on the Quintus and SICStus 4
2010-10-28 18:05:18 +01:00
library. @c To be done - Analysing the aggregation template
2010-04-20 23:06:41 +01:00
@c and compiling a predicate for the list aggregation can be done at
@c compile time. - aggregate_ all/3 can be rewritten to run in constant
@c space using non-backtrackable assignment on a term.
2008-05-15 14:41:48 +01:00
2010-04-20 23:06:41 +01:00
This library provides aggregating operators over the solutions of a
predicate. The operations are a generalisation of the @code{ bagof/3} ,
@code{ setof/3} and @code{ findall/3} built-in predicates. The defined
aggregation operations are counting, computing the sum, minimum,
maximum, a bag of solutions and a set of solutions. We first give a
simple example, computing the country with the smallest area:
2001-08-09 19:00:45 +01:00
2014-04-21 11:14:18 +01:00
@pl_ example
2010-04-20 23:06:41 +01:00
smallest_ country(Name, Area) :-
aggregate(min(A, N), country(N, A), min(Area, Name)).
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
There are four aggregation predicates, distinguished on two properties.
2001-08-09 19:00:45 +01:00
@table @code
2008-07-16 11:45:47 +01:00
2010-04-20 23:06:41 +01:00
@item aggregate vs. aggregate_ all
The aggregate predicates use setof/3 (aggregate/4) or bagof/3
(aggregate/3), dealing with existential qualified variables
(@var{ Var} /\@ var{ Goal} ) and providing multiple solutions for the
remaining free variables in @var{ Goal} . The aggregate_ all/3
predicate uses findall/3, implicitly qualifying all free variables
and providing exactly one solution, while aggregate_ all/4 uses
sort/2 over solutions and Distinguish (see below) generated using
findall/3.
@item The @var{ Distinguish} argument
The versions with 4 arguments provide a @var{ Distinguish} argument
that allow for keeping duplicate bindings of a variable in the
result. For example, if we wish to compute the total population of
all countries we do not want to lose results because two countries
have the same population. Therefore we use:
2001-08-09 19:00:45 +01:00
2014-04-21 11:14:18 +01:00
@pl_ example
2010-04-20 23:06:41 +01:00
aggregate(sum(P), Name, country(Name, P), Total)
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
@end table
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
All aggregation predicates support the following operator below in
@var{ Template} . In addition, they allow for an arbitrary named compound
term where each of the arguments is a term from the list below. I.e. the
term @code{ r(min(X), max(X))} computes both the minimum and maximum
binding for @var{ X} .
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
@table @code
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
@item count
Count number of solutions. Same as @code{ sum(1)} .
@item sum(@var{ Expr} )
Sum of @var{ Expr} for all solutions.
@item min(@var{ Expr} )
Minimum of @var{ Expr} for all solutions.
@item min(@var{ Expr} , @var{ Witness} )
A term min(@var{ Min} , @var{ Witness} ), where @var{ Min} is the minimal version of @var{ Expr}
over all Solution and @var{ Witness} is any other template applied to
Solution that produced @var{ Min} . If multiple solutions provide the same
minimum, @var{ Witness} corresponds to the first solution.
@item max(@var{ Expr} )
Maximum of @var{ Expr} for all solutions.
@item max(@var{ Expr} , @var{ Witness} )
As min(@var{ Expr} , @var{ Witness} ), but producing the maximum result.
@item set(@var{ X} )
An ordered set with all solutions for @var{ X} .
@item bag(@var{ X} )
A list of all solutions for @var{ X} .
@end table
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
The predicates are:
@table @code
2008-05-15 14:41:48 +01:00
2010-04-20 23:06:41 +01:00
@item [nondet]aggregate(+@var{ Template} , :@var{ Goal} , -@var{ Result} )
@findex aggregate/3
@syindex aggregate/3
@cnindex aggregate/3
Aggregate bindings in @var{ Goal} according to @var{ Template} . The
aggregate/3 version performs bagof/3 on @var{ Goal} .
@item [nondet]aggregate(+@var{ Template} , +@var{ Discriminator} , :@var{ Goal} , -@var{ Result} )
@findex aggregate/4
@syindex aggregate/4
@cnindex aggregate/4
Aggregate bindings in @var{ Goal} according to @var{ Template} . The
aggregate/3 version performs setof/3 on @var{ Goal} .
@item [semidet]aggregate_ all(+@var{ Template} , :@var{ Goal} , -@var{ Result} )
@findex aggregate_ all/3
@syindex aggregate_ all/3
@cnindex aggregate_ all/3
Aggregate bindings in @var{ Goal} according to @var{ Template} . The
aggregate_ all/3 version performs findall/3 on @var{ Goal} .
@item [semidet]aggregate_ all(+@var{ Template} , +@var{ Discriminator} , :@var{ Goal} , -@var{ Result} )
@findex aggregate_ all/4
@syindex aggregate_ all/4
@cnindex aggregate_ all/4
Aggregate bindings in @var{ Goal} according to @var{ Template} . The
aggregate_ all/3 version performs findall/3 followed by sort/2 on
@var{ Goal} .
@item foreach(:Generator, :@var{ Goal} )
@findex foreach/2
@syindex foreach/2
@cnindex foreach/2
True if the conjunction of instances of @var{ Goal} using the
bindings from Generator is true. Unlike forall/2, which runs a
failure-driven loop that proves @var{ Goal} for each solution of
Generator, foreach creates a conjunction. Each member of the
conjunction is a copy of @var{ Goal} , where the variables it shares
with Generator are filled with the values from the corresponding
solution.
The implementation executes forall/2 if @var{ Goal} does not contain
any variables that are not shared with Generator.
Here is an example:
2014-04-21 11:14:18 +01:00
@pl_ example
2010-04-20 23:06:41 +01:00
?- foreach(between(1,4,X), dif(X,Y)), Y = 5.
Y = 5
?- foreach(between(1,4,X), dif(X,Y)), Y = 3.
No
2014-04-21 11:14:18 +01:00
@end pl_ example
2010-04-20 23:06:41 +01:00
2010-10-04 19:35:22 +01:00
Notice that @var{ Goal} is copied repeatedly, which may cause
2010-04-20 23:06:41 +01:00
problems if attributed variables are involved.
@item [det]free_ variables(:Generator, +@var{ Template} , +VarList0, -VarList)
@findex free_ variables/4
@syindex free_ variables/4
@cnindex free_ variables/4
In order to handle variables properly, we have to find all the universally quantified variables in the Generator. All variables as yet unbound are universally quantified, unless
2008-05-15 14:41:48 +01:00
2010-04-20 23:06:41 +01:00
@enumerate
@item they occur in the template
@item they are bound by X/\P , setof, or bagof
@end enumerate
2008-05-15 14:41:48 +01:00
2010-04-20 23:06:41 +01:00
@code{ free_ variables(Generator, Template, OldList, NewList)} finds this set, using OldList as an accumulator.
@end table
2008-05-15 14:41:48 +01:00
2010-04-20 23:06:41 +01:00
The original author of this code was Richard O'Keefe. Jan Wielemaker
made some SWI-Prolog enhancements, sponsored by SecuritEase,
http://www.securitease.com. The code is public domain (from DEC10 library).
@c To be done
@c - Distinguish between control-structures and data terms.
@c - Exploit our built-in term_ variables/2 at some places?
2008-05-15 14:41:48 +01:00
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
@node Apply, Association Lists, Aggregate, Library
2014-04-09 14:00:54 +01:00
@section Apply Macros
2010-04-20 23:06:41 +01:00
@cindex apply
2001-08-09 19:00:45 +01:00
2010-04-20 23:06:41 +01:00
This library provides a SWI-compatible set of utilities for applying a
predicate to all elements of a list. The library just forwards
definitions from the @code{ maplist} library.
2001-08-09 19:00:45 +01:00
2006-06-02 05:23:09 +01:00
2010-04-20 23:06:41 +01:00
@node Association Lists, AVL Trees, Apply, Library
2001-04-09 20:54:03 +01:00
@section Association Lists
@cindex association list
The following association list manipulation predicates are available
2006-04-10 20:24:52 +01:00
once included with the @code{ use_ module(library(assoc))} command. The
original library used Richard O'Keefe's implementation, on top of
unbalanced binary trees. The current code utilises code from the
red-black trees library and emulates the SICStus Prolog interface.
2001-04-09 20:54:03 +01:00
@table @code
@item assoc_ to_ list(+@var{ Assoc} ,?@var{ List} )
@findex assoc_ to_ list/2
@syindex assoc_ to_ list/2
@cnindex assoc_ to_ list/2
Given an association list @var{ Assoc} unify @var{ List} with a list of
the form @var{ Key-Val} , where the elements @var{ Key} are in ascending
order.
2006-04-10 20:24:52 +01:00
@item del_ assoc(+@var{ Key} , +@var{ Assoc} , ?@var{ Val} , ?@var{ NewAssoc} )
@findex del_ assoc/4
@syindex del_ assoc/4
@cnindex del_ assoc/4
Succeeds if @var{ NewAssoc} is an association list, obtained by removing
the element with @var{ Key} and @var{ Val} from the list @var{ Assoc} .
@item del_ max_ assoc(+@var{ Assoc} , ?@var{ Key} , ?@var{ Val} , ?@var{ NewAssoc} )
@findex del_ max_ assoc/4
@syindex del_ max_ assoc/4
@cnindex del_ max_ assoc/4
Succeeds if @var{ NewAssoc} is an association list, obtained by removing
the largest element of the list, with @var{ Key} and @var{ Val} from the
list @var{ Assoc} .
@item del_ min_ assoc(+@var{ Assoc} , ?@var{ Key} , ?@var{ Val} , ?@var{ NewAssoc} )
@findex del_ min_ assoc/4
@syindex del_ min_ assoc/4
@cnindex del_ min_ assoc/4
Succeeds if @var{ NewAssoc} is an association list, obtained by removing
the smallest element of the list, with @var{ Key} and @var{ Val}
from the list @var{ Assoc} .
2001-04-09 20:54:03 +01:00
@item empty_ assoc(+@var{ Assoc} )
@findex empty_ assoc/1
@syindex empty_ assoc/1
@cnindex empty_ assoc/1
Succeeds if association list @var{ Assoc} is empty.
@item gen_ assoc(+@var{ Assoc} ,?@var{ Key} ,?@var{ Value} )
@findex gen_ assoc/3
@syindex gen_ assoc/3
@cnindex gen_ assoc/3
Given the association list @var{ Assoc} , unify @var{ Key} and @var{ Value}
with two associated elements. It can be used to enumerate all elements
in the association list.
@item get_ assoc(+@var{ Key} ,+@var{ Assoc} ,?@var{ Value} )
2006-04-10 20:24:52 +01:00
@findex get_ next_ assoc/4
@syindex get_ next_ assoc/4
@cnindex get_ next_ assoc/4
2001-04-09 20:54:03 +01:00
If @var{ Key} is one of the elements in the association list @var{ Assoc} ,
return the associated value.
@item get_ assoc(+@var{ Key} ,+@var{ Assoc} ,?@var{ Value} ,+@var{ NAssoc} ,?@var{ NValue} )
@findex get_ assoc/5
@syindex get_ assoc/5
@cnindex get_ assoc/5
If @var{ Key} is one of the elements in the association list @var{ Assoc} ,
return the associated value @var{ Value} and a new association list
@var{ NAssoc} where @var{ Key} is associated with @var{ NValue} .
2006-04-10 20:24:52 +01:00
@item get_ prev_ assoc(+@var{ Key} ,+@var{ Assoc} ,?@var{ Next} ,?@var{ Value} )
@findex get_ prev_ assoc/4
@syindex get_ prev_ assoc/4
@cnindex get_ prev_ assoc/4
If @var{ Key} is one of the elements in the association list @var{ Assoc} ,
return the previous key, @var{ Next} , and its value, @var{ Value} .
@item get_ next_ assoc(+@var{ Key} ,+@var{ Assoc} ,?@var{ Next} ,?@var{ Value} )
@findex get_ assoc/3
@syindex get_ assoc/3
@cnindex get_ assoc/3
If @var{ Key} is one of the elements in the association list @var{ Assoc} ,
return the next key, @var{ Next} , and its value, @var{ Value} .
@item is_ assoc(+@var{ Assoc} )
@findex is_ assoc/1
@syindex is_ assoc/1
@cnindex is_ assoc/1
Succeeds if @var{ Assoc} is an association list, that is, if it is a
red-black tree.
2001-04-09 20:54:03 +01:00
@item list_ to_ assoc(+@var{ List} ,?@var{ Assoc} )
@findex list_ to_ assoc/2
@syindex list_ to_ assoc/2
@cnindex list_ to_ assoc/2
Given a list @var{ List} such that each element of @var{ List} is of the
form @var{ Key-Val} , and all the @var{ Keys} are unique, @var{ Assoc} is
the corresponding association list.
2006-04-10 20:24:52 +01:00
@item map_ assoc(+@var{ Pred} ,+@var{ Assoc} )
@findex map_ assoc/2
@syindex map_ assoc/2
@cnindex map_ assoc/2
Succeeds if the unary predicate name @var{ Pred} (@var{ Val} ) holds for every
element in the association list.
2001-04-09 20:54:03 +01:00
@item map_ assoc(+@var{ Pred} ,+@var{ Assoc} ,?@var{ New} )
@findex map_ assoc/3
@syindex map_ assoc/3
@cnindex map_ assoc/3
Given the binary predicate name @var{ Pred} and the association list
@var{ Assoc} , @var{ New} in an association list with keys in @var{ Assoc} ,
and such that if @var{ Key-Val} is in @var{ Assoc} , and @var{ Key-Ans} is in
@var{ New} , then @var{ Pred} (@var{ Val} ,@var{ Ans} ) holds.
2006-04-10 20:24:52 +01:00
@item max_ assoc(+@var{ Assoc} ,-@var{ Key} ,?@var{ Value} )
@findex max_ assoc/3
@syindex max_ assoc/3
@cnindex max_ assoc/3
Given the association list
@var{ Assoc} , @var{ Key} in the largest key in the list, and @var{ Value}
the associated value.
@item min_ assoc(+@var{ Assoc} ,-@var{ Key} ,?@var{ Value} )
@findex min_ assoc/3
@syindex min_ assoc/3
@cnindex min_ assoc/3
Given the association list
@var{ Assoc} , @var{ Key} in the smallest key in the list, and @var{ Value}
the associated value.
2001-04-09 20:54:03 +01:00
@item ord_ list_ to_ assoc(+@var{ List} ,?@var{ Assoc} )
@findex ord_ list_ to_ assoc/2
@syindex ord_ list_ to_ assoc/2
@cnindex ord_ list_ to_ assoc/2
Given an ordered list @var{ List} such that each element of @var{ List} is
of the form @var{ Key-Val} , and all the @var{ Keys} are unique, @var{ Assoc} is
the corresponding association list.
@item put_ assoc(+@var{ Key} ,+@var{ Assoc} ,+@var{ Val} ,+@var{ New} )
@findex put_ assoc/4
@syindex put_ assoc/4
@cnindex put_ assoc/4
The association list @var{ New} includes and element of association
@var{ key} with @var{ Val} , and all elements of @var{ Assoc} that did not
have key @var{ Key} .
@end table
2013-04-25 15:48:06 +01:00
@node AVL Trees, Exo Intervals, Association Lists, Library
2001-04-09 20:54:03 +01:00
@section AVL Trees
@cindex AVL trees
AVL trees are balanced search binary trees. They are named after their
inventors, Adelson-Velskii and Landis, and they were the first
dynamically balanced trees to be proposed. The YAP AVL tree manipulation
predicates library uses code originally written by Martin van Emdem and
published in the Logic Programming Newsletter, Autumn 1981. A bug in
this code was fixed by Philip Vasey, in the Logic Programming
Newsletter, Summer 1982. The library currently only includes routines to
2002-06-18 05:23:15 +01:00
insert and lookup elements in the tree. Please try red-black trees if
you need deletion.
2001-04-09 20:54:03 +01:00
@table @code
2006-10-19 16:09:03 +01:00
@item avl_ new(+@var{ T} )
@findex avl_ new/1
@snindex avl_ new/1
@cnindex avl_ new/1
Create a new tree.
@item avl_ insert(+@var{ Key} ,?@var{ Value} ,+@var{ T0} ,-@var{ TF} )
2001-04-09 20:54:03 +01:00
@findex avl_ insert/4
@snindex avl_ insert/4
@cnindex avl_ insert/4
Add an element with key @var{ Key} and @var{ Value} to the AVL tree
@var{ T0} creating a new AVL tree @var{ TF} . Duplicated elements are
allowed.
@item avl_ lookup(+@var{ Key} ,-@var{ Value} ,+@var{ T} )
@findex avl_ lookup/3
@snindex avl_ lookup/3
@cnindex avl_ lookup/3
Lookup an element with key @var{ Key} in the AVL tree
@var{ T} , returning the value @var{ Value} .
@end table
2013-09-29 11:31:18 +01:00
@node Exo Intervals, Gecode, AVL Trees, Library
@section Exo Intervals
@cindex Indexing Numeric Intervals in Exo-predicates
This package assumes you use exo-compilation, that is, that you loaded
the pedicate using the @code{ exo} option to @code{ load_ files/2} , In this
case, YAP includes a package for improved search on intervals of
integers.
The package is activated by @code{ udi} declarations that state what is
the argument of interest:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-04-25 15:48:06 +01:00
:- udi(diagnoses(exo_ interval,?,?)).
:- load_ files(db, [consult(exo)]).
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-04-25 15:48:06 +01:00
It is designed to optimise the following type of queries:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-04-29 22:19:43 +01:00
?- max(X, diagnoses(X, 9, Y), X).
2013-04-25 15:48:06 +01:00
2013-04-29 22:19:43 +01:00
?- min(X, diagnoses(X, 9, 36211117), X).
2013-04-25 15:48:06 +01:00
2013-04-29 22:19:43 +01:00
?- X #< Y, min(X, diagnoses(X, 9, 36211117), X ), diagnoses(Y, 9, _ ).
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-04-25 15:48:06 +01:00
The first argument gives the time, the second the patient, and the
third the condition code. The first query should find the last time
the patient 9 had any code reported, the second looks for the first
report of code 36211117, and the last searches for reports after this
2013-09-29 11:31:18 +01:00
one. All queries run in constant or log(n) time.
2014-03-20 15:41:17 +00:00
@node Gecode, Heaps, Exo Intervals, Library
2013-09-29 11:31:18 +01:00
@section Gecode Interface
@cindex gecode
The gecode library intreface was designed and implemented by Denis
2014-03-20 15:41:17 +00:00
Duchier, with recent work by Vítor Santos Costa to port it to version 4
2013-09-29 11:31:18 +01:00
of gecode and to have an higher level interface,
@menu
* The Gecode Interface:: calling gecode from YAP
* Gecode and ClP(FD) :: using gecode in a CLP(FD) style
@end menu
@node The Gecode Interface, ,Gecode and ClP(FD), Gecode
@subsection The Gecode Interface
This text is due to Denys Duchier. The gecode interface requires
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
:- use_ module(library(gecode)).
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Several example programs are available with the distribution.
@table @code
@item CREATING A SPACE
A space is gecodes data representation for a store of constraints:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Space := space
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
@item CREATING VARIABLES
Unlike in Gecode, variable objects are not bound to a specific Space. Each one
actually contains an index with which it is possible to access a Space-bound
Gecode variable. Variables can be created using the following expressions:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
IVar := intvar(Space,SPEC...)
BVar := boolvar(Space)
SVar := setvar(Space,SPEC...)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
where SPEC... is the same as in Gecode. For creating lists of variables use
the following variants:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
IVars := intvars(Space,N,SPEC...)
BVars := boolvars(Space,N,SPEC...)
SVars := setvars(Space,N,SPEC...)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
where N is the number of variables to create (just like for XXXVarArray in
Gecode). Sometimes an IntSet is necessary:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
ISet := intset([SPEC...])
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
where each SPEC is either an integer or a pair (I,J) of integers. An IntSet
describes a set of ints by providing either intervals, or integers (which stand
for an interval of themselves). It might be tempting to simply represent an
IntSet as a list of specs, but this would be ambiguous with IntArgs which,
here, are represented as lists of ints.
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Space += keep(Var)
Space += keep(Vars)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Variables can be marked as "kept". In this case, only such variables will be
explicitly copied during search. This could bring substantial benefits in
memory usage. Of course, in a solution, you can then only look at variables
that have been "kept". If no variable is marked as "kept", then they are all
kept. Thus marking variables as "kept" is purely an optimization.
@item CONSTRAINTS AND BRANCHINGS
all constraint and branching posting functions are available just like in
Gecode. Wherever a XXXArgs or YYYSharedArray is expected, simply use a list.
At present, there is no support for minimodel-like constraint posting.
Constraints and branchings are added to a space using:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Space += CONSTRAINT
Space += BRANCHING
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-04-25 15:48:06 +01:00
2013-09-29 11:31:18 +01:00
For example:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Space += rel(X,'IRT_ EQ',Y)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
arrays of variables are represented by lists of variables, and constants are
represented by atoms with the same name as the Gecode constant
(e.g. 'INT_ VAR_ SIZE_ MIN').
@item SEARCHING FOR SOLUTIONS
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
SolSpace := search(Space)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
This is a backtrackable predicate that enumerates all solution spaces
(SolSpace). It may also take options:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
SolSpace := search(Space,Options)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Options is a list whose elements maybe:
@table @code
@item restart
to select the Restart search engine
@item threads=N
to activate the parallel search engine and control the number of
workers (see Gecode doc)
@item c_ d=N
to set the commit distance for recomputation
@item a_ d=N
to set the adaptive distance for recomputation
@end table
@item EXTRACTING INFO FROM A SOLUTION
An advantage of non Space-bound variables, is that you can use them both to
post constraints in the original space AND to consult their values in
solutions. Below are methods for looking up information about variables. Each
of these methods can either take a variable as argument, or a list of
variables, and returns resp. either a value, or a list of values:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Val := assigned(Space,X)
Val := min(Space,X)
Val := max(Space,X)
Val := med(Space,X)
Val := val(Space,X)
Val := size(Space,X)
Val := width(Space,X)
Val := regret_ min(Space,X)
Val := regret_ max(Space,X)
Val := glbSize(Space,V)
Val := lubSize(Space,V)
Val := unknownSize(Space,V)
Val := cardMin(Space,V)
Val := cardMax(Space,V)
Val := lubMin(Space,V)
Val := lubMax(Space,V)
Val := glbMin(Space,V)
Val := glbMax(Space,V)
Val := glb_ ranges(Space,V)
Val := lub_ ranges(Space,V)
Val := unknown_ ranges(Space,V)
Val := glb_ values(Space,V)
Val := lub_ values(Space,V)
Val := unknown_ values(Space,V)
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
@item DISJUNCTORS
Disjunctors provide support for disjunctions of clauses, where each clause is a
conjunction of constraints:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
C1 or C2 or ... or Cn
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Each clause is executed "speculatively": this means it does not affect the main
space. When a clause becomes failed, it is discarded. When only one clause
remains, it is committed: this means that it now affects the main space.
Example:
Consider the problem where either X=Y=0 or X=Y+(1 or 2) for variable X and Y
that take values in 0..3.
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Space := space,
[X,Y] := intvars(Space,2,0,3),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
First, we must create a disjunctor as a manager for our 2 clauses:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Disj := disjunctor(Space),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
We can now create our first clause:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
C1 := clause(Disj),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
This clause wants to constrain X and Y to 0. However, since it must be
executed "speculatively", it must operate on new variables X1 and Y1 that
shadow X and Y:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
[X1,Y1] := intvars(C1,2,0,3),
C1 += forward([X,Y],[X1,Y1]),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
The forward(...) stipulation indicates which global variable is shadowed by
which clause-local variable. Now we can post the speculative clause-local
constraints for X=Y=0:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
C1 += rel(X1,'IRT_ EQ',0),
C1 += rel(Y1,'IRT_ EQ',0),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
We now create the second clause which uses X2 and Y2 to shadow X and Y:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
C2 := clause(Disj),
[X2,Y2] := intvars(C2,2,0,3),
C2 += forward([X,Y],[X2,Y2]),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
However, this clause also needs a clause-local variable Z2 taking values 1 or
2 in order to post the clause-local constraint X2=Y2+Z2:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Z2 := intvar(C2,1,2),
C2 += linear([-1,1,1],[X2,Y2,Z2],'IRT_ EQ',0),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Finally, we can branch and search:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
Space += branch([X,Y],'INT_ VAR_ SIZE_ MIN','INT_ VAL_ MIN'),
SolSpace := search(Space),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
and lookup values of variables in each solution:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
[X_ ,Y_ ] := val(SolSpace,[X,Y]).
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
@end table
@node Gecode and ClP(FD), The Gecode Interface, , Gecode
@subsection Programming Finite Domain Constraints in YAP/Gecode
2014-04-09 12:39:52 +01:00
2013-09-29 11:31:18 +01:00
The gecode/clp(fd) interface is designed to use the GECODE functionality
in a more CLP like style. It requires
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
:- use_ module(library(gecode/clpfd)).
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Several example programs are available with the distribution.
2013-09-30 00:20:00 +01:00
Integer variables are declared as:
2013-09-30 00:26:00 +01:00
@table @code
@item @var{ V} in @var{ A} ..@var{ B}
2013-09-30 00:20:00 +01:00
declares an integer variable @var{ V} with range @var{ A} to @var{ B} .
2013-09-30 00:26:00 +01:00
@item @var{ Vs} ins @var{ A} ..@var{ B}
2013-09-30 00:20:00 +01:00
declares a set of integer variabless @var{ Vs} with range @var{ A} to @var{ B} .
2013-09-30 00:26:00 +01:00
@item boolvar(@var{ V} )
2013-09-30 00:20:00 +01:00
declares a boolean variable.
2013-09-30 00:26:00 +01:00
@item boolvars(@var{ Vs} )
2013-09-30 00:20:00 +01:00
declares a set of boolean variable.
@end table
2013-09-29 11:31:18 +01:00
Constraints supported are:
@table @code
@item @var{ X} #= @var{ Y}
equality
@item @var{ X} #\= @var{ Y}
disequality
@item @var{ X} #> @var{ Y}
larger
@item @var{ X} #>= @var{ Y}
larger or equal
@item @var{ X} #=< @var{ Y}
smaller
@item @var{ X} #< @var{ Y}
smaller or equal
Arguments to this constraint may be an arithmetic expression with @t{ +} ,
2013-09-30 00:20:00 +01:00
@t{ -} , @t{ *} , integer division @t{ /} , @t{ min} , @t{ max} , @t{ sum} ,
@t{ count} , and
@t{ abs} . Boolean variables support conjunction (/\) , disjunction (\/ ),
implication (=>), equivalence (<=>), and xor. The @t{ sum} constraint allows a two argument version using the
2013-09-29 11:31:18 +01:00
@code{ where} conditional, in Zinc style.
The send more money equation may be written as:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
1000*S + 100*E + 10*N + D +
1000*M + 100*O + 10*R + E #=
10000*M + 1000*O + 100*N + 10*E + Y,
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
This example uses @code{ where} to select from
column @var{ I} the elements that have value under @var{ M} :
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
OutFlow[I] #= sum(J in 1..N where D[J,I]<M, X[J,I])
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-30 00:20:00 +01:00
The @t{ count} constraint counts the number of elements that match a
certain constant or variable (integer sets are not available).
@item all_ different(@var{ Vs} )
2013-09-29 11:31:18 +01:00
@item all_ distinct(@var{ Vs} )
@item all_ different(@var{ Cs} , @var{ Vs} )
@item all_ distinct(@var{ Cs} , @var{ Vs} )
verifies whether all elements of a list are different. In the second
case, tests if all the sums between a list of constants and a list of
variables are different.
This is a formulation of the queens problem that uses both versions of @code{ all_ different} :
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
queens(N, Queens) :-
length(Queens, N),
Queens ins 1..N,
all_ distinct(Queens),
foldl(inc, Queens, Inc, 0, _ ), % [0, 1, 2, .... ]
foldl(dec, Queens, Dec, 0, _ ), % [0, -1, -2, ... ]
all_ distinct(Inc,Queens),
all_ distinct(Dec,Queens),
labeling([], Queens).
inc(_ , I0, I0, I) :-
I is I0+1.
dec(_ , I0, I0, I) :-
I is I0-1.
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
The next example uses @code{ all_ different/1} and the functionality of the matrix package to verify that all squares in
sudoku have a different value:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
foreach( [I,J] ins 0..2 ,
all_ different(M[I*3+(0..2),J*3+(0..2)]) ),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
2013-09-30 00:20:00 +01:00
@item scalar_ product(+@var{ Cs} , +@var{ Vs} , +@var{ Rel} , ?@var{ V} )
The product of constant @var{ Cs} by @var{ Vs} must be in relation
@var{ Rel} with @var{ V} .
@item @var{ X} #=
all elements of @var{ X} must take the same value
@item @var{ X} #\=
not all elements of @var{ X} take the same value
@item @var{ X} #>
elements of @var{ X} must be increasing
@item @var{ X} #>=
elements of @var{ X} must be increasinga or equal
@item @var{ X} #=<
elements of @var{ X} must be decreasing
@item @var{ X} #<
elements of @var{ X} must be decreasing or equal
2013-09-29 11:31:18 +01:00
@item @var{ X} #<==> @var{ B}
reified equivalence
@item @var{ X} #==> @var{ B}
reified implication
@item @var{ X} #< @var{ B}
reified implication
As an example. consider finding out the people who wanted to sit
next to a friend and that are are actually sitting together:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
preference_ satisfied(X-Y, B) :-
abs(X - Y) #= 1 #<==> B.
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
Note that not all constraints may be reifiable.
2013-09-30 00:20:00 +01:00
@item element(@var{ X} , @var{ Vs} )
@var{ X} is an element of list @var{ Vs}
2013-09-30 00:26:00 +01:00
@item clause(@var{ Type} , @var{ Ps} , @var{ Ns} , @var{ V} )
If @var{ Type} is @code{ and} it is the conjunction of boolean variables
2013-09-30 00:20:00 +01:00
@var{ Ps} and the negation of boolean variables @var{ Ns} and must have
result @var{ V} . If @var{ Type} is @code{ or} it is a disjunction.
2013-09-29 11:31:18 +01:00
@item DFA
the interface allows creating and manipulation deterministic finite
automata. A DFA has a set of states, represented as integers
and is initialised with an initial state, a set of transitions from the
first to the last argument emitting the middle argument, and a final
state.
The swedish-drinkers protocol is represented as follows:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
A = [X,Y,Z],
dfa( 0, [t(0,0,0),t(0,1,1),t(1,0,0),t(-1,0,0)], [0], C),
in_ dfa( A, C ),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
This code will enumeratae the valid tuples of three emissions.
@item extensional constraints
Constraints can also be represented as lists of tuples.
The previous example
would be written as:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
extensional_ constraint([[0,0,0],[0,1,0],[1,0,0]], C),
in_ relation( A, C ),
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
@item minimum(@var{ X} , @var{ Vs} )
@item min(@var{ X} , @var{ Vs} )
First Argument is the least element of a list.
@item maximum(@var{ X} , @var{ Vs} )
@item max(@var{ X} , @var{ Vs} )
First Argument is the greatest element of a list.
2014-04-10 11:59:30 +01:00
@item lex_ order(@var{ Vs} )
2013-09-29 11:31:18 +01:00
All elements must be ordered.
@end table
The following predicates control search:
@table @code
@item labeling(@var{ Opts} , @var{ Xs} )
2013-09-30 00:20:00 +01:00
performs labeling, several variable and value selection options are
available. The defaults are @code{ min} and @code{ min_ step} .
Variable selection options are as follows:
@table @code
@item leftmost
choose the first variable
@item min
choose one of the variables with smallest minimum value
@item max
choose one of the variables with greatest maximum value
@item ff
choose one of the most constrained variables, that is, with the smallest
domain.
@end table
Given that we selected a variable, the values chosen for branching may
be:
@table @code
@item min_ step
smallest value
@item max_ step
largest value
@item bisect
median
@item enum
all value starting from the minimum.
@end table
2013-09-29 11:31:18 +01:00
@item maximize(@var{ V} )
maximise variable @var{ V}
@item minimize(@t{ V} )
minimise variable @var{ V}
@end table
@node Heaps, Lists, Gecode, Library
2001-04-09 20:54:03 +01:00
@section Heaps
@cindex heap
A heap is a labelled binary tree where the key of each node is less than
or equal to the keys of its sons. The point of a heap is that we can
keep on adding new elements to the heap and we can keep on taking out
the minimum element. If there are N elements total, the total time is
O(NlgN). If you know all the elements in advance, you are better off
doing a merge-sort, but this file is for when you want to do say a
best-first search, and have no idea when you start how many elements
there will be, let alone what they are.
The following heap manipulation routines are available once included
with the @code{ use_ module(library(heaps))} command.
@table @code
@item add_ to_ heap(+@var{ Heap} ,+@var{ key} ,+@var{ Datum} ,-@var{ NewHeap} )
2001-06-12 15:07:59 +01:00
@findex add_ to_ heap/4
@syindex add_ to_ heap/4
@cnindex add_ to_ heap/4
2001-04-09 20:54:03 +01:00
Inserts the new @var{ Key-Datum} pair into the heap. The insertion is not
stable, that is, if you insert several pairs with the same @var{ Key} it
is not defined which of them will come out first, and it is possible for
any of them to come out first depending on the history of the heap.
2002-01-08 03:47:18 +00:00
@item empty_ heap(?@var{ Heap} )
@findex empty_ heap/1
@syindex empty_ heap/1
@cnindex empty_ heap/1
Succeeds if @var{ Heap} is an empty heap.
2001-04-09 20:54:03 +01:00
@item get_ from_ heap(+@var{ Heap} ,-@var{ key} ,-@var{ Datum} ,-@var{ Heap} )
2001-06-12 15:07:59 +01:00
@findex get_ from_ heap/4
@syindex get_ from_ heap/4
@cnindex get_ from_ heap/4
2001-04-09 20:54:03 +01:00
Returns the @var{ Key-Datum} pair in @var{ OldHeap} with the smallest
@var{ Key} , and also a @var{ Heap} which is the @var{ OldHeap} with that
pair deleted.
@item heap_ size(+@var{ Heap} , -@var{ Size} )
2001-06-12 15:07:59 +01:00
@findex heap_ size/2
@syindex heap_ size/2
@cnindex heap_ size/2
2001-04-09 20:54:03 +01:00
Reports the number of elements currently in the heap.
@item heap_ to_ list(+@var{ Heap} , -@var{ List} )
2001-06-12 15:07:59 +01:00
@findex heap_ to_ list/2
@syindex heap_ to_ list/2
@cnindex heap_ to_ list/2
2001-04-09 20:54:03 +01:00
Returns the current set of @var{ Key-Datum} pairs in the @var{ Heap} as a
@var{ List} , sorted into ascending order of @var{ Keys} .
@item list_ to_ heap(+@var{ List} , -@var{ Heap} )
2001-06-12 15:07:59 +01:00
@findex list_ to_ heap/2
@syindex list_ to_ heap/2
@cnindex list_ to_ heap/2
2001-04-09 20:54:03 +01:00
Takes a list of @var{ Key-Datum} pairs (such as keysort could be used to sort)
and forms them into a heap.
@item min_ of_ heap(+@var{ Heap} , -@var{ Key} , -@var{ Datum} )
@findex min_ of_ heap/3
@syindex min_ of_ heap/3
@cnindex min_ of_ heap/3
Returns the Key-Datum pair at the top of the heap (which is of course
the pair with the smallest Key), but does not remove it from the heap.
@item min_ of_ heap(+@var{ Heap} , -@var{ Key1} , -@var{ Datum1} ,
-@var{ Key2} , -@var{ Datum2} )
@findex min_ of_ heap/5
@syindex min_ of_ heap/5
@cnindex min_ of_ heap/5
Returns the smallest (Key1) and second smallest (Key2) pairs in the
heap, without deleting them.
@end table
2008-09-01 02:41:09 +01:00
@node Lists, LineUtilities, Heaps, Library
2001-04-09 20:54:03 +01:00
@section List Manipulation
@cindex list manipulation
The following list manipulation routines are available once included
with the @code{ use_ module(library(lists))} command.
@table @code
@item append(?@var{ Prefix} ,?@var{ Suffix} ,?@var{ Combined} )
@findex append/3
@syindex append/3
@cnindex append/3
True when all three arguments are lists, and the members of
@var{ Combined} are the members of @var{ Prefix} followed by the members of @var{ Suffix} .
It may be used to form @var{ Combined} from a given @var{ Prefix} , @var{ Suffix} or to take
a given @var{ Combined} apart.
2008-02-12 17:03:59 +00:00
@item append(?@var{ Lists} ,?@var{ Combined} )
@findex append/2
@syindex append/2
@cnindex append/2
Holds if the lists of @var{ Lists} can be concatenated as a
@var{ Combined} list.
2001-04-09 20:54:03 +01:00
@item delete(+@var{ List} , ?@var{ Element} , ?@var{ Residue} )
@findex delete/3
@syindex delete/3
@cnindex delete/3
True when @var{ List} is a list, in which @var{ Element} may or may not
occur, and @var{ Residue} is a copy of @var{ List} with all elements
identical to @var{ Element} deleted.
2002-09-17 17:43:00 +01:00
@item flatten(+@var{ List} , ?@var{ FlattenedList} )
@findex flatten/2
@syindex flatten/2
@cnindex flatten/2
Flatten a list of lists @var{ List} into a single list
@var{ FlattenedList} .
2014-04-21 11:14:18 +01:00
@pl_ example
2002-09-17 17:43:00 +01:00
?- flatten([[1],[2,3],[4,[5,6],7,8]],L).
L = [1,2,3,4,5,6,7,8] ? ;
no
2014-04-21 11:14:18 +01:00
@end pl_ example
2002-09-17 17:43:00 +01:00
2001-04-09 20:54:03 +01:00
@item last(+@var{ List} ,?@var{ Last} )
@findex last/2
@syindex last/2
@cnindex last/2
True when @var{ List} is a list and @var{ Last} is identical to its last element.
2001-04-17 22:07:41 +01:00
@item list_ concat(+@var{ Lists} ,?@var{ List} )
@findex list_ concat/2
@snindex list_ concat/2
@cnindex list_ concat/2
True when @var{ Lists} is a list of lists and @var{ List} is the
concatenation of @var{ Lists} .
2001-04-09 20:54:03 +01:00
@item member(?@var{ Element} , ?@var{ Set} )
@findex member/2
@syindex member/2
@cnindex member/2
True when @var{ Set} is a list, and @var{ Element} occurs in it. It may be used
to test for an element or to enumerate all the elements by backtracking.
@item memberchk(+@var{ Element} , +@var{ Set} )
@findex memberchk/2
@syindex memberchk/2
@cnindex memberchk/2
As @code{ member/2} , but may only be used to test whether a known
@var{ Element} occurs in a known Set. In return for this limited use, it
is more efficient when it is applicable.
2001-11-19 23:19:23 +00:00
@item nth0(?@var{ N} , ?@var{ List} , ?@var{ Elem} )
2008-07-23 00:34:50 +01:00
@findex nth0/3
@syindex nth0/3
@cnindex nth0/3
2001-04-09 20:54:03 +01:00
True when @var{ Elem} is the Nth member of @var{ List} ,
counting the first as element 0. (That is, throw away the first
N elements and unify @var{ Elem} with the next.) It can only be used to
select a particular element given the list and index. For that
task it is more efficient than @code{ member/2}
2008-07-23 00:34:50 +01:00
@item nth1(?@var{ N} , ?@var{ List} , ?@var{ Elem} )
@findex nth1/3
@syindex nth1/3
@cnindex nth1/3
2001-04-09 20:54:03 +01:00
The same as @code{ nth0/3} , except that it counts from
1, that is @code{ nth(1, [H|_ ], H)} .
2008-07-23 00:34:50 +01:00
@item nth(?@var{ N} , ?@var{ List} , ?@var{ Elem} )
@findex nth/3
@syindex nth/3
@cnindex nth/3
The same as @code{ nth1/3} .
2001-08-27 16:27:29 +01:00
@item nth0(?@var{ N} , ?@var{ List} , ?@var{ Elem} , ?@var{ Rest} )
2001-04-09 20:54:03 +01:00
@findex nth0/4
@syindex nth0/4
@cnindex nth0/4
Unifies @var{ Elem} with the Nth element of @var{ List} ,
counting from 0, and @var{ Rest} with the other elements. It can be used
to select the Nth element of @var{ List} (yielding @var{ Elem} and @var{ Rest} ), or to
insert @var{ Elem} before the Nth (counting from 1) element of @var{ Rest} , when
it yields @var{ List} , e.g. @code{ nth0(2, List, c, [a,b,d,e])} unifies List with
2001-08-27 16:27:29 +01:00
@code{ [a,b,c,d,e]} . @code{ nth/4} is the same except that it counts from 1. @code{ nth0/4}
can be used to insert @var{ Elem} after the Nth element of @var{ Rest} .
2008-07-23 00:34:50 +01:00
@item nth1(?@var{ N} , ?@var{ List} , ?@var{ Elem} , ?@var{ Rest} )
@findex nth1/4
@syindex nth1/4
@cnindex nth1/4
2001-08-27 16:27:29 +01:00
Unifies @var{ Elem} with the Nth element of @var{ List} , counting from 1,
and @var{ Rest} with the other elements. It can be used to select the
Nth element of @var{ List} (yielding @var{ Elem} and @var{ Rest} ), or to
insert @var{ Elem} before the Nth (counting from 1) element of
2014-04-21 11:14:18 +01:00
@var{ Rest} , when it yields @var{ List} , e.g. @code{ nth(3, List, c, [a,b,d,e])} unifies List with @code{ [a,b,c,d,e]} . @code{ nth/4}
2001-08-27 16:27:29 +01:00
can be used to insert @var{ Elem} after the Nth element of @var{ Rest} .
2001-04-09 20:54:03 +01:00
2008-07-23 00:34:50 +01:00
@item nth(?@var{ N} , ?@var{ List} , ?@var{ Elem} , ?@var{ Rest} )
@findex nth/4
@syindex nth/4
@cnindex nth/4
Same as @code{ nth1/4} .
2001-04-09 20:54:03 +01:00
@item permutation(+@var{ List} ,?@var{ Perm} )
@findex permutation/2
@syindex permutation/2
@cnindex permutation/2
True when @var{ List} and @var{ Perm} are permutations of each other.
2002-11-18 17:25:22 +00:00
@item remove_ duplicates(+@var{ List} , ?@var{ Pruned} )
@findex remove_ duplicates/2
@syindex remove_ duplicates/2
@cnindex remove_ duplicates/2
2001-04-09 20:54:03 +01:00
Removes duplicated elements from @var{ List} . Beware: if the @var{ List} has
non-ground elements, the result may surprise you.
@item reverse(+@var{ List} , ?@var{ Reversed} )
@findex reverse/2
@syindex reverse/2
@cnindex reverse/2
True when @var{ List} and @var{ Reversed} are lists with the same elements
but in opposite orders.
@item same_ length(?@var{ List1} , ?@var{ List2} )
@findex same_ length/2
@syindex same_ length/2
@cnindex same_ length/2
True when @var{ List1} and @var{ List2} are both lists and have the same number
of elements. No relation between the values of their elements is
implied.
Modes @code{ same_ length(-,+)} and @code{ same_ length(+,-)} generate either list given
the other; mode @code{ same_ length(-,-)} generates two lists of the same length,
in which case the arguments will be bound to lists of length 0, 1, 2, ...
2008-07-23 00:34:50 +01:00
@item select(?@var{ Element} , ?@var{ List} , ?@var{ Residue} )
2001-04-09 20:54:03 +01:00
@findex select/3
@syindex select/3
@cnindex select/3
2008-07-23 00:34:50 +01:00
True when @var{ Set} is a list, @var{ Element} occurs in @var{ List} , and
@var{ Residue} is everything in @var{ List} except @var{ Element} (things
stay in the same order).
@item selectchk(?@var{ Element} , ?@var{ List} , ?@var{ Residue} )
@findex selectchk/3
@snindex selectchk/3
@cnindex selectchk/3
Semi-deterministic selection from a list. Steadfast: defines as
2014-04-21 11:14:18 +01:00
@pl_ example
2008-07-23 00:34:50 +01:00
selectchk(Elem, List, Residue) :-
select(Elem, List, Rest0), !,
Rest = Rest0.
2014-04-21 11:14:18 +01:00
@end pl_ example
2008-07-23 00:34:50 +01:00
2001-04-09 20:54:03 +01:00
@item sublist(?@var{ Sublist} , ?@var{ List} )
@findex sublist/2
@syindex sublist/2
@cnindex sublist/2
True when both @code{ append(_ ,Sublist,S)} and @code{ append(S,_ ,List)} hold.
@item suffix(?@var{ Suffix} , ?@var{ List} )
@findex suffix/2
@syindex suffix/2
@cnindex suffix/2
Holds when @code{ append(_ ,Suffix,List)} holds.
2001-04-26 15:44:43 +01:00
@item sum_ list(?@var{ Numbers} , ?@var{ Total} )
@findex sum_ list/2
2005-01-29 04:43:14 +00:00
@syindex sum_ list/2
2001-04-26 15:44:43 +01:00
@cnindex sum_ list/2
2005-01-29 04:43:14 +00:00
True when @var{ Numbers} is a list of numbers, and @var{ Total} is their sum.
2001-04-26 15:44:43 +01:00
2008-11-03 16:00:22 +00:00
@item sum_ list(?@var{ Numbers} , +@var{ SoFar} , ?@var{ Total} )
@findex sum_ list/3
@syindex sum_ list/3
@cnindex sum_ list/3
True when @var{ Numbers} is a list of numbers, and @var{ Total} is the sum of their total plus @var{ SoFar} .
2001-04-09 20:54:03 +01:00
@item sumlist(?@var{ Numbers} , ?@var{ Total} )
@findex sumlist/2
@syindex sumlist/2
@cnindex sumlist/2
2001-04-26 15:44:43 +01:00
True when @var{ Numbers} is a list of integers, and @var{ Total} is their
sum. The same as @code{ sum_ list/2} , please do use @code{ sum_ list/2}
instead.
2001-04-09 20:54:03 +01:00
2005-01-29 04:43:14 +00:00
@item max_ list(?@var{ Numbers} , ?@var{ Max} )
@findex max_ list/2
@syindex max_ list/2
@cnindex max_ list/2
True when @var{ Numbers} is a list of numbers, and @var{ Max} is the maximum.
@item min_ list(?@var{ Numbers} , ?@var{ Min} )
@findex min_ list/2
@syindex min_ list/2
@cnindex min_ list/2
True when @var{ Numbers} is a list of numbers, and @var{ Min} is the minimum.
2008-07-23 00:34:50 +01:00
@item numlist(+@var{ Low} , +@var{ High} , +@var{ List} )
@findex numlist/3
@syindex numlist/3
@cnindex numlist/3
2009-04-25 16:59:23 +01:00
If @var{ Low} and @var{ High} are integers with @var{ Low} =<
2008-07-23 00:34:50 +01:00
@var{ High} , unify @var{ List} to a list @code{ [Low, Low+1, ...High]} . See
also @code{ between/3} .
2010-04-21 00:15:11 +01:00
@item intersection(+@var{ Set1} , +@var{ Set2} , +@var{ Set3} )
@findex intersection/3
@syindex intersection/3
@cnindex intersection/3
Succeeds if @var{ Set3} unifies with the intersection of @var{ Set1} and
@var{ Set2} . @var{ Set1} and @var{ Set2} are lists without duplicates. They
need not be ordered.
2001-04-09 20:54:03 +01:00
2011-01-04 03:55:42 +00:00
@item subtract(+@var{ Set} , +@var{ Delete} , ?@var{ Result} )
@findex subtract/3
@syindex subtract/3
@cnindex subtract/3
Delete all elements from @var{ Set} that occur in @var{ Delete} (a set)
and unify the result with @var{ Result} . Deletion is based on
unification using @code{ memberchk/2} . The complexity is
@code{ |Delete|*|Set|} .
See @code{ ord_ subtract/3} .
@end table
2013-09-28 11:09:32 +01:00
@node LineUtilities, MapArgs, Lists, Library
2008-09-01 02:41:09 +01:00
@section Line Manipulation Utilities
@cindex Line Utilities Library
This package provides a set of useful predicates to manipulate
sequences of characters codes, usually first read in as a line. It is
2010-10-04 19:35:22 +01:00
available by loading the library @code{ library(lineutils)} .
2008-09-01 02:41:09 +01:00
@table @code
@item search_ for(+@var{ Char} ,+@var{ Line} )
@findex search_ for/2
@snindex search_ for/2
@cnindex search_ for/2
Search for a character @var{ Char} in the list of codes @var{ Line} .
@item search_ for(+@var{ Char} ,+@var{ Line} ,-@var{ RestOfine} )
2014-04-09 12:39:52 +01:00
@findex search_ for/3
@snindex search_ for/3
@cnindex search_ for/3
2008-09-01 02:41:09 +01:00
Search for a character @var{ Char} in the list of codes @var{ Line} ,
@var{ RestOfLine} has the line to the right.
@item scan_ natural(?@var{ Nat} ,+@var{ Line} ,+@var{ RestOfLine} )
@findex scan_ natural/3
@snindex scan_ natural/3
@cnindex scan_ natural/3
Scan the list of codes @var{ Line} for a natural number @var{ Nat} , zero
or a positive integer, and unify @var{ RestOfLine} with the remainder
of the line.
@item scan_ integer(?@var{ Int} ,+@var{ Line} ,+@var{ RestOfLine} )
@findex scan_ integer/3
@snindex scan_ integer/3
@cnindex scan_ integer/3
Scan the list of codes @var{ Line} for an integer @var{ Nat} , either a
positive, zero, or negative integer, and unify @var{ RestOfLine} with
the remainder of the line.
@item split(+@var{ Line} ,+@var{ Separators} ,-@var{ Split} )
@findex split/3
@snindex split/3
@cnindex split/3
Unify @var{ Words} with a set of strings obtained from @var{ Line} by
using the character codes in @var{ Separators} as separators. As an
example, consider:
2014-04-21 11:14:18 +01:00
@pl_ example
2008-09-01 02:41:09 +01:00
?- split("Hello * I am free"," *",S).
S = ["Hello","I","am","free"] ?
no
2014-04-21 11:14:18 +01:00
@end pl_ example
2008-09-01 02:41:09 +01:00
2009-07-15 23:30:27 +01:00
@item split(+@var{ Line} ,-@var{ Split} )
@findex split/2
@snindex split/2
@cnindex split/2
Unify @var{ Words} with a set of strings obtained from @var{ Line} by
using the blank characters as separators.
@item fields(+@var{ Line} ,+@var{ Separators} ,-@var{ Split} )
@findex fields/3
@snindex fields/3
@cnindex fields/3
Unify @var{ Words} with a set of strings obtained from @var{ Line} by
using the character codes in @var{ Separators} as separators for
fields. If two separators occur in a row, the field is considered
empty. As an example, consider:
2014-04-21 11:14:18 +01:00
@pl_ example
2009-07-15 23:30:27 +01:00
?- fields("Hello I am free"," *",S).
S = ["Hello","","I","am","","free"] ?
2014-04-21 11:14:18 +01:00
@end pl_ example
2009-07-15 23:30:27 +01:00
@item fields(+@var{ Line} ,-@var{ Split} )
@findex fields/2
@snindex fields/2
@cnindex fields/2
Unify @var{ Words} with a set of strings obtained from @var{ Line} by
using the blank characters as field separators.
2008-09-01 02:41:09 +01:00
@item glue(+@var{ Words} ,+@var{ Separator} ,-@var{ Line} )
@findex glue/3
@snindex glue/3
@cnindex glue/3
Unify @var{ Line} with string obtained by glueing @var{ Words} with
the character code @var{ Separator} .
@item copy_ line(+@var{ StreamInput} ,+@var{ StreamOutput} )
@findex copy_ line/2
@snindex copy_ line/2
@cnindex copy_ line/2
Copy a line from @var{ StreamInput} to @var{ StreamOutput} .
@item process(+@var{ StreamInp} , +@var{ Goal} )
@findex process/2
@snindex process/2
@cnindex process/2
For every line @var{ LineIn} in stream @var{ StreamInp} , call
@code{ call(Goal,LineIn)} .
@item filter(+@var{ StreamInp} , +@var{ StreamOut} , +@var{ Goal} )
@findex filter/3
@snindex filter/3
@cnindex filter/3
For every line @var{ LineIn} in stream @var{ StreamInp} , execute
@code{ call(Goal,LineIn,LineOut)} , and output @var{ LineOut} to
stream @var{ StreamOut} .
@item file_ filter(+@var{ FileIn} , +@var{ FileOut} , +@var{ Goal} )
2010-04-18 21:48:00 +01:00
@findex file_ filter/3
@snindex file_ filter/3
@cnindex file_ filter/3
2008-09-01 02:41:09 +01:00
For every line @var{ LineIn} in file @var{ FileIn} , execute
@code{ call(Goal,LineIn,LineOut)} , and output @var{ LineOut} to file
@var{ FileOut} .
2010-04-18 21:48:00 +01:00
@item file_ filter(+@var{ FileIn} , +@var{ FileOut} , +@var{ Goal} ,
+@var{ FormatCommand} , +@var{ Arguments} )
@findex file_ filter_ with_ init/5
@snindex file_ filter_ with_ init/5
@cnindex file_ filter_ with_ init/5
Same as @code{ file_ filter/3} , but before starting the filter execute
@code{ format/3} on the output stream, using @var{ FormatCommand} and
@var{ Arguments} .
2008-09-01 02:41:09 +01:00
@end table
2014-05-28 00:12:36 +01:00
@texinfo
2008-09-01 02:41:09 +01:00
2013-09-28 11:09:32 +01:00
@node MapArgs, MapList, LineUtilities, Library
@section Maplist
@cindex macros
This library provides a set of utilities for applying a predicate to
all sub-terms of a term. They allow to
easily perform the most common do-loop constructs in Prolog. To avoid
performance degradation due to @code{ apply/2} , each call creates an
equivalent Prolog program, without meta-calls, which is executed by
the Prolog engine instead.
@table @code
@item mapargs(+@var{ Pred} , +@var{ TermIn} )
@findex mapargs/2
@snindex mapargs/2
@cnindex mapargs/2
Applies the predicate @var{ Pred} to all
arguments of @var{ TermIn}
@item mapargs(+@var{ Pred} , +@var{ TermIn} , ?@var{ TermOut} )
@findex mapargs/3
@snindex mapargs/3
@cnindex mapargs/3
Creates @var{ TermOut} by applying the predicate @var{ Pred} to all
arguments of @var{ TermIn}
@item mapargs(+@var{ Pred} , +@var{ TermIn} , ?@var{ TermOut1} , ?@var{ TermOut2} )
@findex mapargs/4
@snindex mapargs/4
@cnindex mapargs/4
Creates @var{ TermOut1} and @var{ TermOut2} by applying the predicate @var{ Pred} to all
arguments of @var{ TermIn}
2014-04-09 12:39:52 +01:00
@item mapargs(+@var{ Pred} , +@var{ TermIn} , ?@var{ TermOut1} , ?@var{ TermOut2} , ?@var{ TermOut3} )
2013-09-28 11:09:32 +01:00
@findex mapargs/5
@snindex mapargs/5
@cnindex mapargs/5
Creates @var{ TermOut1} , @var{ TermOut2} and @var{ TermOut3} by applying the predicate @var{ Pred} to all
arguments of @var{ TermIn}
2014-04-09 12:39:52 +01:00
@item mapargs(+@var{ Pred} , +@var{ TermIn} , ?@var{ TermOut1} , ?@var{ TermOut2} , ?@var{ TermOut3} , ?@var{ TermOut4} )
2013-09-28 11:09:32 +01:00
@findex mapargs/6
@snindex mapargs/6
@cnindex mapargs/6
Creates @var{ TermOut1} , @var{ TermOut2} , @var{ TermOut3} and @var{ TermOut4} by applying the predicate @var{ Pred} to all
arguments of @var{ TermIn}
@item foldargs(+@var{ Pred} , +@var{ Term} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldargs/4
@snindex foldargs/4
@cnindex foldargs/4
Calls the predicate @var{ Pred} on all arguments of @var{ Term} and
collects a result in @var{ Accumulator}
2014-04-09 12:39:52 +01:00
2013-09-28 11:09:32 +01:00
@item foldargs(+@var{ Pred} , +@var{ Term} , +@var{ Term1} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldargs/5
@snindex foldargs/5
@cnindex foldargs/5
Calls the predicate @var{ Pred} on all arguments of @var{ Term} and @var{ Term1} and
collects a result in @var{ Accumulator}
@item foldargs(+@var{ Pred} , +@var{ Term} , +@var{ Term1} , +@var{ Term2} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldargs/6
@snindex foldargs/6
@cnindex foldargs/6
Calls the predicate @var{ Pred} on all arguments of @var{ Term} , +@var{ Term1} and @var{ Term2} and
collects a result in @var{ Accumulator}
@item foldargs(+@var{ Pred} , +@var{ Term} , +@var{ Term1} , +@var{ Term2} , +@var{ Term3} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldargs/7
@snindex foldargs/7
@cnindex foldargs/7
Calls the predicate @var{ Pred} on all arguments of @var{ Term} , +@var{ Term1} , +@var{ Term2} and @var{ Term3} and
collects a result in @var{ Accumulator}
@item sumargs(+@var{ Pred} , +@var{ Term} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex sumargs/4
@snindex sumargs/4
@cnindex sumargs/4
Calls the predicate @var{ Pred} on all arguments of @var{ Term} and
collects a result in @var{ Accumulator} (uses reverse order than foldargs).
@end table
2014-05-28 00:12:36 +01:00
@texinfo
2013-09-28 11:09:32 +01:00
@node MapList, matrix, MapArgs, Library
2010-04-20 23:06:41 +01:00
@section Maplist
@cindex macros
This library provides a set of utilities for applying a predicate to
all elements of a list or to all sub-terms of a term. They allow to
easily perform the most common do-loop constructs in Prolog. To avoid
performance degradation due to @code{ apply/2} , each call creates an
equivalent Prolog program, without meta-calls, which is executed by
the Prolog engine instead. Note that if the equivalent Prolog program
already exists, it will be simply used. The library is based on code
by Joachim Schimpf and on code from SWI-Prolog.
The following routines are available once included with the
@code{ use_ module(library(apply_ macros))} command.
@table @code
2012-08-24 16:39:14 +01:00
@item maplist(:@var{ Pred} , ?@var{ ListIn} , ?@var{ ListOut} )
2010-04-20 23:06:41 +01:00
@findex maplist/3
@snindex maplist/3
@cnindex maplist/3
Creates @var{ ListOut} by applying the predicate @var{ Pred} to all
elements of @var{ ListIn} .
2012-08-24 16:39:14 +01:00
@item maplist(:@var{ Pred} , ?@var{ ListIn} )
2014-04-09 14:00:54 +01:00
@findex maplist/2
@snindex maplist/2
@cnindex maplist/2
2010-04-20 23:06:41 +01:00
Creates @var{ ListOut} by applying the predicate @var{ Pred} to all
elements of @var{ ListIn} .
2012-08-24 16:39:14 +01:00
@item maplist(:@var{ Pred} , ?@var{ L1} , ?@var{ L2} , ?@var{ L3} )
2010-04-20 23:06:41 +01:00
@findex maplist/4
@snindex maplist/4
@cnindex maplist/4
@var{ L1} , @var{ L2} , and @var{ L3} are such that
@code{ call(@var{ Pred} ,@var{ A1} ,@var{ A2} ,@var{ A3} )} holds for every
corresponding element in lists @var{ L1} , @var{ L2} , and @var{ L3} .
2012-08-24 16:39:14 +01:00
@item maplist(:@var{ Pred} , ?@var{ L1} , ?@var{ L2} , ?@var{ L3} , ?@var{ L4} )
2010-04-20 23:06:41 +01:00
@findex maplist/5
@snindex maplist/5
@cnindex maplist/5
@var{ L1} , @var{ L2} , @var{ L3} , and @var{ L4} are such that
@code{ call(@var{ Pred} ,@var{ A1} ,@var{ A2} ,@var{ A3} ,@var{ A4} )} holds
for every corresponding element in lists @var{ L1} , @var{ L2} , @var{ L3} , and
@var{ L4} .
2012-08-24 16:39:14 +01:00
@item checklist(:@var{ Pred} , +@var{ List} )
2010-04-20 23:06:41 +01:00
@findex checklist/2
@snindex checklist/2
@cnindex checklist/2
Succeeds if the predicate @var{ Pred} succeeds on all elements of @var{ List} .
2012-08-24 16:39:14 +01:00
@item selectlist(:@var{ Pred} , +@var{ ListIn} , ?@var{ ListOut} )
2010-04-20 23:06:41 +01:00
@findex selectlist/3
@snindex selectlist/3
@cnindex selectlist/3
Creates @var{ ListOut} of all list elements of @var{ ListIn} that pass a given test
2013-06-03 22:39:48 +01:00
@item selectlist(:@var{ Pred} , +@var{ ListIn} , +@var{ ListInAux} , ?@var{ ListOut} )
2014-04-09 14:00:54 +01:00
@findex selectlist/4
@snindex selectlist/4
@cnindex selectlist/4
2013-06-03 22:39:48 +01:00
Creates @var{ ListOut} of all list elements of @var{ ListIn} that
pass the given test @var{ Pred} using +@var{ ListInAux} as an
auxiliary element.
2012-08-24 16:39:14 +01:00
@item convlist(:@var{ Pred} , +@var{ ListIn} , ?@var{ ListOut} )
2010-04-20 23:06:41 +01:00
@findex convlist/3
@snindex convlist/3
@cnindex convlist/3
A combination of @code{ maplist} and @code{ selectlist} : creates @var{ ListOut} by
applying the predicate @var{ Pred} to all list elements on which
@var{ Pred} succeeds
2012-08-24 16:39:14 +01:00
@item sumlist(:@var{ Pred} , +@var{ List} , ?@var{ AccIn} , ?@var{ AccOut} )
2010-04-20 23:06:41 +01:00
@findex sumlist/4
@snindex sumlist/4
@cnindex sumlist/4
Calls @var{ Pred} on all elements of List and collects a result in
2012-08-24 16:39:14 +01:00
@var{ Accumulator} . Same as @code{ foldl/4} .
@item foldl(:@var{ Pred} , +@var{ List} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldl/4
@snindex foldl/4
@cnindex foldl/4
Calls @var{ Pred} on all elements of @code{ List} and collects a result in
2010-04-20 23:06:41 +01:00
@var{ Accumulator} .
2012-08-24 16:39:14 +01:00
@item foldl(:@var{ Pred} , +@var{ List1} , +@var{ List2} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldl/5
@snindex foldl/5
@cnindex foldl/5
Calls @var{ Pred} on all elements of @code{ List1} and
@code{ List2} and collects a result in @var{ Accumulator} . Same as
@code{ foldr/4} .
@item foldl(:@var{ Pred} , +@var{ List1} , +@var{ List2} , +@var{ List3} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldl/6
@snindex foldl/6
@cnindex foldl/6
Calls @var{ Pred} on all elements of @code{ List1} , @code{ List2} , and
@code{ List3} and collects a result in @var{ Accumulator} .
@item foldl(:@var{ Pred} , +@var{ List1} , +@var{ List2} , +@var{ List3} , +@var{ List4} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex foldl/7
@snindex foldl/7
@cnindex foldl/7
Calls @var{ Pred} on all elements of @code{ List1} , @code{ List2} , @code{ List3} , and
@code{ List4} and collects a result in @var{ Accumulator} .
2012-08-29 02:19:46 +01:00
@item foldl2(:@var{ Pred} , +@var{ List} , ?@var{ X0} , ?@var{ X} , ?@var{ Y0} , ?@var{ Y} )
@findex foldl2/6
@snindex foldl2/6
@cnindex foldl2/6
Calls @var{ Pred} on all elements of @code{ List} and collects a result in
@var{ X} and @var{ Y} .
2012-09-07 23:12:38 +01:00
@item foldl2(:@var{ Pred} , +@var{ List} , ?@var{ List1} , ?@var{ X0} , ?@var{ X} , ?@var{ Y0} , ?@var{ Y} )
@findex foldl2/7
@snindex foldl2/7
@cnindex foldl2/7
2013-09-28 11:09:32 +01:00
Calls @var{ Pred} on all elements of @var{ List} and @var{ List1} and collects a result in
@var{ X} and @var{ Y} .
@item foldl2(:@var{ Pred} , +@var{ List} , ?@var{ List1} , ?@var{ List2} , ?@var{ X0} , ?@var{ X} , ?@var{ Y0} , ?@var{ Y} )
@findex foldl2/8
@snindex foldl2/8
@cnindex foldl2/8
Calls @var{ Pred} on all elements of @var{ List} , @var{ List1} and @var{ List2} and collects a result in
2012-09-07 23:12:38 +01:00
@var{ X} and @var{ Y} .
2014-04-09 12:39:52 +01:00
@item foldl3(:@var{ Pred} , +@var{ List1} , ?@var{ List2} , ?@var{ X0} , ?@var{ X} , ?@var{ Y0} , ?@var{ Y} , ?@var{ Z0} , ?@var{ Z} )
2012-09-07 23:12:38 +01:00
2012-08-29 02:19:46 +01:00
@findex foldl3/6
@snindex foldl3/6
@cnindex foldl3/6
2012-09-07 23:12:38 +01:00
Calls @var{ Pred} on all elements of @code{ List} and collects a
result in @var{ X} , @var{ Y} and @var{ Z} .
2012-08-29 02:19:46 +01:00
2014-04-09 12:39:52 +01:00
@item foldl4(:@var{ Pred} , +@var{ List1} , ?@var{ List2} , ?@var{ X0} , ?@var{ X} , ?@var{ Y0} , ?@var{ Y} , ?@var{ Z0} , ?@var{ Z} , ?@var{ W0} , ?@var{ W} )
2012-10-02 15:15:16 +01:00
@findex foldl4/8
@snindex foldl4/8
@cnindex foldl4/8
Calls @var{ Pred} on all elements of @code{ List} and collects a
result in @var{ X} , @var{ Y} , @var{ Z} and @var{ W} .
2012-08-24 16:39:14 +01:00
@item scanl(:@var{ Pred} , +@var{ List} , +@var{ V0} , ?@var{ Values} )
@findex scanl/4
@snindex scanl/4
@cnindex scanl/4
Left scan of list. The scanl family of higher order list
2013-09-29 11:31:18 +01:00
operations is defined by:
2012-08-24 16:39:14 +01:00
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
scanl(P, [X11,...,X1n], ..., [Xm1,...,Xmn], V0, [V0,V1,...,Vn]) :-
P(X11, ..., Xm1, V0, V1),
...
P(X1n, ..., Xmn, Vn-1, Vn).
2014-04-21 11:14:18 +01:00
@end pl_ example
2012-08-24 16:39:14 +01:00
@item scanl(:@var{ Pred} , +@var{ List1} , +@var{ List2} , ?@var{ V0} , ?@var{ Vs} )
@findex scanl/5
@snindex scanl/5
@cnindex scanl/5
Left scan of list.
@item scanl(:@var{ Pred} , +@var{ List1} , +@var{ List2} , +@var{ List3} , ?@var{ V0} , ?@var{ Vs} )
@findex scanl/6
@snindex scanl/6
@cnindex scanl/6
Left scan of list.
@item scanl(:@var{ Pred} , +@var{ List1} , +@var{ List2} , +@var{ List3} , +@var{ List4} , ?@var{ V0} , ?@var{ Vs} )
@findex scanl/7
@snindex scanl/7
@cnindex scanl/7
Left scan of list.
2010-04-20 23:06:41 +01:00
@item mapnodes(+@var{ Pred} , +@var{ TermIn} , ?@var{ TermOut} )
@findex mapnodes/3
@snindex mapnodes/3
@cnindex mapnodes/3
Creates @var{ TermOut} by applying the predicate @var{ Pred}
to all sub-terms of @var{ TermIn} (depth-first and left-to-right order)
@item checknodes(+@var{ Pred} , +@var{ Term} )
@findex checknodes/3
@snindex checknodes/3
@cnindex checknodes/3
Succeeds if the predicate @var{ Pred} succeeds on all sub-terms of
@var{ Term} (depth-first and left-to-right order)
@item sumnodes(+@var{ Pred} , +@var{ Term} , ?@var{ AccIn} , ?@var{ AccOut} )
@findex sumnodes/4
@snindex sumnodes/4
@cnindex sumnodes/4
Calls the predicate @var{ Pred} on all sub-terms of @var{ Term} and
collect a result in @var{ Accumulator} (depth-first and left-to-right
order)
@item include(+@var{ Pred} , +@var{ ListIn} , ?@var{ ListOut} )
@findex include/3
@snindex include/3
@cnindex include/3
Same as @code{ selectlist/3} .
@item exclude(+@var{ Goal} , +@var{ List1} , ?@var{ List2} )
@findex exclude/3
@snindex exclude/3
@cnindex exclude/3
Filter elements for which @var{ Goal} fails. True if @var{ List2} contains
those elements @var{ Xi} of @var{ List1} for which @code{ call(Goal, Xi)} fails.
@item partition(+@var{ Pred} , +@var{ List1} , ?@var{ Included} , ?@var{ Excluded} )
@findex partition/4
@snindex partition/4
@cnindex partition/4
Filter elements of @var{ List} according to @var{ Pred} . True if
@var{ Included} contains all elements for which @code{ call(Pred, X)}
succeeds and @var{ Excluded} contains the remaining elements.
@item partition(+@var{ Pred} , +@var{ List1} , ?@var{ Lesser} , ?@var{ Equal} , ?@var{ Greater} )
@findex partition/5
@snindex partition/5
@cnindex partition/5
Filter list according to @var{ Pred} in three sets. For each element
@var{ Xi} of @var{ List} , its destination is determined by
@code{ call(Pred, Xi, Place)} , where @var{ Place} must be unified to one
of @code{ <} , @code{ =} or @code{ >} . @code{ Pred} must be deterministic.
@end table
Examples:
2014-04-21 11:14:18 +01:00
@pl_ example
2010-04-20 23:06:41 +01:00
%given
plus(X,Y,Z) :- Z is X + Y.
plus_ if_ pos(X,Y,Z) :- Y > 0, Z is X + Y.
vars(X, Y, [X|Y]) :- var(X), !.
vars(_ , Y, Y).
trans(TermIn, TermOut) :-
2010-12-07 17:50:51 +00:00
nonvar(TermIn),
2010-04-20 23:06:41 +01:00
TermIn =.. [p|Args],
2010-12-07 17:50:51 +00:00
TermOut =..[q|Args], !.
2010-04-20 23:06:41 +01:00
trans(X,X).
%success
maplist(plus(1), [1,2,3,4], [2,3,4,5]).
checklist(var, [X,Y,Z]).
selectlist(<(0), [-1,0,1], [1]).
convlist(plus_ if_ pos(1), [-1,0,1], [2]).
sumlist(plus, [1,2,3,4], 1, 11).
mapargs(number_ atom,s(1,2,3), s('1','2','3')).
2010-12-07 17:50:51 +00:00
sumargs(vars, s(1,X,2,Y), [], [Y,X]).m
apnodes(trans, p(a,p(b,a),c), q(a,q(b,a),c)).
2010-04-20 23:06:41 +01:00
checknodes(\= =(T), p(X,p(Y,X),Z)).
sumnodes(vars, [c(X), p(X,Y), q(Y)], [], [Y,Y,X,X]).
% another one
maplist(mapargs(number_ atom),[c(1),s(1,2,3)],[c('1'),s('1','2','3')]).
2014-04-21 11:14:18 +01:00
@end pl_ example
2010-04-20 23:06:41 +01:00
2014-05-28 00:12:36 +01:00
@end texinfo
2010-04-20 23:06:41 +01:00
@node matrix, MATLAB, MapList, Library
2007-10-28 01:54:09 +01:00
@section Matrix Library
@cindex Matrix Library
2007-07-03 16:24:20 +01:00
This package provides a fast implementation of multi-dimensional
matrices of integers and floats. In contrast to dynamic arrays, these
matrices are multi-dimensional and compact. In contrast to static
arrays. these arrays are allocated in the stack. Matrices are available
2013-09-28 11:09:32 +01:00
by loading the library @code{ library(matrix)} . They are multimensional
2014-03-20 15:41:17 +00:00
objects of type:
2014-03-27 15:34:25 +00:00
@itemize
2014-03-20 15:41:17 +00:00
@item @t{ terms} : Prolog terms
2013-09-28 11:09:32 +01:00
@item @t{ ints} : bounded integers, represented as an opaque term. The
maximum integer depends on hardware, but should be obtained from the
natural size of the machine.
@item @t{ floats} : floating-poiny numbers, represented as an opaque term.
@end itemize
2007-07-03 16:24:20 +01:00
2013-09-28 11:09:32 +01:00
Matrix elements can be accessed through the @code{ matrix_ get/2}
predicate or through an @t{ R} -inspired access notation (that uses the ciao
style extension to @code{ []} . Examples include:
@table @code
@item @var{ E} <== @var{ X} [2,3]
Access the second row, third column of matrix @t{ X} . Indices start from
@code{ 0} ,
@item @var{ L} <== @var{ X} [2,_ ]
Access all the second row, the output is a list ofe elements.
@item @var{ L} <== @var{ X} [2..4,_ ]
Access all the second, thrd and fourth rows, the output is a list of elements.
@item @var{ L} <== @var{ X} [2..4+3,_ ]
Access all the fifth, sixth and eight rows, the output is a list of elements.
@end table
2013-09-29 11:31:18 +01:00
The matrix library also supports a B-Prolog/ECliPSe inspired @code{ foreach} ITerator to iterate over
2013-09-28 11:09:32 +01:00
elements of a matrix:
@table @code
2013-09-29 11:31:18 +01:00
@item foreach(I in 0..N1, X[I] <== Y[I])
2013-09-28 11:09:32 +01:00
Copies a vector, element by element.
2013-09-29 11:31:18 +01:00
@item foreach([I in 0..N1, J in I..N1], Z[I,J] <== X[I,J] - X[I,J])
2013-09-28 11:09:32 +01:00
The lower-triangular matrix @var{ Z} is the difference between the
lower-triangular and upper-triangular parts of @var{ X} .
2013-09-29 11:31:18 +01:00
@item foreach([I in 0..N1, J in 0..N1], plus(X[I,J]), 0, Sum)
2013-09-28 11:09:32 +01:00
Add all elements of a matrix by using @var{ Sum} as an accumulator.
@end table
Notice that the library does not support all known matrix operations. Please
2010-05-03 14:26:56 +01:00
contact the YAP maintainers if you require extra functionality.
2007-07-03 16:24:20 +01:00
@table @code
2013-09-28 11:09:32 +01:00
@item @var{ X} = array[@var{ Dim1} ,...,@var{ Dimn} ] of @var{ Objects}
@findex of/2
@snindex of/2
@cnindex of/2
The @code{ of/2} operator can be used to create a new array of
@var{ Objects} . The objects supported are:
@table @code
@item Unbound Variable
create an array of free variables
@item ints
create an array of integers
@item floats
create an array of floating-point numbers
@item @var{ I} :@var{ J}
create an array with integers from @var{ I} to @var{ J}
@item [..]
create an array from the values in a list
@end table
The dimensions can be given as an integer, and the matrix will be
indexed @code{ C} -style from @code{ 0..(@var{ Max} -1)} , or can be given
as an interval @code{ @var{ Base} ..@var{ Limit} } . In the latter case,
matrices of integers and of floating-point numbers should have the same
@var{ Base} on every dimension.
@item ?@var{ LHS} <== @var{ RHS}
@findex <==/2
@snindex <==/2
@cnindex <==/2
General matrix assignment operation. It evaluates the right-hand side
and then acts different according to the
left-hand side and to the matrix:
2014-03-27 15:34:25 +00:00
@itemize @bullet
2013-09-28 11:09:32 +01:00
@item if @var{ LHS} is part of an integer or floating-point matrix,
perform non-backtrackable assignment.
@item other unify left-hand side and right-hand size.
@end itemize
The right-hand side supports the following operators:
@table @code
@item []/2
written as @var{ M} [@var{ Offset} ]: obtain an element or list of elements
of matrix @var{ M} at offset @var{ Offset} .
@item matrix/1
create a vector from a list
@item matrix/2
create a matrix from a list. Oprions are:
@table @code
@item dim=
a list of dimensiona
@item type=
integers, floating-point or terms
@item base=
a list of base offsets per dimension (all must be the same for arrays of
integers and floating-points
@end table
@item matrix/3
create matrix giving two options
@item dim/1
list with matrix dimensions
@item nrow/1
number of rows in bi-dimensional matrix
@item ncol/1
number of columns in bi-dimensional matrix
@item length/1
size of a matrix
@item size/1
size of a matrix
@item max/1
maximum element of a numeric matrix
@item maxarg/1
argument of maximum element of a numeric matrix
@item min/1
minimum element of a numeric matrix
@item minarg/1
argument of minimum element of a numeric matrix
@item list/1
represent matrix as a list
@item lists/2
represent matrix as list of embedded lists
@item ../2
@var{ I} ..@var{ J} generates a list with all integers from @var{ I} to
@var{ J} , included.
@item +/2
add two numbers, add two matrices element-by-element, or add a number to
all elements of a matrix or list
@item -/2
subtract two numbers, subtract two matrices or lists element-by-element, or subtract a number from
all elements of a matrix or list
2014-04-09 12:39:52 +01:00
@item * /2
2013-09-28 11:09:32 +01:00
multiply two numbers, multiply two matrices or lists element-by-element, or multiply a number from
all elements of a matrix or list
@item log/1
natural logarithm of a number, matrix or list
@item exp/1
natural exponentiation of a number, matrix or list
@end table
2013-09-29 11:31:18 +01:00
@item foreach(@var{ Sequence} , @var{ Goal} )
2014-04-10 11:59:30 +01:00
@findex foreach_ matrix/2
@snindex foreach_ matrix/2
@cnindex foreach_ matrix/2
2014-04-09 12:39:52 +01:00
Deterministic iterator. The ranges are given by @var{ Sequence} that is
either @code{ @var{ I} in @var{ M} ..@var{ N} } , or of the form
@code{ [@var{ I} ,@var{ J} ] ins @var{ M} ..@var{ N} } , or a list of the above conditions.
2013-09-29 11:31:18 +01:00
Variables in the goal are assumed to be global, ie, share a single value
in the execution. The exceptions are the iteration indices. Moreover, if
the goal is of the form @code{ @var{ Locals} ^ @var{ G} } all variables
occurring in @var{ Locals} are marked as local. As an example:
2014-04-21 11:14:18 +01:00
@pl_ example
2013-09-29 11:31:18 +01:00
foreach([I,J] ins 1..N, A^ (A <==M[I,J], N[I] <== N[I] + A*A) )
2014-04-21 11:14:18 +01:00
@end pl_ example
2013-09-29 11:31:18 +01:00
the variables @var{ I} , @var{ J} and @var{ A} are duplicated for every
call (local), whereas the matrices @var{ M} and @var{ N} are shared
throughout the execution (global).
@item foreach(@var{ Sequence} , @var{ Goal} , @var{ Acc0} , @var{ AccF} )
@findex foreach/4
@snindex foreach/4
@cnindex foreach/4
Deterministic iterator with accumulator style arguments.
2013-09-28 11:09:32 +01:00
2007-07-03 16:24:20 +01:00
@item matrix_ new(+@var{ Type} ,+@var{ Dims} ,-@var{ Matrix} )
@findex matrix_ new/3
@snindex matrix_ new/3
@cnindex matrix_ new/3
Create a new matrix @var{ Matrix} of type @var{ Type} , which may be one of
@code{ ints} or @code{ floats} , and with a list of dimensions @var{ Dims} .
The matrix will be initialised to zeros.
@example
?- matrix_ new(ints,[2,3],Matrix).
2014-03-20 15:41:17 +00:00
Matrix = @{ ..@}
2007-07-03 16:24:20 +01:00
@end example
2014-03-20 15:41:17 +00:00
Notice that currently YAP will always write a matrix of numbers as @code{ @{ ..@} } .
2007-07-03 16:24:20 +01:00
@item matrix_ new(+@var{ Type} ,+@var{ Dims} ,+@var{ List} ,-@var{ Matrix} )
@findex matrix_ new/4
@snindex matrix_ new/4
@cnindex matrix_ new/4
Create a new matrix @var{ Matrix} of type @var{ Type} , which may be one of
@code{ ints} or @code{ floats} , with dimensions @var{ Dims} , and
initialised from list @var{ List} .
@item matrix_ new_ set(?@var{ Dims} ,+@var{ OldMatrix} ,+@var{ Value} ,-@var{ NewMatrix} )
@findex matrix_ new_ set/4
@snindex matrix_ new_ set/4
@cnindex matrix_ new_ set/4
Create a new matrix @var{ NewMatrix} of type @var{ Type} , with dimensions
@var{ Dims} . The elements of @var{ NewMatrix} are set to @var{ Value} .
@item matrix_ dims(+@var{ Matrix} ,-@var{ Dims} )
@findex matrix_ dims/2
@snindex matrix_ dims/2
@cnindex matrix_ dims/2
Unify @var{ Dims} with a list of dimensions for @var{ Matrix} .
@item matrix_ ndims(+@var{ Matrix} ,-@var{ Dims} )
@findex matrix_ ndims/2
@snindex matrix_ ndims/2
@cnindex matrix_ ndims/2
Unify @var{ NDims} with the number of dimensions for @var{ Matrix} .
@item matrix_ size(+@var{ Matrix} ,-@var{ NElems} )
@findex matrix_ size/2
@snindex matrix_ size/2
@cnindex matrix_ size/2
Unify @var{ NElems} with the number of elements for @var{ Matrix} .
@item matrix_ type(+@var{ Matrix} ,-@var{ Type} )
@findex matrix_ type/2
@snindex matrix_ type/2
@cnindex matrix_ type/2
Unify @var{ NElems} with the type of the elements in @var{ Matrix} .
@item matrix_ to_ list(+@var{ Matrix} ,-@var{ Elems} )
@findex matrix_ to_ list/2
@snindex matrix_ to_ list/2
@cnindex matrix_ to_ list/2
Unify @var{ Elems} with the list including all the elements in @var{ Matrix} .
@item matrix_ get(+@var{ Matrix} ,+@var{ Position} ,-@var{ Elem} )
@findex matrix_ get/3
@snindex matrix_ get/3
@cnindex matrix_ get/3
Unify @var{ Elem} with the element of @var{ Matrix} at position
@var{ Position} .
2013-07-16 13:59:38 +01:00
@item matrix_ get(+@var{ Matrix} [+@var{ Position} ],-@var{ Elem} )
@findex matrix_ get/2
@snindex matrix_ get/2
@cnindex matrix_ get/2
Unify @var{ Elem} with the element @var{ Matrix} [@var{ Position} ].
2007-07-03 16:24:20 +01:00
@item matrix_ set(+@var{ Matrix} ,+@var{ Position} ,+@var{ Elem} )
@findex matrix_ set/3
@snindex matrix_ set/3
@cnindex matrix_ set/3
Set the element of @var{ Matrix} at position
@var{ Position} to @var{ Elem} .
2013-07-16 13:59:38 +01:00
@item matrix_ set(+@var{ Matrix} [+@var{ Position} ],+@var{ Elem} )
@findex matrix_ set/2
@snindex matrix_ set/2
@cnindex matrix_ set/2
Set the element of @var{ Matrix} [@var{ Position} ] to @var{ Elem} .
2007-07-03 16:24:20 +01:00
@item matrix_ set_ all(+@var{ Matrix} ,+@var{ Elem} )
@findex matrix_ set_ all/2
@snindex matrix_ set_ all/2
@cnindex matrix_ set_ all/2
Set all element of @var{ Matrix} to @var{ Elem} .
@item matrix_ add(+@var{ Matrix} ,+@var{ Position} ,+@var{ Operand} )
@findex matrix_ add/3
@snindex matrix_ add/3
@cnindex matrix_ add/3
Add @var{ Operand} to the element of @var{ Matrix} at position
@var{ Position} .
@item matrix_ inc(+@var{ Matrix} ,+@var{ Position} )
@findex matrix_ inc/2
@snindex matrix_ inc/2
@cnindex matrix_ inc/2
Increment the element of @var{ Matrix} at position @var{ Position} .
@item matrix_ inc(+@var{ Matrix} ,+@var{ Position} ,-@var{ Element} )
@findex matrix_ inc/3
@snindex matrix_ inc/3
@cnindex matrix_ inc/3
Increment the element of @var{ Matrix} at position @var{ Position} and
unify with @var{ Element} .
@item matrix_ dec(+@var{ Matrix} ,+@var{ Position} )
@findex matrix_ dec/2
@snindex matrix_ dec/2
@cnindex matrix_ dec/2
Decrement the element of @var{ Matrix} at position @var{ Position} .
@item matrix_ dec(+@var{ Matrix} ,+@var{ Position} ,-@var{ Element} )
@findex matrix_ dec/3
@snindex matrix_ dec/3
@cnindex matrix_ dec/3
Decrement the element of @var{ Matrix} at position @var{ Position} and
unify with @var{ Element} .
@item matrix_ arg_ to_ offset(+@var{ Matrix} ,+@var{ Position} ,-@var{ Offset} )
@findex matrix_ arg_ to_ offset/3
@snindex matrix_ arg_ to_ offset/3
@cnindex matrix_ arg_ to_ offset/3
Given matrix @var{ Matrix} return what is the numerical @var{ Offset} of
the element at @var{ Position} .
@item matrix_ offset_ to_ arg(+@var{ Matrix} ,-@var{ Offset} ,+@var{ Position} )
@findex matrix_ offset_ to_ arg/3
@snindex matrix_ offset_ to_ arg/3
@cnindex matrix_ offset_ to_ arg/3
Given a position @var{ Position } for matrix @var{ Matrix} return the
corresponding numerical @var{ Offset} from the beginning of the matrix.
@item matrix_ max(+@var{ Matrix} ,+@var{ Max} )
@findex matrix_ max/2
@snindex matrix_ max/2
@cnindex matrix_ max/2
Unify @var{ Max} with the maximum in matrix @var{ Matrix} .
@item matrix_ maxarg(+@var{ Matrix} ,+@var{ Maxarg} )
@findex matrix_ maxarg/2
@snindex matrix_ maxarg/2
@cnindex matrix_ maxarg/2
Unify @var{ Max} with the position of the maximum in matrix @var{ Matrix} .
@item matrix_ min(+@var{ Matrix} ,+@var{ Min} )
@findex matrix_ min/2
@snindex matrix_ min/2
@cnindex matrix_ min/2
Unify @var{ Min} with the minimum in matrix @var{ Matrix} .
@item matrix_ minarg(+@var{ Matrix} ,+@var{ Minarg} )
@findex matrix_ minarg/2
@snindex matrix_ minarg/2
@cnindex matrix_ minarg/2
Unify @var{ Min} with the position of the minimum in matrix @var{ Matrix} .
@item matrix_ sum(+@var{ Matrix} ,+@var{ Sum} )
@findex matrix_ sum/2
@snindex matrix_ sum/2
@cnindex matrix_ sum/2
Unify @var{ Sum} with the sum of all elements in matrix @var{ Matrix} .
@c @item matrix_ add_ to_ all(+@var{ Matrix} ,+@var{ Element} )
@c @findex matrix_ add_ to_ all/2
@c @snindex matrix_ add_ to_ all/2
@c @cnindex matrix_ add_ to_ all/2
@c Add @var{ Element} to all elements of matrix @var{ Matrix} .
@item matrix_ agg_ lines(+@var{ Matrix} ,+@var{ Aggregate} )
@findex matrix_ agg_ lines/2
@snindex matrix_ agg_ lines/2
@cnindex matrix_ agg_ lines/2
If @var{ Matrix} is a n-dimensional matrix, unify @var{ Aggregate} with
the n-1 dimensional matrix where each element is obtained by adding all
Matrix elements with same last n-1 index.
@item matrix_ agg_ cols(+@var{ Matrix} ,+@var{ Aggregate} )
@findex matrix_ agg_ cols/2
@snindex matrix_ agg_ cols/2
@cnindex matrix_ agg_ cols/2
If @var{ Matrix} is a n-dimensional matrix, unify @var{ Aggregate} with
the one dimensional matrix where each element is obtained by adding all
Matrix elements with same first index.
@item matrix_ op(+@var{ Matrix1} ,+@var{ Matrix2} ,+@var{ Op} ,-@var{ Result} )
@findex matrix_ op/4
@snindex matrix_ op/4
@cnindex matrix_ op/4
@var{ Result} is the result of applying @var{ Op} to matrix @var{ Matrix1}
and @var{ Matrix2} . Currently, only addition (@code{ +} ) is supported.
@item matrix_ op_ to_ all(+@var{ Matrix1} ,+@var{ Op} ,+@var{ Operand} ,-@var{ Result} )
2014-04-10 11:59:30 +01:00
@findex matrix_ op_ to_ all/4
@snindex matrix_ op_ to_ all/4
@cnindex matrix_ op_ to_ all/4
2007-07-03 16:24:20 +01:00
@var{ Result} is the result of applying @var{ Op} to all elements of
@var{ Matrix1} , with @var{ Operand} as the second argument. Currently,
2013-09-28 11:09:32 +01:00
only addition (@code{ +} ), multiplication (@code{ *} ), and division
2007-07-03 16:24:20 +01:00
(@code{ /} ) are supported.
@item matrix_ op_ to_ lines(+@var{ Matrix1} ,+@var{ Lines} ,+@var{ Op} ,-@var{ Result} )
@findex matrix_ op_ to_ lines/4
@snindex matrix_ op_ to_ lines/4
@cnindex matrix_ op_ to_ lines/4
@var{ Result} is the result of applying @var{ Op} to all elements of
@var{ Matrix1} , with the corresponding element in @var{ Lines} as the
second argument. Currently, only division (@code{ /} ) is supported.
@item matrix_ op_ to_ cols(+@var{ Matrix1} ,+@var{ Cols} ,+@var{ Op} ,-@var{ Result} )
@findex matrix_ op_ to_ cols/4
@snindex matrix_ op_ to_ cols/4
@cnindex matrix_ op_ to_ cols/4
@var{ Result} is the result of applying @var{ Op} to all elements of
@var{ Matrix1} , with the corresponding element in @var{ Cols} as the
second argument. Currently, only addition (@code{ +} ) is
supported. Notice that @var{ Cols} will have n-1 dimensions.
2007-11-16 14:58:41 +00:00
@item matrix_ shuffle(+@var{ Matrix} ,+@var{ NewOrder} ,-@var{ Shuffle} )
@findex matrix_ shuffle/3
@snindex matrix_ shuffle/3
@cnindex matrix_ shuffle/3
Shuffle the dimensions of matrix @var{ Matrix} according to
@var{ NewOrder} . The list @var{ NewOrder} must have all the dimensions of
@var{ Matrix} , starting from 0.
@item matrix_ transpose(+@var{ Matrix} ,-@var{ Transpose} )
@findex matrix_ reorder/3
@snindex matrix_ reorder/3
@cnindex matrix_ reorder/3
Transpose matrix @var{ Matrix} to @var{ Transpose} . Equivalent to:
@example
matrix_ transpose(Matrix,Transpose) :-
matrix_ shuffle(Matrix,[1,0],Transpose).
@end example
@item matrix_ expand(+@var{ Matrix} ,+@var{ NewDimensions} ,-@var{ New} )
@findex matrix_ expand/3
@snindex matrix_ expand/3
@cnindex matrix_ expand/3
Expand @var{ Matrix} to occupy new dimensions. The elements in
@var{ NewDimensions} are either 0, for an existing dimension, or a
positive integer with the size of the new dimension.
@item matrix_ select(+@var{ Matrix} ,+@var{ Dimension} ,+@var{ Index} ,-@var{ New} )
2008-10-31 09:47:58 +00:00
@findex matrix_ select/4
@snindex matrix_ select/4
@cnindex matrix_ select/4
2007-11-16 14:58:41 +00:00
Select from @var{ Matrix} the elements who have @var{ Index} at
@var{ Dimension} .
2008-10-31 09:47:58 +00:00
@item matrix_ row(+@var{ Matrix} ,+@var{ Column} ,-@var{ NewMatrix} )
@findex matrix_ row/3
@snindex matrix_ row/3
@cnindex matrix_ row/3
Select from @var{ Matrix} the row matching @var{ Column} as new matrix @var{ NewMatrix} . @var{ Column} must have one less dimension than the original matrix.
@var{ Dimension} .
2007-07-03 16:24:20 +01:00
@end table
@node MATLAB, Non-Backtrackable Data Structures, matrix, Library
2007-06-29 02:33:35 +01:00
@section MATLAB Package Interface
@cindex Matlab Interface
The MathWorks MATLAB is a widely used package for array
processing. YAP now includes a straightforward interface to MATLAB. To
actually use it, you need to install YAP calling @code{ configure} with
the @code{ --with-matlab=DIR} option, and you need to call
@code{ use_ module(library(lists))} command.
2010-09-24 00:47:02 +01:00
Accessing the matlab dynamic libraries can be complicated. In Linux
machines, to use this interface, you may have to set the environment
variable @t{ LD_ LIBRARY_ PATH} . Next, follows an example using bash in a
64-bit Linux PC:
@example
export LD_ LIBRARY_ PATH=''$ MATLAB _ HOME" / sys / os / glnxa 64 :'' $ MATLAB_ HOME"/bin/glnxa64:''$ LD _ LIBRARY _ PATH"
@end example
where @code{ MATLAB_ HOME} is the directory where matlab is installed
at. Please replace @code{ ax64} for @code{ x86} on a 32-bit PC.
2007-06-29 02:33:35 +01:00
@table @code
@item start_ matlab(+@var{ Options} )
@findex start_ matlab/1
@snindex start_ matlab/1
@cnindex start_ matlab/1
Start a matlab session. The argument @var{ Options} may either be the
empty string/atom or the command to call matlab. The command may fail.
@item close_ matlab
@findex close_ matlab/0
@snindex close_ matlab/0
@cnindex close_ matlab/0
Stop the current matlab session.
@item matlab_ on
@findex matlab_ on/0
@snindex matlab_ on/0
@cnindex matlab_ on/0
Holds if a matlab session is on.
@item matlab_ eval_ string(+@var{ Command} )
@findex matlab_ eval_ string/1
@snindex matlab_ eval_ string/1
@cnindex matlab_ eval_ string/1
Holds if matlab evaluated successfully the command @var{ Command} .
@item matlab_ eval_ string(+@var{ Command} , -@var{ Answer} )
@findex matlab_ eval_ string/2
@snindex matlab_ eval_ string/2
@cnindex matlab_ eval_ string/2
MATLAB will evaluate the command @var{ Command} and unify @var{ Answer}
with a string reporting the result.
@item matlab_ cells(+@var{ Size} , ?@var{ Array} )
@findex matlab_ cells/2
@snindex matlab_ cells/2
@cnindex matlab_ cells/2
MATLAB will create an empty vector of cells of size @var{ Size} , and if
@var{ Array} is bound to an atom, store the array in the matlab
variable with name @var{ Array} . Corresponds to the MATLAB command @code{ cells} .
@item matlab_ cells(+@var{ SizeX} , +@var{ SizeY} , ?@var{ Array} )
@findex matlab_ cells/3
@snindex matlab_ cells/3
@cnindex matlab_ cells/3
MATLAB will create an empty array of cells of size @var{ SizeX} and
@var{ SizeY} , and if @var{ Array} is bound to an atom, store the array
in the matlab variable with name @var{ Array} . Corresponds to the
MATLAB command @code{ cells} .
@item matlab_ initialized_ cells(+@var{ SizeX} , +@var{ SizeY} , +@var{ List} , ?@var{ Array} )
@findex matlab_ initialized_ cells/4
@snindex matlab_ initialized_ cells/4
@cnindex matlab_ initialized_ cells/4
MATLAB will create an array of cells of size @var{ SizeX} and
@var{ SizeY} , initialized from the list @var{ List} , and if @var{ Array}
is bound to an atom, store the array in the matlab variable with name
@var{ Array} .
@item matlab_ matrix(+@var{ SizeX} , +@var{ SizeY} , +@var{ List} , ?@var{ Array} )
@findex matlab_ matrix/4
@snindex matlab_ matrix/4
@cnindex matlab_ matrix/4
MATLAB will create an array of floats of size @var{ SizeX} and @var{ SizeY} ,
initialized from the list @var{ List} , and if @var{ Array} is bound to
an atom, store the array in the matlab variable with name @var{ Array} .
@item matlab_ set(+@var{ MatVar} , +@var{ X} , +@var{ Y} , +@var{ Value} )
@findex matlab_ set/4
@snindex matlab_ set/4
@cnindex matlab_ set/4
Call MATLAB to set element @var{ MatVar} (@var{ X} , @var{ Y} ) to
@var{ Value} . Notice that this command uses the MATLAB array access
convention.
@item matlab_ get_ variable(+@var{ MatVar} , -@var{ List} )
@findex matlab_ get_ variable/2
@snindex matlab_ get_ variable/2
@cnindex matlab_ get_ variable/2
Unify MATLAB variable @var{ MatVar} with the List @var{ List} .
@item matlab_ item(+@var{ MatVar} , +@var{ X} , ?@var{ Val} )
@findex matlab_ item/3
@snindex matlab_ item/3
@cnindex matlab_ item/3
Read or set MATLAB @var{ MatVar} (@var{ X} ) from/to @var{ Val} . Use
@code{ C} notation for matrix access (ie, starting from 0).
@item matlab_ item(+@var{ MatVar} , +@var{ X} , +@var{ Y} , ?@var{ Val} )
@findex matlab_ item/4
@snindex matlab_ item/4
@cnindex matlab_ item/4
Read or set MATLAB @var{ MatVar} (@var{ X} ,@var{ Y} ) from/to @var{ Val} . Use
@code{ C} notation for matrix access (ie, starting from 0).
@item matlab_ item1(+@var{ MatVar} , +@var{ X} , ?@var{ Val} )
2014-04-21 11:14:18 +01:00
@findex matlab_ item1/3
@snindex matlab_ item1/3
@cnindex matlab_ item1/3
2007-06-29 02:33:35 +01:00
Read or set MATLAB @var{ MatVar} (@var{ X} ) from/to @var{ Val} . Use
MATLAB notation for matrix access (ie, starting from 1).
@item matlab_ item1(+@var{ MatVar} , +@var{ X} , +@var{ Y} , ?@var{ Val} )
2014-04-21 11:14:18 +01:00
@findex matlab_ item1/4
@snindex matlab_ item1/4
@cnindex matlab_ item1/4
2007-06-29 02:33:35 +01:00
Read or set MATLAB @var{ MatVar} (@var{ X} ,@var{ Y} ) from/to @var{ Val} . Use
MATLAB notation for matrix access (ie, starting from 1).
@item matlab_ sequence(+@var{ Min} , +@var{ Max} , ?@var{ Array} )
@findex matlab_ sequence/3
@snindex matlab_ sequence/3
@cnindex matlab_ sequence/3
MATLAB will create a sequence going from @var{ Min} to @var{ Max} , and
if @var{ Array} is bound to an atom, store the sequence in the matlab
variable with name @var{ Array} .
@item matlab_ vector(+@var{ Size} , +@var{ List} , ?@var{ Array} )
@findex matlab_ vector/4
@snindex matlab_ vector/4
@cnindex matlab_ vector/4
MATLAB will create a vector of floats of size @var{ Size} , initialized
from the list @var{ List} , and if @var{ Array} is bound to an atom,
store the array in the matlab variable with name @var{ Array} .
@item matlab_ zeros(+@var{ Size} , ?@var{ Array} )
@findex matlab_ zeros/2
@snindex matlab_ zeros/2
@cnindex matlab_ zeros/2
MATLAB will create a vector of zeros of size @var{ Size} , and if
@var{ Array} is bound to an atom, store the array in the matlab
variable with name @var{ Array} . Corresponds to the MATLAB command
@code{ zeros} .
@item matlab_ zeros(+@var{ SizeX} , +@var{ SizeY} , ?@var{ Array} )
@findex matlab_ zeros/3
@snindex matlab_ zeros/3
@cnindex matlab_ zeros/3
MATLAB will create an array of zeros of size @var{ SizeX} and
@var{ SizeY} , and if @var{ Array} is bound to an atom, store the array
in the matlab variable with name @var{ Array} . Corresponds to the
MATLAB command @code{ zeros} .
@item matlab_ zeros(+@var{ SizeX} , +@var{ SizeY} , +@var{ SizeZ} , ?@var{ Array} )
@findex matlab_ zeros/4
@snindex matlab_ zeros/4
@cnindex matlab_ zeros/4
MATLAB will create an array of zeros of size @var{ SizeX} , @var{ SizeY} ,
and @var{ SizeZ} . If @var{ Array} is bound to an atom, store the array
in the matlab variable with name @var{ Array} . Corresponds to the
MATLAB command @code{ zeros} .
@end table
@node Non-Backtrackable Data Structures, Ordered Sets, MATLAB, Library
2006-08-26 00:22:12 +01:00
@section Non-Backtrackable Data Structures
The following routines implement well-known data-structures using global
non-backtrackable variables (implemented on the Prolog stack). The
data-structures currently supported are Queues, Heaps, and Beam for Beam
search. They are allowed through @code{ library(nb)} .
@table @code
@item nb_ queue(-@var{ Queue} )
@findex nb_ queue/1
@snindex nb_ queue/1
@cnindex nb_ queue/1
Create a @var{ Queue} .
@item nb_ queue_ close(+@var{ Queue} , -@var{ Head} , ?@var{ Tail} )
@findex nb_ queue_ close/3
@snindex nb_ queue_ close/3
@cnindex nb_ queue_ close/3
Unify the queue @var{ Queue} with a difference list
@var{ Head} -@var{ Tail} . The queue will now be empty and no further
elements can be added.
2009-05-16 02:55:24 +01:00
@item nb_ queue_ enqueue(+@var{ Queue} , +@var{ Element} )
2006-08-26 00:22:12 +01:00
@findex nb_ queue_ enqueue/2
@snindex nb_ queue_ enqueue/2
@cnindex nb_ queue_ enqueue/2
Add @var{ Element} to the front of the queue @var{ Queue} .
@item nb_ queue_ dequeue(+@var{ Queue} , -@var{ Element} )
@findex nb_ queue_ dequeue/2
@snindex nb_ queue_ dequeue/2
@cnindex nb_ queue_ dequeue/2
Remove @var{ Element} from the front of the queue @var{ Queue} . Fail if
the queue is empty.
@item nb_ queue_ peek(+@var{ Queue} , -@var{ Element} )
@findex nb_ queue_ peek/2
@snindex nb_ queue_ peek/2
@cnindex nb_ queue_ peek/2
@var{ Element} is the front of the queue @var{ Queue} . Fail if
the queue is empty.
@item nb_ queue_ size(+@var{ Queue} , -@var{ Size} )
@findex nb_ queue_ size/2
@snindex nb_ queue_ size/2
@cnindex nb_ queue_ size/2
Unify @var{ Size} with the number of elements in the queue @var{ Queue} .
@item nb_ queue_ empty(+@var{ Queue} )
@findex nb_ queue_ empty/1
@snindex nb_ queue_ empty/1
@cnindex nb_ queue_ empty/1
Succeeds if @var{ Queue} is empty.
@item nb_ heap(+@var{ DefaultSize} ,-@var{ Heap} )
@findex nb_ heap/1
@snindex nb_ heap/1
@cnindex nb_ heap/1
Create a @var{ Heap} with default size @var{ DefaultSize} . Note that size
will expand as needed.
@item nb_ heap_ close(+@var{ Heap} )
@findex nb_ heap_ close/1
@snindex nb_ heap_ close/1
@cnindex nb_ heap_ close/1
Close the heap @var{ Heap} : no further elements can be added.
@item nb_ heap_ add(+@var{ Heap} , +@var{ Key} , +@var{ Value} )
@findex nb_ heap_ add/3
@snindex nb_ heap_ add/3
@cnindex nb_ heap_ add/3
Add @var{ Key} -@var{ Value} to the heap @var{ Heap} . The key is sorted on
@var{ Key} only.
@item nb_ heap_ del(+@var{ Heap} , -@var{ Key} , -@var{ Value} )
@findex nb_ heap_ del/3
@snindex nb_ heap_ del/3
@cnindex nb_ heap_ del/3
Remove element @var{ Key} -@var{ Value} with smallest @var{ Value} in heap
@var{ Heap} . Fail if the heap is empty.
@item nb_ heap_ peek(+@var{ Heap} , -@var{ Key} , -@var{ Value} ))
@findex nb_ heap_ peek/3
@snindex nb_ heap_ peek/3
@cnindex nb_ heap_ peek/3
@var{ Key} -@var{ Value} is the element with smallest @var{ Key} in the heap
@var{ Heap} . Fail if the heap is empty.
@item nb_ heap_ size(+@var{ Heap} , -@var{ Size} )
@findex nb_ heap_ size/2
@snindex nb_ heap_ size/2
@cnindex nb_ heap_ size/2
Unify @var{ Size} with the number of elements in the heap @var{ Heap} .
@item nb_ heap_ empty(+@var{ Heap} )
@findex nb_ heap_ empty/1
@snindex nb_ heap_ empty/1
@cnindex nb_ heap_ empty/1
Succeeds if @var{ Heap} is empty.
@item nb_ beam(+@var{ DefaultSize} ,-@var{ Beam} )
@findex nb_ beam/1
@snindex nb_ beam/1
@cnindex nb_ beam/1
Create a @var{ Beam} with default size @var{ DefaultSize} . Note that size
is fixed throughout.
@item nb_ beam_ close(+@var{ Beam} )
@findex nb_ beam_ close/1
@snindex nb_ beam_ close/1
@cnindex nb_ beam_ close/1
Close the beam @var{ Beam} : no further elements can be added.
@item nb_ beam_ add(+@var{ Beam} , +@var{ Key} , +@var{ Value} )
@findex nb_ beam_ add/3
@snindex nb_ beam_ add/3
@cnindex nb_ beam_ add/3
Add @var{ Key} -@var{ Value} to the beam @var{ Beam} . The key is sorted on
@var{ Key} only.
@item nb_ beam_ del(+@var{ Beam} , -@var{ Key} , -@var{ Value} )
@findex nb_ beam_ del/3
@snindex nb_ beam_ del/3
@cnindex nb_ beam_ del/3
Remove element @var{ Key} -@var{ Value} with smallest @var{ Value} in beam
@var{ Beam} . Fail if the beam is empty.
@item nb_ beam_ peek(+@var{ Beam} , -@var{ Key} , -@var{ Value} ))
@findex nb_ beam_ peek/3
@snindex nb_ beam_ peek/3
@cnindex nb_ beam_ peek/3
@var{ Key} -@var{ Value} is the element with smallest @var{ Key} in the beam
@var{ Beam} . Fail if the beam is empty.
@item nb_ beam_ size(+@var{ Beam} , -@var{ Size} )
@findex nb_ beam_ size/2
@snindex nb_ beam_ size/2
@cnindex nb_ beam_ size/2
Unify @var{ Size} with the number of elements in the beam @var{ Beam} .
@item nb_ beam_ empty(+@var{ Beam} )
@findex nb_ beam_ empty/1
@snindex nb_ beam_ empty/1
@cnindex nb_ beam_ empty/1
Succeeds if @var{ Beam} is empty.
@end table
@node Ordered Sets, Pseudo Random, Non-Backtrackable Data Structures, Library
2001-04-09 20:54:03 +01:00
@section Ordered Sets
@cindex ordered set
The following ordered set manipulation routines are available once
2001-04-25 21:31:00 +01:00
included with the @code{ use_ module(library(ordsets))} command. An
ordered set is represented by a list having unique and ordered
elements. Output arguments are guaranteed to be ordered sets, if the
relevant inputs are. This is a slightly patched version of Richard
O'Keefe's original library.
2001-04-09 20:54:03 +01:00
@table @code
@item list_ to_ ord_ set(+@var{ List} , ?@var{ Set} )
@findex list_ to_ ord_ set/2
@syindex list_ to_ ord_ set/2
@cnindex list_ to_ ord_ set/2
2001-04-25 21:31:00 +01:00
Holds when @var{ Set} is the ordered representation of the set
represented by the unordered representation @var{ List} .
2001-04-09 20:54:03 +01:00
@item merge(+@var{ List1} , +@var{ List2} , -@var{ Merged} )
@findex merge/3
@syindex merge/3
@cnindex merge/3
Holds when @var{ Merged} is the stable merge of the two given lists.
2001-04-25 21:31:00 +01:00
Notice that @code{ merge/3} will not remove duplicates, so merging
ordered sets will not necessarily result in an ordered set. Use
@code{ ord_ union/3} instead.
2001-04-09 20:54:03 +01:00
@item ord_ add_ element(+@var{ Set1} , +@var{ Element} , ?@var{ Set2} )
@findex ord_ add_ element/3
@syindex ord_ add_ element/3
@cnindex ord_ add_ element/3
2002-10-11 04:39:11 +01:00
Inserting @var{ Element} in @var{ Set1} returns @var{ Set2} . It should give
2001-04-09 20:54:03 +01:00
exactly the same result as @code{ merge(Set1, [Element], Set2)} , but a
bit faster, and certainly more clearly. The same as @code{ ord_ insert/3} .
@item ord_ del_ element(+@var{ Set1} , +@var{ Element} , ?@var{ Set2} )
@findex ord_ del_ element/3
@syindex ord_ del_ element/3
@cnindex ord_ del_ element/3
Removing @var{ Element} from @var{ Set1} returns @var{ Set2} .
@item ord_ disjoint(+@var{ Set1} , +@var{ Set2} )
@findex ord_ disjoint/2
@syindex ord_ disjoint/2
@cnindex ord_ disjoint/2
Holds when the two ordered sets have no element in common.
2001-04-26 15:44:43 +01:00
@item ord_ member(+@var{ Element} , +@var{ Set} )
@findex ord_ member/2
@syindex ord_ member/2
@cnindex ord_ member/2
2001-04-09 20:54:03 +01:00
Holds when @var{ Element} is a member of @var{ Set} .
@item ord_ insert(+@var{ Set1} , +@var{ Element} , ?@var{ Set2} )
@findex ord_ insert/3
@syindex ord_ insert/3
@cnindex ord_ insert/3
2002-10-11 04:39:11 +01:00
Inserting @var{ Element} in @var{ Set1} returns @var{ Set2} . It should give
2001-04-09 20:54:03 +01:00
exactly the same result as @code{ merge(Set1, [Element], Set2)} , but a
bit faster, and certainly more clearly. The same as @code{ ord_ add_ element/3} .
@item ord_ intersect(+@var{ Set1} , +@var{ Set2} )
@findex ord_ intersect/2
@syindex ord_ intersect/2
@cnindex ord_ intersect/2
Holds when the two ordered sets have at least one element in common.
@item ord_ intersection(+@var{ Set1} , +@var{ Set2} , ?@var{ Intersection} )
@findex ord_ intersect/3
@syindex ord_ intersect/3
@cnindex ord_ intersect/3
Holds when Intersection is the ordered representation of @var{ Set1}
and @var{ Set2} .
2005-02-18 21:34:02 +00:00
@item ord_ intersection(+@var{ Set1} , +@var{ Set2} , ?@var{ Intersection} , ?@var{ Diff} )
@findex ord_ intersect/4
@syindex ord_ intersect/4
@cnindex ord_ intersect/4
Holds when Intersection is the ordered representation of @var{ Set1}
2006-02-08 19:13:11 +00:00
and @var{ Set2} . @var{ Diff} is the difference between @var{ Set2} and @var{ Set1} .
2005-02-18 21:34:02 +00:00
2001-04-09 20:54:03 +01:00
@item ord_ seteq(+@var{ Set1} , +@var{ Set2} )
@findex ord_ seteq/2
@syindex ord_ seteq/2
@cnindex ord_ seteq/2
Holds when the two arguments represent the same set.
@item ord_ setproduct(+@var{ Set1} , +@var{ Set2} , -@var{ Set} )
@findex ord_ setproduct/3
@syindex ord_ setproduct/3
@cnindex ord_ setproduct/3
If Set1 and Set2 are ordered sets, Product will be an ordered
set of x1-x2 pairs.
@item ord_ subset(+@var{ Set1} , +@var{ Set2} )
@findex ordsubset/2
@syindex ordsubset/2
@cnindex ordsubset/2
Holds when every element of the ordered set @var{ Set1} appears in the
ordered set @var{ Set2} .
@item ord_ subtract(+@var{ Set1} , +@var{ Set2} , ?@var{ Difference} )
@findex ord_ subtract/3
@syindex ord_ subtract/3
@cnindex ord_ subtract/3
Holds when @var{ Difference} contains all and only the elements of @var{ Set1}
which are not also in @var{ Set2} .
@item ord_ symdiff(+@var{ Set1} , +@var{ Set2} , ?@var{ Difference} )
@findex ord_ symdiff/3
@syindex ord_ symdiff/3
@cnindex ord_ symdiff/3
Holds when @var{ Difference} is the symmetric difference of @var{ Set1}
and @var{ Set2} .
@item ord_ union(+@var{ Sets} , ?@var{ Union} )
@findex ord_ union/2
@syindex ord_ union/2
@cnindex ord_ union/2
Holds when @var{ Union} is the union of the lists @var{ Sets} .
@item ord_ union(+@var{ Set1} , +@var{ Set2} , ?@var{ Union} )
@findex ord_ union/3
@syindex ord_ union/3
@cnindex ord_ union/3
Holds when @var{ Union} is the union of @var{ Set1} and @var{ Set2} .
@item ord_ union(+@var{ Set1} , +@var{ Set2} , ?@var{ Union} , ?@var{ Diff} )
@findex ord_ union/4
@syindex ord_ union/4
@cnindex ord_ union/4
Holds when @var{ Union} is the union of @var{ Set1} and @var{ Set2} and
@var{ Diff} is the difference.
@end table
@node Pseudo Random, Queues, Ordered Sets, Library
@section Pseudo Random Number Integer Generator
@cindex pseudo random
The following routines produce random non-negative integers in the range
2006-02-08 19:13:11 +00:00
0 .. 2^ (w-1) -1, where w is the word size available for integers, e.g.
2001-04-09 20:54:03 +01:00
32 for Intel machines and 64 for Alpha machines. Note that the numbers
generated by this random number generator are repeatable. This generator
was originally written by Allen Van Gelder and is based on Knuth Vol 2.
@table @code
@item rannum(-@var{ I} )
@findex rannum/1
@snindex rannum/1
@cnindex rannum/1
Produces a random non-negative integer @var{ I} whose low bits are not
all that random, so it should be scaled to a smaller range in general.
The integer @var{ I} is in the range 0 .. 2^ (w-1) - 1. You can use:
@example
rannum(X) :- yap_ flag(max_ integer,MI), rannum(R), X is R/MI.
@end example
to obtain a floating point number uniformly distributed between 0 and 1.
@item ranstart
@findex ranstart/0
@snindex ranstart/0
@cnindex ranstart/0
2002-10-11 04:39:11 +01:00
Initialize the random number generator using a built-in seed. The
2001-04-09 20:54:03 +01:00
@code{ ranstart/0} built-in is always called by the system when loading
the package.
@item ranstart(+@var{ Seed} )
@findex ranstart/1
@snindex ranstart/1
@cnindex ranstart/1
2002-10-11 04:39:11 +01:00
Initialize the random number generator with user-defined @var{ Seed} . The
2001-04-09 20:54:03 +01:00
same @var{ Seed} always produces the same sequence of numbers.
@item ranunif(+@var{ Range} ,-@var{ I} )
@findex ranunif/2
@snindex ranunif/2
@cnindex ranunif/2
@code{ ranunif/2} produces a uniformly distributed non-negative random
integer @var{ I} over a caller-specified range @var{ R} . If range is @var{ R} ,
the result is in 0 .. @var{ R} -1.
@end table
@node Queues, Random, Pseudo Random, Library
@section Queues
@cindex queue
The following queue manipulation routines are available once
included with the @code{ use_ module(library(queues))} command. Queues are
implemented with difference lists.
@table @code
@item make_ queue(+@var{ Queue} )
@findex make_ queue/1
@syindex make_ queue/1
@cnindex make_ queue/1
Creates a new empty queue. It should only be used to create a new queue.
@item join_ queue(+@var{ Element} , +@var{ OldQueue} , -@var{ NewQueue} )
@findex join_ queue/3
@syindex join_ queue/3
@cnindex join_ queue/3
Adds the new element at the end of the queue.
@item list_ join_ queue(+@var{ List} , +@var{ OldQueue} , -@var{ NewQueue} )
@findex list_ join_ queue/3
@syindex list_ join_ queue/3
@cnindex list_ join_ queue/3
Ads the new elements at the end of the queue.
@item jump_ queue(+@var{ Element} , +@var{ OldQueue} , -@var{ NewQueue} )
@findex jump_ queue/3
@syindex jump_ queue/3
@cnindex jump_ queue/3
Adds the new element at the front of the list.
@item list_ jump_ queue(+@var{ List} , +@var{ OldQueue} , +@var{ NewQueue} )
@findex list_ jump_ queue/3
@syindex list_ jump_ queue/3
@cnindex list_ jump_ queue/3
Adds all the elements of @var{ List} at the front of the queue.
@item head_ queue(+@var{ Queue} , ?@var{ Head} )
@findex head_ queue/2
@syindex head_ queue/2
@cnindex head_ queue/2
Unifies Head with the first element of the queue.
@item serve_ queue(+@var{ OldQueue} , +@var{ Head} , -@var{ NewQueue} )
@findex serve_ queue/3
@syindex serve_ queue/3
@cnindex serve_ queue/3
Removes the first element of the queue for service.
@item empty_ queue(+@var{ Queue} )
@findex empty_ queue/1
@syindex empty_ queue/1
@cnindex empty_ queue/1
Tests whether the queue is empty.
@item length_ queue(+@var{ Queue} , -@var{ Length} )
@findex length_ queue/2
@syindex length_ queue/2
@cnindex length_ queue/2
Counts the number of elements currently in the queue.
@item list_ to_ queue(+@var{ List} , -@var{ Queue} )
@findex list_ to_ queue/2
@syindex list_ to_ queue/2
@cnindex list_ to_ queue/2
Creates a new queue with the same elements as @var{ List.}
@item queue_ to_ list(+@var{ Queue} , -@var{ List} )
@findex queue_ to_ list/2
@syindex queue_ to_ list/2
@cnindex queue_ to_ list/2
Creates a new list with the same elements as @var{ Queue} .
@end table
2006-08-02 19:18:31 +01:00
@node Random, Read Utilities, Queues, Library
2001-04-09 20:54:03 +01:00
@section Random Number Generator
2012-07-18 23:58:09 +01:00
@cindex random
2001-04-09 20:54:03 +01:00
The following random number operations are included with the
2007-02-18 00:26:36 +00:00
@code{ use_ module(library(random))} command. Since YAP-4.3.19 YAP uses
2001-06-11 21:20:36 +01:00
the O'Keefe public-domain algorithm, based on the "Applied Statistics"
algorithm AS183.
2001-04-09 20:54:03 +01:00
@table @code
@item getrand(-@var{ Key} )
@findex getrand/1
@syindex getrand/1
@cnindex getrand/1
Unify @var{ Key} with a term of the form @code{ rand(X,Y,Z)} describing the
current state of the random number generator.
@item random(-@var{ Number} )
@findex random/1
@syindex random/1
@cnindex random/1
Unify @var{ Number} with a floating-point number in the range @code{ [0...1)} .
2002-08-09 17:36:55 +01:00
@item random(+@var{ LOW} , +@var{ HIGH} , -@var{ NUMBER} )
2001-04-09 20:54:03 +01:00
@findex random/3
@syindex random/3
@cnindex random/3
Unify @var{ Number} with a number in the range
@code{ [LOW...HIGH)} . If both @var{ LOW} and @var{ HIGH} are
integers then @var{ NUMBER} will also be an integer, otherwise
@var{ NUMBER} will be a floating-point number.
@item randseq(+@var{ LENGTH} , +@var{ MAX} , -@var{ Numbers} )
@findex randseq/3
@syindex randseq/3
@cnindex randseq/3
Unify @var{ Numbers} with a list of @var{ LENGTH} unique random integers
2014-03-20 15:41:17 +00:00
in the range @code{ [1...@var{ MAX} )} .
2001-04-09 20:54:03 +01:00
@item randset(+@var{ LENGTH} , +@var{ MAX} , -@var{ Numbers} )
@findex randset/3
@syindex randset/3
@cnindex randset/3
Unify @var{ Numbers} with an ordered list of @var{ LENGTH} unique random
2014-03-20 15:41:17 +00:00
integers in the range @code{ [1...@var{ MAX} )} .
2001-04-09 20:54:03 +01:00
@item setrand(+@var{ Key} )
@findex setrand/1
@syindex setrand/1
@cnindex setrand/1
Use a term of the form @code{ rand(X,Y,Z)} to set a new state for the
random number generator. The integer @code{ X} must be in the range
@code{ [1...30269)} , the integer @code{ Y} must be in the range
@code{ [1...30307)} , and the integer @code{ Z} must be in the range
@code{ [1...30323)} .
@end table
2006-08-02 19:18:31 +01:00
@node Read Utilities, Red-Black Trees, Random, Library
@section Read Utilities
The @code{ readutil} library contains primitives to read lines, files,
multiple terms, etc.
@table @code
@item read_ line_ to_ codes(+@var{ Stream} , -@var{ Codes} )
@findex read_ line_ to_ codes/2
@snindex read_ line_ to_ codes/2
@cnindex read_ line_ to_ codes/2
Read the next line of input from @var{ Stream} and unify the result with
@var{ Codes} @emph{ after} the line has been read. A line is ended by a
newline character or end-of-file. Unlike @code{ read_ line_ to_ codes/3} ,
this predicate removes trailing newline character.
On end-of-file the atom @code{ end_ of_ file} is returned. See also
@code{ at_ end_ of_ stream/[0,1]} .
@item read_ line_ to_ codes(+@var{ Stream} , -@var{ Codes} , ?@var{ Tail} )
@findex read_ line_ to_ codes/3
@snindex read_ line_ to_ codes/3
@cnindex read_ line_ to_ codes/3
2009-04-25 16:59:23 +01:00
Difference-list version to read an input line to a list of character
2006-08-02 19:18:31 +01:00
codes. Reading stops at the newline or end-of-file character, but
unlike @code{ read_ line_ to_ codes/2} , the newline is retained in the
2009-04-25 16:59:23 +01:00
output. This predicate is especially useful for reading a block of
2006-08-02 19:18:31 +01:00
lines upto some delimiter. The following example reads an HTTP header
ended by a blank line:
@example
read_ header_ data(Stream, Header) :-
2013-09-29 11:31:18 +01:00
read_ line_ to_ codes(Stream, Header, Tail),
read_ header_ data(Header, Stream, Tail).
2006-08-02 19:18:31 +01:00
read_ header_ data("\r \n ", _ , _ ) :- !.
read_ header_ data("\n ", _ , _ ) :- !.
read_ header_ data("", _ , _ ) :- !.
read_ header_ data(_ , Stream, Tail) :-
2013-09-29 11:31:18 +01:00
read_ line_ to_ codes(Stream, Tail, NewTail),
read_ header_ data(Tail, Stream, NewTail).
2006-08-02 19:18:31 +01:00
@end example
@item read_ stream_ to_ codes(+@var{ Stream} , -@var{ Codes} )
2014-04-21 11:14:18 +01:00
@findex read_ stream_ to_ codes/2
@snindex read_ stream_ to_ codes/2
@cnindex read_ stream_ to_ codes/2
2006-08-02 19:18:31 +01:00
Read all input until end-of-file and unify the result to @var{ Codes} .
@item read_ stream_ to_ codes(+@var{ Stream} , -@var{ Codes} , ?@var{ Tail} )
@findex read_ stream_ to_ codes/3
@snindex read_ stream_ to_ codes/3
@cnindex read_ stream_ to_ codes/3
Difference-list version of @code{ read_ stream_ to_ codes/2} .
@item read_ file_ to_ codes(+@var{ Spec} , -@var{ Codes} , +@var{ Options} )
@findex read_ file_ to_ codes/3
@snindex read_ file_ to_ codes/3
@cnindex read_ file_ to_ codes/3
Read a file to a list of character codes. Currently ignores
@var{ Options} .
@c @var{ Spec} is a
@c file-specification for absolute_ file_ name/3. @var{ Codes} is the
@c resulting code-list. @var{ Options} is a list of options for
@c absolute_ file_ name/3 and open/4. In addition, the option
@c \term { tail} { Tail} is defined, forming a difference-list.
@item read_ file_ to_ terms(+@var{ Spec} , -@var{ Terms} , +@var{ Options} )
@findex read_ file_ to_ terms/3
@snindex read_ file_ to_ terms/3
@cnindex read_ file_ to_ terms/3
2009-04-25 16:59:23 +01:00
Read a file to a list of Prolog terms (see read/1). @c @var{ Spec} is a
2006-08-02 19:18:31 +01:00
@c file-specification for absolute_ file_ name/3. @var{ Terms} is the
@c resulting list of Prolog terms. @var{ Options} is a list of options for
@c absolute_ file_ name/3 and open/4. In addition, the option
@c \term { tail} { Tail} is defined, forming a difference-list.
@c \end { description}
@end table
@node Red-Black Trees, RegExp, Read Utilities, Library
2002-06-18 05:23:15 +01:00
@section Red-Black Trees
@cindex Red-Black Trees
Red-Black trees are balanced search binary trees. They are named because
nodes can be classified as either red or black. The code we include is
based on "Introduction to Algorithms", second edition, by Cormen,
Leiserson, Rivest and Stein. The library includes routines to insert,
lookup and delete elements in the tree.
@table @code
2006-04-10 20:24:52 +01:00
@item rb_ new(?@var{ T} )
@findex rb_ new/1
@snindex rb_ new/1
@cnindex rb_ new/1
Create a new tree.
@item rb_ empty(?@var{ T} )
@findex rb_ empty/1
@snindex rb_ empty/1
@cnindex rb_ empty/1
Succeeds if tree @var{ T} is empty.
@item is_ rbtree(+@var{ T} )
@findex is_ rbtree/1
@snindex is_ rbtree/1
@cnindex is_ rbtree/1
Check whether @var{ T} is a valid red-black tree.
@item rb_ insert(+@var{ T0} ,+@var{ Key} ,?@var{ Value} ,+@var{ TF} )
@findex rb_ insert/4
@snindex rb_ insert/4
@cnindex rb_ insert/4
2002-06-18 05:23:15 +01:00
Add an element with key @var{ Key} and @var{ Value} to the tree
2006-04-10 20:24:52 +01:00
@var{ T0} creating a new red-black tree @var{ TF} . Duplicated elements are not
2002-06-18 05:23:15 +01:00
allowed.
2008-06-16 22:22:15 +01:00
@snindex rb_ insert_ new/4
@cnindex rb_ insert_ new/4
Add a new element with key @var{ Key} and @var{ Value} to the tree
@var{ T0} creating a new red-black tree @var{ TF} . Fails is an element
with @var{ Key} exists in the tree.
2006-04-10 20:24:52 +01:00
@item rb_ lookup(+@var{ Key} ,-@var{ Value} ,+@var{ T} )
@findex rb_ lookup/3
@snindex rb_ lookup/3
@cnindex rb_ lookup/3
Backtrack through all elements with key @var{ Key} in the red-black tree
@var{ T} , returning for each the value @var{ Value} .
@item rb_ lookupall(+@var{ Key} ,-@var{ Value} ,+@var{ T} )
@findex rb_ lookupall/3
@snindex rb_ lookupall/3
@cnindex rb_ lookupall/3
2004-05-17 22:42:12 +01:00
Lookup all elements with key @var{ Key} in the red-black tree
2002-06-18 05:23:15 +01:00
@var{ T} , returning the value @var{ Value} .
2006-04-10 20:24:52 +01:00
@item rb_ delete(+@var{ T} ,+@var{ Key} ,-@var{ TN} )
@findex rb_ delete/3
@snindex rb_ delete/3
@cnindex rb_ delete/3
2002-06-18 05:23:15 +01:00
Delete element with key @var{ Key} from the tree @var{ T} , returning a new
tree @var{ TN} .
2006-04-10 20:24:52 +01:00
@item rb_ delete(+@var{ T} ,+@var{ Key} ,-@var{ Val} ,-@var{ TN} )
@findex rb_ delete/4
@snindex rb_ delete/4
@cnindex rb_ delete/4
Delete element with key @var{ Key} from the tree @var{ T} , returning the
value @var{ Val} associated with the key and a new tree @var{ TN} .
@item rb_ del_ min(+@var{ T} ,-@var{ Key} ,-@var{ Val} ,-@var{ TN} )
@findex rb_ del_ min/4
@snindex rb_ del_ min/4
@cnindex rb_ del_ min/4
Delete the least element from the tree @var{ T} , returning the key
@var{ Key} , the value @var{ Val} associated with the key and a new tree
@var{ TN} .
@item rb_ del_ max(+@var{ T} ,-@var{ Key} ,-@var{ Val} ,-@var{ TN} )
@findex rb_ del_ max/4
@snindex rb_ del_ max/4
@cnindex rb_ del_ max/4
Delete the largest element from the tree @var{ T} , returning the key
@var{ Key} , the value @var{ Val} associated with the key and a new tree
@var{ TN} .
@item rb_ update(+@var{ T} ,+@var{ Key} ,+@var{ NewVal} ,-@var{ TN} )
@findex rb_ update/4
@snindex rb_ update/4
@cnindex rb_ update/4
Tree @var{ TN} is tree @var{ T} , but with value for @var{ Key} associated
with @var{ NewVal} . Fails if it cannot find @var{ Key} in @var{ T} .
@item rb_ apply(+@var{ T} ,+@var{ Key} ,+@var{ G} ,-@var{ TN} )
@findex rb_ apply/4
@snindex rb_ apply/4
@cnindex rb_ apply/4
If the value associated with key @var{ Key} is @var{ Val0} in @var{ T} , and
2006-05-19 18:49:25 +01:00
if @code{ call(G,Val0,ValF)} holds, then @var{ TN} differs from
@var{ T} only in that @var{ Key} is associated with value @var{ ValF} in
tree @var{ TN} . Fails if it cannot find @var{ Key} in @var{ T} , or if
@code{ call(G,Val0,ValF)} is not satisfiable.
2006-04-10 20:24:52 +01:00
@item rb_ visit(+@var{ T} ,-@var{ Pairs} )
@findex rb_ visit/2
@snindex rb_ visit/2
@cnindex rb_ visit/2
@var{ Pairs} is an infix visit of tree @var{ T} , where each element of
2006-05-19 18:49:25 +01:00
@var{ Pairs} is of the form @var{ K} -@var{ Val} .
2006-04-10 20:24:52 +01:00
@item rb_ size(+@var{ T} ,-@var{ Size} )
@findex rb_ size/2
@snindex rb_ size/2
@cnindex rb_ size/2
@var{ Size} is the number of elements in @var{ T} .
@item rb_ keys(+@var{ T} ,+@var{ Keys} )
@findex rb_ keys/2
@snindex rb_ keys/2
@cnindex rb_ keys/2
@var{ Keys} is an infix visit with all keys in tree @var{ T} . Keys will be
sorted, but may be duplicate.
@item rb_ map(+@var{ T} ,+@var{ G} ,-@var{ TN} )
@findex rb_ map/3
@snindex rb_ map/3
@cnindex rb_ map/3
For all nodes @var{ Key} in the tree @var{ T} , if the value associated with
key @var{ Key} is @var{ Val0} in tree @var{ T} , and if
2006-05-19 18:49:25 +01:00
@code{ call(G,Val0,ValF)} holds, then the value associated with @var{ Key}
in @var{ TN} is @var{ ValF} . Fails if or if @code{ call(G,Val0,ValF)} is not
2006-04-10 20:24:52 +01:00
satisfiable for all @var{ Var0} .
@item rb_ partial_ map(+@var{ T} ,+@var{ Keys} ,+@var{ G} ,-@var{ TN} )
@findex rb_ partial_ map/4
@snindex rb_ partial_ map/4
@cnindex rb_ partial_ map/4
For all nodes @var{ Key} in @var{ Keys} , if the value associated with key
2006-05-19 18:49:25 +01:00
@var{ Key} is @var{ Val0} in tree @var{ T} , and if @code{ call(G,Val0,ValF)}
2006-04-10 20:24:52 +01:00
holds, then the value associated with @var{ Key} in @var{ TN} is
2006-05-19 18:49:25 +01:00
@var{ ValF} . Fails if or if @code{ call(G,Val0,ValF)} is not satisfiable
2006-04-10 20:24:52 +01:00
for all @var{ Var0} . Assumes keys are not repeated.
2012-08-24 21:19:15 +01:00
@item rb_ fold(+@var{ T} ,+@var{ G} ,+@var{ Acc0} , -@var{ AccF} )
@findex rb_ fold/4
@snindex rb_ fold/4
@cnindex rb_ fold/4
2013-09-29 11:31:18 +01:00
For all nodes @var{ Key} in the tree @var{ T} , if the value
2012-08-24 21:19:15 +01:00
associated with key @var{ Key} is @var{ V} in tree @var{ T} , if
@code{ call(G,V,Acc1,Acc2)} holds, then if @var{ VL} is value of the
previous node in inorder, @code{ call(G,VL,_ ,Acc0)} must hold, and if
@var{ VR} is the value of the next node in inorder,
@code{ call(G,VR,Acc1,_ )} must hold.
2012-08-29 02:19:46 +01:00
@item rb_ key_ fold(+@var{ T} ,+@var{ G} ,+@var{ Acc0} , -@var{ AccF} )
@findex rb_ key_ fold/4
@snindex rb_ key_ fold/4
@cnindex rb_ key_ fold/4
2013-09-29 11:31:18 +01:00
For all nodes @var{ Key} in the tree @var{ T} , if the value
2012-08-29 02:19:46 +01:00
associated with key @var{ Key} is @var{ V} in tree @var{ T} , if
@code{ call(G,Key,V,Acc1,Acc2)} holds, then if @var{ VL} is value of the
previous node in inorder, @code{ call(G,KeyL,VL,_ ,Acc0)} must hold, and if
@var{ VR} is the value of the next node in inorder,
@code{ call(G,KeyR,VR,Acc1,_ )} must hold.
2006-04-10 20:24:52 +01:00
@item rb_ clone(+@var{ T} ,+@var{ NT} ,+@var{ Nodes} )
@findex rb_ clone/3
@snindex rb_ clone/3
@cnindex rb_ clone/3
``Clone'' the red-back tree into a new tree with the same keys as the
original but with all values set to unbound values. Nodes is a list
containing all new nodes as pairs @var{ K-V} .
@item rb_ min(+@var{ T} ,-@var{ Key} ,-@var{ Value} )
@findex rb_ min/3
@snindex rb_ min/3
@cnindex rb_ min/3
@var{ Key} is the minimum key in @var{ T} , and is associated with @var{ Val} .
@item rb_ max(+@var{ T} ,-@var{ Key} ,-@var{ Value} )
@findex rb_ max/3
@snindex rb_ max/3
@cnindex rb_ max/3
@var{ Key} is the maximal key in @var{ T} , and is associated with @var{ Val} .
@item rb_ next(+@var{ T} , +@var{ Key} ,-@var{ Next} ,-@var{ Value} )
@findex rb_ next/4
@snindex rb_ next/4
@cnindex rb_ next/4
@var{ Next} is the next element after @var{ Key} in @var{ T} , and is
associated with @var{ Val} .
@item rb_ previous(+@var{ T} , +@var{ Key} ,-@var{ Previous} ,-@var{ Value} )
@findex rb_ previous/4
@snindex rb_ previous/4
@cnindex rb_ previous/4
@var{ Previous} is the previous element after @var{ Key} in @var{ T} , and is
associated with @var{ Val} .
@item ord_ list_ to_ rbtree(+@var{ L} , -@var{ T} )
@findex list_ to_ rbtree/2
@snindex list_ to_ rbtree/2
@cnindex list_ to_ rbtree/2
@var{ T} is the red-black tree corresponding to the mapping in ordered
list @var{ L} .
2002-06-18 05:23:15 +01:00
@end table
2010-06-17 00:32:52 +01:00
@node RegExp, shlib, Red-Black Trees, Library
2001-04-09 20:54:03 +01:00
@section Regular Expressions
@cindex regular expressions
This library includes routines to determine whether a regular expression
matches part or all of a string. The routines can also return which
parts parts of the string matched the expression or subexpressions of
it. This library relies on Henry Spencer's @code{ C} -package and is only
available in operating systems that support dynamic loading. The
@code{ C} -code has been obtained from the sources of FreeBSD-4.0 and is
protected by copyright from Henry Spencer and from the Regents of the
University of California (see the file library/regex/COPYRIGHT for
further details).
Much of the description of regular expressions below is copied verbatim
from Henry Spencer's manual page.
A regular expression is zero or more branches, separated by ``|''. It
matches anything that matches one of the branches.
A branch is zero or more pieces, concatenated. It matches a match for
the first, followed by a match for the second, etc.
A piece is an atom possibly followed by ``*'', ``+'', or ``?''. An atom
followed by ``*'' matches a sequence of 0 or more matches of the atom.
An atom followed by ``+'' matches a sequence of 1 or more matches of the
atom. An atom followed by ``?'' matches a match of the atom, or the
null string.
An atom is a regular expression in parentheses (matching a match for the
regular expression), a range (see below), ``.'' (matching any single
character), ``^ '' (matching the null string at the beginning of the
input string), ``$ '' ( matching the null string at the end of the input
string), a ``\' ' followed by a single character (matching that
character), or a single character with no other significance (matching
that character).
A range is a sequence of characters enclosed in ``[]''. It normally
matches any single character from the sequence. If the sequence begins
with ``^ '', it matches any single character not from the rest of the
sequence. If two characters in the sequence are separated by ``-'',
this is shorthand for the full list of ASCII characters between them
(e.g. ``[0-9]'' matches any decimal digit). To include a literal ``]''
in the sequence, make it the first character (following a possible
``^ ''). To include a literal ``-'', make it the first or last
character.
@table @code
@item regexp(+@var{ RegExp} ,+@var{ String} ,+@var{ Opts} )
@findex regexp/3
@snindex regexp/3
@cnindex regexp/3
Match regular expression @var{ RegExp} to input string @var{ String}
according to options @var{ Opts} . The options may be:
@itemize @bullet
@item @code{ nocase} : Causes upper-case characters in @var{ String} to
be treated as lower case during the matching process.
@end itemize
2008-03-25 11:54:08 +00:00
@item regexp(+@var{ RegExp} ,+@var{ String} ,+@var{ Opts} ,?@var{ SubMatchVars} )
2001-04-09 20:54:03 +01:00
@findex regexp/4
@snindex regexp/4
@cnindex regexp/4
Match regular expression @var{ RegExp} to input string @var{ String}
according to options @var{ Opts} . The variable @var{ SubMatchVars} should
2008-03-25 11:54:08 +00:00
be originally unbound or a list of unbound variables all will contain a
sequence of matches, that is, the head of @var{ SubMatchVars} will
contain the characters in @var{ String} that matched the leftmost
parenthesized subexpression within @var{ RegExp} , the next head of list
will contain the characters that matched the next parenthesized
subexpression to the right in @var{ RegExp} , and so on.
2001-04-09 20:54:03 +01:00
The options may be:
@itemize @bullet
@item @code{ nocase} : Causes upper-case characters in @var{ String} to
be treated as lower case during the matching process.
@item @code{ indices} : Changes what is stored in
@var{ SubMatchVars} . Instead of storing the matching characters from
@var{ String} , each variable will contain a term of the form @var{ IO-IF}
giving the indices in @var{ String} of the first and last characters in
the matching range of characters.
@end itemize
In general there may be more than one way to match a regular expression
to an input string. For example, consider the command
@example
regexp("(a*)b*","aabaaabb", [], [X,Y])
@end example
Considering only the rules given so far, @var{ X} and @var{ Y} could end up
with the values @code{ "aabb"} and @code{ "aa"} , @code{ "aaab"} and
@code{ "aaa"} , @code{ "ab"} and @code{ "a"} , or any of several other
2006-02-08 19:13:11 +00:00
combinations. To resolve this potential ambiguity @code{ regexp} chooses among
2001-04-09 20:54:03 +01:00
alternatives using the rule ``first then longest''. In other words, it
considers the possible matches in order working from left to right
across the input string and the pattern, and it attempts to match longer
pieces of the input string before shorter ones. More specifically, the
following rules apply in decreasing order of priority:
@enumerate
@item If a regular expression could match two different parts of an
input string then it will match the one that begins earliest.
@item If a regular expression contains "|" operators then the leftmost matching sub-expression is chosen.
@item In *, +, and ? constructs, longer matches are chosen in preference to shorter ones.
@item In sequences of expression components the components are considered from left to right.
@end enumerate
In the example from above, @code{ "(a*)b*"} matches @code{ "aab"} : the
@code{ "(a*)"} portion of the pattern is matched first and it consumes
the leading @code{ "aa"} ; then the @code{ "b*"} portion of the pattern
consumes the next @code{ "b"} . Or, consider the following example:
@example
regexp("(ab|a)(b*)c", "abc", [], [X,Y,Z])
@end example
After this command @var{ X} will be @code{ "abc"} , @var{ Y} will be
@code{ "ab"} , and @var{ Z} will be an empty string. Rule 4 specifies that
@code{ "(ab|a)"} gets first shot at the input string and Rule 2 specifies
that the @code{ "ab"} sub-expression is checked before the @code{ "a"}
sub-expression. Thus the @code{ "b"} has already been claimed before the
@code{ "(b*)"} component is checked and @code{ (b*)} must match an empty string.
@end table
2010-06-17 00:32:52 +01:00
@node shlib, Splay Trees, RegExp, Library
@section SWI-Prolog's shlib library
@cindex SWI-Compatible foreign file loading
This section discusses the functionality of the (autoload)
@code{ library(shlib)} , providing an interface to manage shared
libraries.
One of the files provides a global function @code{ install_ mylib()} that
initialises the module using calls to @code{ PL_ register_ foreign()} . Here is a
simple example file @code{ mylib.c} , which creates a Windows MessageBox:
2014-04-21 11:14:18 +01:00
@c_ example
2010-06-17 00:32:52 +01:00
#include <windows.h>
#include <SWI-Prolog.h>
static foreign_ t
pl_ say_ hello(term_ t to)
@{ char *a;
if ( PL_ get_ atom_ chars(to, & a) )
@{ MessageBox(NULL, a, "DLL test", MB_ OK|MB_ TASKMODAL);
PL_ succeed;
@}
PL_ fail;
@}
install_ t
install_ mylib()
@{ PL_ register_ foreign("say_ hello", 1, pl_ say_ hello, 0);
@}
2014-04-21 11:14:18 +01:00
@end c_ example
2010-06-17 00:32:52 +01:00
Now write a file mylib.pl:
@example
:- module(mylib, [ say_ hello/1 ]).
:- use_ foreign_ library(foreign(mylib)).
@end example
The file mylib.pl can be loaded as a normal Prolog file and provides the predicate defined in C.
@table @code
2014-04-11 02:27:10 +01:00
@item load_ foreign_ library(:@var{ FileSpec} ) is det
2010-06-17 00:32:52 +01:00
@findex load_ foreign_ library/1
@snindex load_ foreign_ library/1
@cnindex load_ foreign_ library/1
2014-04-11 02:27:10 +01:00
@item load_ foreign_ library(:@var{ FileSpec} , +@var{ Entry} :atom) is det
2010-06-17 00:32:52 +01:00
@findex load_ foreign_ library/2
@snindex load_ foreign_ library/2
@cnindex load_ foreign_ library/2
Load a shared object or DLL. After loading the @var{ Entry} function is
called without arguments. The default entry function is composed
from @code{ install_ } , followed by the file base-name. E.g., the
load-call below calls the function @code{ install_ mylib()} . If the platform
prefixes extern functions with @code{ _ } , this prefix is added before
calling.
@example
...
load_ foreign_ library(foreign(mylib)),
...
@end example
@var{ FileSpec} is a specification for
@code{ absolute_ file_ name/3} . If searching the file fails, the plain
name is passed to the OS to try the default method of the OS for
locating foreign objects. The default definition of
@code{ file_ search_ path/2} searches <prolog home>/lib/Yap.
See also
@code{ use_ foreign_ library/1,2} are intended for use in
directives.
2014-03-27 15:34:25 +00:00
@item [det] use_ foreign_ library(+@var{ FileSpec} ), use_ foreign_ library(+@var{ FileSpec} , +@var{ Entry} :atom)
2010-06-17 00:32:52 +01:00
@findex use_ foreign_ library/1
@snindex use_ foreign_ library/1
@cnindex use_ foreign_ library/1
@findex use_ foreign_ library/2
@snindex use_ foreign_ library/2
@cnindex use_ foreign_ library/2
2014-04-10 11:59:30 +01:00
Load and install a foreign library as @code{ load_ foreign_ library/1}
and @code{ load_ foreign_ library/2} and
2010-06-17 00:32:52 +01:00
register the installation using @code{ initialization/2} with the option
now. This is similar to using:
@example
:- initialization(load_ foreign_ library(foreign(mylib))).
@end example
but using the @code{ initialization/1} wrapper causes the library to
be loaded after loading of the file in which it appears is
completed, while @code{ use_ foreign_ library/1} loads the library
immediately. I.e. the difference is only relevant if the remainder
of the file uses functionality of the @code{ C} -library.
@item [det]unload_ foreign_ library(+@var{ FileSpec} )
@item [det]unload_ foreign_ library(+@var{ FileSpec} , +@var{ Exit} :atom)
@findex unload_ foreign_ library/1
@snindex unload_ foreign_ library/1
@cnindex unload_ foreign_ library/1
@findex unload_ foreign_ library/2
@snindex unload_ foreign_ library/2
@cnindex unload_ foreign_ library/2
Unload a shared
object or DLL. After calling the @var{ Exit} function, the shared object is
removed from the process. The default exit function is composed from
@code{ uninstall_ } , followed by the file base-name.
@item current_ foreign_ library(?@var{ File} , ?@var{ Public} )
@findex current_ foreign_ library/2
@snindex current_ foreign_ library/2
@cnindex current_ foreign_ library/2
Query currently
loaded shared libraries.
@c @item reload_ foreign_ libraries
@c @findex reload_ foreign_ libraries/0
@c @snindex reload_ foreign_ libraries/0
@c @cnindex reload_ foreign_ libraries/0
@c Reload all foreign
@c libraries loaded (after restore of a state created using
@c @code{ qsave_ program/2} ).
@end table
2014-04-21 11:14:18 +01:00
@node Splay Trees, String Input/Output, shlib, Library
2001-04-09 20:54:03 +01:00
@section Splay Trees
@cindex splay trees
Splay trees are explained in the paper "Self-adjusting Binary Search
Trees", by D.D. Sleator and R.E. Tarjan, JACM, vol. 32, No.3, July 1985,
p. 668. They are designed to support fast insertions, deletions and
removals in binary search trees without the complexity of traditional
balanced trees. The key idea is to allow the tree to become
unbalanced. To make up for this, whenever we find a node, we move it up
to the top. We use code by Vijay Saraswat originally posted to the Prolog
mailing-list.
@table @code
@item splay_ access(-@var{ Return} ,+@var{ Key} ,?@var{ Val} ,+@var{ Tree} ,-@var{ NewTree} )
@findex splay_ access/5
@snindex splay_ access/5
@cnindex splay_ access/5
2004-03-08 15:38:36 +00:00
If item @var{ Key} is in tree @var{ Tree} , return its @var{ Val} and
2003-08-27 14:37:10 +01:00
unify @var{ Return} with @code{ true} . Otherwise unify @var{ Return} with
2001-04-09 20:54:03 +01:00
@code{ null} . The variable @var{ NewTree} unifies with the new tree.
@item splay_ delete(+@var{ Key} ,?@var{ Val} ,+@var{ Tree} ,-@var{ NewTree} )
@findex splay_ delete/4
@snindex splay_ delete/4
@cnindex splay_ delete/4
Delete item @var{ Key} from tree @var{ Tree} , assuming that it is present
already. The variable @var{ Val} unifies with a value for key @var{ Key} ,
and the variable @var{ NewTree} unifies with the new tree. The predicate
will fail if @var{ Key} is not present.
2002-04-23 23:43:10 +01:00
@item splay_ init(-@var{ NewTree} )
@findex splay_ init/3
@snindex splay_ init/3
@cnindex splay_ init/3
2002-10-11 04:39:11 +01:00
Initialize a new splay tree.
2002-04-23 23:43:10 +01:00
2001-04-09 20:54:03 +01:00
@item splay_ insert(+@var{ Key} ,?@var{ Val} ,+@var{ Tree} ,-@var{ NewTree} )
@findex splay_ insert/4
@snindex splay_ insert/4
@cnindex splay_ insert/4
2003-08-27 14:37:10 +01:00
Insert item @var{ Key} in tree @var{ Tree} , assuming that it is not
there already. The variable @var{ Val} unifies with a value for key
@var{ Key} , and the variable @var{ NewTree} unifies with the new
tree. In our implementation, @var{ Key} is not inserted if it is
already there: rather it is unified with the item already in the tree.
2001-04-09 20:54:03 +01:00
@item splay_ join(+@var{ LeftTree} ,+@var{ RighTree} ,-@var{ NewTree} )
@findex splay_ join/3
@snindex splay_ join/3
@cnindex splay_ join/3
Combine trees @var{ LeftTree} and @var{ RighTree} into a single
2003-08-27 14:37:10 +01:00
tree@var{ NewTree} containing all items from both trees. This operation
2001-04-09 20:54:03 +01:00
assumes that all items in @var{ LeftTree} are less than all those in
@var{ RighTree} and destroys both @var{ LeftTree} and @var{ RighTree} .
@item splay_ split(+@var{ Key} ,?@var{ Val} ,+@var{ Tree} ,-@var{ LeftTree} ,-@var{ RightTree} )
@findex splay_ split/5
@snindex splay_ split/5
@cnindex splay_ split/5
2003-08-27 14:37:10 +01:00
Construct and return two trees @var{ LeftTree} and @var{ RightTree} ,
where @var{ LeftTree} contains all items in @var{ Tree} less than
@var{ Key} , and @var{ RightTree} contains all items in @var{ Tree}
greater than @var{ Key} . This operations destroys @var{ Tree} .
2001-04-09 20:54:03 +01:00
@end table
2014-04-21 11:14:18 +01:00
@node String Input/Output, System, Splay Trees, Library
2001-04-09 20:54:03 +01:00
@section Reading From and Writing To Strings
2014-04-21 11:14:18 +01:00
@cindex string Input/Output
2001-04-09 20:54:03 +01:00
From Version 4.3.2 onwards YAP implements SICStus Prolog compatible
2014-04-21 11:14:18 +01:00
String Input/Output. The library allows users to read from and write to a memory
2001-04-09 20:54:03 +01:00
buffer as if it was a file. The memory buffer is built from or converted
to a string of character codes by the routines in library. Therefore, if
one wants to read from a string the string must be fully instantiated
2006-02-08 19:13:11 +00:00
before the library built-in opens the string for reading. These commands
2001-04-09 20:54:03 +01:00
are available through the @code{ use_ module(library(charsio))} command.
@table @code
@item format_ to_ chars(+@var{ Form} , +@var{ Args} , -@var{ Result} )
@findex format_ to_ chars/3
@syindex format_ to_ chars/3
@cnindex format_ to_ chars/3
Execute the built-in procedure @code{ format/2} with form @var{ Form} and
arguments @var{ Args} outputting the result to the string of character
codes @var{ Result} .
2009-05-20 17:14:48 +01:00
@item format_ to_ chars(+@var{ Form} , +@var{ Args} , -@var{ Result} , -@var{ Result0} )
2001-04-09 20:54:03 +01:00
@findex format_ to_ chars/4
@syindex format_ to_ chars/4
@cnindex format_ to_ chars/4
Execute the built-in procedure @code{ format/2} with form @var{ Form} and
arguments @var{ Args} outputting the result to the difference list of
character codes @var{ Result-Result0} .
@item write_ to_ chars(+@var{ Term} , -@var{ Result} )
@findex write_ to_ chars/2
@syindex write_ to_ chars/2
@cnindex write_ to_ chars/2
Execute the built-in procedure @code{ write/1} with argument @var{ Term}
outputting the result to the string of character codes @var{ Result} .
@item write_ to_ chars(+@var{ Term} , -@var{ Result0} , -@var{ Result} )
@findex write_ to_ chars/3
@syindex write_ to_ chars/3
@cnindex write_ to_ chars/3
Execute the built-in procedure @code{ write/1} with argument @var{ Term}
outputting the result to the difference list of character codes
@var{ Result-Result0} .
@item atom_ to_ chars(+@var{ Atom} , -@var{ Result} )
@findex atom_ to_ chars/2
@syindex atom_ to_ chars/2
@cnindex atom_ to_ chars/2
Convert the atom @var{ Atom} to the string of character codes
@var{ Result} .
@item atom_ to_ chars(+@var{ Atom} , -@var{ Result0} , -@var{ Result} )
@findex atom_ to_ chars/3
@syindex atom_ to_ chars/3
@cnindex atom_ to_ chars/3
Convert the atom @var{ Atom} to the difference list of character codes
@var{ Result-Result0} .
@item number_ to_ chars(+@var{ Number} , -@var{ Result} )
@findex number_ to_ chars/2
@syindex number_ to_ chars/2
@cnindex number_ to_ chars/2
Convert the number @var{ Number} to the string of character codes
@var{ Result} .
@item number_ to_ chars(+@var{ Number} , -@var{ Result0} , -@var{ Result} )
@findex number_ to_ chars/3
@syindex number_ to_ chars/3
@cnindex number_ to_ chars/3
Convert the atom @var{ Number} to the difference list of character codes
@var{ Result-Result0} .
2010-01-26 12:19:13 +00:00
@item atom_ to_ term(+@var{ Atom} , -@var{ Term} , -@var{ Bindings} )
@findex atom_ to_ term/3
@syindex atom_ to_ term/3
@cnindex atom_ to_ term/3
Use @var{ Atom} as input to @code{ read_ term/2} using the option @code{ variable_ names} and return the read term in @var{ Term} and the variable bindings in @var{ Bindings} . @var{ Bindings} is a list of @code{ Name = Var} couples, thus providing access to the actual variable names. See also @code{ read_ term/2} . If Atom has no valid syntax, a syntax_ error exception is raised.
2010-05-23 18:39:46 +01:00
@item term_ to_ atom(?@var{ Term} , ?@var{ Atom} )
@findex term_ to_ atom/2
@syindex term_ to_ atom/2
@cnindex term_ to_ atom/2
True if @var{ Atom} describes a term that unifies with @var{ Term} . When
@var{ Atom} is instantiated @var{ Atom} is converted and then unified with
@var{ Term} . If @var{ Atom} has no valid syntax, a syntax_ error exception
is raised. Otherwise @var{ Term} is ``written'' on @var{ Atom} using
@code{ write_ term/2} with the option quoted(true).
2010-01-26 12:19:13 +00:00
2001-04-09 20:54:03 +01:00
@item read_ from_ chars(+@var{ Chars} , -@var{ Term} )
@findex read_ from_ chars/2
@syindex read_ from_ chars/2
@cnindex read_ from_ chars/2
Parse the list of character codes @var{ Chars} and return the result in
the term @var{ Term} . The character codes to be read must terminate with
a dot character such that either (i) the dot character is followed by
blank characters; or (ii) the dot character is the last character in the
string.
@item open_ chars_ stream(+@var{ Chars} , -@var{ Stream} )
@findex open_ chars_ stream/2
@syindex open_ chars_ stream/2
@cnindex open_ chars_ stream/2
Open the list of character codes @var{ Chars} as a stream @var{ Stream} .
@item with_ output_ to_ chars(?@var{ Goal} , -@var{ Chars} )
@findex with_ output_ to_ chars/2
@syindex with_ output_ to_ chars/2
@cnindex with_ output_ to_ chars/2
Execute goal @var{ Goal} such that its standard output will be sent to a
memory buffer. After successful execution the contents of the memory
buffer will be converted to the list of character codes @var{ Chars} .
@item with_ output_ to_ chars(?@var{ Goal} , ?@var{ Chars0} , -@var{ Chars} )
@findex with_ output_ to_ chars/3
@syindex with_ output_ to_ chars/3
@cnindex with_ output_ to_ chars/3
Execute goal @var{ Goal} such that its standard output will be sent to a
memory buffer. After successful execution the contents of the memory
buffer will be converted to the difference list of character codes
@var{ Chars-Chars0} .
@item with_ output_ to_ chars(?@var{ Goal} , -@var{ Stream} , ?@var{ Chars0} , -@var{ Chars} )
@findex with_ output_ to_ chars/4
@syindex with_ output_ to_ chars/4
@cnindex with_ output_ to_ chars/4
Execute goal @var{ Goal} such that its standard output will be sent to a
memory buffer. After successful execution the contents of the memory
buffer will be converted to the difference list of character codes
@var{ Chars-Chars0} and @var{ Stream} receives the stream corresponding to
the memory buffer.
@end table
The implementation of the character IO operations relies on three YAP
2006-02-08 19:13:11 +00:00
built-ins:
2001-04-09 20:54:03 +01:00
@table @code
@item charsio:open_ mem_ read_ stream(+@var{ String} , -@var{ Stream} )
Store a string in a memory buffer and output a stream that reads from this
memory buffer.
@item charsio:open_ mem_ write_ stream(-@var{ Stream} )
Create a new memory buffer and output a stream that writes to it.
@item charsio:peek_ mem_ write_ stream(-@var{ Stream} , L0, L)
Convert the memory buffer associated with stream @var{ Stream} to the
difference list of character codes @var{ L-L0} .
@end table
@noindent
2006-02-08 19:13:11 +00:00
These built-ins are initialized to belong to the module @code{ charsio} in
2001-04-09 20:54:03 +01:00
@code{ init.yap} . Novel procedures for manipulating strings by explicitly
importing these built-ins.
YAP does not currently support opening a @code{ charsio} stream in
@code{ append} mode, or seeking in such a stream.
2014-04-21 11:14:18 +01:00
@node System, Terms, String Input/Output, Library
2001-05-24 16:26:41 +01:00
@section Calling The Operating System from YAP
@cindex Operating System Utilities
2007-02-18 00:26:36 +00:00
YAP now provides a library of system utilities compatible with the
2001-05-28 20:54:53 +01:00
SICStus Prolog system library. This library extends and to some point
replaces the functionality of Operating System access routines. The
library includes Unix/Linux and Win32 @code{ C} code. They
are available through the @code{ use_ module(library(system))} command.
@table @code
@item datime(datime(-@var{ Year} , -@var{ Month} , -@var{ DayOfTheMonth} ,
-@var{ Hour} , -@var{ Minute} , -@var{ Second} )
@findex datime/1
@syindex datime/1
@cnindex datime/1
The @code{ datime/1} procedure returns the current date and time, with
information on @var{ Year} , @var{ Month} , @var{ DayOfTheMonth} ,
@var{ Hour} , @var{ Minute} , and @var{ Second} . The @var{ Hour} is returned
on local time. This function uses the WIN32
@code{ GetLocalTime} function or the Unix @code{ localtime} function.
@example
?- datime(X).
X = datime(2001,5,28,15,29,46) ?
@end example
2004-08-11 17:14:55 +01:00
@item mktime(datime(+@var{ Year} , +@var{ Month} , +@var{ DayOfTheMonth} ,
+@var{ Hour} , +@var{ Minute} , +@var{ Second} ), -@var{ Seconds} )
@findex mktime/2
@snindex mktime/2
@cnindex mktime/2
The @code{ mktime/1} procedure returns the number of @var{ Seconds}
elapsed since 00:00:00 on January 1, 1970, Coordinated Universal Time
(UTC). The user provides information on @var{ Year} , @var{ Month} ,
@var{ DayOfTheMonth} , @var{ Hour} , @var{ Minute} , and @var{ Second} . The
2005-10-28 18:55:30 +01:00
@var{ Hour} is given on local time. This function uses the WIN32
2004-08-11 17:14:55 +01:00
@code{ GetLocalTime} function or the Unix @code{ mktime} function.
@example
?- mktime(datime(2001,5,28,15,29,46),X).
X = 991081786 ? ;
@end example
2001-05-28 20:54:53 +01:00
@item delete_ file(+@var{ File} )
@findex delete_ file/1
@syindex delete_ file/1
@cnindex delete_ file/1
The @code{ delete_ file/1} procedure removes file @var{ File} . If
2014-04-21 11:14:18 +01:00
@var{ File} is a directory, remove the directory @emph{ and all its subdirectories} .
2001-05-28 20:54:53 +01:00
@example
?- delete_ file(x).
@end example
@item delete_ file(+@var{ File} ,+@var{ Opts} )
@findex delete_ file/2
@syindex delete_ file/2
@cnindex delete_ file/2
The @code{ delete_ file/2} procedure removes file @var{ File} according to
options @var{ Opts} . These options are @code{ directory} if one should
remove directories, @code{ recursive} if one should remove directories
recursively, and @code{ ignore} if errors are not to be reported.
This example is equivalent to using the @code{ delete_ file/1} predicate:
@example
?- delete_ file(x, [recursive]).
@end example
@item directory_ files(+@var{ Dir} ,+@var{ List} )
@findex directory_ files/2
@syindex directory_ files/2
@cnindex directory_ files/2
Given a directory @var{ Dir} , @code{ directory_ files/2} procedures a
listing of all files and directories in the directory:
@example
?- directory_ files('.',L), writeq(L).
['Makefile.~1~','sys.so','Makefile','sys.o',x,..,'.']
@end example
The predicates uses the @code{ dirent} family of routines in Unix
environments, and @code{ findfirst} in WIN32.
@item file_ exists(+@var{ File} )
@findex file_ exists/1
@syindex file_ exists/1
@cnindex file_ exists/1
The atom @var{ File} corresponds to an existing file.
@item file_ exists(+@var{ File} ,+@var{ Permissions} )
@findex file_ exists/2
@syindex file_ exists/2
@cnindex file_ exists/2
The atom @var{ File} corresponds to an existing file with permissions
compatible with @var{ Permissions} . YAP currently only accepts for
permissions to be described as a number. The actual meaning of this
number is Operating System dependent.
@item file_ property(+@var{ File} ,?@var{ Property} )
@findex file_ property/2
@syindex file_ property/2
@cnindex file_ property/2
The atom @var{ File} corresponds to an existing file, and @var{ Property}
2002-10-11 04:39:11 +01:00
will be unified with a property of this file. The properties are of the
2001-06-12 15:07:59 +01:00
form @code{ type(@var{ Type} )} , which gives whether the file is a regular
2001-05-28 20:54:53 +01:00
file, a directory, a fifo file, or of unknown type;
2001-06-12 15:07:59 +01:00
@code{ size(@var{ Size} )} , with gives the size for a file, and
@code{ mod_ time(@var{ Time} )} , which gives the last time a file was
2001-05-28 20:54:53 +01:00
modified according to some Operating System dependent
2002-06-11 06:30:47 +01:00
timestamp; @code{ mode(@var{ mode} )} , gives the permission flags for the
file, and @code{ linkto(@var{ FileName} )} , gives the file pointed to by a
symbolic link. Properties can be obtained through backtracking:
2001-05-28 20:54:53 +01:00
@example
?- file_ property('Makefile',P).
P = type(regular) ? ;
P = size(2375) ? ;
P = mod_ time(990826911) ? ;
no
@end example
@item make_ directory(+@var{ Dir} )
@findex make_ directory/2
@syindex make_ directory/2
@cnindex make_ directory/2
Create a directory @var{ Dir} . The name of the directory must be an atom.
@item rename_ file(+@var{ OldFile} ,+@var{ NewFile} )
@findex rename_ file/2
@syindex rename_ file/2
@cnindex rename_ file/2
Create file @var{ OldFile} to @var{ NewFile} . This predicate uses the
2006-02-08 19:13:11 +00:00
@code{ C} built-in function @code{ rename} .
2001-05-28 20:54:53 +01:00
@item environ(?@var{ EnvVar} ,+@var{ EnvValue} )
2014-04-21 11:14:18 +01:00
@findex sys_ environ/2
@syindex sys_ environ/2
@cnindex sys_ environ/2
2001-05-28 20:54:53 +01:00
Unify environment variable @var{ EnvVar} with its value @var{ EnvValue} ,
if there is one. This predicate is backtrackable in Unix systems, but
not currently in Win32 configurations.
@example
?- environ('HOME',X).
X = 'C:\\ cygwin\\ home\\ administrator' ?
@end example
2001-06-29 13:45:54 +01:00
@item host_ id(-@var{ Id} )
@findex host_ id/1
@syindex host_ id/1
@cnindex host_ id/1
2007-02-18 00:26:36 +00:00
Unify @var{ Id} with an identifier of the current host. YAP uses the
2001-06-29 13:45:54 +01:00
@code{ hostid} function when available,
@item host_ name(-@var{ Name} )
@findex host_ name/1
@syindex host_ name/1
@cnindex host_ name/1
2007-02-18 00:26:36 +00:00
Unify @var{ Name} with a name for the current host. YAP uses the
2001-06-29 13:45:54 +01:00
@code{ hostname} function in Unix systems when available, and the
@code{ GetComputerName} function in WIN32 systems.
@item kill(@var{ Id} ,+@var{ SIGNAL} )
@findex kill/2
@syindex kill/2
@cnindex kill/2
Send signal @var{ SIGNAL} to process @var{ Id} . In Unix this predicate is
a direct interface to @code{ kill} so one can send signals to groups of
processes. In WIN32 the predicate is an interface to
2009-04-25 16:59:23 +01:00
@code{ TerminateProcess} , so it kills @var{ Id} independently of @var{ SIGNAL} .
2001-06-29 13:45:54 +01:00
@item mktemp(@var{ Spec} ,-@var{ File} )
@findex mktemp/2
@syindex mktemp/2
@cnindex mktemp/2
Direct interface to @code{ mktemp} : given a @var{ Spec} , that is a file
name with six @var{ X} to it, create a file name @var{ File} . Use
@code{ tmpnam/1} instead.
@item pid(-@var{ Id} )
@findex pid/1
@syindex pid/1
@cnindex pid/1
Unify @var{ Id} with the process identifier for the current
process. An interface to the @t{ getpid} function.
@item tmpnam(-@var{ File} )
@findex tmpnam/1
@syindex tmpnam/1
@cnindex tmpnam/1
2008-05-23 00:25:21 +01:00
Interface with @var{ tmpnam} : obtain a new, unique file name @var{ File} .
@item tmp_ file(-@var{ File} )
@findex tmp_ file/2
@snindex tmp_ file/2
@cnindex tmp_ file/2
Create a name for a temporary file. @var{ Base} is an user provided
identifier for the category of file. The @var{ TmpName} is guaranteed to
be unique. If the system halts, it will automatically remove all created
temporary files.
2001-06-29 13:45:54 +01:00
2007-10-05 19:24:30 +01:00
@item exec(+@var{ Command} ,[+@var{ InputStream} ,+@var{ OutputStream} ,+@var{ ErrorStream} ],-@var{ PID} )
2001-05-28 20:54:53 +01:00
@findex exec/3
@syindex exec/3
@cnindex exec/3
Execute command @var{ Command} with its streams connected to
2007-10-05 19:24:30 +01:00
@var{ InputStream} , @var{ OutputStream} , and @var{ ErrorStream} . The
process that executes the command is returned as @var{ PID} . The
command is executed by the default shell @code{ bin/sh -c} in Unix.
2001-06-29 13:45:54 +01:00
The following example demonstrates the use of @code{ exec/3} to send a
command and process its output:
@example
exec(ls,[std,pipe(S),null],P),repeat, get0(S,C), (C = -1, close(S) ! ; put(C)).
@end example
The streams may be one of standard stream, @code{ std} , null stream,
@code{ null} , or @code{ pipe(S)} , where @var{ S} is a pipe stream. Note
that it is up to the user to close the pipe.
2001-05-28 20:54:53 +01:00
2001-06-29 13:45:54 +01:00
@item popen(+@var{ Command} , +@var{ TYPE} , -@var{ Stream} )
@findex popen/3
@syindex popen/3
@cnindex popen/3
Interface to the @t{ popen} function. It opens a process by creating a
pipe, forking and invoking @var{ Command} on the current shell. Since a
pipe is by definition unidirectional the @var{ Type} argument may be
@code{ read} or @code{ write} , not both. The stream should be closed
using @code{ close/1} , there is no need for a special @code{ pclose}
command.
The following example demonstrates the use of @code{ popen/3} to process
the output of a command, as @code{ exec/3} would do:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-06-29 13:45:54 +01:00
?- popen(ls,read,X),repeat, get0(X,C), (C = -1, ! ; put(C)).
X = 'C:\\ cygwin\\ home\\ administrator' ?
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-06-29 13:45:54 +01:00
The WIN32 implementation of @code{ popen/3} relies on @code{ exec/3} .
2001-06-08 15:52:54 +01:00
@item shell
@findex shell/0
@syindex shell/0
@cnindex shell/0
2007-02-18 00:26:36 +00:00
Start a new shell and leave YAP in background until the shell
completes. YAP uses the shell given by the environment variable
2001-06-08 15:52:54 +01:00
@code{ SHELL} . In WIN32 environment YAP will use @code{ COMSPEC} if
@code{ SHELL} is undefined.
@item shell(+@var{ Command} )
@findex shell/1
@syindex shell/1
@cnindex shell/1
2007-02-18 00:26:36 +00:00
Execute command @var{ Command} under a new shell. YAP will be in
background until the command completes. In Unix environments YAP uses
2001-06-08 15:52:54 +01:00
the shell given by the environment variable @code{ SHELL} with the option
@code{ " -c "} . In WIN32 environment YAP will use @code{ COMSPEC} if
@code{ SHELL} is undefined, in this case with the option @code{ " /c "} .
@item shell(+@var{ Command} ,-@var{ Status} )
2014-04-21 11:14:18 +01:00
@findex shell/2
@syindex shell/2
@cnindex shell/2
2001-06-08 15:52:54 +01:00
Execute command @var{ Command} under a new shell and unify @var{ Status}
2007-02-18 00:26:36 +00:00
with the exit for the command. YAP will be in background until the
command completes. In Unix environments YAP uses the shell given by the
2001-06-08 15:52:54 +01:00
environment variable @code{ SHELL} with the option @code{ " -c "} . In
WIN32 environment YAP will use @code{ COMSPEC} if @code{ SHELL} is
undefined, in this case with the option @code{ " /c "} .
2001-06-07 18:54:29 +01:00
@item sleep(+@var{ Time} )
@findex sleep/1
@syindex sleep/1
@cnindex sleep/1
2006-12-30 11:29:30 +00:00
Block the current thread for @var{ Time} seconds. When YAP is compiled
without multi-threading support, this predicate blocks the YAP process.
The number of seconds must be a positive number, and it may an integer
or a float. The Unix implementation uses @code{ usleep} if the number of
seconds is below one, and @code{ sleep} if it is over a second. The WIN32
implementation uses @code{ Sleep} for both cases.
2001-06-07 18:54:29 +01:00
2001-06-29 13:45:54 +01:00
@item system
@findex system/0
@syindex system/0
@cnindex system/0
2007-02-18 00:26:36 +00:00
Start a new default shell and leave YAP in background until the shell
completes. YAP uses @code{ /bin/sh} in Unix systems and @code{ COMSPEC} in
2001-06-29 13:45:54 +01:00
WIN32.
@item system(+@var{ Command} ,-@var{ Res} )
@findex system/2
@syindex system/2
@cnindex system/2
Interface to @code{ system} : execute command @var{ Command} and unify
@var{ Res} with the result.
2001-06-07 18:54:29 +01:00
@item wait(+@var{ PID} ,-@var{ Status} )
@findex wait/2
@syindex wait/2
@cnindex wait/2
Wait until process @var{ PID} terminates, and return its exits @var{ Status} .
2001-05-28 20:54:53 +01:00
@end table
2007-09-16 21:27:57 +01:00
@node Terms, Tries, System, Library
2001-04-09 20:54:03 +01:00
@section Utilities On Terms
@cindex utilities on terms
The next routines provide a set of commonly used utilities to manipulate
terms. Most of these utilities have been implemented in @code{ C} for
efficiency. They are available through the
2002-09-17 20:42:06 +01:00
@code{ use_ module(library(terms))} command.
2001-04-09 20:54:03 +01:00
@table @code
2002-03-07 05:13:21 +00:00
@item cyclic_ term(?@var{ Term} )
@findex cyclic_ term/1
@syindex cyclic_ term/1
@cnindex cyclic_ term/1
2012-10-03 09:11:37 +01:00
Succeed if the argument @var{ Term} is not a cyclic term.
2002-03-07 05:13:21 +00:00
2001-04-09 20:54:03 +01:00
@item term_ hash(+@var{ Term} , ?@var{ Hash} )
@findex term_ hash/2
@syindex term_ hash/2
@cnindex term_ hash/2
If @var{ Term} is ground unify @var{ Hash} with a positive integer
calculated from the structure of the term. Otherwise the argument
@var{ Hash} is left unbound. The range of the positive integer is from
@code{ 0} to, but not including, @code{ 33554432} .
@item term_ hash(+@var{ Term} , +@var{ Depth} , +@var{ Range} , ?@var{ Hash} )
@findex term_ hash/4
@syindex term_ hash/4
@cnindex term_ hash/4
Unify @var{ Hash} with a positive integer calculated from the structure
of the term. The range of the positive integer is from @code{ 0} to, but
not including, @var{ Range} . If @var{ Depth} is @code{ -1} the whole term
is considered. Otherwise, the term is considered only up to depth
@code{ 1} , where the constants and the principal functor have depth
@code{ 1} , and an argument of a term with depth @var{ I} has depth @var{ I+1} .
2009-03-10 16:21:05 +00:00
@item variables_ within_ term(+@var{ Variables} ,?@var{ Term} , -@var{ OutputVariables} )
@findex variables_ within_ term/3
@snindex variables_ within_ term/3
@cnindex variables_ within_ term/3
Unify @var{ OutputVariables} with the subset of the variables @var{ Variables} that occurs in @var{ Term} .
@item new_ variables_ in_ term(+@var{ Variables} ,?@var{ Term} , -@var{ OutputVariables} )
@findex new_ variables_ in_ term/3
@snindex new_ variables_ in_ term/3
@cnindex new_ variables_ in_ term/3
Unify @var{ OutputVariables} with all variables occurring in @var{ Term} that are not in the list @var{ Variables} .
2001-04-09 20:54:03 +01:00
@item variant(?@var{ Term1} , ?@var{ Term2} )
@findex variant/2
@syindex variant/2
@cnindex variant/2
Succeed if @var{ Term1} and @var{ Term2} are variant terms.
@item subsumes(?@var{ Term1} , ?@var{ Term2} )
@findex subsumes/2
@syindex subsumes/2
@cnindex subsumes/2
Succeed if @var{ Term1} subsumes @var{ Term2} . Variables in term
@var{ Term1} are bound so that the two terms become equal.
@item subsumes_ chk(?@var{ Term1} , ?@var{ Term2} )
@findex subsumes_ chk/2
@syindex subsumes_ chk/2
@cnindex subsumes_ chk/2
Succeed if @var{ Term1} subsumes @var{ Term2} but does not bind any
variable in @var{ Term1} .
2002-03-07 05:13:21 +00:00
@item variable_ in_ term(?@var{ Term} ,?@var{ Var} )
@findex variable_ in_ term/2
@snindex variable_ in_ term/2
@cnindex variable_ in_ term/2
Succeed if the second argument @var{ Var} is a variable and occurs in
term @var{ Term} .
2008-03-13 18:41:52 +00:00
@item unifiable(?@var{ Term1} , ?@var{ Term2} , -@var{ Bindings} )
@findex unifiable/3
@syindex unifiable/3
@cnindex unifiable/3
Succeed if @var{ Term1} and @var{ Term2} are unifiable with substitution
@var{ Bindings} .
2002-10-11 04:39:11 +01:00
@end table
2007-09-16 21:27:57 +01:00
@node Tries, Cleanup, Terms, Library
@section Trie DataStructure
@cindex tries
The next routines provide a set of utilities to create and manipulate
prefix trees of Prolog terms. Tries were originally proposed to
implement tabling in Logic Programming, but can be used for other
purposes. The tries will be stored in the Prolog database and can seen
as alternative to @code{ assert} and @code{ record} family of
primitives. Most of these utilities have been implemented in @code{ C}
for efficiency. They are available through the
@code{ use_ module(library(tries))} command.
@table @code
@item trie_ open(-@var{ Id} )
@findex trie_ open/1
@snindex trie_ open/1
2008-11-05 13:28:44 +00:00
@cnindex trie_ open/1
2007-09-16 21:27:57 +01:00
Open a new trie with identifier @var{ Id} .
@item trie_ close(+@var{ Id} )
@findex trie_ close/1
@snindex trie_ close/1
2008-11-05 13:28:44 +00:00
@cnindex trie_ close/1
2007-09-16 21:27:57 +01:00
Close trie with identifier @var{ Id} .
@item trie_ close_ all
@findex trie_ close_ all/0
@snindex trie_ close_ all/0
@cnindex trie_ close_ all/0
Close all available tries.
@item trie_ mode(?@var{ Mode} )
@findex trie_ mode/1
@snindex trie_ mode/1
@cnindex trie_ mode/1
Unify @var{ Mode} with trie operation mode. Allowed values are either
@code{ std} (default) or @code{ rev} .
@item trie_ put_ entry(+@var{ Trie} ,+@var{ Term} ,-@var{ Ref} )
@findex trie_ put_ entry/3
@snindex trie_ put_ entry/3
@cnindex trie_ put_ entry/3
Add term @var{ Term} to trie @var{ Trie} . The handle @var{ Ref} gives
a reference to the term.
@item trie_ check_ entry(+@var{ Trie} ,+@var{ Term} ,-@var{ Ref} )
@findex trie_ check_ entry/3
@snindex trie_ check_ entry/3
@cnindex trie_ check_ entry/3
Succeeds if a variant of term @var{ Term} is in trie @var{ Trie} . An handle
@var{ Ref} gives a reference to the term.
@item trie_ get_ entry(+@var{ Ref} ,-@var{ Term} )
@findex trie_ get_ entry/2
@snindex trie_ get_ entry/2
@cnindex trie_ get_ entry/2
Unify @var{ Term} with the entry for handle @var{ Ref} .
@item trie_ remove_ entry(+@var{ Ref} )
@findex trie_ remove_ entry/1
@snindex trie_ remove_ entry/1
@cnindex trie_ remove_ entry/1
Remove entry for handle @var{ Ref} .
@item trie_ remove_ subtree(+@var{ Ref} )
@findex trie_ remove_ subtree/1
@snindex trie_ remove_ subtree/1
@cnindex trie_ remove_ subtree/1
Remove subtree rooted at handle @var{ Ref} .
@item trie_ save(+@var{ Trie} ,+@var{ FileName} )
@findex trie_ save/2
@snindex trie_ save/2
@cnindex trie_ save/2
Dump trie @var{ Trie} into file @var{ FileName} .
@item trie_ load(+@var{ Trie} ,+@var{ FileName} )
@findex trie_ load/2
@snindex trie_ load/2
@cnindex trie_ load/2
Load trie @var{ Trie} from the contents of file @var{ FileName} .
@item trie_ stats(-@var{ Memory} ,-@var{ Tries} ,-@var{ Entries} ,-@var{ Nodes} )
@findex trie_ stats/4
@snindex trie_ stats/4
@cnindex trie_ stats/4
Give generic statistics on tries, including the amount of memory,
@var{ Memory} , the number of tries, @var{ Tries} , the number of entries,
@var{ Entries} , and the total number of nodes, @var{ Nodes} .
@item trie_ max_ stats(-@var{ Memory} ,-@var{ Tries} ,-@var{ Entries} ,-@var{ Nodes} )
@findex trie_ max_ stats/4
@snindex trie_ max_ stats/4
@cnindex trie_ max_ stats/4
Give maximal statistics on tries, including the amount of memory,
@var{ Memory} , the number of tries, @var{ Tries} , the number of entries,
@var{ Entries} , and the total number of nodes, @var{ Nodes} .
@item trie_ usage(+@var{ Trie} ,-@var{ Entries} ,-@var{ Nodes} ,-@var{ VirtualNodes} )
@findex trie_ usage/4
@snindex trie_ usage/4
@cnindex trie_ usage/4
Give statistics on trie @var{ Trie} , the number of entries,
@var{ Entries} , and the total number of nodes, @var{ Nodes} , and the
number of @var{ VirtualNodes} .
@item trie_ print(+@var{ Trie} )
@findex trie_ print/1
@snindex trie_ print/1
@cnindex trie_ print/1
Print trie @var{ Trie} on standard output.
@end table
@node Cleanup, Timeout, Tries, Library
2005-10-28 18:55:30 +01:00
@section Call Cleanup
2002-10-11 04:39:11 +01:00
@cindex cleanup
@t{ call_ cleanup/1} and @t{ call_ cleanup/2} allow predicates to register
2002-10-27 18:11:01 +00:00
code for execution after the call is finished. Predicates can be
declared to be @t{ fragile} to ensure that @t{ call_ cleanup} is called
for any Goal which needs it. This library is loaded with the
@code{ use_ module(library(cleanup))} command.
2001-04-09 20:54:03 +01:00
2002-10-11 04:39:11 +01:00
@table @code
2002-10-27 18:11:01 +00:00
@item :- fragile @var{ P} ,....,@var{ Pn}
@findex fragile
@syindex fragile
@cnindex fragile
Declares the predicate @var{ P} =@t{ [module:]name/arity} as a fragile
predicate, module is optional, default is the current
typein_ module. Whenever such a fragile predicate is used in a query
it will be called through call_ cleanup/1.
2014-04-21 11:14:18 +01:00
@pl_ example
2002-10-27 18:11:01 +00:00
:- fragile foo/1,bar:baz/2.
2014-04-21 11:14:18 +01:00
@end pl_ example
2002-10-27 18:11:01 +00:00
2009-05-20 07:53:14 +01:00
@item call_ cleanup(:@var{ Goal} )
2002-10-11 04:39:11 +01:00
@findex call_ cleanup/1
@syindex call_ cleanup/1
@cnindex call_ cleanup/1
Execute goal @var{ Goal} within a cleanup-context. Called predicates
might register cleanup Goals which are called right after the end of
the call to @var{ Goal} . Cuts and exceptions inside Goal do not prevent the
execution of the cleanup calls. @t{ call_ cleanup} might be nested.
2009-05-20 07:53:14 +01:00
@item call_ cleanup(:@var{ Goal} , :@var{ CleanUpGoal} )
2002-10-11 04:39:11 +01:00
@findex call_ cleanup/2
@syindex call_ cleanup/2
@cnindex call_ cleanup/2
This is similar to @t{ call_ cleanup/1} with an additional
@var{ CleanUpGoal} which gets called after @var{ Goal} is finished.
2009-05-20 07:53:14 +01:00
@item setup_ call_ cleanup(:@var{ Setup} ,:@var{ Goal} , :@var{ CleanUpGoal} )
@findex setup_ call_ cleanup/3
@snindex setup_ call_ cleanup/3
@cnindex setup_ call_ cleanup/3
Calls @code{ (Setup, Goal)} . For each sucessful execution of @var{ Setup} , calling @var{ Goal} , the
cleanup handler @var{ Cleanup} is guaranteed to be called exactly once.
This will happen after @var{ Goal} completes, either through failure,
deterministic success, commit, or an exception. @var{ Setup} will
contain the goals that need to be protected from asynchronous interrupts
such as the ones received from @code{ call_ with_ time_ limit/2} or @code{ thread_ signal/2} . In
most uses, @var{ Setup} will perform temporary side-effects required by
2009-05-26 23:57:59 +01:00
@var{ Goal} that are finally undone by @var{ Cleanup} .
2009-05-20 07:53:14 +01:00
Success or failure of @var{ Cleanup} is ignored and choice-points it
created are destroyed (as @code{ once/1} ). If @var{ Cleanup} throws an exception,
this is executed as normal.
Typically, this predicate is used to cleanup permanent data storage
required to execute @var{ Goal} , close file-descriptors, etc. The example
below provides a non-deterministic search for a term in a file, closing
the stream as needed.
2014-04-21 11:14:18 +01:00
@pl_ example
2009-05-20 07:53:14 +01:00
term_ in_ file(Term, File) :-
2013-09-29 11:31:18 +01:00
setup_ call_ cleanup(open(File, read, In),
term_ in_ stream(Term, In),
close(In) ).
2009-05-20 07:53:14 +01:00
term_ in_ stream(Term, In) :-
2013-09-29 11:31:18 +01:00
repeat,
read(In, T),
( T == end_ of_ file
-> !, fail
; T = Term
).
2014-04-21 11:14:18 +01:00
@end pl_ example
2009-05-20 07:53:14 +01:00
Note that it is impossible to implement this predicate in Prolog other than
by reading all terms into a list, close the file and call @code{ member/2} .
Without @code{ setup_ call_ cleanup/3} there is no way to gain control if the
2009-05-26 23:57:59 +01:00
choice-point left by @code{ repeat} is removed by a cut or an exception.
2009-05-20 07:53:14 +01:00
@code{ setup_ call_ cleanup/2} can also be used to test determinism of a goal:
@example
?- setup_ call_ cleanup(true,(X=1;X=2), Det=yes).
X = 1 ;
X = 2,
Det = yes ;
@end example
This predicate is under consideration for inclusion into the ISO standard.
For compatibility with other Prolog implementations see @code{ call_ cleanup/2} .
2009-05-20 17:14:48 +01:00
@item setup_ call_ catcher_ cleanup(:@var{ Setup} ,:@var{ Goal} , +@var{ Catcher} ,:@var{ CleanUpGoal} )
@findex setup_ call_ catcher_ cleanup/4
@snindex setup_ call_ catcher_ cleanup/4
@cnindex setup_ call_ catcher_ cleanup/4
2009-05-26 23:57:59 +01:00
Similar to @code{ setup_ call_ cleanup(@var{ Setup} , @var{ Goal} , @var{ Cleanup} )} with
2009-05-20 07:53:14 +01:00
additional information on the reason of calling @var{ Cleanup} . Prior
to calling @var{ Cleanup} , @var{ Catcher} unifies with the termination
code. If this unification fails, @var{ Cleanup} is
@strong{ not} called.
2002-10-11 04:39:11 +01:00
@item on_ cleanup(+@var{ CleanUpGoal} )
@findex on_ cleanup/1
@syindex on_ cleanup/1
@cnindex on_ cleanup/1
Any Predicate might registers a @var{ CleanUpGoal} . The
@var{ CleanUpGoal} is put onto the current cleanup context. All such
CleanUpGoals are executed in reverse order of their registration when
the surrounding cleanup-context ends. This call will throw an exception
if a predicate tries to register a @var{ CleanUpGoal} outside of any
cleanup-context.
@item cleanup_ all
@findex cleanup_ all/0
@syindex cleanup_ all/0
@cnindex cleanup_ all/0
Calls all pending CleanUpGoals and resets the cleanup-system to an
initial state. Should only be used as one of the last calls in the
main program.
2001-04-09 20:54:03 +01:00
@end table
2002-10-11 04:39:11 +01:00
There are some private predicates which could be used in special
cases, such as manually setting up cleanup-contexts and registering
CleanUpGoals for other than the current cleanup-context.
Read the Source Luke.
2005-10-31 18:12:51 +00:00
@node Timeout, Trees, Cleanup, Library
2001-04-09 20:54:03 +01:00
@section Calls With Timeout
@cindex timeout
The @t{ time_ out/3} command relies on the @t{ alarm/3} built-in to
implement a call with a maximum time of execution. The command is
available with the @code{ use_ module(library(timeout))} command.
@table @code
2005-10-31 18:12:51 +00:00
2001-04-09 20:54:03 +01:00
@item time_ out(+@var{ Goal} , +@var{ Timeout} , -@var{ Result} )
@findex time_ out/3
@syindex time_ out/3
@cnindex time_ out/3
2002-10-11 04:39:11 +01:00
Execute goal @var{ Goal} with time limited @var{ Timeout} , where
2001-04-09 20:54:03 +01:00
@var{ Timeout} is measured in milliseconds. If the goal succeeds, unify
@var{ Result} with success. If the timer expires before the goal
2010-03-03 22:12:17 +00:00
terminates, unify @var{ Result} with @t{ time_ out} .
2001-04-09 20:54:03 +01:00
This command is implemented by activating an alarm at procedure
entry. If the timer expires before the goal completes, the alarm will
2009-05-12 23:06:33 +01:00
throw an exception @var{ timeout} .
2001-04-09 20:54:03 +01:00
One should note that @code{ time_ out/3} is not reentrant, that is, a goal
called from @code{ time_ out} should never itself call
2009-05-12 23:06:33 +01:00
@code{ time_ out/3} . Moreover, @code{ time_ out/3} will deactivate any previous
2001-04-09 20:54:03 +01:00
alarms set by @code{ alarm/3} and vice-versa, hence only one of these
calls should be used in a program.
Last, even though the timer is set in milliseconds, the current
implementation relies on @t{ alarm/3} , and therefore can only offer
precision on the scale of seconds.
@end table
@node Trees, UGraphs, Timeout, Library
@section Updatable Binary Trees
@cindex updatable tree
The following queue manipulation routines are available once
included with the @code{ use_ module(library(trees))} command.
@table @code
@item get_ label(+@var{ Index} , +@var{ Tree} , ?@var{ Label} )
@findex get_ label/3
@syindex get_ label/3
@cnindex get_ label/3
Treats the tree as an array of @var{ N} elements and returns the
@var{ Index} -th.
@item list_ to_ tree(+@var{ List} , -@var{ Tree} )
@findex list_ to_ tree/2
@syindex list_ to_ tree/2
@cnindex list_ to_ tree/2
Takes a given @var{ List} of @var{ N} elements and constructs a binary
@var{ Tree} .
@item map_ tree(+@var{ Pred} , +@var{ OldTree} , -@var{ NewTree} )
@findex map_ tree/3
@syindex map_ tree/3
@cnindex map_ tree/3
Holds when @var{ OldTree} and @var{ NewTree} are binary trees of the same shape
and @code{ Pred(Old,New)} is true for corresponding elements of the two trees.
@item put_ label(+@var{ Index} , +@var{ OldTree} , +@var{ Label} , -@var{ NewTree} )
@findex put_ label/4
@syindex put_ label/4
@cnindex put_ label/4
constructs a new tree the same shape as the old which moreover has the
same elements except that the @var{ Index} -th one is @var{ Label} .
@item tree_ size(+@var{ Tree} , -@var{ Size} )
@findex tree_ size/2
@syindex tree_ size/2
@cnindex tree_ size/2
Calculates the number of elements in the @var{ Tree} .
@item tree_ to_ list(+@var{ Tree} , -@var{ List} )
@findex tree_ to_ list/2
@syindex tree_ to_ list/2
@cnindex tree_ to_ list/2
Is the converse operation to list_ to_ tree.
@end table
2006-04-10 20:24:52 +01:00
@node UGraphs, DGraphs, Trees, Library
2001-04-09 20:54:03 +01:00
@section Unweighted Graphs
@cindex unweighted graphs
2006-04-10 20:24:52 +01:00
The following graph manipulation routines are based in code originally
2001-04-09 20:54:03 +01:00
written by Richard O'Keefe. The code was then extended to be compatible
with the SICStus Prolog ugraphs library. The routines assume directed
graphs, undirected graphs may be implemented by using two edges. Graphs
are represented in one of two ways:
@itemize @bullet
@item The P-representation of a graph is a list of (from-to) vertex
pairs, where the pairs can be in any old order. This form is
convenient for input/output.
2002-10-11 04:39:11 +01:00
@item The S-representation of a graph is a list of (vertex-neighbors)
2001-04-09 20:54:03 +01:00
pairs, where the pairs are in standard order (as produced by keysort)
2002-10-11 04:39:11 +01:00
and the neighbors of each vertex are also in standard order (as
2001-04-09 20:54:03 +01:00
produced by sort). This form is convenient for many calculations.
@end itemize
2006-02-08 19:13:11 +00:00
These built-ins are available once included with the
2001-04-09 20:54:03 +01:00
@code{ use_ module(library(ugraphs))} command.
@table @code
@item vertices_ edges_ to_ ugraph(+@var{ Vertices} , +@var{ Edges} , -@var{ Graph} )
@findex vertices_ edges_ to_ ugraph/3
@syindex vertices_ edges_ to_ ugraph/3
@cnindex vertices_ edges_ to_ ugraph/3
Given a graph with a set of vertices @var{ Vertices} and a set of edges
@var{ Edges} , @var{ Graph} must unify with the corresponding
s-representation. Note that the vertices without edges will appear in
@var{ Vertices} but not in @var{ Edges} . Moreover, it is sufficient for a
2009-04-25 16:59:23 +01:00
vertex to appear in @var{ Edges} .
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- vertices_ edges_ to_ ugraph([],[1-3,2-4,4-5,1-5],L).
2001-04-09 20:54:03 +01:00
L = [1-[3,5],2-[4],3-[],4-[5],5-[]] ?
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
In this case all edges are defined implicitly. The next example shows
three unconnected edges:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- vertices_ edges_ to_ ugraph([6,7,8],[1-3,2-4,4-5,1-5],L).
2001-04-09 20:54:03 +01:00
L = [1-[3,5],2-[4],3-[],4-[5],5-[],6-[],7-[],8-[]] ?
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item vertices(+@var{ Graph} , -@var{ Vertices} )
@findex vertices/2
@syindex vertices/2
@cnindex vertices/2
Unify @var{ Vertices} with all vertices appearing in graph
@var{ Graph} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- vertices([1-[3,5],2-[4],3-[],4-[5],5-[]], V).
2001-04-09 20:54:03 +01:00
L = [1,2,3,4,5]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item edges(+@var{ Graph} , -@var{ Edges} )
2006-04-10 20:24:52 +01:00
@findex edges/2
@syindex edges/2
@cnindex edges/2
2001-04-09 20:54:03 +01:00
Unify @var{ Edges} with all edges appearing in graph
@var{ Graph} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- vertices([1-[3,5],2-[4],3-[],4-[5],5-[]], V).
2001-04-09 20:54:03 +01:00
L = [1,2,3,4,5]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
2001-04-26 15:44:43 +01:00
@item add_ vertices(+@var{ Graph} , +@var{ Vertices} , -@var{ NewGraph} )
2001-04-09 20:54:03 +01:00
@findex add_ vertices/3
@syindex add_ vertices/3
@cnindex add_ vertices/3
Unify @var{ NewGraph} with a new graph obtained by adding the list of
vertices @var{ Vertices} to the graph @var{ Graph} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-26 15:44:43 +01:00
?- add_ vertices([1-[3,5],2-[4],3-[],4-[5],
5-[],6-[],7-[],8-[]],
[0,2,9,10,11],
NG).
2001-04-09 20:54:03 +01:00
2001-04-16 17:41:04 +01:00
NG = [0-[],1-[3,5],2-[4],3-[],4-[5],5-[],
6-[],7-[],8-[],9-[],10-[],11-[]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
2007-12-05 12:17:25 +00:00
@item del_ vertices(+@var{ Graph} , +@var{ Vertices} , -@var{ NewGraph} )
2001-04-09 20:54:03 +01:00
@findex del_ vertices/3
@syindex del_ vertices/3
@cnindex del_ vertices/3
Unify @var{ NewGraph} with a new graph obtained by deleting the list of
vertices @var{ Vertices} and all the edges that start from or go to a
vertex in @var{ Vertices} to the graph @var{ Graph} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- del_ vertices([2,1],[1-[3,5],2-[4],3-[],
4-[5],5-[],6-[],7-[2,6],8-[]],NL).
2001-04-09 20:54:03 +01:00
NL = [3-[],4-[5],5-[],6-[],7-[6],8-[]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item add_ edges(+@var{ Graph} , +@var{ Edges} , -@var{ NewGraph} )
@findex add_ edges/3
@syindex add_ edges/3
@cnindex add_ edges/3
Unify @var{ NewGraph} with a new graph obtained by adding the list of
edges @var{ Edges} to the graph @var{ Graph} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- add_ edges([1-[3,5],2-[4],3-[],4-[5],5-[],6-[],
7-[],8-[]],[1-6,2-3,3-2,5-7,3-2,4-5],NL).
2001-04-09 20:54:03 +01:00
NL = [1-[3,5,6],2-[3,4],3-[2],4-[5],5-[7],6-[],7-[],8-[]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
2006-04-10 20:24:52 +01:00
@item del_ edges(+@var{ Graph} , +@var{ Edges} , -@var{ NewGraph} )
@findex del_ edges/3
@syindex del_ edges/3
@cnindex del_ edges/3
2001-04-09 20:54:03 +01:00
Unify @var{ NewGraph} with a new graph obtained by removing the list of
edges @var{ Edges} from the graph @var{ Graph} . Notice that no vertices
are deleted. In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- del_ edges([1-[3,5],2-[4],3-[],4-[5],5-[],
6-[],7-[],8-[]],
[1-6,2-3,3-2,5-7,3-2,4-5,1-3],NL).
2001-04-09 20:54:03 +01:00
NL = [1-[5],2-[4],3-[],4-[],5-[],6-[],7-[],8-[]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item transpose(+@var{ Graph} , -@var{ NewGraph} )
@findex transpose/3
@syindex transpose/3
@cnindex transpose/3
Unify @var{ NewGraph} with a new graph obtained from @var{ Graph} by
replacing all edges of the form @var{ V1-V2} by edges of the form
@var{ V2-V1} . The cost is @code{ O(|V|^ 2)} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- transpose([1-[3,5],2-[4],3-[],
4-[5],5-[],6-[],7-[],8-[]], NL).
2001-04-09 20:54:03 +01:00
NL = [1-[],2-[],3-[1],4-[2],5-[1,4],6-[],7-[],8-[]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
Notice that an undirected graph is its own transpose.
@item neighbors(+@var{ Vertex} , +@var{ Graph} , -@var{ Vertices} )
@findex neighbors/3
@syindex neighbors/3
@cnindex neighbors/3
Unify @var{ Vertices} with the list of neighbors of vertex @var{ Vertex}
in @var{ Graph} . If the vertice is not in the graph fail. In the next
example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- neighbors(4,[1-[3,5],2-[4],3-[],
4-[1,2,7,5],5-[],6-[],7-[],8-[]],
NL).
2001-04-09 20:54:03 +01:00
NL = [1,2,7,5]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item neighbours(+@var{ Vertex} , +@var{ Graph} , -@var{ Vertices} )
@findex neighbours/3
@syindex neighbours/3
@cnindex neighbours/3
Unify @var{ Vertices} with the list of neighbours of vertex @var{ Vertex}
in @var{ Graph} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- neighbours(4,[1-[3,5],2-[4],3-[],
4-[1,2,7,5],5-[],6-[],7-[],8-[]], NL).
2001-04-09 20:54:03 +01:00
NL = [1,2,7,5]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item complement(+@var{ Graph} , -@var{ NewGraph} )
@findex complement/2
@syindex complement/2
@cnindex complement/2
2002-10-11 04:39:11 +01:00
Unify @var{ NewGraph} with the graph complementary to @var{ Graph} .
2001-04-09 20:54:03 +01:00
In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- complement([1-[3,5],2-[4],3-[],
4-[1,2,7,5],5-[],6-[],7-[],8-[]], NL).
2001-04-09 20:54:03 +01:00
2001-04-16 17:41:04 +01:00
NL = [1-[2,4,6,7,8],2-[1,3,5,6,7,8],3-[1,2,4,5,6,7,8],
4-[3,5,6,8],5-[1,2,3,4,6,7,8],6-[1,2,3,4,5,7,8],
7-[1,2,3,4,5,6,8],8-[1,2,3,4,5,6,7]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
@item compose(+@var{ LeftGraph} , +@var{ RightGraph} , -@var{ NewGraph} )
@findex compose/3
@syindex compose/3
@cnindex compose/3
Compose the graphs @var{ LeftGraph} and @var{ RightGraph} to form @var{ NewGraph} .
In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- compose([1-[2],2-[3]],[2-[4],3-[1,2,4]],L).
2001-04-09 20:54:03 +01:00
L = [1-[4],2-[1,2,4],3-[]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
2002-10-11 04:39:11 +01:00
@item top_ sort(+@var{ Graph} , -@var{ Sort} )
2001-04-09 20:54:03 +01:00
@findex top_ sort/2
@syindex top_ sort/2
@cnindex top_ sort/2
Generate the set of nodes @var{ Sort} as a topological sorting of graph
@var{ Graph} , if one is possible.
In the next example we show how topological sorting works for a linear graph:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- top_ sort([_ 138-[_ 219],_ 219-[_ 139], _ 139-[]],L).
2001-04-09 20:54:03 +01:00
L = [_ 138,_ 219,_ 139]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
2005-08-17 14:35:52 +01:00
@item top_ sort(+@var{ Graph} , -@var{ Sort0} , -@var{ Sort} )
@findex top_ sort/3
@syindex top_ sort/3
@cnindex top_ sort/3
Generate the difference list @var{ Sort} -@var{ Sort0} as a topological
sorting of graph @var{ Graph} , if one is possible.
2001-04-09 20:54:03 +01:00
@item transitive_ closure(+@var{ Graph} , +@var{ Closure} )
@findex transitive_ closure/2
@syindex transitive_ closure/2
@cnindex transitive_ closure/2
Generate the graph @var{ Closure} as the transitive closure of graph
@var{ Graph} .
In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2001-04-16 17:41:04 +01:00
?- transitive_ closure([1-[2,3],2-[4,5],4-[6]],L).
2001-04-09 20:54:03 +01:00
L = [1-[2,3,4,5,6],2-[4,5,6],4-[6]]
2014-04-21 11:14:18 +01:00
@end pl_ example
2001-04-09 20:54:03 +01:00
2002-01-27 21:35:39 +00:00
@item reachable(+@var{ Node} , +@var{ Graph} , -@var{ Vertices} )
@findex reachable/3
@syindex reachable/3
@cnindex reachable/3
Unify @var{ Vertices} with the set of all vertices in graph
2002-02-04 16:12:54 +00:00
@var{ Graph} that are reachable from @var{ Node} . In the next example:
2014-04-21 11:14:18 +01:00
@pl_ example
2002-01-27 21:35:39 +00:00
?- reachable(1,[1-[3,5],2-[4],3-[],4-[5],5-[]],V).
V = [1,3,5]
2014-04-21 11:14:18 +01:00
@end pl_ example
2002-01-27 21:35:39 +00:00
2001-04-09 20:54:03 +01:00
@end table
2006-04-10 20:24:52 +01:00
@node DGraphs, UnDGraphs, UGraphs, Library
@section Directed Graphs
@cindex Efficient Directed Graphs
The following graph manipulation routines use the red-black tree library
to try to avoid linear-time scans of the graph for all graph
operations. Graphs are represented as a red-black tree, where the key is
the vertex, and the associated value is a list of vertices reachable
from that vertex through an edge (ie, a list of edges).
@table @code
@item dgraph_ new(+@var{ Graph} )
@findex dgraph_ new/1
@snindex dgraph_ new/1
@cnindex dgraph_ new/1
Create a new directed graph. This operation must be performed before
trying to use the graph.
@item dgraph_ vertices(+@var{ Graph} , -@var{ Vertices} )
@findex dgraph_ vertices/2
@snindex dgraph_ vertices/2
@cnindex dgraph_ vertices/2
Unify @var{ Vertices} with all vertices appearing in graph
@var{ Graph} .
2006-04-20 16:28:08 +01:00
@item dgraph_ edge(+@var{ N1} , +@var{ N2} , +@var{ Graph} )
@findex dgraph_ edge/2
@snindex dgraph_ edge/2
@cnindex dgraph_ edge/2
Edge @var{ N1} -@var{ N2} is an edge in directed graph @var{ Graph} .
2006-04-10 20:24:52 +01:00
@item dgraph_ edges(+@var{ Graph} , -@var{ Edges} )
@findex dgraph_ edges/2
@snindex dgraph_ edges/2
@cnindex dgraph_ edges/2
Unify @var{ Edges} with all edges appearing in graph
@var{ Graph} .
2008-06-26 14:09:15 +01:00
@item dgraph_ add_ vertices(+@var{ Graph} , +@var{ Vertex} , -@var{ NewGraph} )
@findex dgraph_ add_ vertex/3
@snindex dgraph_ add_ vertex/3
@cnindex dgraph_ add_ vertex/3
Unify @var{ NewGraph} with a new graph obtained by adding
vertex @var{ Vertex} to the graph @var{ Graph} .
2006-04-10 20:24:52 +01:00
@item dgraph_ add_ vertices(+@var{ Graph} , +@var{ Vertices} , -@var{ NewGraph} )
@findex dgraph_ add_ vertices/3
@snindex dgraph_ add_ vertices/3
@cnindex dgraph_ add_ vertices/3
Unify @var{ NewGraph} with a new graph obtained by adding the list of
vertices @var{ Vertices} to the graph @var{ Graph} .
2008-06-26 14:09:15 +01:00
@item dgraph_ del_ vertex(+@var{ Graph} , +@var{ Vertex} , -@var{ NewGraph} )
@findex dgraph_ del_ vertex/3
@syindex dgraph_ del_ vertex/3
@cnindex dgraph_ del_ vertex/3
Unify @var{ NewGraph} with a new graph obtained by deleting vertex
@var{ Vertex} and all the edges that start from or go to @var{ Vertex} to
the graph @var{ Graph} .
@item dgraph_ del_ vertices(+@var{ Graph} , +@var{ Vertices} , -@var{ NewGraph} )
2006-04-10 20:24:52 +01:00
@findex dgraph_ del_ vertices/3
@syindex dgraph_ del_ vertices/3
@cnindex dgraph_ del_ vertices/3
Unify @var{ NewGraph} with a new graph obtained by deleting the list of
vertices @var{ Vertices} and all the edges that start from or go to a
vertex in @var{ Vertices} to the graph @var{ Graph} .
2008-06-26 14:09:15 +01:00
@item dgraph_ add_ edge(+@var{ Graph} , +@var{ N1} , +@var{ N2} , -@var{ NewGraph} )
@findex dgraph_ add_ edge/4
@snindex dgraph_ add_ edge/4
@cnindex dgraph_ add_ edge/4
Unify @var{ NewGraph} with a new graph obtained by adding the edge
@var{ N1} -@var{ N2} to the graph @var{ Graph} .
2006-04-10 20:24:52 +01:00
@item dgraph_ add_ edges(+@var{ Graph} , +@var{ Edges} , -@var{ NewGraph} )
@findex dgraph_ add_ edges/3
@snindex dgraph_ add_ edges/3
@cnindex dgraph_ add_ edges/3
Unify @var{ NewGraph} with a new graph obtained by adding the list of
edges @var{ Edges} to the graph @var{ Graph} .
2008-06-26 14:09:15 +01:00
@item dgraph_ del_ edge(+@var{ Graph} , +@var{ N1} , +@var{ N2} , -@var{ NewGraph} )
@findex dgraph_ del_ edge/4
@snindex dgraph_ del_ edge/4
@cnindex dgraph_ del_ edge/4
Succeeds if @var{ NewGraph} unifies with a new graph obtained by
removing the edge @var{ N1} -@var{ N2} from the graph @var{ Graph} . Notice
that no vertices are deleted.
2006-04-10 20:24:52 +01:00
@item dgraph_ del_ edges(+@var{ Graph} , +@var{ Edges} , -@var{ NewGraph} )
@findex dgraph_ del_ edges/3
@snindex dgraph_ del_ edges/3
@cnindex dgraph_ del_ edges/3
Unify @var{ NewGraph} with a new graph obtained by removing the list of
edges @var{ Edges} from the graph @var{ Graph} . Notice that no vertices
are deleted.
2008-06-26 14:09:15 +01:00
@item dgraph_ to_ ugraph(+@var{ Graph} , -@var{ UGraph} )
@findex dgraph_ to_ ugraph/2
@snindex dgraph_ to_ ugraph/2
@cnindex dgraph_ to_ ugraph/2
Unify @var{ UGraph} with the representation used by the @var{ ugraphs}
unweighted graphs library, that is, a list of the form
@var{ V-Neighbors} , where @var{ V} is a node and @var{ Neighbors} the nodes
children.
@item ugraph_ to_ dgraph( +@var{ UGraph} , -@var{ Graph} )
@findex ugraph_ to_ dgraph/2
@snindex ugraph_ to_ dgraph/2
@cnindex ugraph_ to_ dgraph/2
Unify @var{ Graph} with the directed graph obtain from @var{ UGraph} ,
represented in the form used in the @var{ ugraphs} unweighted graphs
library.
2006-04-10 20:24:52 +01:00
@item dgraph_ neighbors(+@var{ Vertex} , +@var{ Graph} , -@var{ Vertices} )
@findex dgraph_ neighbors/3
@snindex dgraph_ neighbors/3
@cnindex dgraph_ neighbors/3
Unify @var{ Vertices} with the list of neighbors of vertex @var{ Vertex}
in @var{ Graph} . If the vertice is not in the graph fail.
@item dgraph_ neighbours(+@var{ Vertex} , +@var{ Graph} , -@var{ Vertices} )
@findex dgraph_ neighbours/3
@snindex dgraph_ neighbours/3
@cnindex dgraph_ neighbours/3
Unify @var{ Vertices} with the list of neighbours of vertex @var{ Vertex}
in @var{ Graph} .
@item dgraph_ complement(+@var{ Graph} , -@var{ NewGraph} )
@findex dgraph_ complement/2
@snindex dgraph_ complement/2
@cnindex dgraph_ complement/2
Unify @var{ NewGraph} with the graph complementary to @var{ Graph} .
@item dgraph_ transpose(+@var{ Graph} , -@var{ Transpose} )
@findex dgraph_ transpose/2
@snindex dgraph_ transpose/2
@cnindex dgraph_ transpose/2
Unify @var{ NewGraph} with a new graph obtained from @var{ Graph} by
replacing all edges of the form @var{ V1-V2} by edges of the form
@var{ V2-V1} .
2008-06-26 14:09:15 +01:00
@item dgraph_ compose(+@var{ Graph1} , +@var{ Graph2} , -@var{ ComposedGraph} )
2006-04-10 20:24:52 +01:00
@findex dgraph_ compose/3
@snindex dgraph_ compose/3
@cnindex dgraph_ compose/3
Unify @var{ ComposedGraph} with a new graph obtained by composing
@var{ Graph1} and @var{ Graph2} , ie, @var{ ComposedGraph} has an edge
@var{ V1-V2} iff there is a @var{ V} such that @var{ V1-V} in @var{ Graph1}
and @var{ V-V2} in @var{ Graph2} .
@item dgraph_ transitive_ closure(+@var{ Graph} , -@var{ Closure} )
@findex dgraph_ transitive_ closure/2
@snindex dgraph_ transitive_ closure/2
@cnindex dgraph_ transitive_ closure/2
Unify @var{ Closure} with the transitive closure of graph @var{ Graph} .
@item dgraph_ symmetric_ closure(+@var{ Graph} , -@var{ Closure} )
@findex dgraph_ symmetric_ closure/2
@snindex dgraph_ symmetric_ closure/2
@cnindex dgraph_ symmetric_ closure/2
Unify @var{ Closure} with the symmetric closure of graph @var{ Graph} ,
that is, if @var{ Closure} contains an edge @var{ U-V} it must also
contain the edge @var{ V-U} .
@item dgraph_ top_ sort(+@var{ Graph} , -@var{ Vertices} )
@findex dgraph_ top_ sort/2
@snindex dgraph_ top_ sort/2
@cnindex dgraph_ top_ sort/2
Unify @var{ Vertices} with the topological sort of graph @var{ Graph} .
2008-06-26 14:09:15 +01:00
@item dgraph_ top_ sort(+@var{ Graph} , -@var{ Vertices} , ?@var{ Vertices0} )
@findex dgraph_ top_ sort/3
@snindex dgraph_ top_ sort/3
@cnindex dgraph_ top_ sort/3
Unify the difference list @var{ Vertices} -@var{ Vertices0} with the
topological sort of graph @var{ Graph} .
@item dgraph_ min_ path(+@var{ V1} , +@var{ V1} , +@var{ Graph} , -@var{ Path} , ?@var{ Costt} )
@findex dgraph_ min_ path/5
@snindex dgraph_ min_ path/5
@cnindex dgraph_ min_ path/5
Unify the list @var{ Path} with the minimal cost path between nodes
@var{ N1} and @var{ N2} in graph @var{ Graph} . Path @var{ Path} has cost
@var{ Cost} .
@item dgraph_ max_ path(+@var{ V1} , +@var{ V1} , +@var{ Graph} , -@var{ Path} , ?@var{ Costt} )
@findex dgraph_ max_ path/5
@snindex dgraph_ max_ path/5
@cnindex dgraph_ max_ path/5
Unify the list @var{ Path} with the maximal cost path between nodes
@var{ N1} and @var{ N2} in graph @var{ Graph} . Path @var{ Path} has cost
@var{ Cost} .
@item dgraph_ min_ paths(+@var{ V1} , +@var{ Graph} , -@var{ Paths} )
@findex dgraph_ min_ paths/3
@snindex dgraph_ min_ paths/3
@cnindex dgraph_ min_ paths/3
Unify the list @var{ Paths} with the minimal cost paths from node
@var{ N1} to the nodes in graph @var{ Graph} .
@item dgraph_ isomorphic(+@var{ Vs} , +@var{ NewVs} , +@var{ G0} , -@var{ GF} )
@findex dgraph_ isomorphic/4
@snindex dgraph_ isomorphic/4
@cnindex dgraph_ isomorphic/4
Unify the list @var{ GF} with the graph isomorphic to @var{ G0} where
vertices in @var{ Vs} map to vertices in @var{ NewVs} .
@item dgraph_ path(+@var{ Vertex} , +@var{ Graph} , ?@var{ Path} )
@findex dgraph_ path/3
@snindex dgraph_ path/3
@cnindex dgraph_ path/3
The path @var{ Path} is a path starting at vertex @var{ Vertex} in graph
@var{ Graph} .
2013-03-24 14:12:55 +00:00
@item dgraph_ path(+@var{ Vertex} , +@var{ Vertex1} , +@var{ Graph} , ?@var{ Path} )
2014-04-21 11:14:18 +01:00
@findex dgraph_ path/4
@snindex dgraph_ path/4
@cnindex dgraph_ path/4
2013-03-24 14:12:55 +00:00
The path @var{ Path} is a path starting at vertex @var{ Vertex} in graph
@var{ Graph} and ending at path @var{ Vertex2} .
2008-06-26 14:09:15 +01:00
@item dgraph_ reachable(+@var{ Vertex} , +@var{ Graph} , ?@var{ Edges} )
2014-04-21 11:14:18 +01:00
@findex dgraph_ reachable/3
@snindex dgraph_ reachable/3
@cnindex dgraph_ reachable/3
2008-06-26 14:09:15 +01:00
The path @var{ Path} is a path starting at vertex @var{ Vertex} in graph
@var{ Graph} .
2006-04-20 16:28:08 +01:00
2012-03-22 21:41:48 +00:00
@item dgraph_ leaves(+@var{ Graph} , ?@var{ Vertices} )
@findex dgraph_ leaves/2
@snindex dgraph_ leaves/2
@cnindex dgraph_ leaves/2
The vertices @var{ Vertices} have no outgoing edge in graph
@var{ Graph} .
2006-04-10 20:24:52 +01:00
@end table
2012-07-16 16:19:15 +01:00
@node UnDGraphs, DBUsage , DGraphs, Library
2006-04-10 20:24:52 +01:00
@section Undirected Graphs
2009-04-25 16:59:23 +01:00
@cindex undirected graphs
2006-04-10 20:24:52 +01:00
The following graph manipulation routines use the red-black tree graph
library to implement undirected graphs. Mostly, this is done by having
two directed edges per undirected edge.
@table @code
@item undgraph_ new(+@var{ Graph} )
@findex undgraph_ new/1
@snindex undgraph_ new/1
@cnindex undgraph_ new/1
Create a new directed graph. This operation must be performed before
trying to use the graph.
@item undgraph_ vertices(+@var{ Graph} , -@var{ Vertices} )
@findex undgraph_ vertices/2
@snindex undgraph_ vertices/2
@cnindex undgraph_ vertices/2
Unify @var{ Vertices} with all vertices appearing in graph
@var{ Graph} .
2006-04-20 16:28:08 +01:00
@item undgraph_ edge(+@var{ N1} , +@var{ N2} , +@var{ Graph} )
@findex undgraph_ edge/2
@snindex undgraph_ edge/2
@cnindex undgraph_ edge/2
Edge @var{ N1} -@var{ N2} is an edge in undirected graph @var{ Graph} .
2006-04-10 20:24:52 +01:00
@item undgraph_ edges(+@var{ Graph} , -@var{ Edges} )
@findex undgraph_ edges/2
@snindex undgraph_ edges/2
@cnindex undgraph_ edges/2
Unify @var{ Edges} with all edges appearing in graph
@var{ Graph} .
@item undgraph_ add_ vertices(+@var{ Graph} , +@var{ Vertices} , -@var{ NewGraph} )
@findex undgraph_ add_ vertices/3
@snindex undgraph_ add_ vertices/3
@cnindex undgraph_ add_ vertices/3
Unify @var{ NewGraph} with a new graph obtained by adding the list of
vertices @var{ Vertices} to the graph @var{ Graph} .
2007-12-05 12:17:25 +00:00
@item undgraph_ del_ vertices(+@var{ Graph} , +@var{ Vertices} , -@var{ NewGraph} )
2006-04-10 20:24:52 +01:00
@findex undgraph_ del_ vertices/3
@syindex undgraph_ del_ vertices/3
@cnindex undgraph_ del_ vertices/3
Unify @var{ NewGraph} with a new graph obtained by deleting the list of
vertices @var{ Vertices} and all the edges that start from or go to a
vertex in @var{ Vertices} to the graph @var{ Graph} .
@item undgraph_ add_ edges(+@var{ Graph} , +@var{ Edges} , -@var{ NewGraph} )
@findex undgraph_ add_ edges/3
@snindex undgraph_ add_ edges/3
@cnindex undgraph_ add_ edges/3
Unify @var{ NewGraph} with a new graph obtained by adding the list of
edges @var{ Edges} to the graph @var{ Graph} .
@item undgraph_ del_ edges(+@var{ Graph} , +@var{ Edges} , -@var{ NewGraph} )
@findex undgraph_ del_ edges/3
@snindex undgraph_ del_ edges/3
@cnindex undgraph_ del_ edges/3
Unify @var{ NewGraph} with a new graph obtained by removing the list of
edges @var{ Edges} from the graph @var{ Graph} . Notice that no vertices
are deleted.
@item undgraph_ neighbors(+@var{ Vertex} , +@var{ Graph} , -@var{ Vertices} )
@findex undgraph_ neighbors/3
@snindex undgraph_ neighbors/3
@cnindex undgraph_ neighbors/3
Unify @var{ Vertices} with the list of neighbors of vertex @var{ Vertex}
in @var{ Graph} . If the vertice is not in the graph fail.
@item undgraph_ neighbours(+@var{ Vertex} , +@var{ Graph} , -@var{ Vertices} )
@findex undgraph_ neighbours/3
@snindex undgraph_ neighbours/3
@cnindex undgraph_ neighbours/3
Unify @var{ Vertices} with the list of neighbours of vertex @var{ Vertex}
in @var{ Graph} .
@item undgraph_ complement(+@var{ Graph} , -@var{ NewGraph} )
@findex undgraph_ complement/2
@snindex undgraph_ complement/2
@cnindex undgraph_ complement/2
Unify @var{ NewGraph} with the graph complementary to @var{ Graph} .
2006-04-20 16:28:08 +01:00
@item dgraph_ to_ undgraph( +@var{ DGraph} , -@var{ UndGraph} )
@findex dgraph_ to_ undgraph/2
@snindex dgraph_ to_ undgraph/2
@cnindex dgraph_ to_ undgraph/2
2009-04-25 16:59:23 +01:00
Unify @var{ UndGraph} with the undirected graph obtained from the
2006-04-20 16:28:08 +01:00
directed graph @var{ DGraph} .
2006-04-10 20:24:52 +01:00
@end table
2012-07-16 16:19:15 +01:00
@node DBUsage, Lambda, UnDGraphs, Library
@section Memory Usage in Prolog Data-Base
@cindex DBUsage
This library provides a set of utilities for studying memory usage in YAP.
The following routines are available once included with the
@code{ use_ module(library(dbusage))} command.
@table @code
@item db_ usage
@findex db_ usage/0
@snindex db_ usage/0
@cnindex db_ usage/0
Give general overview of data-base usage in the system.
@item db_ static
@findex db_ static/0
@snindex db_ static/0
@cnindex db_ static/0
List memory usage for every static predicate.
@item db_ static(+@var{ Threshold} )
2014-04-21 11:14:18 +01:00
@findex db_ static/1
@snindex db_ static/1
@cnindex db_ static/1
2012-07-16 16:19:15 +01:00
List memory usage for every static predicate. Predicate must use more
than @var{ Threshold} bytes.
@item db_ dynamic
@findex db_ dynamic/0
@snindex db_ dynamic/0
@cnindex db_ dynamic/0
List memory usage for every dynamic predicate.
@item db_ dynamic(+@var{ Threshold} )
2014-04-21 11:14:18 +01:00
@findex db_ dynamic/1
@snindex db_ dynamic/1
@cnindex db_ dynamic/1
2012-07-16 16:19:15 +01:00
List memory usage for every dynamic predicate. Predicate must use more
than @var{ Threshold} bytes.
@end table
@node Lambda, LAM, DBUsage, Library
2010-08-04 23:26:50 +01:00
@section Lambda Expressions
@cindex Lambda Expressions
This library, designed and implemented by Ulrich Neumerkel, provides
lambda expressions to simplify higher order programming based on @code{ call/N} .
Lambda expressions are represented by ordinary Prolog terms. There are
two kinds of lambda expressions:
2014-04-21 11:14:18 +01:00
@pl_ example
2010-08-04 23:26:50 +01:00
Free+\X 1^ X2^ ..^ XN^ Goal
\X 1^ X2^ ..^ XN^ Goal
2014-04-21 11:14:18 +01:00
@end pl_ example
2010-08-04 23:26:50 +01:00
The second is a shorthand for@code{ t+\X 1^ X2^ ..^ XN^ Goal} , where @code{ Xi} are the parameters.
@var{ Goal} is a goal or continuation (Syntax note: @var{ Operators} within @var{ Goal}
require parentheses due to the low precedence of the @code{ ^ } operator).
Free contains variables that are valid outside the scope of the lambda
expression. They are thus free variables within.
All other variables of @var{ Goal} are considered local variables. They must
not appear outside the lambda expression. This restriction is
currently not checked. Violations may lead to unexpected bindings.
In the following example the parentheses around @code{ X>3} are necessary.
2014-04-21 11:14:18 +01:00
@pl_ example
2010-08-04 23:26:50 +01:00
?- use_ module(library(lambda)).
?- use_ module(library(apply)).
?- maplist(\X ^ (X>3),[4,5,9]).
true.
2014-04-21 11:14:18 +01:00
@end pl_ example
2010-08-04 23:26:50 +01:00
In the following @var{ X} is a variable that is shared by both instances
of the lambda expression. The second query illustrates the cooperation
of continuations and lambdas. The lambda expression is in this case a
continuation expecting a further argument.
2014-04-21 11:14:18 +01:00
@pl_ example
2010-08-04 23:26:50 +01:00
?- Xs = [A,B], maplist(X+\Y ^ dif(X,Y), Xs).
Xs = [A, B],
dif(X, A),
dif(X, B).
?- Xs = [A,B], maplist(X+\dif (X), Xs).
Xs = [A, B],
dif(X, A),
dif(X, B).
2014-04-21 11:14:18 +01:00
@end pl_ example
2010-08-04 23:26:50 +01:00
The following queries are all equivalent. To see this, use
the fact @code{ f(x,y)} .
2014-04-21 11:14:18 +01:00
@pl_ example
2010-08-04 23:26:50 +01:00
?- call(f,A1,A2).
?- call(\X ^ f(X),A1,A2).
2014-04-21 11:14:18 +01:00
?- call(\X ^ Y^ f(X,Y), A1,A2).
2010-08-04 23:26:50 +01:00
?- call(\X ^ (X+\Y ^ f(X,Y)), A1,A2).
?- call(call(f, A1),A2).
?- call(f(A1),A2).
?- f(A1,A2).
A1 = x,
A2 = y.
2014-04-21 11:14:18 +01:00
@end pl_ example
2010-08-04 23:26:50 +01:00
Further discussions
2014-04-21 11:14:18 +01:00
at Ulrich Neumerker's page in @url{ http://www.complang.tuwien.ac.at/ulrich/Prolog-inedit/ISO-Hiord} .
2010-08-04 23:26:50 +01:00
2013-09-30 15:45:14 +01:00
@node LAM, BDDs, Lambda, Library
2006-06-02 05:23:09 +01:00
@section LAM
@cindex lam
This library provides a set of utilities for interfacing with LAM MPI.
The following routines are available once included with the
@code{ use_ module(library(lam_ mpi))} command. The yap should be
invoked using the LAM mpiexec or mpirun commands (see LAM manual for
more details).
@table @code
@item mpi_ init
@findex mpi_ init/0
@snindex mpi_ init/0
@cnindex mpi_ init/0
Sets up the mpi environment. This predicate should be called before any other MPI predicate.
@item mpi_ finalize
@findex mpi_ finalize/0
@snindex mpi_ finalize/0
@cnindex mpi_ finalize/0
Terminates the MPI execution environment. Every process must call this predicate before exiting.
@item mpi_ comm_ size(-@var{ Size} )
@findex mpi_ comm_ size/1
@snindex mpi_ comm_ size/1
@cnindex mpi_ comm_ size/1
Unifies @var{ Size} with the number of processes in the MPI environment.
@item mpi_ comm_ rank(-@var{ Rank} )
@findex mpi_ comm_ rank/1
@snindex mpi_ comm_ rank/1
@cnindex mpi_ comm_ rank/1
Unifies @var{ Rank} with the rank of the current process in the MPI environment.
@item mpi_ version(-@var{ Major} ,-@var{ Minor} )
@findex mpi_ version/2
@snindex mpi_ version/2
@cnindex mpi_ version/2
Unifies @var{ Major} and @var{ Minor} with, respectively, the major and minor version of the MPI.
@item mpi_ send(+@var{ Data} ,+@var{ Dest} ,+@var{ Tag} )
@findex mpi_ send/3
@snindex mpi_ send/3
@cnindex mpi_ send/3
Blocking communication predicate. The message in @var{ Data} , with tag
@var{ Tag} , is sent immediately to the processor with rank @var{ Dest} .
The predicate succeeds after the message being sent.
@item mpi_ isend(+@var{ Data} ,+@var{ Dest} ,+@var{ Tag} ,-@var{ Handle} )
@findex mpi_ isend/4
@snindex mpi_ isend/4
@cnindex mpi_ isend/4
Non blocking communication predicate. The message in @var{ Data} , with
tag @var{ Tag} , is sent whenever possible to the processor with rank
@var{ Dest} . An @var{ Handle} to the message is returned to be used to
check for the status of the message, using the @code{ mpi_ wait} or
@code{ mpi_ test} predicates. Until @code{ mpi_ wait} is called, the
memory allocated for the buffer containing the message is not
released.
@item mpi_ recv(?@var{ Source} ,?@var{ Tag} ,-@var{ Data} )
@findex mpi_ recv/3
@snindex mpi_ recv/3
@cnindex mpi_ recv/3
Blocking communication predicate. The predicate blocks until a message
is received from processor with rank @var{ Source} and tag @var{ Tag} .
The message is placed in @var{ Data} .
@item mpi_ irecv(?@var{ Source} ,?@var{ Tag} ,-@var{ Handle} )
@findex mpi_ irecv/3
@snindex mpi_ irecv/3
@cnindex mpi_ irecv/3
Non-blocking communication predicate. The predicate returns an
@var{ Handle} for a message that will be received from processor with
rank @var{ Source} and tag @var{ Tag} . Note that the predicate succeeds
immediately, even if no message has been received. The predicate
@code{ mpi_ wait_ recv} should be used to obtain the data associated to
the handle.
@item mpi_ wait_ recv(?@var{ Handle} ,-@var{ Status} ,-@var{ Data} )
@findex mpi_ wait_ recv/3
@snindex mpi_ wait_ recv/3
@cnindex mpi_ wait_ recv/3
Completes a non-blocking receive operation. The predicate blocks until
a message associated with handle @var{ Hanlde} is buffered. The
predicate succeeds unifying @var{ Status} with the status of the
message and @var{ Data} with the message itself.
@item mpi_ test_ recv(?@var{ Handle} ,-@var{ Status} ,-@var{ Data} )
@findex mpi_ test_ recv/3
@snindex mpi_ test_ recv/3
@cnindex mpi_ test_ recv/3
Provides information regarding a handle. If the message associated
with handle @var{ Hanlde} is buffered then the predicate succeeds
unifying @var{ Status} with the status of the message and @var{ Data}
with the message itself. Otherwise, the predicate fails.
@item mpi_ wait(?@var{ Handle} ,-@var{ Status} )
@findex mpi_ wait/2
@snindex mpi_ wait/2
@cnindex mpi_ wait/2
Completes a non-blocking operation. If the operation was a
@code{ mpi_ send} , the predicate blocks until the message is buffered
or sent by the runtime system. At this point the send buffer is
released. If the operation was a @code{ mpi_ recv} , it waits until the
message is copied to the receive buffer. @var{ Status} is unified with
the status of the message.
@item mpi_ test(?@var{ Handle} ,-@var{ Status} )
@findex mpi_ test/2
@snindex mpi_ test/2
@cnindex mpi_ test/2
Provides information regarding the handle @var{ Handle} , ie., if a
communication operation has been completed. If the operation
associate with @var{ Hanlde} has been completed the predicate succeeds
with the completion status in @var{ Status} , otherwise it fails.
@item mpi_ barrier
@findex mpi_ barrier/0
@snindex mpi_ barrier/0
@cnindex mpi_ barrier/0
Collective communication predicate. Performs a barrier
synchronization among all processes. Note that a collective
communication means that all processes call the same predicate. To be
able to use a regular @code{ mpi_ recv} to receive the messages, one
should use @code{ mpi_ bcast2} .
2012-02-05 11:20:30 +00:00
@item mpi_ bcast2(+@var{ Root} , ?@var{ Data} )
2006-06-02 05:23:09 +01:00
@findex mpi_ bcast/2
@snindex mpi_ bcast/2
@cnindex mpi_ bcast/2
Broadcasts the message @var{ Data} from the process with rank @var{ Root}
to all other processes.
@item mpi_ bcast3(+@var{ Root} , +@var{ Data} , +@var{ Tag} )
@findex mpi_ bcast/3
@snindex mpi_ bcast/3
@cnindex mpi_ bcast/3
Broadcasts the message @var{ Data} with tag @var{ Tag} from the process with rank @var{ Root}
to all other processes.
@item mpi_ ibcast(+@var{ Root} , +@var{ Data} , +@var{ Tag} )
2014-04-21 11:14:18 +01:00
@findex mpi_ ibcast/3
@snindex mpi_ ibcast/3
@cnindex mpi_ ibcast/3
2006-06-02 05:23:09 +01:00
Non-blocking operation. Broadcasts the message @var{ Data} with tag @var{ Tag}
from the process with rank @var{ Root} to all other processes.
2012-02-05 11:20:30 +00:00
@item mpi_ default_ buffer_ size(-@var{ OldBufferSize} , ?@var{ NewBufferSize} )
@findex mpi_ default_ buffer_ size/1
@snindex mpi_ default_ buffer_ size/1
@cnindex mpi_ default_ buffer_ size/1
The @var{ OldBufferSize} argument unifies with the current size of the
MPI communication buffer size and sets the communication buffer size
@var{ NewBufferSize} . The buffer is used for assynchronous waiting and
for broadcast receivers. Notice that buffer is local at each MPI
process.
@item mpi_ msg_ size(@var{ Msg} , -@var{ MsgSize} )
@findex mpi_ msg_ size/2
@snindex mpi_ msg_ size/2
@cnindex mpi_ msg_ size/2
Unify @var{ MsgSize} with the number of bytes YAP would need to send the
message @var{ Msg} .
2006-06-02 05:23:09 +01:00
@item mpi_ gc
@findex mpi_ gc/0
@snindex mpi_ gc/0
@cnindex mpi_ gc/0
Attempts to perform garbage collection with all the open handles
associated with send and non-blocking broadcasts. For each handle it
tests it and the message has been delivered the handle and the buffer
are released.
@end table
2013-09-30 15:45:14 +01:00
@node BDDs, Block Diagram, LAM, Library
@section Binary Decision Diagrams and Friends
@cindex BDDs
This library provides an interface to the BDD package CUDD. It requires
CUDD compiled as a dynamic library. In Linux this is available out of
box in Fedora, but can easily be ported to other Linux
distributions. CUDD is available in the ports OSX package, and in
2013-10-03 11:28:09 +01:00
cygwin. To use it, call @code{ :-use_ module(library(bdd))} .
2013-09-30 15:45:14 +01:00
The following predicates construct a BDD:
@table @code
@item bbd_ new(?@var{ Exp} , -@var{ BddHandle} )
@findex bdd_ new/2
create a new BDD from the logical expression @var{ Exp} . The expression
may include:
@table @code
@item Logical Variables:
a leaf-node can be a logical variable.
@item Constants 0 and 1
a leaf-node can also be one of these two constants.
@item or(@var{ X} , @var{ Y} ), @var{ X} \/ @var{ Y} , @var{ X} + @var{ Y}
disjunction
@item and(@var{ X} , @var{ Y} ), @var{ X} /\ @var{ Y} , @var{ X} * @var{ Y}
conjunction
@item nand(@var{ X} , @var{ Y} )
negated conjunction@
@item nor(@var{ X} , @var{ Y} )
negated disjunction
@item xor(@var{ X} , @var{ Y} )
exclusive or
@item not(@var{ X} ), -@var{ X}
negation
@end table
@item bdd_ from_ list(?@var{ List} , -@var{ BddHandle} )
2014-03-27 15:34:25 +00:00
@findex bdd_ from_ list/2
2013-09-30 15:45:14 +01:00
Convert a @var{ List} of logical expressions of the form above into a BDD
accessible through @var{ BddHandle} .
@item mtbdd_ new(?@var{ Exp} , -@var{ BddHandle} )
@findex mtbdd_ new/2
create a new algebraic decision diagram (ADD) from the logical
expression @var{ Exp} . The expression may include:
@table @code
@item Logical Variables:
a leaf-node can be a logical variable, or @emph{ parameter} .
@item Number
a leaf-node can also be any number
@item @var{ X} * @var{ Y}
product
@item @var{ X} + @var{ Y}
sum
@item @var{ X} - @var{ Y}
subtraction
@item or(@var{ X} , @var{ Y} ), @var{ X} \/ @var{ Y}
logical or
@end table
@item bdd_ tree(+@var{ BDDHandle} , @var{ Term} )
@findex bdd_ tree/2
Convert the BDD or ADD represented by @var{ BDDHandle} to a Prolog term
2014-04-21 11:14:18 +01:00
of the form @code{ bdd(@var{ Dir} , @var{ Nodes} , @var{ Vars} )} or @code{ mtbdd(@var{ Nodes} , @var{ Vars} )} , respectively. The arguments are:
2013-09-30 15:45:14 +01:00
@itemize
@item
@var{ Dir} direction of the BDD, usually 1
@item
@var{ Nodes} list of nodes in the BDD or ADD.
In a BDD nodes may be @t{ pp} (both terminals are positive) or @t{ pn}
(right-hand-side is negative), and have four arguments: a logical
variable that will be bound to the value of the node, the logical
variable corresponding to the node, a logical variable, a 0 or a 1 with
the value of the left-hand side, and a logical variable, a 0 or a 1
with the right-hand side.
@item
@var{ Vars} are the free variables in the original BDD, or the parameters of the BDD/ADD.
@end itemize
As an example, the BDD for the expression @code{ X+(Y+X)*(-Z)} becomes:
@example
bdd(1,[pn(N2,X,1,N1),pp(N1,Y,N0,1),pn(N0,Z,1,1)],vs(X,Y,Z))
@end example
@item bdd_ eval(+@var{ BDDHandle} , @var{ Val} )
@findex bdd_ eval/2
Unify @var{ Val} with the value of the logical expression compiled in
@var{ BDDHandle} given an assignment to its variables.
@example
bdd_ new(X+(Y+X)*(-Z), BDD),
[X,Y,Z] = [0,0,0],
bdd_ eval(BDD, V),
writeln(V).
@end example
would write 0 in the standard output stream.
The Prolog code equivalent to @t{ bdd_ eval/2} is:
@example
Tree = bdd(1, T, _ Vs),
reverse(T, RT),
foldl(eval_ bdd, RT, _ , V).
eval_ bdd(pp(P,X,L,R), _ , P) :-
P is ( X/\L ) \/ ( (1-X) /\ R ).
eval_ bdd(pn(P,X,L,R), _ , P) :-
P is ( X/\L ) \/ ( (1-X) /\ (1-R) ).
@end example
First, the nodes are reversed to implement bottom-up evaluation. Then,
we use the @code{ foldl} list manipulation predicate to walk every node,
computing the disjunction of the two cases and binding the output
variable. The top node gives the full expression value. Notice that
@code{ (1-@var{ X} )} implements negation.
@item bdd_ size(+@var{ BDDHandle} , -@var{ Size} )
@findex bdd_ size/2
Unify @var{ Size} with the number of nodes in @var{ BDDHandle} .
@item bdd_ print(+@var{ BDDHandle} , +@var{ File} )
@findex bdd_ print/2
Output bdd @var{ BDDHandle} as a dot file to @var{ File} .
@item bdd_ to_ probability_ sum_ product(+@var{ BDDHandle} , -@var{ Prob} )
@findex bdd_ to_ probability_ sum_ product/2
Each node in a BDD is given a probability @var{ Pi} . The total
probability of a corresponding sum-product network is @var{ Prob} .
@item bdd_ to_ probability_ sum_ product(+@var{ BDDHandle} , -@var{ Probs} , -@var{ Prob} )
@findex bdd_ to_ probability_ sum_ product/3
Each node in a BDD is given a probability @var{ Pi} . The total
probability of a corresponding sum-product network is @var{ Prob} , and
the probabilities of the inner nodes are @var{ Probs} .
In Prolog, this predicate would correspond to computing the value of a
BDD. The input variables will be bound to probabilities, eg
@code{ [@var{ X} ,@var{ Y} ,@var{ Z} ] = [0.3.0.7,0.1]} , and the previous
@code{ eval_ bdd} would operate over real numbers:
@example
Tree = bdd(1, T, _ Vs),
reverse(T, RT),
foldl(eval_ prob, RT, _ , V).
eval_ prob(pp(P,X,L,R), _ , P) :-
P is X * L + (1-X) * R.
eval_ prob(pn(P,X,L,R), _ , P) :-
P is X * L + (1-X) * (1-R).
@end example
@item bdd_ close(@var{ BDDHandle} )
@findex bdd_ close/1
close the BDD and release any resources it holds.
@end table
@node Block Diagram, , BDDs, Library
2010-11-23 12:24:05 +00:00
@section Block Diagram
@cindex Block Diagram
2013-09-30 15:45:14 +01:00
This library provides a way of visualizing a prolog program using
modules with blocks. To use it use:
@code{ :-use_ module(library(block_ diagram))} .
2010-11-23 12:24:05 +00:00
@table @code
@item make_ diagram(+inputfilename, +ouputfilename)
@findex make_ diagram/2
@snindex make_ diagram/2
@cnindex make_ diagram/2
This will crawl the files following the use_ module, ensure_ loaded directives withing the inputfilename.
The result will be a file in dot format.
You can make a pdf at the shell by asking @code{ dot -Tpdf filename > output.pdf} .
@item make_ diagram(+inputfilename, +ouputfilename, +predicate, +depth, +extension)
2014-04-21 11:14:18 +01:00
@findex make_ diagram/5
@snindex make_ diagram/5
@cnindex make_ diagram/5
2010-11-23 12:24:05 +00:00
The same as @code{ make_ diagram/2} but you can define how many of the imported/exporeted predicates will be shown with predicate, and how deep the crawler is allowed to go with depth. The extension is used if the file use module directives do not include a file extension.
@end table
2006-04-10 20:24:52 +01:00
2010-03-12 10:19:55 +00:00
@node SWI-Prolog, SWI-Prolog Global Variables, Library, Top
2005-10-31 18:12:51 +00:00
@cindex SWI-Prolog
@menu SWI-Prolog Emulation
Subnodes of SWI-Prolog
* Invoking Predicates on all Members of a List :: maplist and friends
2006-02-08 19:13:11 +00:00
* Forall :: forall built-in
2005-10-31 18:12:51 +00:00
@end menu
2005-11-01 18:19:44 +00:00
@include swi.tex
2001-04-09 20:54:03 +01:00
2010-03-12 10:19:55 +00:00
@node Extensions,Debugging,SWI-Prolog Global Variables,Top
@chapter Extensions to Prolog
2001-04-09 20:54:03 +01:00
@menu
* Rational Trees:: Working with Rational Trees
2007-02-18 00:26:36 +00:00
* Co-routining:: Changing the Execution of Goals
2001-04-09 20:54:03 +01:00
* Attributed Variables:: Using attributed Variables
2005-11-01 18:19:44 +00:00
* CLPR:: The CLP(R) System
2001-05-21 21:03:51 +01:00
* Logtalk:: The Logtalk Object-Oriented system
2010-09-07 15:51:59 +01:00
* MYDDAS:: The MYDDAS Database Interface package
2004-03-05 17:27:53 +00:00
* Threads:: Thread Library
2001-04-09 20:54:03 +01:00
* Parallelism:: Running in Or-Parallel
* Tabling:: Storing Intermediate Solutions of programs
* Low Level Profiling:: Profiling Abstract Machine Instructions
* Low Level Tracing:: Tracing at Abstract Machine Level
@end menu
2014-04-10 11:59:30 +01:00
YAP includes a number of extensions over the original Prolog
language. Next, we discuss support to the most important ones.
2007-02-18 00:26:36 +00:00
@node Rational Trees, Co-routining, , Extensions
2005-10-31 18:12:51 +00:00
@section Rational Trees
2001-04-09 20:54:03 +01:00
Prolog unification is not a complete implementation. For efficiency
considerations, Prolog systems do not perform occur checks while
unifying terms. As an example, @code{ X = a(X)} will not fail but instead
will create an infinite term of the form @code{ a(a(a(a(a(...)))))} , or
@emph{ rational tree} .
2009-05-18 15:36:00 +01:00
Rational trees are now supported by default in YAP. In previous
2005-10-31 18:12:51 +00:00
versions, this was not the default and these terms could easily lead
to infinite computation. For example, @code{ X = a(X), X = X} would
enter an infinite loop.
2001-04-09 20:54:03 +01:00
The @code{ RATIONAL_ TREES} flag improves support for these
terms. Internal primitives are now aware that these terms can exist, and
will not enter infinite loops. Hence, the previous unification will
succeed. Another example, @code{ X = a(X), ground(X)} will succeed
2006-02-08 19:13:11 +00:00
instead of looping. Other affected built-ins include the term comparison
2001-04-09 20:54:03 +01:00
primitives, @code{ numbervars/3} , @code{ copy_ term/2} , and the internal
data base routines. The support does not extend to Input/Output routines
or to @code{ assert/1} YAP does not allow directly reading
rational trees, and you need to use @code{ write_ depth/2} to avoid
entering an infinite cycle when trying to write an infinite term.
2007-02-18 00:26:36 +00:00
@node Co-routining, Attributed Variables, Rational Trees, Extensions
@section Co-routining
2001-04-09 20:54:03 +01:00
Prolog uses a simple left-to-right flow of control. It is sometimes
convenient to change this control so that goals will only be executed
when conditions are fulfilled. This may result in a more "data-driven"
execution, or may be necessary to correctly implement extensions such as
negation by default.
The @code{ COROUTINING} flag enables this option. Note that the support for
coroutining will in general slow down execution.
The following declaration is supported:
@table @code
@item block/1
The argument to @code{ block/1} is a condition on a goal or a conjunction
of conditions, with each element separated by commas. Each condition is
of the form @code{ predname(@var{ C1} ,...,@var{ CN} )} , where @var{ N} is the
arity of the goal, and each @var{ CI} is of the form @code{ -} , if the
2010-09-30 20:52:20 +01:00
argument must suspend until the first such variable is bound, or
@code{ ?} , otherwise.
2001-04-09 20:54:03 +01:00
@item wait/1
The argument to @code{ wait/1} is a predicate descriptor or a conjunction
of these predicates. These predicates will suspend until their first
argument is bound.
@end table
The following primitives are supported:
@table @code
@item dif(@var{ X} ,@var{ Y} )
@findex dif/2
@syindex dif/2
@cnindex dif/2
Succeed if the two arguments do not unify. A call to @code{ dif/2} will
suspend if unification may still succeed or fail, and will fail if they
always unify.
2002-05-24 01:13:15 +01:00
@item freeze(?@var{ X} ,:@var{ G} )
2001-04-09 20:54:03 +01:00
@findex freeze/2
@syindex freeze/2
@cnindex freeze/2
Delay execution of goal @var{ G} until the variable @var{ X} is bound.
@item frozen(@var{ X} ,@var{ G} )
@findex frozen/2
@syindex frozen/2
@cnindex frozen/2
Unify @var{ G} with a conjunction of goals suspended on variable @var{ X} ,
or @code{ true} if no goal has suspended.
2002-05-24 01:13:15 +01:00
@item when(+@var{ C} ,:@var{ G} )
2001-04-09 20:54:03 +01:00
@findex when/2
@syindex when/2
@cnindex when/2
Delay execution of goal @var{ G} until the conditions @var{ C} are
satisfied. The conditions are of the following form:
@table @code
@item @var{ C1} ,@var{ C2}
Delay until both conditions @var{ C1} and @var{ C2} are satisfied.
@item @var{ C1} ;@var{ C2}
Delay until either condition @var{ C1} or condition @var{ C2} is satisfied.
@item ?=(@var{ V1} ,@var{ C2} )
Delay until terms @var{ V1} and @var{ V1} have been unified.
@item nonvar(@var{ V} )
Delay until variable @var{ V} is bound.
@item ground(@var{ V} )
Delay until variable @var{ V} is ground.
@end table
Note that @code{ when/2} will fail if the conditions fail.
2002-05-24 01:13:15 +01:00
@item call_ residue(:@var{ G} ,@var{ L} )
2001-04-09 20:54:03 +01:00
@findex call_ residue/2
@syindex call_ residue/2
@cnindex call_ residue/2
Call goal @var{ G} . If subgoals of @var{ G} are still blocked, return
a list containing these goals and the variables they are blocked in. The
goals are then considered as unblocked. The next example shows a case
where @code{ dif/2} suspends twice, once outside @code{ call_ residue/2} ,
and the other inside:
@example
2001-04-16 17:41:04 +01:00
?- dif(X,Y),
call_ residue((dif(X,Y),(X = f(Z) ; Y = f(Z))), L).
2001-04-09 20:54:03 +01:00
2001-04-16 17:41:04 +01:00
X = f(Z),
L = [[Y]-dif(f(Z),Y)],
dif(f(Z),Y) ? ;
2001-04-09 20:54:03 +01:00
2001-04-16 17:41:04 +01:00
Y = f(Z),
L = [[X]-dif(X,f(Z))],
dif(X,f(Z)) ? ;
2001-04-09 20:54:03 +01:00
no
@end example
The system only reports one invocation of @code{ dif/2} as having
suspended.
2009-05-13 22:43:24 +01:00
@item call_ residue_ vars(:@var{ G} ,@var{ L} )
@findex call_ residue_ vars/2
@syindex call_ residue_ vars/2
@cnindex call_ residue_ vars/2
Call goal @var{ G} and unify @var{ L} with a list of all constrained variables created @emph{ during} execution of @var{ G} :
@example
?- dif(X,Z), call_ residue_ vars(dif(X,Y),L).
dif(X,Z), call_ residue_ vars(dif(X,Y),L).
L = [Y],
dif(X,Z),
dif(X,Y) ? ;
no
@end example
2001-04-09 20:54:03 +01:00
@end table
2007-02-18 00:26:36 +00:00
@node Attributed Variables, CLPR, Co-routining, Extensions
2014-04-21 11:14:18 +01:00
@section Attributed Variables
2001-04-09 20:54:03 +01:00
@cindex attributed variables
@menu
2010-03-12 10:19:55 +00:00
* New Style Attribute Declarations:: New Style code
* Old Style Attribute Declarations:: Old Style code (deprecated)
2001-04-09 20:54:03 +01:00
@end menu
2010-03-12 10:19:55 +00:00
YAP supports attributed variables, originally developed at OFAI by
2001-04-09 20:54:03 +01:00
Christian Holzbaur. Attributes are a means of declaring that an
arbitrary term is a property for a variable. These properties can be
2001-06-12 15:07:59 +01:00
updated during forward execution. Moreover, the unification algorithm is
2001-04-09 20:54:03 +01:00
aware of attributed variables and will call user defined handlers when
trying to unify these variables.
Attributed variables provide an elegant abstraction over which one can
extend Prolog systems. Their main application so far has been in
2010-03-12 10:19:55 +00:00
implementing constraint handlers, such as Holzbaur's CLPQR, Fruewirth
and Holzbaur's CHR, and CLP(BN).
Different Prolog systems implement attributed variables in different
ways. Traditionally, YAP has used the interface designed by SICStus
Prolog. This interface is still
available in the @t{ atts} library, but from YAP-6.0.3 we recommend using
the hProlog, SWI style interface. The main reason to do so is that
most packages included in YAP that use attributed variables, such as CHR, CLP(FD), and CLP(QR),
rely on the SWI-Prolog interface.
@node New Style Attribute Declarations, Old Style Attribute Declarations, , Attributed Variables
@section hProlog and SWI-Prolog style Attribute Declarations
The following documentation is taken from the SWI-Prolog manual.
Binding an attributed variable schedules a goal to be executed at the
first possible opportunity. In the current implementation the hooks are
executed immediately after a successful unification of the clause-head
or successful completion of a foreign language (built-in) predicate. Each
attribute is associated to a module and the hook @code{ attr_ unify_ hook/2} is
executed in this module. The example below realises a very simple and
incomplete finite domain reasoner.
@example
:- module(domain,
2013-09-29 11:31:18 +01:00
[ domain/2 % Var, ?Domain
]).
2010-03-12 10:19:55 +00:00
:- use_ module(library(ordsets)).
domain(X, Dom) :-
2013-09-29 11:31:18 +01:00
var(Dom), !,
get_ attr(X, domain, Dom).
2010-03-12 10:19:55 +00:00
domain(X, List) :-
2013-09-29 11:31:18 +01:00
list_ to_ ord_ set(List, Domain),
put_ attr(Y, domain, Domain),
X = Y.
2010-03-12 10:19:55 +00:00
2013-09-29 11:31:18 +01:00
% An attributed variable with attribute value Domain has been
% assigned the value Y
2010-03-12 10:19:55 +00:00
attr_ unify_ hook(Domain, Y) :-
2013-09-29 11:31:18 +01:00
( get_ attr(Y, domain, Dom2)
-> ord_ intersection(Domain, Dom2, NewDomain),
( NewDomain == []
-> fail
; NewDomain = [Value]
-> Y = Value
; put_ attr(Y, domain, NewDomain)
)
; var(Y)
-> put_ attr( Y, domain, Domain )
; ord_ memberchk(Y, Domain)
).
% Translate attributes from this module to residual goals
2010-03-12 10:19:55 +00:00
attribute_ goals(X) -->
2013-09-29 11:31:18 +01:00
@{ get_ attr(X, domain, List) @} ,
[domain(X, List)].
2010-03-12 10:19:55 +00:00
@end example
Before explaining the code we give some example queries:
2014-04-21 11:14:18 +01:00
@texinfo
2010-03-12 10:19:55 +00:00
@multitable @columnfractions .70 .30
@item @code{ ?- domain(X, [a,b]), X = c}
@tab @code{ fail}
@item @code{ domain(X, [a,b]), domain(X, [a,c]).}
@tab @code{ X=a}
@item @code{ domain(X, [a,b,c]), domain(X, [a,c]).}
@tab @code{ domain(X, [a,c]).}
@end multitable
2014-04-21 11:14:18 +01:00
@end texinfo
2010-03-12 10:19:55 +00:00
The predicate @code{ domain/2} fetches (first clause) or assigns
(second clause) the variable a @emph{ domain} , a set of values it can
be unified with. In the second clause first associates the domain
with a fresh variable and then unifies X to this variable to deal
with the possibility that X already has a domain. The
predicate @code{ attr_ unify_ hook/2} is a hook called after a variable with
a domain is assigned a value. In the simple case where the variable
is bound to a concrete value we simply check whether this value is in
the domain. Otherwise we take the intersection of the domains and either
fail if the intersection is empty (first example), simply assign the
value if there is only one value in the intersection (second example) or
assign the intersection as the new domain of the variable (third
example). The nonterminal @code{ attribute_ goals/3} is used to translate
remaining attributes to user-readable goals that, when executed, reinstate
these attributes.
@table @code
@item put_ attr(+@var{ Var} ,+@var{ Module} ,+@var{ Value} )
@findex put_ attr/3
@snindex put_ attr/3
@cnindex put_ attr/3
If @var{ Var} is a variable or attributed variable, set the value for the
attribute named @var{ Module} to @var{ Value} . If an attribute with this
name is already associated with @var{ Var} , the old value is replaced.
Backtracking will restore the old value (i.e., an attribute is a mutable
term. See also @code{ setarg/3} ). This predicate raises a representation error if
@var{ Var} is not a variable and a type error if @var{ Module} is not an atom.
@item get_ attr(+@var{ Var} ,+@var{ Module} ,-@var{ Value} )
@findex get_ attr/3
@snindex get_ attr/3
@cnindex get_ attr/3
Request the current @var{ value} for the attribute named @var{ Module} . If
@var{ Var} is not an attributed variable or the named attribute is not
associated to @var{ Var} this predicate fails silently. If @var{ Module}
is not an atom, a type error is raised.
@item del_ attr(+@var{ Var} ,+@var{ Module} )
@findex del_ attr/2
@snindex del_ attr/2
@cnindex del_ attr/2
Delete the named attribute. If @var{ Var} loses its last attribute it
is transformed back into a traditional Prolog variable. If @var{ Module}
is not an atom, a type error is raised. In all other cases this
predicate succeeds regardless whether or not the named attribute is
present.
@item attr_ unify_ hook(+@var{ AttValue} ,+@var{ VarValue} )
@findex attr_ unify_ hook/2
@snindex attr_ unify_ hook/2
@cnindex attr_ unify_ hook/2
Hook that must be defined in the module an attributed variable refers
to. Is is called @emph{ after} the attributed variable has been
unified with a non-var term, possibly another attributed variable.
@var{ AttValue} is the attribute that was associated to the variable
in this module and @var{ VarValue} is the new value of the variable.
Normally this predicate fails to veto binding the variable to
@var{ VarValue} , forcing backtracking to undo the binding. If
@var{ VarValue} is another attributed variable the hook often combines
the two attribute and associates the combined attribute with
@var{ VarValue} using @code{ put_ attr/3} .
@item attr_ portray_ hook(+@var{ AttValue} ,+@var{ Var} )
@findex attr_ portray_ hook/2
@snindex attr_ portray_ hook/2
@cnindex attr_ portray_ hook/2
Called by @code{ write_ term/2} and friends for each attribute if the option
@code{ attributes(portray)} is in effect. If the hook succeeds the
attribute is considered printed. Otherwise @code{ Module = ...} is
printed to indicate the existence of a variable.
@item attribute_ goals(+@var{ Var} ,-@var{ Gs} ,+@var{ GsRest} )
@findex attribute_ goals/2
@snindex attribute_ goals/2
@cnindex attribute_ goals/2
This nonterminal, if it is defined in a module, is used by @var{ copy_ term/3}
to project attributes of that module to residual goals. It is also
used by the toplevel to obtain residual goals after executing a query.
@end table
Normal user code should deal with @code{ put_ attr/3} , @code{ get_ attr/3} and @code{ del_ attr/2} .
The routines in this section fetch or set the entire attribute list of a
variables. Use of these predicates is anticipated to be restricted to
printing and other special purpose operations.
@table @code
@item get_ attrs(+@var{ Var} ,-@var{ Attributes} )
@findex get_ attrs/2
@snindex get_ attrs/2
@cnindex get_ attrs/2
Get all attributes of @var{ Var} . @var{ Attributes} is a term of the form
@code{ att(@var{ Module} , @var{ Value} , @var{ MoreAttributes} )} , where @var{ MoreAttributes} is
@code{ []} for the last attribute.
@item put_ attrs(+@var{ Var} ,+@var{ Attributes} )
@findex put_ attrs/2
@snindex put_ attrs/2
@cnindex put_ attrs/2
Set all attributes of @var{ Var} . See @code{ get_ attrs/2} for a description of
@var{ Attributes} .
@item del_ attrs(+@var{ Var} )
@findex del_ attrs/1
@snindex del_ attrs/1
@cnindex del_ attrs/1
If @var{ Var} is an attributed variable, delete @emph{ all} its
attributes. In all other cases, this predicate succeeds without
side-effects.
@item term_ attvars(+@var{ Term} ,-@var{ AttVars} )
@findex term_ attvars/2
@snindex term_ attvars/2
@cnindex term_ attvars/2
@var{ AttVars} is a list of all attributed variables in @var{ Term} and
its attributes. I.e., @code{ term_ attvars/2} works recursively through
attributes. This predicate is Cycle-safe.
@item copy_ term(?@var{ TI} ,-@var{ TF} ,-@var{ Goals} )
@findex copy_ term/3
@syindex copy_ term/3
@cnindex copy_ term/3
Term @var{ TF} is a variant of the original term @var{ TI} , such that for
each variable @var{ V} in the term @var{ TI} there is a new variable @var{ V'}
in term @var{ TF} without any attributes attached. Attributed
variables are thus converted to standard variables. @var{ Goals} is
unified with a list that represents the attributes. The goal
@code{ maplist(call,@var{ Goals} )} can be called to recreate the
attributes.
Before the actual copying, @code{ copy_ term/3} calls
@code{ attribute_ goals/1} in the module where the attribute is
defined.
@item copy_ term_ nat(?@var{ TI} ,-@var{ TF} )
@findex copy_ term_ nat/2
@syindex copy_ term_ nat/2
@cnindex copy_ term_ nat/2
As @code{ copy_ term/2} . Attributes however, are @emph{ not} copied but replaced
by fresh variables.
2010-10-24 21:19:03 +01:00
2010-03-12 10:19:55 +00:00
@end table
@node Old Style Attribute Declarations, , New Style Attribute Declarations, Attributed Variables
@section SICStus Prolog style Attribute Declarations
@menu
* Attribute Declarations:: Declaring New Attributes
* Attribute Manipulation:: Setting and Reading Attributes
* Attributed Unification:: Tuning the Unification Algorithm
* Displaying Attributes:: Displaying Attributes in User-Readable Form
* Projecting Attributes:: Obtaining the Attributes of Interest
* Attribute Examples:: Two Simple Examples of how to use Attributes.
@end menu
2001-04-09 20:54:03 +01:00
2010-03-12 10:19:55 +00:00
Old style attribute declarations are activated through loading the library @t{ atts} . The command
2001-04-09 20:54:03 +01:00
@example
| ?- use_ module(library(atts)).
@end example
2010-03-12 10:19:55 +00:00
enables this form of use of attributed variables. The package provides the
2001-04-09 20:54:03 +01:00
following functionality:
2001-05-21 21:03:51 +01:00
@itemize @bullet
2001-04-09 20:54:03 +01:00
@item Each attribute must be declared first. Attributes are described by a functor
and are declared per module. Each Prolog module declares its own sets of
attributes. Different modules may have different functors with the same
module.
@item The built-in @code{ put_ atts/2} adds or deletes attributes to a
variable. The variable may be unbound or may be an attributed
variable. In the latter case, YAP discards previous values for the
attributes.
@item The built-in @code{ get_ atts/2} can be used to check the values of
an attribute associated with a variable.
2002-11-18 17:17:22 +00:00
@item The unification algorithm calls the user-defined predicate
2001-04-09 20:54:03 +01:00
@t{ verify_ attributes/3} before trying to bind an attributed
variable. Unification will resume after this call.
@item The user-defined predicate
@t{ attribute_ goal/2} converts from an attribute to a goal.
@item The user-defined predicate
@t{ project_ attributes/2} is used from a set of variables into a set of
2002-01-23 15:17:56 +00:00
constraints or goals. One application of @t{ project_ attributes/2} is in
2001-04-09 20:54:03 +01:00
the top-level, where it is used to output the set of
floundered constraints at the end of a query.
@end itemize
2010-03-12 10:19:55 +00:00
@node Attribute Declarations, Attribute Manipulation, , Old Style Attribute Declarations
@subsection Attribute Declarations
2001-04-09 20:54:03 +01:00
Attributes are compound terms associated with a variable. Each attribute
has a @emph{ name} which is @emph{ private} to the module in which the
attribute was defined. Variables may have at most one attribute with a
name. Attribute names are defined with the following declaration:
@cindex attribute declaration
@cindex declaration, attribute
@findex attribute/1 (declaration)
@example
2014-04-10 11:59:30 +01:00
:- attribute AttributeSpec, ..., AttributeSpec.
2001-04-09 20:54:03 +01:00
@end example
@noindent
where each @var{ AttributeSpec} has the form (@var{ Name} /@var{ Arity} ).
One single such declaration is allowed per module @var{ Module} .
Although the YAP module system is predicate based, attributes are local
2001-06-12 15:07:59 +01:00
to modules. This is implemented by rewriting all calls to the
2006-02-08 19:13:11 +00:00
built-ins that manipulate attributes so that attribute names are
2001-04-09 20:54:03 +01:00
preprocessed depending on the module. The @code{ user:goal_ expansion/3}
mechanism is used for this purpose.
2010-03-12 10:19:55 +00:00
@node Attribute Manipulation, Attributed Unification, Attribute Declarations, Old Style Attribute Declarations
@subsection Attribute Manipulation
2001-04-09 20:54:03 +01:00
The attribute manipulation predicates always work as follows:
@enumerate
@item The first argument is the unbound variable associated with
attributes,
@item The second argument is a list of attributes. Each attribute will
be a Prolog term or a constant, prefixed with the @t{ +} and @t{ -} unary
operators. The prefix @t{ +} may be dropped for convenience.
@end enumerate
The following three procedures are available to the user. Notice that
2006-02-08 19:13:11 +00:00
these built-ins are rewritten by the system into internal built-ins, and
2001-04-09 20:54:03 +01:00
that the rewriting process @emph{ depends} on the module on which the
2006-02-08 19:13:11 +00:00
built-ins have been invoked.
2001-04-09 20:54:03 +01:00
2004-11-24 04:13:50 +00:00
@table @code
2001-04-09 20:54:03 +01:00
@item @var{ Module} :get_ atts(@var{ -Var} ,@var{ ?ListOfAttributes} )
@findex get_ atts/2
@syindex get_ atts/2
@cnindex get_ atts/2
Unify the list @var{ ?ListOfAttributes} with the attributes for the unbound
variable @var{ Var} . Each member of the list must be a bound term of the
form @code{ +(@var{ Attribute} )} , @code{ -(@var{ Attribute} )} (the @t{ kbd}
prefix may be dropped). The meaning of @t{ +} and @t{ -} is:
@item +(@var{ Attribute} )
Unifies @var{ Attribute} with a corresponding attribute associated with
@var{ Var} , fails otherwise.
@item -(@var{ Attribute} )
Succeeds if a corresponding attribute is not associated with
@var{ Var} . The arguments of @var{ Attribute} are ignored.
@item @var{ Module} :put_ atts(@var{ -Var} ,@var{ ?ListOfAttributes} )
@findex put_ atts/2
@syindex put_ atts/2
@cnindex put_ atts/2
Associate with or remove attributes from a variable @var{ Var} . The
attributes are given in @var{ ?ListOfAttributes} , and the action depends
on how they are prefixed:
@item +(@var{ Attribute} )
Associate @var{ Var} with @var{ Attribute} . A previous value for the
attribute is simply replace (like with @code{ set_ mutable/2} ).
@item -(@var{ Attribute} )
Remove the attribute with the same name. If no such attribute existed,
simply succeed.
@end table
2010-03-12 10:19:55 +00:00
@node Attributed Unification, Displaying Attributes, Attribute Manipulation, Old Style Attribute Declarations
@subsection Attributed Unification
2001-04-09 20:54:03 +01:00
The user-predicate predicate @code{ verify_ attributes/3} is called when
attempting to unify an attributed variable which might have attributes
in some @var{ Module} .
@table @code
@item @var{ Module} :verify_ attributes(@var{ -Var} , @var{ +Value} , @var{ -Goals} )
@findex verify_ attributes/3
@syindex verify_ attributes/3
@cnindex verify_ attributes/3
The predicate is called when trying to unify the attributed variable
@var{ Var} with the Prolog term @var{ Value} . Note that @var{ Value} may be
itself an attributed variable, or may contain attributed variables. The
goal @t{ verify_ attributes/3} is actually called before @var{ Var} is
unified with @var{ Value} .
It is up to the user to define which actions may be performed by
@t{ verify_ attributes/3} but the procedure is expected to return in
@var{ Goals} a list of goals to be called @emph{ after} @var{ Var} is
unified with @var{ Value} . If @t{ verify_ attributes/3} fails, the
unification will fail.
2014-04-21 11:14:18 +01:00
Notice that the @t{ verify_ attributes/3} may be called even if @var{ Var} <
2001-04-09 20:54:03 +01:00
has no attributes in module @t{ Module} . In this case the routine should
simply succeed with @var{ Goals} unified with the empty list.
2004-11-24 04:13:50 +00:00
2004-09-10 21:18:01 +01:00
@item attvar(@var{ -Var} )
@findex attvar/1
@snindex attvar/1
@cnindex attvar/1
Succeed if @var{ Var} is an attributed variable.
2001-04-09 20:54:03 +01:00
@end table
2004-09-10 21:18:01 +01:00
2010-03-12 10:19:55 +00:00
@node Displaying Attributes, Projecting Attributes,Attributed Unification, Old Style Attribute Declarations
@subsection Displaying Attributes
2001-04-09 20:54:03 +01:00
Attributes are usually presented as goals. The following routines are
2006-02-08 19:13:11 +00:00
used by built-in predicates such as @code{ call_ residue/2} and by the
2001-04-09 20:54:03 +01:00
Prolog top-level to display attributes:
@table @code
@item @var{ Module} :attribute_ goal(@var{ -Var} , @var{ -Goal} )
@findex attribute_ goal/2
@syindex attribute_ goal/2
@cnindex attribute_ goal/2
User-defined procedure, called to convert the attributes in @var{ Var} to
a @var{ Goal} . Should fail when no interpretation is available.
@end table
2010-03-12 10:19:55 +00:00
@node Projecting Attributes, Attribute Examples, Displaying Attributes, Old Style Attribute Declarations
@subsection Projecting Attributes
2001-04-09 20:54:03 +01:00
2001-06-12 15:07:59 +01:00
Constraint solvers must be able to project a set of constraints to a set
of variables. This is useful when displaying the solution to a goal, but
2001-04-09 20:54:03 +01:00
may also be used to manipulate computations. The user-defined
@code{ project_ attributes/2} is responsible for implementing this
projection.
@table @code
@item @var{ Module} :project_ attributes(@var{ +QueryVars} , @var{ +AttrVars} )
@findex project_ attributes/2
@syindex project_ attributes/2
@cnindex project_ attributes/2
Given a list of variables @var{ QueryVars} and list of attributed
variables @var{ AttrVars} , project all attributes in @var{ AttrVars} to
@var{ QueryVars} . Although projection is constraint system dependent,
typically this will involve expressing all constraints in terms of
@var{ QueryVars} and considering all remaining variables as existentially
quantified.
@end table
2009-04-25 16:59:23 +01:00
Projection interacts with @code{ attribute_ goal/2} at the Prolog top
2001-04-09 20:54:03 +01:00
level. When the query succeeds, the system first calls
@code{ project_ attributes/2} . The system then calls
@code{ attribute_ goal/2} to get a user-level representation of the
constraints. Typically, @code{ attribute_ goal/2} will convert from the
original constraints into a set of new constraints on the projection,
and these constraints are the ones that will have an
@code{ attribute_ goal/2} handler.
2010-03-12 10:19:55 +00:00
@node Attribute Examples, ,Projecting Attributes, Old Style Attribute Declarations
@subsection Attribute Examples
2001-04-09 20:54:03 +01:00
The following two examples example is taken from the SICStus Prolog manual. It
2001-06-12 15:07:59 +01:00
sketches the implementation of a simple finite domain ``solver''. Note
2001-04-09 20:54:03 +01:00
that an industrial strength solver would have to provide a wider range
of functionality and that it quite likely would utilize a more efficient
representation for the domains proper. The module exports a single
predicate @code{ domain(@var{ -Var} ,@var{ ?Domain} )} which associates
@var{ Domain} (a list of terms) with @var{ Var} . A variable can be
queried for its domain by leaving @var{ Domain} unbound.
We do not present here a definition for @code{ project_ attributes/2} .
Projecting finite domain constraints happens to be difficult.
@example
:- module(domain, [domain/2]).
:- use_ module(library(atts)).
:- use_ module(library(ordsets), [
ord_ intersection/3,
ord_ intersect/2,
list_ to_ ord_ set/2
]).
:- attribute dom/1.
verify_ attributes(Var, Other, Goals) :-
get_ atts(Var, dom(Da)), !, % are we involved?
( var(Other) -> % must be attributed then
( get_ atts(Other, dom(Db)) -> % has a domain?
ord_ intersection(Da, Db, Dc),
Dc = [El|Els], % at least one element
( Els = [] -> % exactly one element
Goals = [Other=El] % implied binding
; Goals = [],
put_ atts(Other, dom(Dc))% rescue intersection
)
; Goals = [],
put_ atts(Other, dom(Da)) % rescue the domain
)
; Goals = [],
ord_ intersect([Other], Da) % value in domain?
).
verify_ attributes(_ , _ , []). % unification triggered
% because of attributes
% in other modules
attribute_ goal(Var, domain(Var,Dom)) :- % interpretation as goal
get_ atts(Var, dom(Dom)).
domain(X, Dom) :-
var(Dom), !,
get_ atts(X, dom(Dom)).
domain(X, List) :-
list_ to_ ord_ set(List, Set),
Set = [El|Els], % at least one element
( Els = [] -> % exactly one element
X = El % implied binding
; put_ atts(Fresh, dom(Set)),
X = Fresh % may call
% verify_attributes/3
).
@end example
Note that the ``implied binding'' @code{ Other=El} was deferred until after
the completion of @code{ verify_ attribute/3} . Otherwise, there might be a
2001-06-12 15:07:59 +01:00
danger of recursively invoking @code{ verify_ attribute/3} , which might bind
2001-04-09 20:54:03 +01:00
@code{ Var} , which is not allowed inside the scope of @code{ verify_ attribute/3} .
Deferring unifications into the third argument of @code{ verify_ attribute/3}
2002-11-18 17:17:22 +00:00
effectively serializes the calls to @code{ verify_ attribute/3} .
2001-04-09 20:54:03 +01:00
Assuming that the code resides in the file @file{ domain.yap} , we
can use it via:
@example
| ?- use_ module(domain).
@end example
Let's test it:
@example
| ?- domain(X,[5,6,7,1]), domain(Y,[3,4,5,6]), domain(Z,[1,6,7,8]).
domain(X,[1,5,6,7]),
domain(Y,[3,4,5,6]),
domain(Z,[1,6,7,8]) ?
yes
| ?- domain(X,[5,6,7,1]), domain(Y,[3,4,5,6]), domain(Z,[1,6,7,8]),
X=Y.
Y = X,
domain(X,[5,6]),
domain(Z,[1,6,7,8]) ?
yes
| ?- domain(X,[5,6,7,1]), domain(Y,[3,4,5,6]), domain(Z,[1,6,7,8]),
X=Y, Y=Z.
X = 6,
Y = 6,
Z = 6
@end example
To demonstrate the use of the @var{ Goals} argument of
@code{ verify_ attributes/3} , we give an implementation of
@code{ freeze/2} . We have to name it @code{ myfreeze/2} in order to
avoid a name clash with the built-in predicate of the same name.
@example
:- module(myfreeze, [myfreeze/2]).
:- use_ module(library(atts)).
:- attribute frozen/1.
verify_ attributes(Var, Other, Goals) :-
get_ atts(Var, frozen(Fa)), !, % are we involved?
( var(Other) -> % must be attributed then
( get_ atts(Other, frozen(Fb)) % has a pending goal?
-> put_ atts(Other, frozen((Fa,Fb))) % rescue conjunction
; put_ atts(Other, frozen(Fa)) % rescue the pending goal
),
Goals = []
; Goals = [Fa]
).
verify_ attributes(_ , _ , []).
attribute_ goal(Var, Goal) :- % interpretation as goal
get_ atts(Var, frozen(Goal)).
myfreeze(X, Goal) :-
put_ atts(Fresh, frozen(Goal)),
Fresh = X.
@end example
Assuming that this code lives in file @file{ myfreeze.yap} ,
we would use it via:
@example
| ?- use_ module(myfreeze).
| ?- myfreeze(X,print(bound(x,X))), X=2.
bound(x,2) % side effect
X = 2 % bindings
@end example
The two solvers even work together:
@example
| ?- myfreeze(X,print(bound(x,X))), domain(X,[1,2,3]),
domain(Y,[2,10]), X=Y.
bound(x,2) % side effect
X = 2, % bindings
Y = 2
@end example
The two example solvers interact via bindings to shared attributed
variables only. More complicated interactions are likely to be found
in more sophisticated solvers. The corresponding
@code{ verify_ attributes/3} predicates would typically refer to the
attributes from other known solvers/modules via the module prefix in
@code{ @var{ Module} :get_ atts/2} .
2005-11-01 18:19:44 +00:00
@node CLPR, CHR, Attributed Variables, Extensions
2001-04-09 20:54:03 +01:00
@cindex CLPQ
@cindex CLPR
@menu
2005-11-01 18:19:44 +00:00
* CLPR Solver Predicates::
* CLPR Syntax::
* CLPR Unification::
* CLPR Non-linear Constraints::
2001-04-09 20:54:03 +01:00
@end menu
2014-05-12 17:49:11 +01:00
2005-11-01 18:19:44 +00:00
@include clpr.tex
2001-04-09 20:54:03 +01:00
2005-11-01 18:19:44 +00:00
@node CHR, Logtalk, CLPR, Top
2001-04-09 20:54:03 +01:00
@menu
2005-11-01 18:19:44 +00:00
* CHR Introduction::
* CHR Syntax and Semantics::
* CHR in YAP Programs::
* CHR Debugging::
* CHR Examples::
* CHR Compatibility::
* CHR Guidelines::
2001-04-09 20:54:03 +01:00
@end menu
2005-11-01 18:19:44 +00:00
@include chr.tex
2001-04-09 20:54:03 +01:00
2010-09-07 15:51:59 +01:00
@node Logtalk, MYDDAS, CHR, Extensions
2014-04-21 11:14:18 +01:00
@section Logtalk
2005-08-22 14:42:05 +01:00
@cindex Logtalk
2005-02-21 17:06:45 +00:00
2008-06-07 11:11:44 +01:00
The Logtalk object-oriented extension is available after running its
standalone installer by using the @code{ yaplgt} command in POSIX
systems or by using the @code{ Logtalk - YAP} shortcut in the Logtalk
program group in the Start Menu on Windows systems. For more information
please see the URL @url{ http://logtalk.org/} .
2001-05-21 21:03:51 +01:00
2014-05-12 17:49:11 +01:00
@node MYDDAS, Real, Logtalk, Extensions
2014-04-21 11:14:18 +01:00
@section MYDDAS
2010-09-07 15:51:59 +01:00
@cindex MYDDAS
The MYDDAS database project was developed within a FCT project aiming at
the development of a highly efficient deductive database system, based
on the coupling of the MySQL relational database system with the Yap
Prolog system. MYDDAS was later expanded to support the ODBC interface.
@menu
Subnodes of MYDDAS
* Requirements and Installation Guide::
* MYDDAS Architecture::
* Loading MYDDAS::
* Connecting to and disconnecting from a Database Server::
* Accessing a Relation::
* View Level Interface ::
* Accessing Tables in Data Sources Using SQL::
* Insertion of Rows::
* Types of Attributes::
* Number of Fields::
* Describing a Relation::
* Enumerating Relations::
* The MYDDAS MySQL Top Level::
* Other MYDDAS Properties::
@end menu
@node Requirements and Installation Guide, MYDDAS Architecture, , MYDDAS
@section Requirements and Installation Guide
Next, we describe how to usen of the YAP with the MYDDAS System. The
use of this system is entirely depend of the MySQL development libraries
or the ODBC development libraries. At least one of the this development
libraries must be installed on the computer system, otherwise MYDDAS
will not compile. The MySQL development libraries from MySQL 3.23 an
above are know to work. We recommend the usage of MySQL versusODBC,
but it is possible to have both options installed
At the same time, without any problem. The MYDDAS system automatically
controls the two options. Currently, MYDDAS is know to compile without
problems in Linux. The usage of this system on Windows has not been
tested yet. MYDDAS must be enabled at configure time. This can be done
with the following options:
@table @code
@item --enable-myddas
This option will detect which development libraries are installed on the computer system, MySQL, ODBC or both, and will compile the Yap system with the support for which libraries it detects;
@item --enable-myddas-stats
This option is only available in MySQL. It includes code to get
statistics from the MYDDAS system;
@item --enable-top-level
This option is only available in MySQL. It enables the option to interact with the MySQL server in
two different ways. As if we were on the MySQL Client Shell, and as if
we were using Datalog.
@end table
@node MYDDAS Architecture, Loading MYDDAS, Requirements and Installation Guide, MYDDAS
@section MYDDAS Architecture
The system includes four main blocks that are put together through the
MYDDAS interface: the Yap Prolog compiler, the MySQL database system, an
ODBC layer and a Prolog to SQL compiler. Current effort is put on the
MySQL interface rather than on the ODBC interface. If you want to use
the full power of the MYDDAS interface we recommend you to use a MySQL
database. Other databases, such as Oracle, PostGres or Microsoft SQL
Server, can be interfaced through the ODBC layer, but with limited
performance and features support.
The main structure of the MYDDAS interface is simple. Prolog queries
involving database goals are translated to SQL using the Prolog to SQL
compiler; then the SQL expression is sent to the database system, which
returns the set of tuples satisfying the query; and finally those tuples
are made available to the Prolog engine as terms. For recursive queries
involving database goals, the YapTab tabling engine provides the
necessary support for an efficient evaluation of such queries.
An important aspect of the MYDDAS interface is that for the programmer
the use of predicates which are defined in database relations is
completely transparent. An example of this transparent support is the
Prolog cut operator, which has exactly the same behaviour from
predicates defined in the Prolog program source code, or from predicates
defined in database as relations.
@node Loading MYDDAS, Connecting to and disconnecting from a Database Server, MYDDAS Architecture, MYDDAS
@section Loading MYDDAS
Begin by starting YAP and loading the library
@code{ use_ module(library(myddas))} . This library already includes the
Prolog to SQL Compiler described in [2] and [1]. In MYDDAS this compiler
has been extended to support further constructs which allow a more
efficient SQL translation.
@node Connecting to and disconnecting from a Database Server, Accessing a Relation, Loading MYDDAS, MYDDAS
@section Connecting to and disconnecting from a Database Server
@table @code
@item db open(+,+,+,+,+).
@findex db_ open/5
@snindex db_ open/5
@cnindex db_ open/5
@item db open(+,+,+,+).
@findex db_ open/4
@snindex db_ open/4
@cnindex db_ open/4
@item db close(+).
@findex db_ close/1
@snindex db_ close/1
@cnindex db_ close/1
@end table
Assuming the MySQL server is running and we have an account, we can
login to MySQL by invoking @code{ db_ open/5} as one of the following:
@example
?- db_ open(mysql,Connection,Host/Database,User,Password).
?- db_ open(mysql,Connection,Host/Database/Port,User,Password).
?- db_ open(mysql,Connection,Host/Database/UnixSocket,User,Password).
?- db_ open(mysql,Connection,Host/Database/Port/UnixSocket,User,Password).
@end example
If the login is successful, there will be a response of @code{ yes} . For
instance:
@example
?- db_ open(mysql,con1,localhost/guest_ db,guest,'').
@end example
uses the MySQL native interface, selected by the first argument, to open
a connection identified by the @code{ con1} atom, to an instance of a
MySQL server running on host @code{ localhost} , using database guest @code{ db}
and user @code{ guest} with empty @code{ password} . To disconnect from the @code{ con1}
connection we use:
@example
?- db_ close(con1).
@end example
Alternatively, we can use @code{ db_ open/4} and @code{ db_ close/0,} without an argument
to identify the connection. In this case the default connection is used,
with atom @code{ myddas} . Thus using
@example
?- db_ open(mysql,localhost/guest_ db,guest,'').
?- db_ close.
@end example
or
@example
?- db_ open(mysql,myddas,localhost/guest_ db,guest,'').
?- db_ close(myddas).
@end example
is exactly the same.
MYDDAS also supports ODBC. To connect to a database using an ODBC driver
you must have configured on your system a ODBC DSN. If so, the @code{ db_ open/4}
and @code{ db_ open/5} have the following mode:
@example
?- db_ open(odbc,Connection,ODBC_ DSN,User,Password).
?- db_ open(odbc,ODBC_ DSN,User,Password).
@end example
For instance, if you do @code{ db_ open(odbc,odbc_ dsn,guest,'')} . it will connect
to a database, through ODBC, using the definitions on the @code{ odbc_ dsn} DSN
configured on the system. The user will be the user @code{ guest} with no
password.
@node Accessing a Relation, View Level Interface , Connecting to and disconnecting from a Database Server, MYDDAS
@section Accessing a Relation
@table @code
@item db_ import(+Conn,+RelationName,+PredName).
@findex db_ import/3
@snindex db_ import/3
@cnindex db_ import/3
@item db_ import(+RelationName,+PredName).
@findex db_ import/2
@snindex db_ import/2
@cnindex db_ import/2
@end table
Assuming you have access permission for the relation you wish to import,
you can use @code{ db_ import/3} or @code{ db_ import/2} as:
@example
?- db_ import(Conn,RelationName,PredName).
?- db_ import(RelationName,PredName).
@end example
where @var{ RelationName} , is the name of
relation we wish to access, @var{ PredName} is the name of the predicate we
wish to use to access the relation from YAP. @var{ Conn} , is the connection
identifier, which again can be dropped so that the default myddas connection
is used. For instance, if we want to access the relation phonebook,
using the predicate @code{ phonebook/3} we write:
@example
?- db_ import(con1,phonebook,phonebook).
yes
?- phonebook(Letter,Name,Number).
Letter = 'D',
Name = 'John Doe',
Number = 123456789 ?
yes
@end example
Backtracking can then be used to retrieve the next row
of the relation phonebook. Records with particular field values may be
selected in the same way as in Prolog. (In particular, no mode
specification for database predicates is required). For instance:
@example
?- phonebook(Letter,'John Doe',Letter).
Letter = 'D',
Number = 123456789 ?
yes
@end example
2013-06-21 00:08:29 +01:00
generates the query
@example
2010-09-07 15:51:59 +01:00
SELECT A.Letter , 'John Doe' , A.Number
FROM 'phonebook' A
WHERE A.Name = 'John Doe';
@end example
@node View Level Interface, Accessing Tables in Data Sources Using SQL, Accessing a Relation, MYDDAS
@section View Level Interface
@table @code
@item db view(+,+,+).
@findex db_ view/3
@snindex db_ view/3
@cnindex db_ view/3
@item db view(+,+).
@findex db_ view/2
@snindex db_ view/2
@cnindex db_ view/2
@end table
If we import a database relation, such as an edge relation representing the edges of a directed graph, through
@example
?- db_ import('Edge',edge).
yes
@end example
and we then write a query to retrieve all the direct cycles in the
graph, such as
@example
?- edge(A,B), edge(B,A).
A = 10,
B = 20 ?
@end example
this is clearly inefficient [3], because of relation-level
access. Relation-level access means that a separate SQL query will be
generated for every goal in the body of the clause. For the second
@code{ edge/2} goal, a SQL query is generated using the variable bindings that
result from the first @code{ edge/2} goal execution. If the second
@code{ edge/2} goal
fails, or if alternative solutions are demanded, backtracking access the
next tuple for the first @code{ edge/2} goal and another SQL query will be
generated for the second @code{ edge/2} goal. The generation of this large
number of queries and the communication overhead with the database
system for each of them, makes the relation-level approach inefficient.
To solve this problem the view level interface can be used for the
definition of rules whose bodies includes only imported database
predicates. One can use the view level interface through the predicates
@code{ db_ view/3} and @code{ db_ view/2} :
@example
?- db_ view(Conn,PredName(Arg_ 1,...,Arg_ n),DbGoal).
?- db_ view(PredName(Arg_ 1,...,Arg_ n),DbGoal).
@end example
All arguments are standard Prolog terms. @var{ Arg1} through @var{ Argn}
define the attributes to be retrieved from the database, while
@var{ DbGoal} defines the selection restrictions and join
conditions. @var{ Conn} is the connection identifier, which again can be
dropped. Calling predicate @code{ PredName/n} will retrieve database
tuples using a single SQL query generated for the @var{ DbGoal} . We next show
an example of a view definition for the direct cycles discussed
above. Assuming the declaration:
@example
?- db_ import('Edge',edge).
yes
@end example
we
2013-06-21 00:08:29 +01:00
write:
@example
2010-09-07 15:51:59 +01:00
?- db_ view(direct_ cycle(A,B),(edge(A,B), edge(B,A))).
yes
?- direct_ cycle(A,B)).
A = 10,
B = 20 ?
@end example
This call generates the SQL
2013-06-21 00:08:29 +01:00
statement:
@example
2010-09-07 15:51:59 +01:00
SELECT A.attr1 , A.attr2
FROM Edge A , Edge B
WHERE B.attr1 = A.attr2 AND B.attr2 = A.attr1;
@end example
Backtracking, as in relational level interface, can be used to retrieve the next row of the view.
The view interface also supports aggregate function predicates such as
@code{ sum} , @code{ avg} , @code{ count} , @code{ min} and @code{ max} . For
instance:
@example
?- db_ view(count(X),(X is count(B, B^ edge(10,B)))).
@end example
generates the query :
@example
SELECT COUNT(A.attr2)
FROM Edge A WHERE A.attr1 = 10;
@end example
To know how to use db @code{ view/3} , please refer to Draxler's Prolog to
SQL Compiler Manual.
@node Accessing Tables in Data Sources Using SQL, Insertion of Rows, View Level Interface , MYDDAS
@section Accessing Tables in Data Sources Using SQL
@table @code
@item db_ sql(+,+,?).
@findex db_ sql/3
@snindex db_ sql/3
@cnindex db_ sql/3
@item db_ sql(+,?).
@findex db_ sql/2
@snindex db_ sql/2
@cnindex db_ sql/2
@end table
It is also possible to explicitly send a SQL query to the database server using
@example
?- db_ sql(Conn,SQL,List).
?- db_ sql(SQL,List).
@end example
where @var{ SQL} is an arbitrary SQL expression, and @var{ List} is a list
holding the first tuple of result set returned by the server. The result
set can also be navigated through backtracking.
Example:
@example
?- db_ sql('SELECT * FROM phonebook',LA).
LA = ['D','John Doe',123456789] ?
@end example
@node Insertion of Rows, Types of Attributes, Accessing Tables in Data Sources Using SQL, MYDDAS
@section Insertion of Rows
@table @code
@item db_ assert(+,+).
@findex db_ assert/2
@snindex db_ assert/2
@cnindex db_ assert/2
@item db_ assert(+).
@findex db_ assert/1
@snindex db_ assert/1
@cnindex db_ assert/1
@end table
Assuming you have imported the related base table using
@code{ db_ import/2} or @code{ db_ import/3} , you can insert to that table
by using @code{ db_ assert/2} predicate any given fact.
@example
?- db_ assert(Conn,Fact).
?- db_ assert(Fact).
@end example
The second argument must be declared with all of its arguments bound to
constants. For example assuming @code{ helloWorld} is imported through
@code{ db_ import/2} :
@example
?- db_ import('Hello World',helloWorld).
yes
?- db_ assert(helloWorld('A' ,'Ana',31)).
yes
@end example
This, would generate the following query
@example
INSERT INTO helloWorld
VALUES ('A','Ana',3)
@end example
which would insert into the helloWorld, the following row:
@code{ A,Ana,31} . If we want to insert @code{ NULL} values into the
relation, we call @code{ db_ assert/2} with a uninstantiated variable in
the data base imported predicate. For example, the following query on
the YAP-prolog system:
@example
?- db_ assert(helloWorld('A',NULL,31)).
yes
@end example
Would insert the row: @code{ A,null value,31} into the relation
@code{ Hello World} , assuming that the second row allows null values.
@table @code
@item db insert(+,+,+).
@findex db_ insert/3
@snindex db_ insert/3
@cnindex db_ insert/3
@item db insert(+,+).
@findex db_ insert/2
@snindex db_ insert/2
@cnindex db_ insert/2
@end table
This predicate would create a new database predicate, which will insert
any given tuple into the database.
@example
?- db_ insert(Conn,RelationName,PredName).
?- db_ insert(RelationName,PredName).
@end example
This would create a new predicate with name @var{ PredName} , that will
insert tuples into the relation @var{ RelationName} . is the connection
identifier. For example, if we wanted to insert the new tuple
@code{ ('A',null,31)} into the relation @code{ Hello World} , we do:
@example
?- db_ insert('Hello World',helloWorldInsert).
yes
?- helloWorldInsert('A',NULL,31).
yes
@end example
@node Types of Attributes, Number of Fields, Insertion of Rows, MYDDAS
@section Types of Attributes
@table @code
@item db_ get_ attributes_ types(+,+,?).
@findex db_ get_ attributes_ types/3
@snindex db_ get_ attributes_ types/3
@cnindex db_ get_ attributes_ types/3
@item db_ get_ attributes_ types(+,?).
@findex db_ get_ attributes_ types/2
@snindex db_ get_ attributes_ types/2
@cnindex db_ get_ attributes_ types/2
@end table
The prototype for this predicate is the following:
@example
?- db_ get_ attributes_ types(Conn,RelationName,ListOfFields).
?- db_ get_ attributes_ types(RelationName,ListOfFields).
@end example
You can use the
predicate @code{ db_ get_ attributes types/2} or @code{ db_ get_ attributes_ types/3} , to
know what are the names and attributes types of the fields of a given
relation. For example:
@example
?- db_ get_ attributes_ types(myddas,'Hello World',LA).
LA = ['Number',integer,'Name',string,'Letter',string] ?
yes
@end example
where @t{ Hello World} is the name of the relation and @t{ myddas} is the
connection identifier.
@node Number of Fields, Describing a Relation, Types of Attributes, MYDDAS
@section Number of Fields
@table @code
@item db_ number_ of_ fields(+,?).
@findex db_ number_ of_ fields/2
@snindex db_ number_ of_ fields/2
@cnindex db_ number_ of_ fields/2
@item db_ number_ of_ fields(+,+,?).
@findex db_ number_ of_ fields/3
@snindex db_ number_ of_ fields/3
@cnindex db_ number_ of_ fields/3
@end table
The prototype for this
predicate is the following:
@example
?- db_ number_ of_ fields(Conn,RelationName,Arity).
?- db_ number_ of_ fields(RelationName,Arity).
@end example
You can use the predicate @code{ db_ number_ of_ fields/2} or
@code{ db_ number_ of_ fields/3} to know what is the arity of a given
relation. Example:
@example
?- db_ number_ of_ fields(myddas,'Hello World',Arity).
Arity = 3 ?
yes
@end example
where @code{ Hello World} is the name of the
relation and @code{ myddas} is the connection identifier.
@node Describing a Relation, Enumerating Relations, Number of Fields, MYDDAS
@section Describing a Relation
@table @code
@item db_ datalog_ describe(+,+).
@findex db_ datalog_ describe/2
@snindex db_ datalog_ describe/2
@cnindex db_ datalog_ describe/2
@item db_ datalog_ describe(+).
@findex db_ datalog_ describe/1
@snindex db_ datalog_ describe/1
@cnindex db_ datalog_ describe/1
@end table
The db @code{ datalog_ describe/2} predicate does not really returns any
value. It simply prints to the screen the result of the MySQL describe
command, the same way as @code{ DESCRIBE} in the MySQL prompt would.
@example
?- db_ datalog_ describe(myddas,'Hello World').
+----------+----------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+----------+----------+------+-----+---------+-------+
+ Number | int(11) | YES | | NULL | |
+ Name | char(10) | YES | | NULL | |
+ Letter | char(1) | YES | | NULL | |
+----------+----------+------+-----+---------+-------+
yes
@end example
@table @code
@item db_ describe(+,+).
@findex db_ describe/2
@snindex db_ describe/2
@cnindex db_ describe/2
@item db_ describe(+).
@findex db_ describe/1
@snindex db_ describe/1
@cnindex db_ describe/1
@end table
The @code{ db_ describe/3} predicate does the same action as
@code{ db_ datalog_ describe/2} predicate but with one major
difference. The results are returned by backtracking. For example, the
last query:
@example
?- db_ describe(myddas,'Hello World',Term).
Term = tableInfo('Number',int(11),'YES','',null(0),'') ? ;
Term = tableInfo('Name',char(10),'YES','',null(1),'' ? ;
Term = tableInfo('Letter',char(1),'YES','',null(2),'') ? ;
no
@end example
@node Enumerating Relations, The MYDDAS MySQL Top Level, Describing a Relation, MYDDAS
@section Enumeration Relations
@table @code
@item db_ datalog_ show_ tables(+).
@item db_ datalog_ show_ tables
@end table
If we need to know what relations exists in a given MySQL Schema, we can use
the @code{ db_ datalog_ show_ tables/1} predicate. As @t{ db_ datalog_ describe/2} ,
it does not returns any value, but instead prints to the screen the result of the
@code{ SHOW TABLES} command, the same way as it would be in the MySQL prompt.
@example
?- db_ datalog_ show_ tables(myddas).
+-----------------+
| Tables_ in_ guest |
+-----------------+
| Hello World |
+-----------------+
yes
@end example
@table @code
@item db_ show_ tables(+, ?).
@findex db_ show_ tables/2
@snindex db_ show_ tables/2
@cnindex db_ show_ tables/2
@item db_ show_ tables(?)
@findex db_ show_ tables/1
@snindex db_ show_ tables/1
@cnindex db_ show_ tables/1
@end table
The @code{ db_ show_ tables/2} predicate does the same action as
@code{ db_ show_ tables/1} predicate but with one major difference. The
results are returned by backtracking. For example, given the last query:
@example
?- db_ show_ tables(myddas,Table).
Table = table('Hello World') ? ;
no
@end example
@node The MYDDAS MySQL Top Level, Other MYDDAS Properties, Enumerating Relations, MYDDAS
@section The MYDDAS MySQL Top Level
@table @code
@item db_ top_ level(+,+,+,+,+).
@findex db_ top_ level/5
@snindex db_ top_ level/5
@cnindex db_ top_ level/5
@item db_ top_ level(+,+,+,+).
@findex db_ top_ level/4
@snindex db_ top_ level/4
@cnindex db_ top_ level/4
@end table
Through MYDDAS is also possible to access the MySQL Database Server, in
the same wthe mysql client. In this mode, is possible to query the
SQL server by just using the standard SQL language. This mode is exactly the same as
different from the standard mysql client. We can use this
mode, by invoking the db top level/5. as one of the following:
@example
?- db_ top_ level(mysql,Connection,Host/Database,User,Password).
?- db_ top_ level(mysql,Connection,Host/Database/Port,User,Password).
?- db_ top_ level(mysql,Connection,Host/Database/UnixSocket,User,Password).
?- db_ top_ level(mysql,Connection,Host/Database/Port/UnixSocket,User,Password).
@end example
Usage is similar as the one described for the @code{ db_ open/5} predicate
discussed above. If the login is successful, automatically the prompt of
the mysql client will be used. For example:
@example
?- db_ top_ level(mysql,con1,localhost/guest_ db,guest,'').
@end example
opens a
connection identified by the @code{ con1} atom, to an instance of a MySQL server
running on host @code{ localhost} , using database guest @code{ db} and user @code{ guest} with
empty password. After this is possible to use MYDDAS as the mysql
client.
@example
?- db_ top_ level(mysql,con1,localhost/guest_ db,guest,'').
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A
Welcome to the MySQL monitor.
Commands end with ; or \g .
Your MySQL connection id is 4468 to server version: 4.0.20
Type 'help;' or '\h ' for help.
Type '\c ' to clear the buffer.
mysql> exit
Bye
yes
?-
@end example
@node Other MYDDAS Properties, , The MYDDAS MySQL Top Level , MYDDAS
@section Other MYDDAS Properties
@table @code
@item db_ verbose(+).
@item db_ top_ level(+,+,+,+).
@end table
When we ask a question to YAP, using a predicate asserted by
@code{ db_ import/3} , or by @code{ db_ view/3} , this will generate a SQL
@code{ QUERY} . If we want to see that query, we must to this at a given
point in our session on YAP.
@example
?- db_ verbose(1).
yes
?-
@end example
If we want to
disable this feature, we must call the @code{ db_ verbose/1} predicate with the value 0.
@table @code
@item db_ module(?).
@findex db_ module/1
@snindex db_ module/1
@cnindex db_ module/1
@end table
When we create a new database predicate, by using @code{ db_ import/3} ,
@code{ db_ view/3} or @code{ db_ insert/3} , that predicate will be asserted
by default on the @code{ user} module. If we want to change this value, we can
use the @code{ db_ module/1} predicate to do so.
@example
?- db_ module(lists).
yes
?-
@end example
By executing this predicate, all of the predicates asserted by the
predicates enumerated earlier will created in the lists module.
If we want to put back the value on default, we can manually put the
value user. Example:
@example
?- db_ module(user).
yes
?-
@end example
We can also see in what module the predicates are being asserted by doing:
@example
?- db_ module(X).
X=user
yes
?-
@end example
@table @code
@item db_ my_ result_ set(?).
@findex db_ my_ result_ set/1
@snindex db_ my_ result_ set/1
@cnindex db_ my_ result_ set/1
@end table
The MySQL C API permits two modes for transferring the data generated by
a query to the client, in our case YAP. The first mode, and the default
mode used by the MYDDAS-MySQL, is to store the result. This mode copies all the
2013-06-21 00:08:29 +01:00
information generated to the client side.
@example
2010-09-07 15:51:59 +01:00
?- db_ my_ result_ set(X).
X=store_ result
yes
@end example
The other mode that we can use is use result. This one uses the result
set created directly from the server. If we want to use this mode, he
simply do
@example
?- db_ my_ result_ set(use_ result).
yes
@end example
After this command, all
of the database predicates will use use result by default. We can change
this by doing again @code{ db_ my_ result_ set(store_ result)} .
@table @code
@item db_ my_ sql_ mode(+Conn,?SQL_ Mode).
@findex db_ my_ sql_ mode/2
@snindex db_ my_ sql_ mode/2
@cnindex db_ my_ sql_ mode/2
@item db_ my_ sql_ mode(?SQL_ Mode).
@findex db_ my_ sql_ mode/1
@snindex db_ my_ sql_ mode/1
@cnindex db_ my_ sql_ mode/1
@end table
The MySQL server allows the user to change the SQL mode. This can be
very useful for debugging proposes. For example, if we want MySQL server
not to ignore the INSERT statement warnings and instead of taking
action, report an error, we could use the following SQL mode.
@example
?-db_ my_ sql_ mode(traditional). yes
@end example
You can see the available SQL Modes at the MySQL homepage at
@url{ http://www.mysql.org} .
2014-05-12 17:49:11 +01:00
@node Real, Threads, MYDDAS, Extensions
@chapter Real:: Talking to the R language
@ifplaintext
@copydoc real
@end ifplaintext
@node Threads, Parallelism, Real, Extensions
@chapter Threads
2004-03-05 17:27:53 +00:00
YAP implements a SWI-Prolog compatible multithreading
library. Like in SWI-Prolog, Prolog threads have their own stacks and
2004-03-08 15:38:36 +00:00
only share the Prolog @emph{ heap} : predicates, records, flags and other
2004-03-05 17:27:53 +00:00
global non-backtrackable data. The package is based on the POSIX thread
2004-03-08 15:38:36 +00:00
standard (Butenhof:1997:PPT) used on most popular systems except
2004-03-05 17:27:53 +00:00
for MS-Windows.
@comment On Windows it uses the
@comment \url [pthread-win32] { http://sources.redhat.com/pthreads-win32/} emulation
@comment of POSIX threads mixed with the Windows native API for smoother and
@comment faster operation.
@menu
Subnodes of Threads
2004-03-08 15:38:36 +00:00
* Creating and Destroying Prolog Threads::
* Monitoring Threads::
* Thread Communication::
* Thread Synchronisation::
Subnodes of Thread Communication
* Message Queues::
* Signalling Threads::
* Threads and Dynamic Predicates::
2004-03-05 17:27:53 +00:00
@end menu
2004-03-08 15:38:36 +00:00
@node Creating and Destroying Prolog Threads, Monitoring Threads, ,Threads
@section Creating and Destroying Prolog Threads
2004-03-05 17:27:53 +00:00
@table @code
@item thread_ create(:@var{ Goal} , -@var{ Id} , +@var{ Options} )
@findex thread_ create/3
@snindex thread_ create/3
@cnindex thread_ create/3
Create a new Prolog thread (and underlying C-thread) and start it
2009-04-25 16:59:23 +01:00
by executing @var{ Goal} . If the thread is created successfully, the
2004-03-05 17:27:53 +00:00
thread-identifier of the created thread is unified to @var{ Id} .
@var{ Options} is a list of options. Currently defined options are:
@table @code
2004-03-06 00:31:48 +00:00
@item stack
Set the limit in K-Bytes to which the Prolog stacks of
2009-04-25 16:59:23 +01:00
this thread may grow. If omitted, the limit of the calling thread is
2004-03-08 15:38:36 +00:00
used. See also the commandline @code{ -S} option.
2004-03-05 17:27:53 +00:00
2004-03-06 00:31:48 +00:00
@item trail
2004-03-05 17:27:53 +00:00
Set the limit in K-Bytes to which the trail stack of this thread may
2009-04-25 16:59:23 +01:00
grow. If omitted, the limit of the calling thread is used. See also the
2004-03-08 15:38:36 +00:00
commandline option @code{ -T} .
2004-03-05 17:27:53 +00:00
2004-03-06 00:31:48 +00:00
@item alias
2004-03-05 17:27:53 +00:00
Associate an alias-name with the thread. This named may be used to
refer to the thread and remains valid until the thread is joined
(see @code{ thread_ join/2} ).
2008-04-02 23:45:55 +01:00
@item at_ exit
Define an exit hook for the thread. This hook is called when the thread
terminates, no matter its exit status.
2004-03-06 00:31:48 +00:00
@item detached
2004-03-05 17:27:53 +00:00
If @code{ false} (default), the thread can be waited for using
@code{ thread_ join/2} . @code{ thread_ join/2} must be called on this thread
to reclaim the all resources associated to the thread. If @code{ true} ,
the system will reclaim all associated resources automatically after the
thread finishes. Please note that thread identifiers are freed for reuse
after a detached thread finishes or a normal thread has been joined.
See also @code{ thread_ join/2} and @code{ thread_ detach/1} .
@end table
2004-03-08 15:38:36 +00:00
The @var{ Goal} argument is @emph{ copied} to the new Prolog engine.
2004-03-05 17:27:53 +00:00
This implies further instantiation of this term in either thread does
not have consequences for the other thread: Prolog threads do not share
data from their stacks.
2007-01-10 23:11:29 +00:00
@item thread_ create(:@var{ Goal} , -@var{ Id} )
@findex thread_ create/2
@snindex thread_ create/2
@cnindex thread_ create/2
Create a new Prolog thread using default options. See @code{ thread_ create/3} .
2007-06-19 11:55:35 +01:00
@item thread_ create(:@var{ Goal} )
@findex thread_ create/1
@snindex thread_ create/1
@cnindex thread_ create/1
Create a new Prolog detached thread using default options. See @code{ thread_ create/3} .
2004-03-05 17:27:53 +00:00
@item thread_ self(-@var{ Id} )
@findex thread_ self/1
@snindex thread_ self/1
@cnindex thread_ self/1
Get the Prolog thread identifier of the running thread. If the thread
has an alias, the alias-name is returned.
@item thread_ join(+@var{ Id} , -@var{ Status} )
@findex thread_ join/2
@snindex thread_ join/2
@cnindex thread_ join/2
Wait for the termination of thread with given @var{ Id} . Then unify the
result-status of the thread with @var{ Status} . After this call,
@var{ Id} becomes invalid and all resources associated with the thread
are reclaimed. Note that threads with the attribute @code{ detached}
@code{ true} cannot be joined. See also @code{ current_ thread/2} .
A thread that has been completed without @code{ thread_ join/2} being
called on it is partly reclaimed: the Prolog stacks are released and the
C-thread is destroyed. A small data-structure representing the
exit-status of the thread is retained until @code{ thread_ join/2} is called on
the thread. Defined values for @var{ Status} are:
@table @code
2004-03-06 00:31:48 +00:00
@item true
2004-03-05 17:27:53 +00:00
The goal has been proven successfully.
2004-03-06 00:31:48 +00:00
@item false
2004-03-05 17:27:53 +00:00
The goal has failed.
2004-03-06 00:31:48 +00:00
@item exception(@var{ Term} )
The thread is terminated on an
2004-03-05 17:27:53 +00:00
exception. See @code{ print_ message/2} to turn system exceptions into
readable messages.
2004-03-08 15:38:36 +00:00
@item exited(@var{ Term} )
The thread is terminated on @code{ thread_ exit/1} using the argument @var{ Term} .
2004-03-05 17:27:53 +00:00
@end table
@item thread_ detach(+@var{ Id} )
@findex thread_ detach/1
@snindex thread_ detach/1
@cnindex thread_ detach/1
2004-03-08 15:38:36 +00:00
Switch thread into detached-state (see @code{ detached} option at
2004-03-05 17:27:53 +00:00
@code{ thread_ create/3} at runtime. @var{ Id} is the identifier of the thread
placed in detached state.
One of the possible applications is to simplify debugging. Threads that
are created as @code{ detached} leave no traces if they crash. For
not-detached threads the status can be inspected using
@code{ current_ thread/2} . Threads nobody is waiting for may be created
normally and detach themselves just before completion. This way they
leave no traces on normal completion and their reason for failure can be
inspected.
2006-05-04 19:46:50 +01:00
@item thread_ yield
@findex thread_ yield/0
@snindex thread_ yield/0
@cnindex thread_ yield/0
Voluntarily relinquish the processor.
2004-03-05 17:27:53 +00:00
@item thread_ exit(+@var{ Term} )
@findex thread_ exit/1
@snindex thread_ exit/1
@cnindex thread_ exit/1
Terminates the thread immediately, leaving @code{ exited(@var{ Term} )} as
result-state for @code{ thread_ join/2} . If the thread has the attribute
2004-03-08 15:38:36 +00:00
@code{ detached} @code{ true} it terminates, but its exit status cannot be
2004-03-05 17:27:53 +00:00
retrieved using @code{ thread_ join/2} making the value of @var{ Term}
irrelevant. The Prolog stacks and C-thread are reclaimed.
2004-03-06 00:31:48 +00:00
@item thread_ at_ exit(:@var{ Term} )
@findex thread_ at_ exit/1
@snindex thread_ at_ exit/1
@cnindex thread_ at_ exit/1
Run @var{ Goal} just before releasing the thread resources. This is to
be compared to @code{ at_ halt/1} , but only for the current
thread. These hooks are ran regardless of why the execution of the
thread has been completed. As these hooks are run, the return-code is
2008-04-03 00:08:36 +01:00
already available through @code{ thread_ property/2} using the result of
@code{ thread_ self/1} as thread-identifier. If you want to guarantee the
execution of an exit hook no matter how the thread terminates (the thread
can be aborted before reaching the @code{ thread_ at_ exit/1} call), consider
using instead the @code{ at_ exit/1} option of @code{ thread_ create/3} .
2004-03-06 00:31:48 +00:00
@item thread_ setconcurrency(+@var{ Old} , -@var{ New} )
@findex thread_ setconcurrency/2
@snindex thread_ setconcurrency/2
@cnindex thread_ setconcurrency/2
2004-03-05 17:27:53 +00:00
Determine the concurrency of the process, which is defined as the
2004-03-06 00:31:48 +00:00
maximum number of concurrently active threads. `Active' here means
they are using CPU time. This option is provided if the
thread-implementation provides
@code{ pthread_ setconcurrency()} . Solaris is a typical example of this
2004-03-05 17:27:53 +00:00
family. On other systems this predicate unifies @var{ Old} to 0 (zero)
and succeeds silently.
2006-05-25 17:28:28 +01:00
2006-05-25 23:17:42 +01:00
@item thread_ sleep(+@var{ Time} )
2006-05-25 17:28:28 +01:00
@findex thread_ sleep/1
@snindex thread_ sleep/1
@cnindex thread_ sleep/1
Make current thread sleep for @var{ Time} seconds. @var{ Time} may be an
2006-12-31 16:10:29 +00:00
integer or a floating point number. When time is zero or a negative value
the call succeeds and returns immediately. This call should not be used if
2006-05-25 17:28:28 +01:00
alarms are also being used.
2004-03-05 17:27:53 +00:00
@end table
2004-03-08 15:38:36 +00:00
@node Monitoring Threads, Thread Communication,Creating and Destroying Prolog Threads,Threads
@section Monitoring Threads
2004-03-05 17:27:53 +00:00
Normal multi-threaded applications should not need these the predicates
from this section because almost any usage of these predicates is
unsafe. For example checking the existence of a thread before signalling
it is of no use as it may vanish between the two calls. Catching
2004-03-06 00:31:48 +00:00
exceptions using @code{ catch/3} is the only safe way to deal with
2004-03-05 17:27:53 +00:00
thread-existence errors.
2004-03-06 00:31:48 +00:00
These predicates are provided for diagnosis and monitoring tasks.
2004-03-05 17:27:53 +00:00
@table @code
2007-09-18 18:51:53 +01:00
@item thread_ property(?@var{ Id} , ?@var{ Property} )
2007-09-16 21:42:13 +01:00
@findex thread_ property/2
@snindex thread_ property/2
@cnindex thread_ property/2
Enumerates the properties of the specified thread.
Calling @code{ thread_ property/2} does not influence any thread. See also
@code{ thread_ join/2} . For threads that have an alias-name, this name can
be used in @var{ Id} instead of the numerical thread identifier.
@var{ Property} is one of:
@table @code
@item status(@var{ Status} )
The thread status of a thread (see below).
@item alias(@var{ Alias} )
The thread alias, if it exists.
2008-04-02 23:45:55 +01:00
@item at_ exit(@var{ AtExit} )
The thread exit hook, if defined (not available if the thread is already terminated).
2007-09-16 21:42:13 +01:00
@item detached(@var{ Boolean} )
The detached state of the thread.
@item stack(@var{ Size} )
The thread stack data-area size.
@item trail(@var{ Size} )
The thread trail data-area size.
@item system(@var{ Size} )
The thread system data-area size.
@end table
2004-03-06 00:31:48 +00:00
@item current_ thread(+@var{ Id} , -@var{ Status} )
@findex current_ thread/2
@snindex current_ thread/2
@cnindex current_ thread/2
2004-03-05 17:27:53 +00:00
Enumerates identifiers and status of all currently known threads.
2007-09-16 21:09:24 +01:00
Calling @code{ current_ thread/2} does not influence any thread. See also
2004-03-06 00:31:48 +00:00
@code{ thread_ join/2} . For threads that have an alias-name, this name is
2004-03-05 17:27:53 +00:00
returned in @var{ Id} instead of the numerical thread identifier.
@var{ Status} is one of:
@table @code
2004-03-06 00:31:48 +00:00
@item running
2004-03-05 17:27:53 +00:00
The thread is running. This is the initial status of a thread. Please
note that threads waiting for something are considered running too.
2004-03-06 00:31:48 +00:00
@item false
2004-03-05 17:27:53 +00:00
The @var{ Goal} of the thread has been completed and failed.
2004-03-06 00:31:48 +00:00
@item true
2004-03-05 17:27:53 +00:00
The @var{ Goal} of the thread has been completed and succeeded.
2004-03-06 00:31:48 +00:00
@item exited(@var{ Term} )
The @var{ Goal} of the thread has been terminated using @code{ thread_ exit/1}
2004-03-05 17:27:53 +00:00
with @var{ Term} as argument. If the underlying native thread has
exited (using pthread_ exit()) @var{ Term} is unbound.
2004-03-06 00:31:48 +00:00
@item exception(@var{ Term} )
2004-03-05 17:27:53 +00:00
The @var{ Goal} of the thread has been terminated due to an uncaught
2004-03-06 00:31:48 +00:00
exception (see @code{ throw/1} and @code{ catch/3} ).
2004-03-05 17:27:53 +00:00
@end table
2004-03-06 00:31:48 +00:00
@item thread_ statistics(+@var{ Id} , +@var{ Key} , -@var{ Value} )
@findex thread_ statistics/3
@snindex thread_ statistics/3
@cnindex thread_ statistics/3
Obtains statistical information on thread @var{ Id} as @code{ statistics/2}
2004-03-05 17:27:53 +00:00
does in single-threaded applications. This call returns all keys
2004-03-06 00:31:48 +00:00
of @code{ statistics/2} , although only information statistics about the
stacks and CPU time yield different values for each thread.
2004-03-05 17:27:53 +00:00
2004-03-06 00:31:48 +00:00
@item mutex_ statistics
@findex mutex_ statistics/0
@snindex mutex_ statistics/0
@cnindex mutex_ statistics/0
2004-03-05 17:27:53 +00:00
Print usage statistics on internal mutexes and mutexes associated
with dynamic predicates. For each mutex two numbers are printed:
the number of times the mutex was acquired and the number of
2004-03-06 00:31:48 +00:00
collisions: the number times the calling thread has to
2009-04-25 16:59:23 +01:00
wait for the mutex. The collision-count is not available on
2004-03-05 17:27:53 +00:00
Windows as this would break portability to Windows-95/98/ME or
significantly harm performance. Generally collision count is
close to zero on single-CPU hardware.
2006-05-26 00:46:57 +01:00
@item threads
@findex threads/0
@snindex threads/0
@cnindex threads/0
Prints a table of current threads and their status.
2004-03-05 17:27:53 +00:00
@end table
2004-03-08 15:38:36 +00:00
@node Thread Communication, Thread Synchronisation, Monitoring Threads, Threads
2004-03-06 00:31:48 +00:00
@section Thread communication
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
@menu
Subnodes of Thread Communication
* Message Queues::
* Signalling Threads::
* Threads and Dynamic Predicates::
@end menu
@node Message Queues, Signalling Threads, ,Thread Communication
@subsection Message Queues
2004-03-05 17:27:53 +00:00
Prolog threads can exchange data using dynamic predicates, database
records, and other globally shared data. These provide no suitable means
to wait for data or a condition as they can only be checked in an
2004-03-08 15:38:36 +00:00
expensive polling loop. @emph{ Message queues} provide a means for
2004-03-05 17:27:53 +00:00
threads to wait for data or conditions without using the CPU.
Each thread has a message-queue attached to it that is identified
by the thread. Additional queues are created using
2004-03-08 15:38:36 +00:00
@code{ message_ queue_ create/2} .
2004-03-05 17:27:53 +00:00
@table @code
2007-09-18 18:51:53 +01:00
@item thread_ send_ message(+@var{ Term} )
@findex thread_ send_ message/1
@snindex thread_ send_ message/1
@cnindex thread_ send_ message/1
Places @var{ Term} in the message-queue of the thread running the goal.
Any term can be placed in a message queue, but note that the term is
copied to the receiving thread and variable-bindings are thus lost.
This call returns immediately.
2004-03-08 15:38:36 +00:00
@item thread_ send_ message(+@var{ QueueOrThreadId} , +@var{ Term} )
@findex thread_ send_ message/2
@snindex thread_ send_ message/2
@cnindex thread_ send_ message/2
2004-03-05 17:27:53 +00:00
Place @var{ Term} in the given queue or default queue of the indicated
thread (which can even be the message queue of itself (see
2004-03-08 15:38:36 +00:00
@code{ thread_ self/1} ). Any term can be placed in a message queue, but note that
2004-03-05 17:27:53 +00:00
the term is copied to the receiving thread and variable-bindings are
thus lost. This call returns immediately.
If more than one thread is waiting for messages on the given queue and
at least one of these is waiting with a partially instantiated
2004-03-08 15:38:36 +00:00
@var{ Term} , the waiting threads are @emph{ all} sent a wakeup signal,
2004-03-05 17:27:53 +00:00
starting a rush for the available messages in the queue. This behaviour
can seriously harm performance with many threads waiting on the same
queue as all-but-the-winner perform a useless scan of the queue. If
there is only one waiting thread or all waiting threads wait with an
2007-09-16 21:09:24 +01:00
unbound variable an arbitrary thread is restarted to scan the queue.
2013-09-29 11:31:18 +01:00
@comment \footnote { See the documentation for the POSIX thread functions
@comment pthread_ cond_ signal() v.s.\ pthread_ cond_ broadcastt()
@comment for background information.}
2004-03-08 15:38:36 +00:00
@item thread_ get_ message(?@var{ Term} )
@findex thread_ get_ message/1
@snindex thread_ get_ message/1
@cnindex thread_ get_ message/1
2004-03-05 17:27:53 +00:00
Examines the thread message-queue and if necessary blocks execution
until a term that unifies to @var{ Term} arrives in the queue. After
a term from the queue has been unified unified to @var{ Term} , the
term is deleted from the queue and this predicate returns.
Please note that not-unifying messages remain in the queue. After
2004-03-08 15:38:36 +00:00
the following has been executed, thread 1 has the term @code{ gnu}
2004-03-05 17:27:53 +00:00
in its queue and continues execution using @var{ A} is @code{ gnat} .
2004-03-08 15:38:36 +00:00
@example
2004-03-05 17:27:53 +00:00
<thread 1>
thread_ get_ message(a(A)),
<thread 2>
thread_ send_ message(b(gnu)),
thread_ send_ message(a(gnat)),
2004-03-08 15:38:36 +00:00
@end example
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
See also @code{ thread_ peek_ message/1} .
2004-03-05 17:27:53 +00:00
2007-09-16 21:09:24 +01:00
@item message_ queue_ create(?@var{ Queue} )
@findex message_ queue_ create/1
@snindex message_ queue_ create/1
@cnindex message_ queue_ create/1
2004-03-05 17:27:53 +00:00
If @var{ Queue} is an atom, create a named queue. To avoid ambiguity
2004-03-08 15:38:36 +00:00
on @code{ thread_ send_ message/2} , the name of a queue may not be in use
as a thread-name. If @var{ Queue} is unbound an anonymous queue is
created and @var{ Queue} is unified to its identifier.
2007-09-16 21:09:24 +01:00
@item message_ queue_ destroy(+@var{ Queue} )
@findex message_ queue_ destroy/1
@snindex message_ queue_ destroy/1
@cnindex message_ queue_ destroy/1
Destroy a message queue created with @code{ message_ queue_ create/1} . It is
2004-03-08 15:38:36 +00:00
@emph{ not} allows to destroy the queue of a thread. Neither is it
2004-03-05 17:27:53 +00:00
allowed to destroy a queue other threads are waiting for or, for
2009-04-25 16:59:23 +01:00
anonymous message queues, may try to wait for later.
2004-03-05 17:27:53 +00:00
2007-09-18 18:51:53 +01:00
@item thread_ get_ message(+@var{ Queue} , ?@var{ Term} )
2004-03-08 15:38:36 +00:00
@findex thread_ get_ message/2
@snindex thread_ get_ message/2
@cnindex thread_ get_ message/2
2007-09-16 21:09:24 +01:00
As @code{ thread_ get_ message/1} , operating on a given queue. It is allowed to
2004-03-05 17:27:53 +00:00
peek into another thread's message queue, an operation that can be used
to check whether a thread has swallowed a message sent to it.
2007-09-16 21:09:24 +01:00
2007-09-18 18:51:53 +01:00
@item thread_ peek_ message(?@var{ Term} )
@findex thread_ peek_ message/1
@snindex thread_ peek_ message/1
@cnindex thread_ peek_ message/1
Examines the thread message-queue and compares the queued terms
with @var{ Term} until one unifies or the end of the queue has been
reached. In the first case the call succeeds (possibly instantiating
@var{ Term} . If no term from the queue unifies this call fails.
@item thread_ peek_ message(+@var{ Queue} , ?@var{ Term} )
2007-09-16 21:09:24 +01:00
@findex thread_ peek_ message/2
@snindex thread_ peek_ message/2
@cnindex thread_ peek_ message/2
As @code{ thread_ peek_ message/1} , operating on a given queue. It is allowed to
peek into another thread's message queue, an operation that can be used
to check whether a thread has swallowed a message sent to it.
2004-03-05 17:27:53 +00:00
@end table
2004-03-08 15:38:36 +00:00
Explicit message queues are designed with the @emph{ worker-pool} model
2004-03-05 17:27:53 +00:00
in mind, where multiple threads wait on a single queue and pick up the
first goal to execute. Below is a simple implementation where the
workers execute arbitrary Prolog goals. Note that this example provides
no means to tell when all work is done. This must be realised using
additional synchronisation.
2004-03-08 15:38:36 +00:00
@example
2013-09-29 11:31:18 +01:00
% create_workers(+Id, +N)
%
% Create a pool with given Id and number of workers.
2004-03-05 17:27:53 +00:00
create_ workers(Id, N) :-
2013-09-29 11:31:18 +01:00
message_ queue_ create(Id),
forall(between(1, N, _ ),
thread_ create(do_ work(Id), _ , [])).
2004-03-05 17:27:53 +00:00
do_ work(Id) :-
2013-09-29 11:31:18 +01:00
repeat,
thread_ get_ message(Id, Goal),
( catch(Goal, E, print_ message(error, E))
-> true
; print_ message(error, goal_ failed(Goal, worker(Id)))
),
fail.
% work(+Id, +Goal)
%
% Post work to be done by the pool
2004-03-05 17:27:53 +00:00
work(Id, Goal) :-
2013-09-29 11:31:18 +01:00
thread_ send_ message(Id, Goal).
2004-03-08 15:38:36 +00:00
@end example
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
@node Signalling Threads, Threads and Dynamic Predicates,Message Queues, Thread Communication
@subsection Signalling Threads
2004-03-05 17:27:53 +00:00
These predicates provide a mechanism to make another thread execute some
2004-03-08 15:38:36 +00:00
goal as an @emph{ interrupt} . Signalling threads is safe as these
2004-03-05 17:27:53 +00:00
interrupts are only checked at safe points in the virtual machine.
Nevertheless, signalling in multi-threaded environments should be
2004-03-08 15:38:36 +00:00
handled with care as the receiving thread may hold a @emph{ mutex}
2007-09-16 21:09:24 +01:00
(see @code{ with_ mutex/2} ). Signalling probably only makes sense to start
2004-03-08 15:38:36 +00:00
debugging threads and to cancel no-longer-needed threads with @code{ throw/1} ,
2004-03-05 17:27:53 +00:00
where the receiving thread should be designed carefully do handle
exceptions at any point.
@table @code
2004-03-08 15:38:36 +00:00
@item thread_ signal(+@var{ ThreadId} , :@var{ Goal} )
@findex thread_ signal/2
@snindex thread_ signal/2
@cnindex thread_ signal/2
2004-03-05 17:27:53 +00:00
Make thread @var{ ThreadId} execute @var{ Goal} at the first
opportunity. In the current implementation, this implies at the first
2004-03-08 15:38:36 +00:00
pass through the @emph{ Call-port} . The predicate @code{ thread_ signal/2}
2004-03-05 17:27:53 +00:00
itself places @var{ Goal} into the signalled-thread's signal queue
and returns immediately.
Signals (interrupts) do not cooperate well with the world of
multi-threading, mainly because the status of mutexes cannot be
guaranteed easily. At the call-port, the Prolog virtual machine
holds no locks and therefore the asynchronous execution is safe.
2004-03-08 15:38:36 +00:00
@var{ Goal} can be any valid Prolog goal, including @code{ throw/1} to make
the receiving thread generate an exception and @code{ trace/0} to start
2004-03-05 17:27:53 +00:00
tracing the receiving thread.
2004-03-08 15:38:36 +00:00
@comment In the Windows version, the receiving thread immediately executes
@comment the signal if it reaches a Windows GetMessage() call, which generally
@comment happens of the thread is waiting for (user-)input.
2004-03-05 17:27:53 +00:00
@end table
2004-03-08 15:38:36 +00:00
@node Threads and Dynamic Predicates, , Signalling Threads, Thread Communication
@subsection Threads and Dynamic Predicates
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
Besides queues threads can share and exchange data using dynamic
predicates. The multi-threaded version knows about two types of
dynamic predicates. By default, a predicate declared @emph{ dynamic}
(see @code{ dynamic/1} ) is shared by all threads. Each thread may
assert, retract and run the dynamic predicate. Synchronisation inside
Prolog guarantees the consistency of the predicate. Updates are
@emph{ logical} : visible clauses are not affected by assert/retract
2004-03-05 17:27:53 +00:00
after a query started on the predicate. In many cases primitive from
2009-04-25 16:59:23 +01:00
thread synchronisation should be used to ensure application invariants on
2004-03-05 17:27:53 +00:00
the predicate are maintained.
Besides shared predicates, dynamic predicates can be declared with the
2004-03-08 15:38:36 +00:00
@code{ thread_ local/1} directive. Such predicates share their
attributes, but the clause-list is different in each thread.
2004-03-05 17:27:53 +00:00
@table @code
2004-03-08 15:38:36 +00:00
@item thread_ local(@var{ +Functor/Arity} )
@findex thread_ local/1 (directive)
@snindex thread_ local/1 (directive)
@cnindex thread_ local/1 (directive)
related to the dynamic/1 directive. It tells the system that the
predicate may be modified using @code{ assert/1} , @code{ retract/1} ,
etc, during execution of the program. Unlike normal shared dynamic
2004-03-05 17:27:53 +00:00
data however each thread has its own clause-list for the predicate.
As a thread starts, this clause list is empty. If there are still
2004-03-08 15:38:36 +00:00
clauses as the thread terminates these are automatically reclaimed by
2007-09-16 21:09:24 +01:00
the system. The @code{ thread_ local} property implies
the property @code{ dynamic} .
2004-03-05 17:27:53 +00:00
Thread-local dynamic predicates are intended for maintaining
thread-specific state or intermediate results of a computation.
It is not recommended to put clauses for a thread-local predicate into
a file as in the example below as the clause is only visible from the
thread that loaded the source-file. All other threads start with an
empty clause-list.
2004-03-08 15:38:36 +00:00
@example
2004-03-05 17:27:53 +00:00
:- thread_ local
2013-09-29 11:31:18 +01:00
foo/1.
2004-03-05 17:27:53 +00:00
foo(gnat).
2004-03-08 15:38:36 +00:00
@end example
2004-03-05 17:27:53 +00:00
@end table
2004-03-08 15:38:36 +00:00
@node Thread Synchronisation, , Thread Communication, Threads
@section Thread Synchronisation
2004-03-05 17:27:53 +00:00
All internal Prolog operations are thread-safe. This implies two Prolog
threads can operate on the same dynamic predicate without corrupting the
consistency of the predicate. This section deals with user-level
2004-03-08 15:38:36 +00:00
@emph{ mutexes} (called @emph{ monitors} in ADA or
@emph{ critical-sections} by Microsoft). A mutex is a
@emph{ MUT} ual @emph{ EX} clusive device, which implies at most one thread
can @emph{ hold} a mutex.
2004-03-05 17:27:53 +00:00
Mutexes are used to realise related updates to the Prolog database.
With `related', we refer to the situation where a `transaction' implies
two or more changes to the Prolog database. For example, we have a
2004-03-08 15:38:36 +00:00
predicate @code{ address/2} , representing the address of a person and we want
2004-03-05 17:27:53 +00:00
to change the address by retracting the old and asserting the new
address. Between these two operations the database is invalid: this
person has either no address or two addresses, depending on the
assert/retract order.
Here is how to realise a correct update:
2004-03-08 15:38:36 +00:00
@example
2004-03-05 17:27:53 +00:00
:- initialization
2013-09-29 11:31:18 +01:00
mutex_ create(addressbook).
2004-03-05 17:27:53 +00:00
change_ address(Id, Address) :-
2013-09-29 11:31:18 +01:00
mutex_ lock(addressbook),
retractall(address(Id, _ )),
asserta(address(Id, Address)),
mutex_ unlock(addressbook).
2004-03-08 15:38:36 +00:00
@end example
2004-03-05 17:27:53 +00:00
@table @code
2004-03-08 15:38:36 +00:00
@item mutex_ create(?@var{ MutexId} )
@findex mutex_ create/1
@snindex mutex_ create/1
@cnindex mutex_ create/1
Create a mutex. if @var{ MutexId} is an atom, a @emph{ named} mutex is
2004-03-05 17:27:53 +00:00
created. If it is a variable, an anonymous mutex reference is returned.
There is no limit to the number of mutexes that can be created.
2004-03-08 15:38:36 +00:00
@item mutex_ destroy(+@var{ MutexId} )
@findex mutex_ destroy/1
@snindex mutex_ destroy/1
@cnindex mutex_ destroy/1
2004-03-05 17:27:53 +00:00
Destroy a mutex. After this call, @var{ MutexId} becomes invalid and
2004-03-08 15:38:36 +00:00
further references yield an @code{ existence_ error} exception.
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
@item with_ mutex(+@var{ MutexId} , :@var{ Goal} )
@findex with_ mutex/2
@snindex with_ mutex/2
@cnindex with_ mutex/2
2004-03-05 17:27:53 +00:00
Execute @var{ Goal} while holding @var{ MutexId} . If @var{ Goal} leaves
2004-03-08 15:38:36 +00:00
choicepoints, these are destroyed (as in @code{ once/1} ). The mutex is unlocked
2004-03-05 17:27:53 +00:00
regardless of whether @var{ Goal} succeeds, fails or raises an exception.
An exception thrown by @var{ Goal} is re-thrown after the mutex has been
2004-03-08 15:38:36 +00:00
successfully unlocked. See also @code{ mutex_ create/2} .
2004-03-05 17:27:53 +00:00
Although described in the thread-section, this predicate is also
available in the single-threaded version, where it behaves simply as
2007-09-16 21:09:24 +01:00
@code{ once/1} .
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
@item mutex_ lock(+@var{ MutexId} )
@findex mutex_ lock/1
@snindex mutex_ lock/1
@cnindex mutex_ lock/1
Lock the mutex. Prolog mutexes are @emph{ recursive} mutexes: they
2004-03-05 17:27:53 +00:00
can be locked multiple times by the same thread. Only after unlocking
it as many times as it is locked, the mutex becomes available for
locking by other threads. If another thread has locked the mutex the
calling thread is suspended until to mutex is unlocked.
If @var{ MutexId} is an atom, and there is no current mutex with that
2004-03-08 15:38:36 +00:00
name, the mutex is created automatically using @code{ mutex_ create/1} . This
2004-03-05 17:27:53 +00:00
implies named mutexes need not be declared explicitly.
Please note that locking and unlocking mutexes should be paired
carefully. Especially make sure to unlock mutexes even if the protected
code fails or raises an exception. For most common cases use
2009-04-25 16:59:23 +01:00
@code{ with_ mutex/2} , which provides a safer way for handling Prolog-level
2004-03-08 15:38:36 +00:00
mutexes.
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
@item mutex_ trylock(+@var{ MutexId} )
@findex mutex_ trylock/1
@snindex mutex_ trylock/1
@cnindex mutex_ trylock/1
2004-03-05 17:27:53 +00:00
As mutex_ lock/1, but if the mutex is held by another thread, this
predicates fails immediately.
2004-03-08 15:38:36 +00:00
@item mutex_ unlock(+@var{ MutexId} )
@findex mutex_ unlock/1
@snindex mutex_ unlock/1
@cnindex mutex_ unlock/1
2004-03-05 17:27:53 +00:00
Unlock the mutex. This can only be called if the mutex is held by the
2004-03-08 15:38:36 +00:00
calling thread. If this is not the case, a @code{ permission_ error}
2004-03-05 17:27:53 +00:00
exception is raised.
2004-03-08 15:38:36 +00:00
@item mutex_ unlock_ all
@findex mutex_ unlock_ all/0
@snindex mutex_ unlock_ all/0
@cnindex mutex_ unlock_ all/0
2004-03-05 17:27:53 +00:00
Unlock all mutexes held by the current thread. This call is especially
2004-03-08 15:38:36 +00:00
useful to handle thread-termination using @code{ abort/0} or exceptions. See
also @code{ thread_ signal/2} .
2004-03-05 17:27:53 +00:00
2004-03-08 15:38:36 +00:00
@item current_ mutex(?@var{ MutexId} , ?@var{ ThreadId} , ?@var{ Count} )
@findex current_ mutex/3
@snindex current_ mutex/3
@cnindex current_ mutex/3
2004-03-05 17:27:53 +00:00
Enumerates all existing mutexes. If the mutex is held by some thread,
2009-04-25 16:59:23 +01:00
@var{ ThreadId} is unified with the identifier of the holding thread and
2004-03-05 17:27:53 +00:00
@var{ Count} with the recursive count of the mutex. Otherwise,
@var{ ThreadId} is @code{ []} and @var{ Count} is 0.
@end table
2004-03-08 15:38:36 +00:00
@node Parallelism, Tabling, Threads, Extensions
2014-04-21 11:14:18 +01:00
@section Parallelism
2001-04-09 20:54:03 +01:00
@cindex parallelism
@cindex or-parallelism
There has been a sizeable amount of work on an or-parallel
2007-02-18 00:26:36 +00:00
implementation for YAP, called @strong{ YAPOr} . Most of this work has
2001-04-09 20:54:03 +01:00
been performed by Ricardo Rocha. In this system parallelism is exploited
implicitly by running several alternatives in or-parallel. This option
can be enabled from the @code{ configure} script or by checking the
system's @code{ Makefile} .
2007-02-18 00:26:36 +00:00
@strong{ YAPOr} is still a very experimental system, going through rapid
2001-04-09 20:54:03 +01:00
development. The following restrictions are of note:
@itemize @bullet
2007-02-18 00:26:36 +00:00
@item @strong{ YAPOr} currently only supports the Linux/X86 and SPARC/Solaris
2001-04-09 20:54:03 +01:00
platforms. Porting to other Unix-like platforms should be straightforward.
2007-02-18 00:26:36 +00:00
@item @strong{ YAPOr} does not support parallel updates to the
2001-04-09 20:54:03 +01:00
data-base.
2007-02-18 00:26:36 +00:00
@item @strong{ YAPOr} does not support opening or closing of streams during
2001-04-09 20:54:03 +01:00
parallel execution.
@item Garbage collection and stack shifting are not supported in
2007-02-18 00:26:36 +00:00
@strong{ YAPOr} .
2001-04-09 20:54:03 +01:00
@item Built-ins that cause side-effects can only be executed when
left-most in the search-tree. There are no primitives to provide
asynchronous or cavalier execution of these built-ins, as in Aurora or
Muse.
@item YAP does not support voluntary suspension of work.
@end itemize
We expect that some of these restrictions will be removed in future
releases.
@node Tabling, Low Level Tracing, Parallelism , Extensions
2014-04-21 11:14:18 +01:00
@section Tabling
2001-04-09 20:54:03 +01:00
@cindex tabling
2007-02-18 00:26:36 +00:00
@strong{ YAPTab} is the tabling engine that extends YAP's execution
model to support tabled evaluation for definite programs. YAPTab was
2006-04-21 19:39:38 +01:00
implemented by Ricardo Rocha and its implementation is largely based
on the ground-breaking design of the XSB Prolog system, which
2007-02-18 00:26:36 +00:00
implements the SLG-WAM. Tables are implemented using tries and YAPTab
2006-04-21 19:39:38 +01:00
supports the dynamic intermixing of batched scheduling and local
scheduling at the subgoal level. Currently, the following restrictions
are of note:
@itemize @bullet
2007-02-18 00:26:36 +00:00
@item YAPTab does not handle tabled predicates with loops through negation (undefined behaviour).
@item YAPTab does not handle tabled predicates with cuts (undefined behaviour).
@item YAPTab does not support coroutining (configure error).
@item YAPTab does not support tabling dynamic predicates (permission error).
2006-04-21 19:39:38 +01:00
@end itemize
2007-02-18 00:26:36 +00:00
To experiment with YAPTab use @code{ --enable-tabling} in the configure
2006-04-21 19:39:38 +01:00
script or add @code{ -DTABLING} to @code{ YAP_ EXTRAS} in the system's
@code{ Makefile} . We next describe the set of built-ins predicates
2007-02-18 00:26:36 +00:00
designed to interact with YAPTab and control tabled execution:
2001-04-09 20:54:03 +01:00
2006-04-05 01:16:55 +01:00
@table @code
2006-04-21 19:39:38 +01:00
@item table +@var{ P}
@findex table/1
@snindex table/1
@cnindex table/1
Declares predicate @var{ P} (or a list of predicates
@var{ P1} ,...,@var{ Pn} or [@var{ P1} ,...,@var{ Pn} ]) as a tabled
predicate. @var{ P} must be written in the form
@var{ name/arity} . Examples:
@example
:- table son/3.
:- table father/2.
:- table mother/2.
@end example
@noindent or
@example
:- table son/3, father/2, mother/2.
@end example
@noindent or
@example
:- table [son/3, father/2, mother/2].
@end example
2006-04-05 01:16:55 +01:00
2006-04-21 19:39:38 +01:00
@item is_ tabled(+@var{ P} )
2006-04-05 01:16:55 +01:00
@findex is_ tabled/1
@snindex is_ tabled/1
@cnindex is_ tabled/1
2006-04-21 19:39:38 +01:00
Succeeds if the predicate @var{ P} (or a list of predicates
@var{ P1} ,...,@var{ Pn} or [@var{ P1} ,...,@var{ Pn} ]), of the form
@var{ name/arity} , is a tabled predicate.
@item tabling_ mode(+@var{ P} ,?@var{ Mode} )
@findex tabling_ mode/2
@snindex tabling_ mode/2
@cnindex tabling_ mode/2
Sets or reads the default tabling mode for a tabled predicate @var{ P}
(or a list of predicates @var{ P1} ,...,@var{ Pn} or
[@var{ P1} ,...,@var{ Pn} ]). The list of @var{ Mode} options includes:
@table @code
@item batched
Defines that, by default, batched scheduling is the scheduling
strategy to be used to evaluated calls to predicate @var{ P} .
@item local
Defines that, by default, local scheduling is the scheduling
strategy to be used to evaluated calls to predicate @var{ P} .
@item exec_ answers
Defines that, by default, when a call to predicate @var{ P} is
already evaluated (completed), answers are obtained by executing
compiled WAM-like code directly from the trie data
structure. This reduces the loading time when backtracking, but
the order in which answers are obtained is undefined.
@item load_ answers
Defines that, by default, when a call to predicate @var{ P} is
already evaluated (completed), answers are obtained (as a
consumer) by loading them from the trie data structure. This
guarantees that answers are obtained in the same order as they
were found. Somewhat less efficient but creates less choice-points.
@end table
The default tabling mode for a new tabled predicate is @code{ batched}
and @code{ exec_ answers} . To set the tabling mode for all predicates at
once you can use the @code{ yap_ flag/2} predicate as described next.
@item yap_ flag(tabling_ mode,?@var{ Mode} )
@findex tabling_ mode (yap_ flag/2 option)
Sets or reads the tabling mode for all tabled predicates. The list of
@var{ Mode} options includes:
2006-04-05 01:16:55 +01:00
@table @code
2006-04-21 19:39:38 +01:00
@item default
Defines that (i) all calls to tabled predicates are evaluated
using the predicate default mode, and that (ii) answers for all
completed calls are obtained by using the predicate default mode.
@item batched
Defines that all calls to tabled predicates are evaluated using
batched scheduling. This option ignores the default tabling mode
of each predicate.
@item local
Defines that all calls to tabled predicates are evaluated using
local scheduling. This option ignores the default tabling mode
of each predicate.
@item exec_ answers
Defines that answers for all completed calls are obtained by
executing compiled WAM-like code directly from the trie data
structure. This option ignores the default tabling mode
of each predicate.
@item load_ answers
Defines that answers for all completed calls are obtained by
loading them from the trie data structure. This option ignores
the default tabling mode of each predicate.
2006-04-05 01:16:55 +01:00
@end table
2006-04-21 19:39:38 +01:00
@item abolish_ table(+@var{ P} )
2006-04-05 01:16:55 +01:00
@findex abolish_ table/1
@snindex abolish_ table/1
@cnindex abolish_ table/1
2006-04-21 19:39:38 +01:00
Removes all the entries from the table space for predicate @var{ P} (or
a list of predicates @var{ P1} ,...,@var{ Pn} or
[@var{ P1} ,...,@var{ Pn} ]). The predicate remains as a tabled predicate.
@item abolish_ all_ tables/0
@findex abolish_ all_ tables/0
@snindex abolish_ all_ tables/0
@cnindex abolish_ all_ tables/0
Removes all the entries from the table space for all tabled
predicates. The predicates remain as tabled predicates.
@item show_ table(+@var{ P} )
2006-04-05 01:16:55 +01:00
@findex show_ table/1
@snindex show_ table/1
@cnindex show_ table/1
2006-04-21 19:39:38 +01:00
Prints table contents (subgoals and answers) for predicate @var{ P}
(or a list of predicates @var{ P1} ,...,@var{ Pn} or
[@var{ P1} ,...,@var{ Pn} ]).
2006-04-05 01:16:55 +01:00
2006-04-21 19:39:38 +01:00
@item table_ statistics(+@var{ P} )
2006-04-05 01:16:55 +01:00
@findex table_ statistics/1
@snindex table_ statistics/1
@cnindex table_ statistics/1
2006-04-21 19:39:38 +01:00
Prints table statistics (subgoals and answers) for predicate @var{ P}
(or a list of predicates @var{ P1} ,...,@var{ Pn} or
[@var{ P1} ,...,@var{ Pn} ]).
@item tabling_ statistics/0
@findex tabling_ statistics/0
@snindex tabling_ statistics/0
@cnindex tabling_ statistics/0
Prints statistics on space used by all tables.
2006-04-05 01:16:55 +01:00
@end table
2001-04-09 20:54:03 +01:00
@node Low Level Tracing, Low Level Profiling, Tabling, Extensions
2014-04-21 11:14:18 +01:00
@section Tracing at Low Level
2001-04-09 20:54:03 +01:00
It is possible to follow the flow at abstract machine level if
YAP is compiled with the flag @code{ LOW_ LEVEL_ TRACER} . Note
that this option is of most interest to implementers, as it quickly generates
an huge amount of information.
Low level tracing can be toggled from an interrupt handler by using the
2006-02-08 19:13:11 +00:00
option @code{ T} . There are also two built-ins that activate and
2001-04-09 20:54:03 +01:00
deactivate low level tracing:
@table @code
@item start_ low_ level_ trace
@findex start_ low_ level_ trace/0
@snindex start_ low_ level_ trace/0
@cnindex start_ low_ level_ trace/0
Begin display of messages at procedure entry and retry.
@item stop_ low_ level_ trace
2014-04-21 11:14:18 +01:00
@findex stop_ low_ level_ trace/0
@snindex stop_ low_ level_ trace/0
@cnindex stop_ low_ level_ trace/0
2001-04-09 20:54:03 +01:00
Stop display of messages at procedure entry and retry.
@end table
Note that this compile-time option will slow down execution.
@node Low Level Profiling, , Low Level Tracing, Extensions
2014-04-21 11:14:18 +01:00
@section Profiling the Abstract Machine
2001-04-09 20:54:03 +01:00
Implementors may be interested in detecting on which abstract machine
instructions are executed by a program. The @code{ ANALYST} flag can give
WAM level information. Note that this option slows down execution very
substantially, and is only of interest to developers of the system
internals, or to system debuggers.
@table @code
@item reset_ op_ counters
@findex reset_ op_ counters/0
@snindex reset_ op_ counters/0
@cnindex reset_ op_ counters/0
2002-10-11 04:39:11 +01:00
Reinitialize all counters.
2001-04-09 20:54:03 +01:00
@item show_ op_ counters(+@var{ A} )
@findex show_ op_ counters/1
@snindex show_ op_ counters/1
@cnindex show_ op_ counters/1
Display the current value for the counters, using label @var{ A} . The
label must be an atom.
@item show_ ops_ by_ group(+@var{ A} )
@findex show_ ops_ by_ group/1
@snindex show_ ops_ by_ group/1
@cnindex show_ ops_ by_ group/1
2002-10-11 04:39:11 +01:00
Display the current value for the counters, organized by groups, using
2001-04-09 20:54:03 +01:00
label @var{ A} . The label must be an atom.
@end table
@node Debugging,Efficiency,Extensions,Top
2014-04-21 11:14:18 +01:00
@section Debugging
2001-04-09 20:54:03 +01:00
@menu
* Deb Preds:: Debugging Predicates
* Deb Interaction:: Interacting with the debugger
@end menu
@node Deb Preds, Deb Interaction, , Debugging
@section Debugging Predicates
The following predicates are available to control the debugging of
programs:
@table @code
@item debug
@findex debug/0
@saindex debug/0
@cyindex debug/0
Switches the debugger on.
@item debugging
@findex debugging/0
@syindex debugging/0
@cyindex debugging/0
Outputs status information about the debugger which includes the leash
mode and the existing spy-points, when the debugger is on.
@item nodebug
@findex nodebug/0
@syindex nodebug/0
@cyindex nodebug/0
Switches the debugger off.
@item spy +@var{ P}
@findex spy/1
@syindex spy/1
@cyindex spy/1
Sets spy-points on all the predicates represented by
@var{ P} . @var{ P} can either be a single specification or a list of
specifications. Each one must be of the form @var{ Name/Arity}
or @var{ Name} . In the last case all predicates with the name
@var{ Name} will be spied. As in C-Prolog, system predicates and
predicates written in C, cannot be spied.
@item nospy +@var{ P}
@findex nospy/1
@syindex nospy/1
@cyindex nospy/1
Removes spy-points from all predicates specified by @var{ P} .
The possible forms for @var{ P} are the same as in @code{ spy P} .
@item nospyall
@findex nospyall/0
@syindex nospyall/0
@cnindex nospyall/0
Removes all existing spy-points.
@item leash(+@var{ M} )
@findex leash/1
@syindex leash/1
@cyindex leash/1
Sets leashing mode to @var{ M} .
The mode can be specified as:
@table @code
@item full
prompt on Call, Exit, Redo and Fail
@item tight
prompt on Call, Redo and Fail
@item half
prompt on Call and Redo
@item loose
prompt on Call
@item off
never prompt
2002-06-01 02:46:06 +01:00
@item none
never prompt, same as @code{ off}
2001-04-09 20:54:03 +01:00
@end table
@noindent
2002-06-01 02:46:06 +01:00
The initial leashing mode is @code{ full} .
2001-04-09 20:54:03 +01:00
@noindent
The user may also specify directly the debugger ports
where he wants to be prompted. If the argument for leash
is a number @var{ N} , each of lower four bits of the number is used to
control prompting at one the ports of the box model. The debugger will
prompt according to the following conditions:
@itemize @bullet
@item
if @code{ N/\ 1 =\= 0} prompt on fail
@item
if @code{ N/\ 2 =\= 0} prompt on redo
@item
if @code{ N/\ 4 =\= 0} prompt on exit
@item
if @code{ N/\ 8 =\= 0} prompt on call
@end itemize
@noindent
Therefore, @code{ leash(15)} is equivalent to @code{ leash(full)} and
@code{ leash(0)} is equivalent to @code{ leash(off)} .
@noindent
Another way of using @code{ leash} is to give it a list with the names of
the ports where the debugger should stop. For example,
@code{ leash([call,exit,redo,fail])} is the same as @code{ leash(full)} or
@code{ leash(15)} and @code{ leash([fail])} might be used instead of
@code{ leash(1)} .
@item spy_ write(+@var{ Stream} ,Term)
@findex spy_ write/2
@snindex spy_ write/2
@cnindex spy_ write/2
If defined by the user, this predicate will be used to print goals by
the debugger instead of @code{ write/2} .
2001-08-08 22:17:27 +01:00
@item trace
2008-03-12 16:19:02 +00:00
@findex trace/0
@syindex trace/0
@cyindex trace/0
2001-08-08 22:17:27 +01:00
Switches on the debugger and starts tracing.
2008-03-12 16:19:02 +00:00
@item notrace
@findex notrace/0
@syindex notrace/0
@cyindex notrace/0
Ends tracing and exits the debugger. This is the same as
@code{ nodebug/0} .
2001-04-09 20:54:03 +01:00
@end table
@node Deb Interaction, , Deb Preds, Debugging
@section Interacting with the debugger
2007-02-18 00:26:36 +00:00
Debugging with YAP is similar to debugging with C-Prolog. Both systems
include a procedural debugger, based on Byrd's four port model. In this
model, execution is seen at the procedure level: each activation of a
procedure is seen as a box with control flowing into and out of that
2001-04-09 20:54:03 +01:00
box.
In the four port model control is caught at four key points: before
entering the procedure, after exiting the procedure (meaning successful
evaluation of all queries activated by the procedure), after backtracking but
before trying new alternative to the procedure and after failing the
procedure. Each one of these points is named a port:
@smallexample
@group
*--------------------------------------*
Call | | Exit
---------> + descendant(X,Y) :- offspring(X,Y). + --------->
| |
| descendant(X,Z) :- |
<--------- + offspring(X,Y), descendant(Y,Z). + <---------
Fail | | Redo
*--------------------------------------*
@end group
@end smallexample
@table @code
@item Call
The call port is activated before initial invocation of
procedure. Afterwards, execution will try to match the goal with the
head of existing clauses for the procedure.
@item Exit
This port is activated if the procedure succeeds.
Control will now leave the procedure and return to its ancestor.
@item Redo
if the goal, or goals, activated after the call port
fail then backtracking will eventually return control to this procedure
through the redo port.
@item Fail
If all clauses for this predicate fail, then the
invocation fails, and control will try to redo the ancestor of this
invocation.
@end table
2007-02-18 00:26:36 +00:00
To start debugging, the user will either call @code{ trace} or spy the
relevant procedures, entering debug mode, and start execution of the
program. When finding the first spy-point, YAP's debugger will take
control and show a message of the form:
2001-04-09 20:54:03 +01:00
@example
* (1) call: quicksort([1,2,3],_ 38) ?
@end example
2007-02-18 00:26:36 +00:00
The debugger message will be shown while creeping, or at spy-points,
2001-04-09 20:54:03 +01:00
and it includes four or five fields:
@itemize @bullet
@item
2006-03-06 14:04:57 +00:00
The first three characters are used to point out special states of the
debugger. If the port is exit and the first character is '?', the
2007-02-18 00:26:36 +00:00
current call is non-deterministic, that is, it still has alternatives to
be tried. If the second character is a @code{ *} , execution is at a
2006-03-06 14:04:57 +00:00
spy-point. If the third character is a @code{ >} , execution has returned
2001-04-09 20:54:03 +01:00
either from a skip, a fail or a redo command.
@item
The second field is the activation number, and uniquely identifies the
activation. The number will start from 1 and will be incremented for
each activation found by the debugger.
@item
In the third field, the debugger shows the active port.
@item
2007-02-18 00:26:36 +00:00
The fourth field is the goal. The goal is written by
@code{ write_ term/3} on the standard error stream, using the options
given by @code{ debugger_ print_ options} .
2001-04-09 20:54:03 +01:00
@end itemize
If the active port is leashed, the debugger will prompt the user with a
@code{ ?} , and wait for a command. A debugger command is just a
character, followed by a return. By default, only the call and redo
entries are leashed, but the @code{ leash/1} predicate can be used in
order to make the debugger stop where needed.
There are several commands available, but the user only needs to
remember the help command, which is @code{ h} . This command shows all the
available options, which are:
@table @code
@item c - creep
this command makes YAP continue execution and stop at the next
leashed port.
@item return - creep
the same as c
@item l - leap
2007-02-18 00:26:36 +00:00
YAP will execute until it meets a port for a spied predicate; this mode
keeps all computation history for debugging purposes, so it is more
expensive than standard execution. Use @t{ k} or @t{ z} for fast execution.
2001-04-09 20:54:03 +01:00
@item k - quasi-leap
similar to leap but faster since the computation history is
not kept; useful when leap becomes too slow.
2007-02-18 00:26:36 +00:00
@item z - zip
same as @t{ k}
2001-04-09 20:54:03 +01:00
@item s - skip
YAP will continue execution without showing any messages until
returning to the current activation. Spy-points will be ignored in this
2007-02-18 00:26:36 +00:00
mode. Note that this command keeps all debugging history, use @t{ t} for fast execution. This command is meaningless, and therefore illegal, in the fail
2001-04-09 20:54:03 +01:00
and exit ports.
@item t - fast-skip
2007-02-18 00:26:36 +00:00
similar to skip but faster since computation history is not
kept; useful if skip becomes slow.
@item f [@var{ GoalId} ] - fail
If given no argument, forces YAP to fail the goal, skipping the fail
port and backtracking to the parent.
If @t{ f} receives a goal number as
the argument, the command fails all the way to the goal. If goal @var{ GoalId} has completed execution, YAP fails until meeting the first active ancestor.
@item r [@var{ GoalId} ] - retry
This command forces YAP to jump back call to the port. Note that any
side effects of the goal cannot be undone. This command is not available
at the call port. If @t{ f} receives a goal number as the argument, the
command retries goal @var{ GoalId} instead. If goal @var{ GoalId} has
completed execution, YAP fails until meeting the first active ancestor.
2001-04-09 20:54:03 +01:00
@item a - abort
execution will be aborted, and the interpreter will return to the
2007-02-18 00:26:36 +00:00
top-level. YAP disactivates debug mode, but spypoints are not removed.
2001-04-09 20:54:03 +01:00
@item n - nodebug
2007-02-18 00:26:36 +00:00
stop debugging and continue execution. The command will not clear active
spy-points.
2001-04-09 20:54:03 +01:00
@item e - exit
leave YAP.
@item h - help
show the debugger commands.
@item ! Query
execute a query. YAP will not show the result of the query.
@item b - break
2014-05-04 22:30:33 +01:00
break active execution and launch a break level. This is the same as @code{ !break} .
2001-04-09 20:54:03 +01:00
@item + - spy this goal
start spying the active goal. The same as @code{ ! spy G} where @var{ G}
is the active goal.
@item - - nospy this goal
stop spying the active goal. The same as @code{ ! nospy G} where @var{ G} is
the active goal.
@item p - print
shows the active goal using print/1
@item d - display
shows the active goal using display/1
@item <Depth - debugger write depth
sets the maximum write depth, both for composite terms and lists, that
2002-01-02 03:54:15 +00:00
will be used by the debugger. For more
2014-04-21 11:14:18 +01:00
information about @code{ write_ depth/2} (@pxref{ Input/Output Control} ).
2001-04-09 20:54:03 +01:00
@item < - full term
2002-01-02 03:54:15 +00:00
resets to the default of ten the debugger's maximum write depth. For
2014-04-21 11:14:18 +01:00
more information about @code{ write_ depth/2} (@pxref{ Input/Output Control} ).
2006-03-06 14:04:57 +00:00
@item A - alternatives
show the list of backtrack points in the current execution.
2007-02-18 00:26:36 +00:00
@item g [@var{ N} ]
show the list of ancestors in the current debugging environment. If it
receives @var{ N} , show the first @var{ N} ancestors.
2001-04-09 20:54:03 +01:00
@end table
The debugging information, when fast-skip @code{ quasi-leap} is used, will
be lost.
@node Efficiency, C-Interface, Debugging, Top
2006-04-20 16:28:08 +01:00
@chapter Efficiency Considerations
We next discuss several issues on trying to make Prolog programs run
fast in YAP. We assume two different programming styles:
2013-06-21 00:08:29 +01:00
@itemize @bullet
2006-05-19 18:49:25 +01:00
@item Execution of @emph{ deterministic} programs often
2006-04-20 16:28:08 +01:00
boils down to a recursive loop of the form:
@example
loop(Env) :-
do_ something(Env,NewEnv),
loop(NewEnv).
@end example
2013-06-21 00:08:29 +01:00
@end itemize
2006-04-20 16:28:08 +01:00
2014-04-10 11:59:30 +01:00
@c @section Deterministic Programs
2006-04-20 16:28:08 +01:00
2014-04-10 11:59:30 +01:00
@c @section Non-Deterministic Programs
2006-04-20 16:28:08 +01:00
2014-04-10 11:59:30 +01:00
@c @section Data-Base Operations
2006-04-20 16:28:08 +01:00
@section Indexing
2010-10-24 21:19:03 +01:00
The indexation mechanism restricts the set of clauses to be tried in a
procedure by using information about the status of the instantiated
arguments of the goal. These arguments are then used as a key,
selecting a restricted set of a clauses from all the clauses forming the
procedure.
2001-04-09 20:54:03 +01:00
As an example, the two clauses for concatenate:
@example
concatenate([],L,L).
concatenate([H|T],A,[H|NT]) :- concatenate(T,A,NT).
@end example
If the first argument for the goal is a list, then only the second clause
is of interest. If the first argument is the nil atom, the system needs to
look only for the first clause. The indexation generates instructions that
test the value of the first argument, and then proceed to a selected clause,
or group of clauses.
Note that if the first argument was a free variable, then both clauses
should be tried. In general, indexation will not be useful if the first
argument is a free variable.
When activating a predicate, a Prolog system needs to store state
information. This information, stored in a structure known as choice point
or fail point, is necessary when backtracking to other clauses for the
predicate. The operations of creating and using a choice point are very
expensive, both in the terms of space used and time spent.
Creating a choice point is not necessary if there is only a clause for
the predicate as there are no clauses to backtrack to. With indexation, this
situation is extended: in the example, if the first argument was the atom
nil, then only one clause would really be of interest, and it is pointless to
create a choice point. This feature is even more useful if the first argument
is a list: without indexation, execution would try the first clause, creating
a choice point. The clause would fail, the choice point would then be used to
restore the previous state of the computation and the second clause would
be tried. The code generated by the indexation mechanism would behave
much more efficiently: it would test the first argument and see whether it
is a list, and then proceed directly to the second clause.
An important side effect concerns the use of "cut". In the above
example, some programmers would use a "cut" in the first clause just to
inform the system that the predicate is not backtrackable and force the
removal the choice point just created. As a result, less space is needed but
with a great loss in expressive power: the "cut" would prevent some uses of
the procedure, like generating lists through backtracking. Of course, with
indexation the "cut" becomes useless: the choice point is not even created.
Indexation is also very important for predicates with a large number
of clauses that are used like tables:
@example
2009-04-25 16:59:23 +01:00
logician(aristoteles,greek).
2001-04-09 20:54:03 +01:00
logician(frege,german).
logician(russel,english).
logician(godel,german).
logician(whitehead,english).
@end example
An interpreter like C-Prolog, trying to answer the query:
@example
?- logician(godel,X).
@end example
@noindent
would blindly follow the standard Prolog strategy, trying first the
first clause, then the second, the third and finally finding the
relevant clause. Also, as there are some more clauses after the
important one, a choice point has to be created, even if we know the
next clauses will certainly fail. A "cut" would be needed to prevent
some possible uses for the procedure, like generating all logicians. In
this situation, the indexing mechanism generates instructions that
implement a search table. In this table, the value of the first argument
would be used as a key for fast search of possibly matching clauses. For
the query of the last example, the result of the search would be just
the fourth clause, and again there would be no need for a choice point.
If the first argument is a complex term, indexation will select clauses
just by testing its main functor. However, there is an important
exception: if the first argument of a clause is a list, the algorithm
also uses the list's head if not a variable. For instance, with the
following clauses,
@example
rules([],B,B).
rules([n(N)|T],I,O) :- rules_ for_ noun(N,I,N), rules(T,N,O).
rules([v(V)|T],I,O) :- rules_ for_ verb(V,I,N), rules(T,N,O).
rules([q(Q)|T],I,O) :- rules_ for_ qualifier(Q,I,N), rules(T,N,O).
@end example
@noindent
if the first argument of the goal is a list, its head will be tested, and only
the clauses matching it will be tried during execution.
Some advice on how to take a good advantage of this mechanism:
@itemize @bullet
@item
Try to make the first argument an input argument.
@item
Try to keep together all clauses whose first argument is not a
variable, that will decrease the number of tests since the other clauses are
always tried.
@item
Try to avoid predicates having a lot of clauses with the same key.
For instance, the procedure:
@end itemize
@example
type(n(mary),person).
type(n(john), person).
type(n(chair),object).
type(v(eat),active).
type(v(rest),passive).
@end example
@noindent
becomes more efficient with:
@example
type(n(N),T) :- type_ of_ noun(N,T).
type(v(V),T) :- type_ of_ verb(V,T).
type_ of_ noun(mary,person).
type_ of_ noun(john,person).
type_ of_ noun(chair,object).
type_ of_ verb(eat,active).
type_ of_ verb(rest,passive).
@end example
2007-02-18 00:26:36 +00:00
@node C-Interface,YAPLibrary,Efficiency,Top
2001-04-09 20:54:03 +01:00
@chapter C Language interface to YAP
2014-05-04 22:30:33 +01:00
YAP provides the user with three facilities for writing
predicates in a language other than Prolog. Under Unix systems,
most language implementations were linkable to @code{ C} , and the first interface exported the YAP machinery to the C language. YAP also implements most of the SWI-Prolog foreign language interface.
This gives portability with a number of SWI-Prolog packages. Last, a new C++ based interface is
being designed to work with the swig (@url(www.swig.org} ) interface compiler.
@ifplaintext
<ul>
<li> The original YAP C-interface exports the YAP engine.
</li>
2014-05-12 17:49:11 +01:00
<li>The @ref swi-c-interface emulates Jan Wielemaker's SWI foreign language interface.
2014-05-04 22:30:33 +01:00
</li>
2014-05-12 17:49:11 +01:00
<li>The @ref yap-cplus-interface is desiged to interface with Object-Oriented systems.
2014-05-04 22:30:33 +01:00
</li>
</ul>
@end ifplaintext
2001-04-09 20:54:03 +01:00
Before describing in full detail how to interface to C code, we will examine
a brief example.
Assume the user requires a predicate @code{ my_ process_ id(Id)} which succeeds
when @var{ Id} unifies with the number of the process under which YAP is running.
In this case we will create a @code{ my_ process.c} file containing the
C-code described below.
2014-04-21 11:14:18 +01:00
@c_ example
2001-04-09 20:54:03 +01:00
@cartouche
2014-04-10 11:59:30 +01:00
#include "YAP/YapInterface.h"
2001-04-09 20:54:03 +01:00
static int my_ process_ id(void)
@{
2002-09-09 18:40:12 +01:00
YAP_ Term pid = YAP_ MkIntTerm(getpid());
YAP_ Term out = YAP_ ARG1;
return(YAP_ Unify(out,pid));
2001-04-09 20:54:03 +01:00
@}
void init_ my_ predicates()
@{
2002-09-09 18:40:12 +01:00
YAP_ UserCPredicate("my_ process_ id",my_ process_ id,1);
2001-04-09 20:54:03 +01:00
@}
@end cartouche
2014-04-21 11:14:18 +01:00
@end c_ example
2001-04-09 20:54:03 +01:00
The commands to compile the above file depend on the operating
system. Under Linux (i386 and Alpha) you should use:
@example
2001-05-21 21:00:05 +01:00
gcc -c -shared -fPIC my_ process.c
2001-04-25 14:27:14 +01:00
ld -shared -o my_ process.so my_ process.o
2001-04-09 20:54:03 +01:00
@end example
@noindent
2007-05-06 10:55:12 +01:00
Under WIN32 in a MINGW/CYGWIN environment, using the standard
installation path you should use:
@example
gcc -mno-cygwin -I "c:/Yap/include" -c my_ process.c
gcc -mno-cygwin "c:/Yap/bin/yap.dll" --shared -o my_ process.dll my_ process.o
@end example
@noindent
Under WIN32 in a pure CYGWIN environment, using the standard
installation path, you should use:
@example
gcc -I/usr/local -c my_ process.c
gcc -shared -o my_ process.dll my_ process.o /usr/local/bin/yap.dll
@end example
@noindent
2001-04-09 20:54:03 +01:00
Under Solaris2 it is sufficient to use:
@example
gcc -fPIC -c my_ process.c
@end example
@noindent
Under SunOS it is sufficient to use:
@example
gcc -c my_ process.c
@end example
@noindent
Under Digital Unix you need to create a @code{ so} file. Use:
@example
gcc tst.c -c -fpic
ld my_ process.o -o my_ process.so -shared -expect_ unresolved '*'
@end example
@noindent
and replace my @code{ process.so} for my @code{ process.o} in the
remainder of the example.
@noindent
2009-04-25 16:59:23 +01:00
And could be loaded, under YAP, by executing the following Prolog goal
2001-04-09 20:54:03 +01:00
@example
load_ foreign_ files(['my_ process'],[],init_ my_ predicates).
@end example
2007-02-18 00:26:36 +00:00
Note that since YAP4.3.3 you should not give the suffix for object
2001-04-09 20:54:03 +01:00
files. YAP will deduce the correct suffix from the operating system it
is running under.
2009-04-25 16:59:23 +01:00
After loading that file the following Prolog goal
2001-04-09 20:54:03 +01:00
@example
my_ process_ id(N)
@end example
@noindent
2007-02-18 00:26:36 +00:00
would unify N with the number of the process under which YAP is running.
2001-04-09 20:54:03 +01:00
Having presented a full example, we will now examine in more detail the
contents of the C source code file presented above.
The include statement is used to make available to the C source code the
2009-04-25 16:59:23 +01:00
macros for the handling of Prolog terms and also some YAP public
2001-04-09 20:54:03 +01:00
definitions.
The function @code{ my_ process_ id} is the implementation, in C, of the
desired predicate. Note that it returns an integer denoting the success
of failure of the goal and also that it has no arguments even though the
predicate being defined has one.
2009-04-25 16:59:23 +01:00
In fact the arguments of a Prolog predicate written in C are accessed
2002-09-09 18:40:12 +01:00
through macros, defined in the include file, with names @var{ YAP_ ARG1} ,
@var{ YAP_ ARG2} , ..., @var{ YAP_ ARG16} or with @var{ YAP_ A} (@var{ N} )
where @var{ N} is the argument number (starting with 1). In the present
case the function uses just one local variable of type @code{ YAP_ Term} , the
2007-02-18 00:26:36 +00:00
type used for holding YAP terms, where the integer returned by the
2002-09-09 18:40:12 +01:00
standard unix function @code{ getpid()} is stored as an integer term (the
conversion is done by @code{ YAP_ MkIntTerm(Int))} . Then it calls the
pre-defined routine @code{ YAP_ Unify(YAP_ Term, YAP_ Term)} which in turn returns an
integer denoting success or failure of the unification.
2001-04-09 20:54:03 +01:00
2014-04-11 02:27:10 +01:00
@findex YAP_ UserCPredicate
2001-04-09 20:54:03 +01:00
The role of the procedure @code{ init_ my_ predicates} is to make known to
2002-09-09 18:40:12 +01:00
YAP, by calling @code{ YAP_ UserCPredicate} , the predicates being
2001-04-09 20:54:03 +01:00
defined in the file. This is in fact why, in the example above,
@code{ init_ my_ predicates} was passed as the third argument to
2014-04-10 11:59:30 +01:00
@code{ load_ foreign_ files/3} .
2001-04-09 20:54:03 +01:00
The rest of this appendix describes exhaustively how to interface C to YAP.
@menu
* Manipulating Terms:: Primitives available to the C programmer
2001-05-24 16:26:41 +01:00
* Unifying Terms:: How to Unify Two Prolog Terms
* Manipulating Strings:: From character arrays to Lists of codes and back
2007-02-18 00:26:36 +00:00
* Memory Allocation:: Stealing Memory From YAP
* Controlling Streams:: Control How YAP sees Streams
2010-08-02 19:48:17 +01:00
* Utility Functions:: From character arrays to Lists of codes and back
2007-02-18 00:26:36 +00:00
* Calling YAP From C:: From C to YAP to C to YAP
2007-12-05 12:17:25 +00:00
* Module Manipulation in C:: Create and Test Modules from within C
2010-08-31 04:25:56 +01:00
* Miscellaneous C-Functions:: Other Helpful Interface Functions
2001-05-24 16:26:41 +01:00
* Writing C:: Writing Predicates in C
* Loading Objects:: Loading Object Files
2007-02-18 00:26:36 +00:00
* Save& Rest:: Saving and Restoring
* YAP4 Notes:: Changes in Foreign Predicates Interface
2001-04-09 20:54:03 +01:00
@end menu
2001-05-24 16:26:41 +01:00
@node Manipulating Terms, Unifying Terms, , C-Interface
2001-04-09 20:54:03 +01:00
@section Terms
This section provides information about the primitives available to the C
2009-04-25 16:59:23 +01:00
programmer for manipulating Prolog terms.
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
Several C typedefs are included in the header file @code{ yap/YAPInterface.h} to
2009-04-25 16:59:23 +01:00
describe, in a portable way, the C representation of Prolog terms.
2001-04-09 20:54:03 +01:00
The user should write is programs using this macros to ensure portability of
code across different versions of YAP.
2002-09-09 18:40:12 +01:00
The more important typedef is @var{ YAP_ Term} which is used to denote the
2009-04-25 16:59:23 +01:00
type of a Prolog term.
2001-04-09 20:54:03 +01:00
Terms, from a point of view of the C-programmer, can be classified as
follows
@table @i
@item uninstantiated variables
@item instantiated variables
@item integers
@item floating-point numbers
@item database references
@item atoms
@item pairs (lists)
@item compound terms
@end table
2002-09-09 18:40:12 +01:00
The primitive
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Bool YAP_ IsVarTerm(YAP_ Term @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ IsVarTerm (C-Interface function)
2001-04-09 20:54:03 +01:00
@noindent
returns true iff its argument is an uninstantiated variable. Conversely the
primitive
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ NonVarTerm(YAP_ Term @var{ t} )
@findex YAP_ IsNonVarTerm (C-Interface function)
2001-04-09 20:54:03 +01:00
returns true iff its argument is not a variable.
2014-04-10 11:59:30 +01:00
@end table
@noindent
2001-04-09 20:54:03 +01:00
The user can create a new uninstantiated variable using the primitive
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ MkVarTerm()
@end table
2001-04-09 20:54:03 +01:00
2014-04-10 11:59:30 +01:00
The following primitives can be used to discriminate among the different types
of non-variable terms:
@table @code
@item YAP_ Bool YAP_ IsIntTerm(YAP_ Term @var{ t} )
2002-09-09 18:40:12 +01:00
@findex YAP_ IsIntTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ IsFloatTerm(YAP_ Term @var{ t} )
2002-09-09 18:40:12 +01:00
@findex YAP_ IsFloatTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ IsDbRefTerm(YAP_ Term @var{ t} )
2002-09-09 18:40:12 +01:00
@findex YAP_ IsDBRefTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ IsAtomTerm(YAP_ Term @var{ t} )
2002-09-09 18:40:12 +01:00
@findex YAP_ IsAtomTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ IsPairTerm(YAP_ Term @var{ t} )
2002-09-09 18:40:12 +01:00
@findex YAP_ IsPairTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ IsApplTerm(YAP_ Term @var{ t} )
2002-09-09 18:40:12 +01:00
@findex YAP_ IsApplTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@item YAP_ Bool YAP_ IsCompoundTerm(YAP_ Term @var{ t} )
2011-11-02 22:55:56 +00:00
@findex YAP_ IsCompoundTerm (C-Interface function)
2014-04-11 02:27:10 +01:00
@end table
2001-04-09 20:54:03 +01:00
2011-10-27 11:35:07 +01:00
The next primitive gives the type of a Prolog term:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ tag_ t YAP_ TagOfTerm(YAP_ Term @var{ t} )
@end table
2011-10-27 11:35:07 +01:00
The set of possible values is an enumerated type, with the following values:
@table @i
@item @code{ YAP_ TAG_ ATT} : an attributed variable
@item @code{ YAP_ TAG_ UNBOUND} : an unbound variable
@item @code{ YAP_ TAG_ REF} : a reference to a term
@item @code{ YAP_ TAG_ PAIR} : a list
@item @code{ YAP_ TAG_ ATOM} : an atom
@item @code{ YAP_ TAG_ INT} : a small integer
@item @code{ YAP_ TAG_ LONG_ INT} : a word sized integer
@item @code{ YAP_ TAG_ BIG_ INT} : a very large integer
@item @code{ YAP_ TAG_ RATIONAL} : a rational number
@item @code{ YAP_ TAG_ FLOAT} : a floating point number
@item @code{ YAP_ TAG_ OPAQUE} : an opaque term
@item @code{ YAP_ TAG_ APPL} : a compound term
@end table
2002-09-09 18:40:12 +01:00
Next, we mention the primitives that allow one to destruct and construct
terms. All the above primitives ensure that their result is
@i{ dereferenced} , i.e. that it is not a pointer to another term.
2001-04-09 20:54:03 +01:00
The following primitives are provided for creating an integer term from an
integer and to access the value of an integer term.
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ MkIntTerm(YAP_ Int @var{ i} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkIntTerm (C-Interface function)
@item YAP_ Int YAP_ IntOfTerm(YAP_ Term @var{ t} )
@findex YAP_ IntOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-09-09 18:40:12 +01:00
where @code{ YAP_ Int} is a typedef for the C integer type appropriate for
the machine or compiler in question (normally a long integer). The size
of the allowed integers is implementation dependent but is always
greater or equal to 24 bits: usually 32 bits on 32 bit machines, and 64
on 64 bit machines.
2001-04-09 20:54:03 +01:00
The two following primitives play a similar role for floating-point terms
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ MkFloatTerm(YAP_ flt @var{ double} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkFloatTerm (C-Interface function)
@item YAP_ flt YAP_ FloatOfTerm(YAP_ Term @var{ t} )
@findex YAP_ FloatOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-09-09 18:40:12 +01:00
where @code{ flt} is a typedef for the appropriate C floating point type,
nowadays a @code{ double}
2001-04-09 20:54:03 +01:00
2004-05-14 18:11:32 +01:00
The following primitives are provided for verifying whether a term is
a big int, creating a term from a big integer and to access the value
of a big int from a term.
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Bool YAP_ IsBigNumTerm(YAP_ Term @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ IsBigNumTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ MkBigNumTerm(void *@var{ b} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkBigNumTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item void *YAP_ BigNumOfTerm(YAP_ Term @var{ t} , void *@var{ b} )
2014-04-11 02:27:10 +01:00
@findex YAP_ BigNumOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2004-05-14 18:11:32 +01:00
@noindent
2006-01-02 03:35:45 +00:00
YAP must support bignum for the configuration you are using (check the
2007-02-18 00:26:36 +00:00
YAP configuration and setup). For now, YAP only supports the GNU GMP
2006-01-02 03:35:45 +00:00
library, and @code{ void *} will be a cast for @code{ mpz_ t} . Notice
that @code{ YAP_ BigNumOfTerm} requires the number to be already
initialised. As an example, we show how to print a bignum:
@example
static int
p_ print_ bignum(void)
2006-01-10 11:35:27 +00:00
@{
2006-01-02 03:35:45 +00:00
mpz_ t mz;
if (!YAP_ IsBigNumTerm(YAP_ ARG1))
return FALSE;
mpz_ init(mz);
YAP_ BigNumOfTerm(YAP_ ARG1, mz);
gmp_ printf("Shows up as %Zd\n", mz);
mpz_ clear(mz);
2006-01-02 23:19:10 +00:00
return TRUE;
2006-01-10 11:35:27 +00:00
@}
2006-01-02 03:35:45 +00:00
@end example
2004-05-14 18:11:32 +01:00
2002-09-09 18:40:12 +01:00
Currently, no primitives are supplied to users for manipulating data base
2001-04-09 20:54:03 +01:00
references.
2009-04-25 16:59:23 +01:00
A special typedef @code{ YAP_ Atom} is provided to describe Prolog
2002-09-09 18:40:12 +01:00
@i{ atoms} (symbolic constants). The two following primitives can be used
to manipulate atom terms
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ MkAtomTerm(YAP_ Atom at)
2014-04-11 02:27:10 +01:00
@findex YAP_ MkAtomTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Atom YAP_ AtomOfTerm(YAP_ Term @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ AtomOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-09-09 18:40:12 +01:00
The following primitives are available for associating atoms with their
2001-04-09 20:54:03 +01:00
names
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Atom YAP_ LookupAtom(char * @var{ s} )
2014-04-11 02:27:10 +01:00
@findex YAP_ LookupAtom (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Atom YAP_ FullLookupAtom(char * @var{ s} )
2014-04-11 02:27:10 +01:00
@findex YAP_ FullLookupAtom (C-Interface function)
2014-04-10 11:59:30 +01:00
@item char *YAP_ AtomName(YAP_ Atom @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ AtomName (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2002-09-09 18:40:12 +01:00
The function @code{ YAP_ LookupAtom} looks up an atom in the standard hash
table. The function @code{ YAP_ FullLookupAtom} will also search if the
atom had been "hidden": this is useful for system maintenance from C
code. The functor @code{ YAP_ AtomName} returns a pointer to the string
for the atom.
2007-09-04 11:34:55 +01:00
@noindent
The following primitives handle constructing atoms from strings with
wide characters, and vice-versa:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Atom YAP_ LookupWideAtom(wchar_ t * @var{ s} )
2014-04-11 02:27:10 +01:00
@findex YAP_ LookupWideAtom (C-Interface function)
2014-04-10 11:59:30 +01:00
@item wchar_ t *YAP_ WideAtomName(YAP_ Atom @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ WideAtomName (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2007-09-04 11:34:55 +01:00
@noindent
The following primitive tells whether an atom needs wide atoms in its
representation:
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ IsWideAtom(YAP_ Atom @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ IsIsWideAtom (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2007-09-04 11:34:55 +01:00
@noindent
The following primitive can be used to obtain the size of an atom in a
representation-independent way:
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ AtomNameLength(YAP_ Atom @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ AtomNameLength (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2007-09-04 11:34:55 +01:00
2007-12-05 12:17:25 +00:00
The next routines give users some control over the atom
garbage collector. They allow the user to guarantee that an atom is not
to be garbage collected (this is important if the atom is hold
externally to the Prolog engine, allow it to be collected, and call a
hook on garbage collection:
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ AtomGetHold(YAP_ Atom @var{ at} )
2014-04-11 02:27:10 +01:00
@findex YAP_ AtomGetHold (C-Interface function)
2014-04-10 11:59:30 +01:00
@item int YAP_ AtomReleaseHold(YAP_ Atom @var{ at} )
2014-04-11 02:27:10 +01:00
@findex YAP_ AtomReleaseHold (C-Interface function)
2014-04-10 11:59:30 +01:00
@item int YAP_ AGCRegisterHook(YAP_ AGC_ hook @var{ f} )
2014-04-11 02:27:10 +01:00
@findex YAP_ AGCHook (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2007-12-05 12:17:25 +00:00
2009-04-25 16:59:23 +01:00
A @i{ pair} is a Prolog term which consists of a tuple of two Prolog
2002-09-09 18:40:12 +01:00
terms designated as the @i{ head} and the @i{ tail} of the term. Pairs are
most often used to build @emph{ lists} . The following primitives can be
used to manipulate pairs:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ MkPairTerm(YAP_ Term @var{ Head} , YAP_ Term @var{ Tail} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkPairTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ MkNewPairTerm(void)
2014-04-11 02:27:10 +01:00
@findex YAP_ MkNewPairTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ HeadOfTerm(YAP_ Term @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ HeadOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ TailOfTerm(YAP_ Term @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ TailOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ MkListFromTerms(YAP_ Term *@var{ pt} , YAP_ Int *@var{ sz} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkListFromTerms (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2002-09-09 18:40:12 +01:00
One can construct a new pair from two terms, or one can just build a
pair whose head and tail are new unbound variables. Finally, one can
fetch the head or the tail.
2011-11-18 16:26:11 +00:00
The last function supports the common operation of constructing a list from an
array of terms of size @var{ sz} in a simple sweep.
Notice that the list constructors can call the garbage collector if
there is not enough space in the global stack.
2001-04-09 20:54:03 +01:00
A @i{ compound} term consists of a @i{ functor} and a sequence of terms with
length equal to the @i{ arity} of the functor. A functor, described in C by
the typedef @code{ Functor} , consists of an atom and of an integer.
The following primitives were designed to manipulate compound terms and
functors
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ MkApplTerm(YAP_ Functor @var{ f} , unsigned long int @var{ n} , YAP_ Term[] @var{ args} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkApplTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ MkNewApplTerm(YAP_ Functor @var{ f} , int @var{ n} )
2014-04-11 02:27:10 +01:00
@findex YAP_ MkNewApplTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term YAP_ ArgOfTerm(int argno,YAP_ Term @var{ ts} )
2014-04-11 02:27:10 +01:00
@findex YAP_ ArgOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Term *YAP_ ArgsOfTerm(YAP_ Term @var{ ts} )
2014-04-11 02:27:10 +01:00
@findex YAP_ ArgsOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@item YAP_ Functor YAP_ FunctorOfTerm(YAP_ Term @var{ ts} )
2014-04-11 02:27:10 +01:00
@findex YAP_ FunctorOfTerm (C-Interface function)
2014-04-10 11:59:30 +01:00
@end table
2002-09-09 18:40:12 +01:00
@noindent
The @code{ YAP_ MkApplTerm} function constructs a new term, with functor
@var{ f} (of arity @var{ n} ), and using an array @var{ args} of @var{ n}
terms with @var{ n} equal to the arity of the
functor. @code{ YAP_ MkNewApplTerm} builds up a compound term whose
arguments are unbound variables. @code{ YAP_ ArgOfTerm} gives an argument
to a compound term. @code{ argno} should be greater or equal to 1 and
2007-06-04 13:28:02 +01:00
less or equal to the arity of the functor. @code{ YAP_ ArgsOfTerm}
2009-04-25 16:59:23 +01:00
returns a pointer to an array of arguments.
2002-09-09 18:40:12 +01:00
2011-11-18 16:26:11 +00:00
Notice that the compound term constructors can call the garbage
collector if there is not enough space in the global stack.
2002-09-09 18:40:12 +01:00
YAP allows one to manipulate the functors of compound term. The function
@code{ YAP_ FunctorOfTerm} allows one to obtain a variable of type
@code{ YAP_ Functor} with the functor to a term. The following functions
then allow one to construct functors, and to obtain their name and arity.
@findex YAP_ MkFunctor (C-Interface function)
@findex YAP_ NameOfFunctor (C-Interface function)
@findex YAP_ ArityOfFunctor (C-Interface function)
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Functor YAP_ MkFunctor(YAP_ Atom @var{ a} ,unsigned long int @var{ arity} )
@item YAP_ Atom YAP_ NameOfFunctor(YAP_ Functor @var{ f} )
@item YAP_ Int YAP_ ArityOfFunctor(YAP_ Functor @var{ f} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-10-11 04:39:11 +01:00
Note that the functor is essentially a pair formed by an atom, and
2002-09-09 18:40:12 +01:00
arity.
2001-04-09 20:54:03 +01:00
2012-03-30 09:49:36 +01:00
Constructing terms in the stack may lead to overflow. The routine
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ RequiresExtraStack(size_ t @var{ min} )
@end table
2012-03-30 09:49:36 +01:00
verifies whether you have at least @var{ min} cells free in the stack,
and it returns true if it has to ensure enough memory by calling the
garbage collector and or stack shifter. The routine returns false if no
memory is needed, and a negative number if it cannot provide enough
memory.
You can set @var{ min} to zero if you do not know how much room you need
but you do know you do not need a big chunk at a single go. Usually, the routine
would usually be called together with a long-jump to restart the
code. Slots can also be used if there is small state.
2001-05-24 16:26:41 +01:00
@node Unifying Terms, Manipulating Strings, Manipulating Terms, C-Interface
@section Unification
2002-09-09 18:40:12 +01:00
@findex YAP_ Unify (C-Interface function)
2009-04-25 16:59:23 +01:00
YAP provides a single routine to attempt the unification of two Prolog
2002-09-09 18:40:12 +01:00
terms. The routine may succeed or fail:
2014-04-10 11:59:30 +01:00
@table @code
@item Int YAP_ Unify(YAP_ Term @var{ a} , YAP_ Term @var{ b} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-09-09 18:40:12 +01:00
The routine attempts to unify the terms @var{ a} and
@var{ b} returning @code{ TRUE} if the unification succeeds and @code{ FALSE}
otherwise.
2001-04-09 20:54:03 +01:00
2001-05-24 16:26:41 +01:00
@node Manipulating Strings, Memory Allocation, Unifying Terms, C-Interface
@section Strings
2002-09-09 18:40:12 +01:00
@findex YAP_ StringToBuffer (C-Interface function)
2001-04-09 20:54:03 +01:00
The YAP C-interface now includes an utility routine to copy a string
represented as a list of a character codes to a previously allocated buffer
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ StringToBuffer(YAP_ Term @var{ String} , char *@var{ buf} , unsigned int @var{ bufsize} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-09-09 18:40:12 +01:00
The routine copies the list of character codes @var{ String} to a
previously allocated buffer @var{ buf} . The string including a
terminating null character must fit in @var{ bufsize} characters,
otherwise the routine will simply fail. The @var{ StringToBuffer} routine
fails and generates an exception if @var{ String} is not a valid string.
@findex YAP_ BufferToString (C-Interface function)
2008-07-24 17:02:04 +01:00
@findex YAP_ NBufferToString (C-Interface function)
@findex YAP_ WideBufferToString (C-Interface function)
@findex YAP_ NWideBufferToString (C-Interface function)
2002-09-09 18:40:12 +01:00
@findex YAP_ BufferToAtomList (C-Interface function)
2008-07-24 17:02:04 +01:00
@findex YAP_ NBufferToAtomList (C-Interface function)
@findex YAP_ WideBufferToAtomList (C-Interface function)
@findex YAP_ NWideBufferToAtomList (C-Interface function)
@findex YAP_ BufferToDiffList (C-Interface function)
@findex YAP_ NBufferToDiffList (C-Interface function)
@findex YAP_ WideBufferToDiffList (C-Interface function)
@findex YAP_ NWideBufferToDiffList (C-Interface function)
2001-05-24 16:26:41 +01:00
The C-interface also includes utility routines to do the reverse, that
2008-07-24 17:02:04 +01:00
is, to copy a from a buffer to a list of character codes, to a
difference list, or to a list of
character atoms. The routines work either on strings of characters or
strings of wide characters:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ BufferToString(char *@var{ buf} )
@item YAP_ Term YAP_ NBufferToString(char *@var{ buf} , size_ t @var{ len} )
@item YAP_ Term YAP_ WideBufferToString(wchar_ t *@var{ buf} )
@item YAP_ Term YAP_ NWideBufferToString(wchar_ t *@var{ buf} , size_ t @var{ len} )
@item YAP_ Term YAP_ BufferToAtomList(char *@var{ buf} )
@item YAP_ Term YAP_ NBufferToAtomList(char *@var{ buf} , size_ t @var{ len} )
@item YAP_ Term YAP_ WideBufferToAtomList(wchar_ t *@var{ buf} )
@item YAP_ Term YAP_ NWideBufferToAtomList(wchar_ t *@var{ buf} , size_ t @var{ len} )
@end table
2001-05-24 16:26:41 +01:00
@noindent
2008-07-24 17:02:04 +01:00
Users are advised to use the @var{ N} version of the routines. Otherwise,
the user-provided string must include a terminating null character.
2001-05-24 16:26:41 +01:00
2004-05-14 17:33:47 +01:00
@findex YAP_ ReadBuffer (C-Interface function)
The C-interface function calls the parser on a sequence of characters
stored at @var{ buf} and returns the resulting term.
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ ReadBuffer(char *@var{ buf} ,YAP_ Term *@var{ error} )
@end table
2004-05-14 17:33:47 +01:00
@noindent
The user-provided string must include a terminating null
character. Syntax errors will cause returning @code{ FALSE} and binding
@var{ error} to a Prolog term.
2011-12-30 16:04:16 +00:00
2012-03-14 11:08:28 +00:00
@findex YAP_ IntsToList (C-Interface function)
2011-12-30 16:04:16 +00:00
@findex YAP_ FloatsToList (C-Interface function)
These C-interface functions are useful when converting chunks of data to Prolog:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ FloatsToList(double *@var{ buf} ,size_ t @var{ sz} )
@item YAP_ Term YAP_ IntsToList(YAP_ Int *@var{ buf} ,size_ t @var{ sz} )
@end table
2011-12-30 16:04:16 +00:00
@noindent
2012-03-14 11:08:28 +00:00
Notice that they are unsafe, and may call the garbage collector. They
return 0 on error.
@findex YAP_ ListToInts (C-Interface function)
@findex YAP_ ToListFloats (C-Interface function)
These C-interface functions are useful when converting Prolog lists to arrays:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Int YAP_ IntsToList(YAP_ Term t, YAP_ Int *@var{ buf} ,size_ t @var{ sz} )
@item YAP_ Int YAP_ FloatsToList(YAP_ Term t, double *@var{ buf} ,size_ t @var{ sz} )
@end table
2012-03-14 11:08:28 +00:00
@noindent
They return the number of integers scanned, up to a maximum of @t{ sz} ,
and @t{ -1} on error.
2004-05-14 17:33:47 +01:00
2001-05-24 16:26:41 +01:00
@node Memory Allocation, Controlling Streams, Manipulating Strings, C-Interface
@section Memory Allocation
2007-02-18 00:26:36 +00:00
@findex YAP_ AllocSpaceFromYAP (C-Interface function)
2001-04-09 20:54:03 +01:00
The next routine can be used to ask space from the Prolog data-base:
2014-04-10 11:59:30 +01:00
@table @code
@item void *YAP_ AllocSpaceFromYAP(int @var{ size} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
The routine returns a pointer to a buffer allocated from the code area,
2002-09-09 18:40:12 +01:00
or @code{ NULL} if sufficient space was not available.
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
@findex YAP_ FreeSpaceFromYAP (C-Interface function)
The space allocated with @code{ YAP_ AllocSpaceFromYAP} can be released
back to YAP by using:
2014-04-10 11:59:30 +01:00
@table @code
@item void YAP_ FreeSpaceFromYAP(void *@var{ buf} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
The routine releases a buffer allocated from the code area. The system
may crash if @code{ buf} is not a valid pointer to a buffer in the code
area.
2010-08-02 19:48:17 +01:00
@node Controlling Streams, Utility Functions, Memory Allocation, C-Interface
2007-02-18 00:26:36 +00:00
@section Controlling YAP Streams from @code{ C}
2001-05-24 16:26:41 +01:00
2002-09-09 18:40:12 +01:00
@findex YAP_ StreamToFileNo (C-Interface function)
2001-05-24 16:26:41 +01:00
The C-Interface also provides the C-application with a measure of
2007-02-18 00:26:36 +00:00
control over the YAP Input/Output system. The first routine allows one
2001-05-24 16:26:41 +01:00
to find a file number given a current stream:
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ StreamToFileNo(YAP_ Term @var{ stream} )
@end table
2001-05-24 16:26:41 +01:00
@noindent
This function gives the file descriptor for a currently available
stream. Note that null streams and in memory streams do not have
corresponding open streams, so the routine will return a
2007-02-18 00:26:36 +00:00
negative. Moreover, YAP will not be aware of any direct operations on
2001-05-24 16:26:41 +01:00
this stream, so information on, say, current stream position, may become
stale.
2002-09-09 18:40:12 +01:00
@findex YAP_ CloseAllOpenStreams (C-Interface function)
2001-05-24 16:26:41 +01:00
A second routine that is sometimes useful is:
2014-04-10 11:59:30 +01:00
@table @code
@item void YAP_ CloseAllOpenStreams(void)
@end table
2001-05-24 16:26:41 +01:00
@noindent
2007-02-18 00:26:36 +00:00
This routine closes the YAP Input/Output system except for the first
2001-05-24 16:26:41 +01:00
three streams, that are always associated with the three standard Unix
streams. It is most useful if you are doing @code{ fork()} .
2008-07-11 18:02:10 +01:00
@findex YAP_ FlushAllStreams (C-Interface function)
Last, one may sometimes need to flush all streams:
2014-04-10 11:59:30 +01:00
@table @code
@item void YAP_ CloseAllOpenStreams(void)
@end table
2008-07-11 18:02:10 +01:00
@noindent
It is also useful before you do a @code{ fork()} , or otherwise you may
have trouble with unflushed output.
2002-09-09 18:40:12 +01:00
@findex YAP_ OpenStream (C-Interface function)
2001-05-24 16:26:41 +01:00
The next routine allows a currently open file to become a stream. The
routine receives as arguments a file descriptor, the true file name as a
2002-10-11 04:39:11 +01:00
string, an atom with the user name, and a set of flags:
2014-04-10 11:59:30 +01:00
@table @code
@item void YAP_ OpenStream(void *@var{ FD} , char *@var{ name} , YAP_ Term @var{ t} , int @var{ flags} )
@end table
2001-05-24 16:26:41 +01:00
@noindent
The available flags are @code{ YAP_ INPUT_ STREAM} ,
@code{ YAP_ OUTPUT_ STREAM} , @code{ YAP_ APPEND_ STREAM} ,
@code{ YAP_ PIPE_ STREAM} , @code{ YAP_ TTY_ STREAM} , @code{ YAP_ POPEN_ STREAM} ,
@code{ YAP_ BINARY_ STREAM} , and @code{ YAP_ SEEKABLE_ STREAM} . By default, the
2002-09-09 18:40:12 +01:00
stream is supposed to be at position 0. The argument @var{ name} gives
the name by which YAP should know the new stream.
2001-05-24 16:26:41 +01:00
2010-08-02 19:48:17 +01:00
@node Utility Functions, Calling YAP From C, Controlling Streams, C-Interface
@section Utility Functions in @code{ C}
The C-Interface provides the C-application with a a number of utility
functions that are useful.
@findex YAP_ Record (C-Interface function)
The first provides a way to insert a term into the data-base
2014-04-10 11:59:30 +01:00
@table @code
@item void *YAP_ Record(YAP_ Term @var{ t} )
@end table
2010-08-02 19:48:17 +01:00
@noindent
This function returns a pointer to a copy of the term in the database
(or to @t{ NULL} if the operation fails.
@findex YAP_ Recorded (C-Interface function)
The next functions provides a way to recover the term from the data-base:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ Recorded(void *@var{ handle} )
@end table
2010-08-02 19:48:17 +01:00
@noindent
Notice that the semantics are the same as for @code{ recorded/3} : this
function creates a new copy of the term in the stack, with fresh
variables. The function returns @t{ 0L} if it cannot create a new term.
@findex YAP_ Erase (C-Interface function)
Last, the next function allows one to recover space:
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ Erase(void *@var{ handle} )
@end table
2010-08-02 19:48:17 +01:00
@noindent
Notice that any accesses using @var{ handle} after this operation may
lead to a crash.
The following functions are often required to compare terms.
@findex YAP_ ExactlyEqual (C-Interface function)
2011-10-27 11:35:07 +01:00
Succeed if two terms are actually the same term, as in
2010-08-02 19:48:17 +01:00
@code{ ==/2} :
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ ExactlyEqual(YAP_ Term t1, YAP_ Term t2)
@end table
2010-08-02 19:48:17 +01:00
@noindent
2011-10-27 11:35:07 +01:00
The next function succeeds if two terms are variant terms, and returns
2010-08-02 19:48:17 +01:00
0 otherwise, as
@code{ =@=/2} :
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ Variant(YAP_ Term t1, YAP_ Term t2)
@end table
2010-08-02 19:48:17 +01:00
@noindent
2011-11-02 22:55:56 +00:00
The next functions deal with numbering variables in terms:
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ NumberVars(YAP_ Term t, YAP_ Int first_ number)
@item YAP_ Term YAP_ UnNumberVars(YAP_ Term t)
@item int YAP_ IsNumberedVariable(YAP_ Term t)
@end table
2011-11-02 22:55:56 +00:00
@noindent
The next one returns the length of a well-formed list @var{ t} , or
@code{ -1} otherwise:
2014-04-10 11:59:30 +01:00
@table @code
@item Int YAP_ ListLength(YAP_ Term t)
@end table
2011-11-02 22:55:56 +00:00
@noindent
2011-10-27 11:35:07 +01:00
Last, this function succeeds if two terms are unifiable:
@code{ =@=/2} :
2014-04-10 11:59:30 +01:00
@table @code
@item int YAP_ Unifiable(YAP_ Term t1, YAP_ Term t2)
@end table
2011-10-27 11:35:07 +01:00
@noindent
2010-08-02 19:48:17 +01:00
The second function computes a hash function for a term, as in
@code{ term_ hash/4} .
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Int YAP_ TermHash(YAP_ Term t, YAP_ Int range, YAP_ Int depth, int ignore_ variables));
@end table
2010-08-02 19:48:17 +01:00
@noindent
The first three arguments follow @code{ term_ has/4} . The last argument
indicates what to do if we find a variable: if @code{ 0} fail, otherwise
ignore the variable.
@node Calling YAP From C, Module Manipulation in C, Utility Functions, C-Interface
2001-05-24 16:26:41 +01:00
@section From @code{ C} back to Prolog
2007-12-05 12:17:25 +00:00
@findex YAP_ RunGoal (C-Interface function)
There are several ways to call Prolog code from C-code. By default, the
@code{ YAP_ RunGoal()} should be used for this task. It assumes the engine
has been initialised before:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Int YAP_ RunGoal(YAP_ Term Goal)
@end table
2007-12-05 12:17:25 +00:00
Execute query @var{ Goal} and return 1 if the query succeeds, and 0
otherwise. The predicate returns 0 if failure, otherwise it will return
an @var{ YAP_ Term} .
2008-06-04 14:58:42 +01:00
Quite often, one wants to run a query once. In this case you should use
@var{ Goal} :
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Int YAP_ RunGoalOnce(YAP_ Term Goal)
@end table
2008-06-04 14:58:42 +01:00
The @code{ YAP_ RunGoal()} function makes sure to recover stack space at
the end of execution.
2007-12-05 12:17:25 +00:00
2008-06-04 14:58:42 +01:00
Prolog terms are pointers: a problem users often find is that the term
@var{ Goal} may actually @emph{ be moved around} during the execution of
@code{ YAP_ RunGoal()} , due to garbage collection or stack shifting. If
this is possible, @var{ Goal} will become invalid after executing
@code{ YAP_ RunGoal()} . In this case, it is a good idea to save @var{ Goal}
2007-12-05 12:17:25 +00:00
@emph{ slots} , as shown next:
@example
long sl = YAP_ InitSlot(scoreTerm);
out = YAP_ RunGoal(t);
t = YAP_ GetFromSlot(sl);
YAP_ RecoverSlots(1);
if (out == 0) return FALSE;
@end example
2014-05-25 20:52:45 +01:00
@ifplaintext
@copydoc real
@end ifplaintext
@texinfo
2008-06-04 14:58:42 +01:00
Slots are safe houses in the stack, the garbage collector and the stack
shifter know about them and make sure they have correct values. In this
case, we use a slot to preserve @var{ t} during the execution of
@code{ YAP_ RunGoal} . When the execution of @var{ t} is over we read the
(possibly changed) value of @var{ t} back from the slot @var{ sl} and tell
YAP that the slot @var{ sl} is not needed and can be given back to the
system. The slot functions are as follows:
2007-12-05 12:17:25 +00:00
@table @code
2010-08-04 17:36:20 +01:00
@item YAP_ Int YAP_ NewSlots(int @var{ NumberOfSlots} )
2007-12-05 12:17:25 +00:00
@findex YAP_ NewSlots (C-Interface function)
Allocate @var{ NumberOfSlots} from the stack and return an handle to the
2009-04-25 16:59:23 +01:00
last one. The other handle can be obtained by decrementing the handle.
2007-12-05 12:17:25 +00:00
2010-08-04 17:36:20 +01:00
@item YAP_ Int YAP_ CurrentSlot(void)
2007-12-05 12:17:25 +00:00
@findex YAP_ CurrentSlot (C-Interface function)
Return a handle to the system's default slot.
2010-08-04 20:29:24 +01:00
@item YAP_ Int YAP_ InitSlot(YAP_ Term @var{ t} )
2007-12-05 12:17:25 +00:00
@findex YAP_ InitSlot (C-Interface function)
Create a new slot, initialise it with @var{ t} , and return a handle to
this slot, that also becomes the current slot.
2010-08-04 17:36:20 +01:00
@item YAP_ Term *YAP_ AddressFromSlot(YAP_ Int @var{ slot} )
2007-12-05 12:17:25 +00:00
@findex YAP_ AddressFromSlot (C-Interface function)
Return the address of slot @var{ slot} : please use with care.
2010-08-04 17:36:20 +01:00
@item void YAP_ PutInSlot(YAP_ Int @var{ slot} , YAP_ Term @var{ t} )
2007-12-05 12:17:25 +00:00
@findex YAP_ PutInSlot (C-Interface function)
Set the contents of slot @var{ slot} to @var{ t} .
2008-06-17 14:37:51 +01:00
@item int YAP_ RecoverSlots(int @var{ HowMany} )
2007-12-05 12:17:25 +00:00
@findex YAP_ RecoverSlots (C-Interface function)
Recover the space for @var{ HowMany} slots: these will include the
2008-06-17 14:37:51 +01:00
current default slot. Fails if no such slots exist.
2010-08-04 17:36:20 +01:00
@item YAP_ Int YAP_ ArgsToSlots(int @var{ HowMany} )
@findex YAP_ ArgsToSlots (C-Interface function)
Store the current first @var{ HowMany} arguments in new slots.
@item void YAP_ SlotsToArgs(int @var{ HowMany} , YAP_ Int @var{ slot} )
@findex YAP_ SlotsToArgs (C-Interface function)
Set the first @var{ HowMany} arguments to the @var{ HowMany} slots
starting at @var{ slot} .
2007-12-05 12:17:25 +00:00
@end table
2014-05-25 20:52:45 +01:00
@end texinfo
2007-12-05 12:17:25 +00:00
The following functions complement @var{ YAP_ RunGoal} :
@table @code
2011-12-05 18:51:57 +00:00
@item @code{ int} YAP_ RestartGoal(@code{ void} )
2007-12-05 12:17:25 +00:00
@findex YAP_ RestartGoal (C-Interface function)
Look for the next solution to the current query by forcing YAP to
backtrack to the latest goal. Notice that slots allocated since the last
@code{ YAP_ RunGoal} will become invalid.
2014-05-25 20:52:45 +01:00
@Item @code{ int} YAP_ Reset(@code{ void} )
2007-12-05 12:17:25 +00:00
@findex YAP_ Reset (C-Interface function)
Reset execution environment (similar to the @code{ abort/0}
built-in). This is useful when you want to start a new query before
asking all solutions to the previous query.
@item @code{ int} YAP_ ShutdownGoal(@code{ int backtrack} )
@findex YAP_ ShutdownGoal (C-Interface function)
Clean up the current goal. If
@code{ backtrack} is true, stack space will be recovered and bindings
will be undone. In both cases, any slots allocated since the goal was
created will become invalid.
@item @code{ YAP_ Bool} YAP_ GoalHasException(@code{ YAP_ Term *tp} )
2014-04-21 11:14:18 +01:00
@findex YAP_ GoalHasException (C-Interface function)
2007-12-05 12:17:25 +00:00
Check if the last goal generated an exception, and if so copy it to the
space pointed to by @var{ tp}
@item @code{ void} YAP_ ClearExceptions(@code{ void} )
@findex YAP_ ClearExceptions (C-Interface function)
Reset any exceptions left over by the system.
@end table
The @var{ YAP_ RunGoal} interface is designed to be very robust, but may
not be the most efficient when repeated calls to the same goal are made
and when there is no interest in processing exception. The
@var{ YAP_ EnterGoal} interface should have lower-overhead:
@table @code
@item @code{ YAP_ PredEntryPtr} YAP_ FunctorToPred(@code{ YAP_ Functor} @var{ f} ,
@findex YAP_ FunctorToPred (C-Interface function)
Return the predicate whose main functor is @var{ f} .
2014-04-21 11:14:18 +01:00
@item @code{ YAP_ PredEntryPtr} YAP_ AtomToPred(@code{ YAP_ Atom} @var{ at}
2007-12-05 12:17:25 +00:00
@findex YAP_ AtomToPred (C-Interface function)
Return the arity 0 predicate whose name is @var{ at} .
2012-07-13 20:57:12 +01:00
@item @code{ YAP_ PredEntryPtr}
YAP_ FunctorToPredInModule(@code{ YAP_ Functor} @var{ f} , @code{ YAP_ Module} @var{ m} ),
@findex YAP_ FunctorToPredInModule (C-Interface function)
Return the predicate in module @var{ m} whose main functor is @var{ f} .
@item @code{ YAP_ PredEntryPtr} YAP_ AtomToPred(@code{ YAP_ Atom} @var{ at} , @code{ YAP_ Module} @var{ m} ),
@findex YAP_ AtomToPredInModule (C-Interface function)
Return the arity 0 predicate in module @var{ m} whose name is @var{ at} .
2007-12-05 12:17:25 +00:00
@item @code{ YAP_ Bool} YAP_ EnterGoal(@code{ YAP_ PredEntryPtr} @var{ pe} ,
@code{ YAP_ Term *} @var{ array} , @code{ YAP_ dogoalinfo *} @var{ infop} )
@findex YAP_ EnterGoal (C-Interface function)
Execute a query for predicate @var{ pe} . The query is given as an
array of terms @var{ Array} . @var{ infop} is the address of a goal
handle that can be used to backtrack and to recover space. Succeeds if
a solution was found.
Notice that you cannot create new slots if an YAP_ EnterGoal goal is open.
@item @code{ YAP_ Bool} YAP_ RetryGoal(@code{ YAP_ dogoalinfo *} @var{ infop} )
2010-08-31 03:50:33 +01:00
2007-12-05 12:17:25 +00:00
@findex YAP_ RetryGoal (C-Interface function)
Backtrack to a query created by @code{ YAP_ EnterGoal} . The query is
given by the handle @var{ infop} . Returns whether a new solution could
be be found.
@item @code{ YAP_ Bool} YAP_ LeaveGoal(@code{ YAP_ Bool} @var{ backtrack} ,
@code{ YAP_ dogoalinfo *} @var{ infop} )
@findex YAP_ LeaveGoal (C-Interface function)
Exit a query query created by @code{ YAP_ EnterGoal} . If
@code{ backtrack} is @code{ TRUE} , variable bindings are undone and Heap
space is recovered. Otherwise, only stack space is recovered, ie,
@code{ LeaveGoal} executes a cut.
@end table
Next, follows an example of how to use @code{ YAP_ EnterGoal} :
@example
void
runall(YAP_ Term g)
@{
YAP_ dogoalinfo goalInfo;
YAP_ Term *goalArgs = YAP_ ArraysOfTerm(g);
YAP_ Functor *goalFunctor = YAP_ FunctorOfTerm(g);
2010-08-03 12:58:19 +01:00
YAP_ PredEntryPtr goalPred = YAP_ FunctorToPred(goalFunctor);
2007-12-05 12:17:25 +00:00
result = YAP_ EnterGoal( goalPred, goalArgs, & goalInfo );
while (result)
result = YAP_ RetryGoal( & goalInfo );
YAP_ LeaveGoal(TRUE, & goalInfo);
@}
@end example
2002-09-09 18:40:12 +01:00
@findex YAP_ CallProlog (C-Interface function)
2007-12-05 12:17:25 +00:00
YAP allows calling a @strong{ new} Prolog interpreter from @code{ C} . One
way is to first construct a goal @code{ G} , and then it is sufficient to
perform:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Bool YAP_ CallProlog(YAP_ Term @var{ G} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
2002-09-09 18:40:12 +01:00
the result will be @code{ FALSE} , if the goal failed, or @code{ TRUE} , if
the goal succeeded. In this case, the variables in @var{ G} will store
the values they have been unified with. Execution only proceeds until
2001-04-09 20:54:03 +01:00
finding the first solution to the goal, but you can call
@code{ findall/3} or friends if you need all the solutions.
2007-12-05 12:17:25 +00:00
Notice that during execution, garbage collection or stack shifting may
have moved the terms
2010-08-31 04:25:56 +01:00
@node Module Manipulation in C, Miscellaneous C-Functions, Calling YAP From C, C-Interface
2007-12-05 12:17:25 +00:00
@section Module Manipulation in C
YAP allows one to create a new module from C-code. To create the new
code it is sufficient to call:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Module YAP_ CreateModule(YAP_ Atom @var{ ModuleName} )
@end table
2007-12-05 12:17:25 +00:00
@noindent
Notice that the new module does not have any predicates associated and
that it is not the current module. To find the current module, you can call:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Module YAP_ CurrentModule()
@end table
2007-12-05 12:17:25 +00:00
Given a module, you may want to obtain the corresponding name. This is
2009-04-25 16:59:23 +01:00
possible by using:
2014-04-10 11:59:30 +01:00
@table @code
@item YAP_ Term YAP_ ModuleName(YAP_ Module mod)
@end table
2007-12-05 12:17:25 +00:00
@noindent
Notice that this function returns a term, and not an atom. You can
@code{ YAP_ AtomOfTerm} to extract the corresponding Prolog atom.
2010-08-31 04:25:56 +01:00
@node Miscellaneous C-Functions, Writing C, Module Manipulation in C, C-Interface
@section Miscellaneous C Functions
@table @code
2010-12-06 20:14:51 +00:00
@item @code{ void} YAP_ Throw(@code{ YAP_ Term exception} )
@item @code{ void} YAP_ AsyncThrow(@code{ YAP_ Term exception} )
@findex YAP_ Throw (C-Interface function)
@findex YAP_ AsyncThrow (C-Interface function)
Throw an exception with term @var{ exception} , just like if you called
@code{ throw/2} . The function @t{ YAP_ AsyncThrow} is supposed to be used
from interrupt handlers.
@c See also @code{ at_ halt/1} .
2010-08-31 04:25:56 +01:00
@item @code{ int} YAP_ SetYAPFlag(@code{ yap_ flag_ t flag, int value} )
@findex YAP_ SetYAPFlag (C-Interface function)
This function allows setting some YAP flags from @code{ C} .Currently,
only two boolean flags are accepted: @code{ YAPC_ ENABLE_ GC} and
@code{ YAPC_ ENABLE_ AGC} . The first enables/disables the standard garbage
collector, the second does the same for the atom garbage collector.`
2011-07-21 14:32:49 +01:00
@item @code{ YAP_ TERM} YAP_ AllocExternalDataInStack(@code{ size_ t bytes} )
@item @code{ void *} YAP_ ExternalDataInStackFromTerm(@code{ YAP_ Term t} )
@item @code{ YAP_ Bool} YAP_ IsExternalDataInStackTerm(@code{ YAP_ Term t} )
@findex YAP_ AllocExternalDataInStack (C-Interface function)
The next routines allow one to store external data in the Prolog
execution stack. The first routine reserves space for @var{ sz} bytes
and returns an opaque handle. The second routines receives the handle
and returns a pointer to the data. The last routine checks if a term
is an opaque handle.
Data will be automatically reclaimed during
backtracking. Also, this storage is opaque to the Prolog garbage compiler,
so it should not be used to store Prolog terms. On the other hand, it
may be useful to store arrays in a compact way, or pointers to external objects.
2010-09-24 14:00:53 +01:00
@item @code{ int} YAP_ HaltRegisterHook(@code{ YAP_ halt_ hook f, void *closure} )
@findex YAP_ HaltRegisterHook (C-Interface function)
Register the function @var{ f} to be called if YAP is halted. The
2011-07-21 14:32:49 +01:00
function is called with two arguments: the exit code of the process
(@code{ 0} if this cannot be determined on your operating system) and
the closure argument @var{ closure} .
2010-09-24 14:00:53 +01:00
@c See also @code{ at_ halt/1} .
2010-12-06 20:14:51 +00:00
2011-10-13 16:46:39 +01:00
@item @code{ int} YAP_ Argv(@code{ char ***argvp} )
@findex YAP_ Argv (C-Interface function)
Return the number of arguments to YAP and instantiate argvp to point to the list of such arguments.
2010-08-31 04:25:56 +01:00
@end table
@node Writing C, Loading Objects, Miscellaneous C-Functions, C-Interface
2001-04-09 20:54:03 +01:00
@section Writing predicates in C
We will distinguish two kinds of predicates:
@table @i
@item @i{ deterministic} predicates which either fail or succeed but are not
2002-09-09 18:40:12 +01:00
backtrackable, like the one in the introduction;
2001-04-09 20:54:03 +01:00
@item @i{ backtrackable}
predicates which can succeed more than once.
@end table
The first kind of predicates should be implemented as a C function with
no arguments which should return zero if the predicate fails and a
non-zero value otherwise. The predicate should be declared to
YAP, in the initialization routine, with a call to
2014-04-10 11:59:30 +01:00
@table @code
@item void YAP_ UserCPredicate(char *@var{ name} , YAP_ Bool *@var{ fn} (), unsigned long int @var{ arity} );
2014-04-11 02:27:10 +01:00
@findex YAP_ UserCPredicate (C-Interface function)
2001-04-09 20:54:03 +01:00
@noindent
2014-04-11 02:27:10 +01:00
where @var{ name} is a string with the name of the predicate, @var{ init} ,
@var{ cont} , @var{ cut} are the C functions used to start, continue and
when pruning the execution of the predicate, @var{ arity} is the
predicate arity, and @var{ sizeof} is the size of the data to be
preserved in the stack.
2002-09-09 18:40:12 +01:00
2010-01-15 15:09:18 +00:00
For the second kind of predicates we need three C functions. The first one
is called when the predicate is first activated; the second one
is called on backtracking to provide (possibly) other solutions; the
last one is called on pruning. Note
2001-04-09 20:54:03 +01:00
also that we normally also need to preserve some information to find out
the next solution.
In fact the role of the two functions can be better understood from the
2009-04-25 16:59:23 +01:00
following Prolog definition
2001-04-09 20:54:03 +01:00
@example
p :- start.
p :- repeat,
continue.
@end example
@noindent
where @code{ start} and @code{ continue} correspond to the two C functions
described above.
2014-04-11 02:27:10 +01:00
The interface works as follows:
@table @code
@item void YAP_ UserBackCutCPredicate(char *@var{ name} , int *@var{ init} (), int *@var{ cont} (), int *@var{ cut} (), unsigned long int @var{ arity} , unsigned int @var{ sizeof} )
@findex YAP_ UserBackCutCPredicate (C-Interface function)
@noindent
describes a new predicate where @var{ name} is the name of the predicate,
@var{ init} , @var{ cont} , and @var{ cut} are the C functions that implement
the predicate and @var{ arity} is the predicate's arity.
@item void YAP_ UserBackCPredicate(char *@var{ name} , int *@var{ init} (), int *@var{ cont} (), unsigned long int @var{ arity} , unsigned int @var{ sizeof} )
@findex YAP_ UserBackCPredicate (C-Interface function)
@noindent
describes a new predicate where @var{ name} is the name of the predicate,
@var{ init} , and @var{ cont} are the C functions that implement the
predicate and @var{ arity} is the predicate's arity.
@item void YAP_ PRESERVE_ DATA(@var{ ptr} , @var{ type} );
@findex YAP_ PRESERVE_ DATA (C-Interface function)
@item void YAP_ PRESERVED_ DATA(@var{ ptr} , @var{ type} );
@findex YAP_ PRESERVED_ DATA (C-Interface function)
@item void YAP_ PRESERVED_ DATA_ CUT(@var{ ptr} , @var{ type} );
@findex YAP_ PRESERVED_ DATA_ CUT (C-Interface function)
@item void YAP_ cut_ succeed( void );
@findex YAP_ cut_ succeed (C-Interface function)
@item void YAP_ cut_ fail( void );
@findex YAP_ cut_ fail (C-Interface function)
@end table
2001-04-09 20:54:03 +01:00
As an example we will consider implementing in C a predicate @code{ n100(N)}
which, when called with an instantiated argument should succeed if that
argument is a numeral less or equal to 100, and, when called with an
uninstantiated argument, should provide, by backtracking, all the positive
integers less or equal to 100.
To do that we first declare a structure, which can only consist
2009-04-25 16:59:23 +01:00
of Prolog terms, containing the information to be preserved on backtracking
2001-04-09 20:54:03 +01:00
and a pointer variable to a structure of that type.
@example
2007-02-18 00:26:36 +00:00
#include "YAPInterface.h"
2005-10-18 18:04:43 +01:00
static int start_ n100(void);
static int continue_ n100(void);
2001-04-09 20:54:03 +01:00
typedef struct @{
2014-04-09 12:39:52 +01:00
YAP_ Term next_ solution;
2001-04-09 20:54:03 +01:00
@} n100_ data_ type;
n100_ data_ type *n100_ data;
@end example
2001-05-24 16:26:41 +01:00
We now write the @code{ C} function to handle the first call:
2001-04-09 20:54:03 +01:00
@example
2005-10-18 18:04:43 +01:00
static int start_ n100(void)
2001-04-09 20:54:03 +01:00
@{
2005-10-18 18:04:43 +01:00
YAP_ Term t = YAP_ ARG1;
2002-09-09 18:40:12 +01:00
YAP_ PRESERVE_ DATA(n100_ data,n100_ data_ type);
if(YAP_ IsVarTerm(t)) @{
n100_ data->next_ solution = YAP_ MkIntTerm(0);
2005-10-18 18:04:43 +01:00
return continue_ n100();
2001-04-09 20:54:03 +01:00
@}
2002-09-09 18:40:12 +01:00
if(!YAP_ IsIntTerm(t) || YAP_ IntOfTerm(t)<0 || YAP_ IntOfTerm(t)>100) @{
YAP_ cut_ fail();
2001-05-24 16:26:41 +01:00
@} else @{
2002-09-09 18:40:12 +01:00
YAP_ cut_ succeed();
2001-05-24 16:26:41 +01:00
@}
2001-04-09 20:54:03 +01:00
@}
@end example
The routine starts by getting the dereference value of the argument.
2011-07-21 14:32:49 +01:00
The call to @code{ YAP_ PRESERVE_ DATA} is used to initialize the memory
which will hold the information to be preserved across
backtracking. The first argument is the variable we shall use, and the
second its type. Note that we can only use @code{ YAP_ PRESERVE_ DATA}
once, so often we will want the variable to be a structure. This data
is visible to the garbage collector, so it should consist of Prolog
terms, as in the example. It is also correct to store pointers to
objects external to YAP stacks, as the garbage collector will ignore
such references.
2001-04-09 20:54:03 +01:00
If the argument of the predicate is a variable, the routine initializes the
structure to be preserved across backtracking with the information
2014-05-04 22:30:33 +01:00
required to provide the next solution, and exits by calling
@code{ continue_ n100} to provide that solution.
2001-04-09 20:54:03 +01:00
2005-10-18 18:04:43 +01:00
If the argument was not a variable, the routine then checks if it was an
integer, and if so, if its value is positive and less than 100. In that
case it exits, denoting success, with @code{ YAP_ cut_ succeed} , or
otherwise exits with @code{ YAP_ cut_ fail} denoting failure.
2001-04-09 20:54:03 +01:00
2002-09-09 18:40:12 +01:00
The reason for using for using the functions @code{ YAP_ cut_ succeed} and
@code{ YAP_ cut_ fail} instead of just returning a non-zero value in the
first case, and zero in the second case, is that otherwise, if
backtracking occurred later, the routine @code{ continue_ n100} would be
called to provide additional solutions.
2001-04-09 20:54:03 +01:00
The code required for the second function is
@example
2005-10-18 18:04:43 +01:00
static int continue_ n100(void)
2001-04-09 20:54:03 +01:00
@{
int n;
2005-03-02 19:52:12 +00:00
YAP_ Term t;
2005-10-18 18:04:43 +01:00
YAP_ Term sol = YAP_ ARG1;
2002-09-09 18:40:12 +01:00
YAP_ PRESERVED_ DATA(n100_ data,n100_ data_ type);
n = YAP_ IntOfTerm(n100_ data->next_ solution);
2001-04-09 20:54:03 +01:00
if( n == 100) @{
2002-09-09 18:40:12 +01:00
t = YAP_ MkIntTerm(n);
2005-10-18 18:04:43 +01:00
YAP_ Unify(sol,t);
2002-09-09 18:40:12 +01:00
YAP_ cut_ succeed();
2001-04-09 20:54:03 +01:00
@}
else @{
2005-10-18 18:04:43 +01:00
YAP_ Unify(sol,n100_ data->next_ solution);
2002-09-09 18:40:12 +01:00
n100_ data->next_ solution = YAP_ MkIntTerm(n+1);
return(TRUE);
2001-04-09 20:54:03 +01:00
@}
@}
@end example
2002-09-09 18:40:12 +01:00
Note that again the macro @code{ YAP_ PRESERVED_ DATA} is used at the
beginning of the function to access the data preserved from the previous
solution. Then it checks if the last solution was found and in that
case exits with @code{ YAP_ cut_ succeed} in order to cut any further
backtracking. If this is not the last solution then we save the value
for the next solution in the data structure and exit normally with 1
denoting success. Note also that in any of the two cases we use the
2002-10-11 04:39:11 +01:00
function @code{ YAP_ unify} to bind the argument of the call to the value
2002-09-09 18:40:12 +01:00
saved in @code{ n100_ state->next_ solution} .
2001-04-09 20:54:03 +01:00
Note also that the only correct way to signal failure in a backtrackable
2002-09-09 18:40:12 +01:00
predicate is to use the @code{ YAP_ cut_ fail} macro.
2001-04-09 20:54:03 +01:00
Backtrackable predicates should be declared to YAP, in a way
similar to what happened with deterministic ones, but using instead a
call to
@example
2014-04-11 02:27:10 +01:00
2001-04-09 20:54:03 +01:00
@end example
@noindent
2014-04-11 02:27:10 +01:00
In this example, we would have something like
2005-10-18 18:04:43 +01:00
2011-07-21 14:32:49 +01:00
@example
void
init_ n100(void)
@{
YAP_ UserBackCutCPredicate("n100", start_ n100, continue_ n100, cut_ n100, 1, 1);
@}
@end example
The argument before last is the predicate's arity. Notice again the
last argument to the call. function argument gives the extra space we
want to use for @code{ PRESERVED_ DATA} . Space is given in cells, where
a cell is the same size as a pointer. The garbage collector has access
to this space, hence users should use it either to store terms or to
store pointers to objects outside the stacks.
The code for @code{ cut_ n100} could be:
@example
static int cut_ n100(void)
@{
YAP_ PRESERVED_ DATA_ CUT(n100_ data,n100_ data_ type*);
fprintf("n100 cut with counter %ld\n", YAP_IntOfTerm(n100_data->next_solution));
return TRUE;
@}
@end example
Notice that we have to use @code{ YAP_ PRESERVED_ DATA_ CUT} : this is
because the Prolog engine is at a different state during cut.
If no work is required at cut, we can use:
2005-10-18 18:04:43 +01:00
@example
void
init_ n100(void)
2005-10-31 18:12:51 +00:00
@{
2010-01-15 15:09:18 +00:00
YAP_ UserBackCutCPredicate("n100", start_ n100, continue_ n100, NULL, 1, 1);
2005-10-31 18:12:51 +00:00
@}
2005-10-18 18:04:43 +01:00
@end example
2011-07-21 14:32:49 +01:00
in this case no code is executed at cut time.
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
@node Loading Objects, Save& Rest, Writing C, C-Interface
2001-04-09 20:54:03 +01:00
@section Loading Object Files
The primitive predicate
2014-04-10 11:59:30 +01:00
@table @code
@item load_ foreign_ files(@var{ Files} ,@var{ Libs} ,@var{ InitRoutine} )
@end table
2001-04-09 20:54:03 +01:00
@noindent
should be used, from inside YAP, to load object files produced by the C
compiler. The argument @var{ ObjectFiles} should be a list of atoms
specifying the object files to load, @var{ Libs} is a list (possibly
empty) of libraries to be passed to the unix loader (@code{ ld} ) and
InitRoutine is the name of the C routine (to be called after the files
are loaded) to perform the necessary declarations to YAP of the
predicates defined in the files.
YAP will search for @var{ ObjectFiles} in the current directory first. If
it cannot find them it will search for the files using the environment
2014-04-11 02:27:10 +01:00
variable:
@table @code
@item YAPLIBDIR
@findex YAPLIBDIR
@noindent
@end table
if defined, or in the default library.
2001-04-09 20:54:03 +01:00
2010-06-17 00:32:52 +01:00
YAP also supports the SWI-Prolog interface to loading foreign code:
@table @code
@item open_ shared_ object(+@var{ File} , -@var{ Handle} )
@findex open_ shared_ object/2
@snindex open_ shared_ object/2
@cnindex open_ shared_ object/2
File is the name of a shared object file (called dynamic load
library in MS-Windows). This file is attached to the current process
and @var{ Handle} is unified with a handle to the library. Equivalent to
@code{ open_ shared_ object(File, [], Handle)} . See also
2014-04-10 11:59:30 +01:00
@code{ load_ foreign_ library/1} and @code{ load_ foreign_ library/2} .
2010-06-17 00:32:52 +01:00
On errors, an exception @code{ shared_ object} (@var{ Action} ,
@var{ Message} ) is raised. @var{ Message} is the return value from
dlerror().
@item open_ shared_ object(+@var{ File} , -@var{ Handle} , +@var{ Options} )
@findex open_ shared_ object/3
@snindex open_ shared_ object/3
@cnindex open_ shared_ object/3
As @code{ open_ shared_ object/2} , but allows for additional flags to
be passed. @var{ Options} is a list of atoms. @code{ now} implies the
symbols are
resolved immediately rather than lazily (default). @code{ global} implies
symbols of the loaded object are visible while loading other shared
objects (by default they are local). Note that these flags may not
be supported by your operating system. Check the documentation of
@code{ dlopen()} or equivalent on your operating system. Unsupported
flags are silently ignored.
@item close_ shared_ object(+@var{ Handle} )
@findex close_ shared_ object/1
@snindex close_ shared_ object/1
@cnindex close_ shared_ object/1
Detach the shared object identified by @var{ Handle} .
@item call_ shared_ object_ function(+@var{ Handle} , +@var{ Function} )
@findex call_ shared_ object_ function/2
@snindex call_ shared_ object_ function/2
@cnindex call_ shared_ object_ function/2
Call the named function in the loaded shared library. The function
is called without arguments and the return-value is
ignored. In SWI-Prolog, normally this function installs foreign
language predicates using calls to @code{ PL_ register_ foreign()} .
@end table
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
@node Save& Rest, YAP4 Notes, Loading Objects, C-Interface
2001-04-09 20:54:03 +01:00
@section Saving and Restoring
@comment The primitive predicates @code{ save} and @code{ restore} will save and restore
2014-04-10 11:59:30 +01:00
@comment object code loaded with @code{ load_ foreign_ files/3} . However, the values of
2001-04-09 20:54:03 +01:00
@comment any non-static data created by the C files loaded will not be saved nor
@comment restored.
2007-02-18 00:26:36 +00:00
YAP4 currently does not support @code{ save} and @code{ restore} for object code
2014-04-10 11:59:30 +01:00
loaded with @code{ load_ foreign_ files/3} . We plan to support save and restore
2007-02-18 00:26:36 +00:00
in future releases of YAP.
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
@node YAP4 Notes, , Save& Rest, C-Interface
@section Changes to the C-Interface in YAP4
2001-04-09 20:54:03 +01:00
2014-04-10 11:59:30 +01:00
YAP4 includes several changes over the previous @code{ load_ foreign_ files/3}
2001-04-09 20:54:03 +01:00
interface. These changes were required to support the new binary code
formats, such as ELF used in Solaris2 and Linux.
@itemize @bullet
2002-09-09 18:40:12 +01:00
@item All Names of YAP objects now start with @var{ YAP_ } . This is
2007-02-18 00:26:36 +00:00
designed to avoid clashes with other code. Use @code{ YAPInterface.h} to
2002-09-09 18:40:12 +01:00
take advantage of the new interface. @code{ c_ interface.h} is still
available if you cannot port the code to the new interface.
2001-04-09 20:54:03 +01:00
@item Access to elements in the new interface always goes through
@emph{ functions} . This includes access to the argument registers,
2002-09-09 18:40:12 +01:00
@code{ YAP_ ARG1} to @code{ YAP_ ARG16} . This change breaks code such as
@code{ unify(& ARG1,& t)} , which is nowadays:
2001-04-09 20:54:03 +01:00
@example
@{
2002-09-09 18:40:12 +01:00
YAP_ Unify(ARG1, t);
2001-04-09 20:54:03 +01:00
@}
@end example
@item @code{ cut_ fail()} and @code{ cut_ succeed()} are now functions.
@item The use of @code{ Deref} is deprecated. All functions that return
Prolog terms, including the ones that access arguments, already
2009-04-25 16:59:23 +01:00
dereference their arguments.
2001-04-09 20:54:03 +01:00
@item Space allocated with PRESERVE_ DATA is ignored by garbage
collection and stack shifting. As a result, any pointers to a Prolog
stack object, including some terms, may be corrupted after garbage
collection or stack shifting. Prolog terms should instead be stored as
arguments to the backtrackable procedure.
@end itemize
2007-02-18 00:26:36 +00:00
@node YAPLibrary, Compatibility, C-Interface, Top
2014-04-21 11:14:18 +01:00
@section Using YAP as a Library
2001-04-09 20:54:03 +01:00
YAP can be used as a library to be called from other
programs. To do so, you must first create the YAP library:
@example
make library
make install_ library
@end example
This will install a file @code{ libyap.a} in @var{ LIBDIR} and the Prolog
headers in @var{ INCLUDEDIR} . The library contains all the functionality
available in YAP, except the foreign function loader and for
2007-02-18 00:26:36 +00:00
@code{ YAP} 's startup routines.
2001-04-09 20:54:03 +01:00
To actually use this library you must follow a five step process:
@enumerate
@item
2002-10-11 04:39:11 +01:00
You must initialize the YAP environment. A single function,
2002-09-09 18:40:12 +01:00
@code{ YAP_ FastInit} asks for a contiguous chunk in your memory space, fills
2001-04-09 20:54:03 +01:00
it in with the data-base, and sets up YAP's stacks and
execution registers. You can use a saved space from a standard system by
calling @code{ save_ program/1} .
@item You then have to prepare a query to give to
YAP. A query is a Prolog term, and you just have to use the same
functions that are available in the C-interface.
2002-09-09 18:40:12 +01:00
@item You can then use @code{ YAP_ RunGoal(query)} to actually evaluate your
2001-04-09 20:54:03 +01:00
query. The argument is the query term @code{ query} , and the result is 1
if the query succeeded, and 0 if it failed.
@item You can use the term destructor functions to check how
arguments were instantiated.
@item If you want extra solutions, you can use
2002-09-09 18:40:12 +01:00
@code{ YAP_ RestartGoal()} to obtain the next solution.
2001-04-09 20:54:03 +01:00
@end enumerate
The next program shows how to use this system. We assume the saved
program contains two facts for the procedure @t{ b} :
@example
@cartouche
#include <stdio.h>
2007-02-18 00:26:36 +00:00
#include "YAP/YAPInterface.h"
2001-04-09 20:54:03 +01:00
int
main(int argc, char *argv[]) @{
2005-03-02 18:35:49 +00:00
if (YAP_ FastInit("saved_ state") == YAP_ BOOT_ ERROR)
2001-04-09 20:54:03 +01:00
exit(1);
2004-01-26 10:44:13 +00:00
if (YAP_ RunGoal(YAP_ MkAtomTerm(YAP_ LookupAtom("do")))) @{
2001-04-09 20:54:03 +01:00
printf("Success\n ");
2002-09-09 18:40:12 +01:00
while (YAP_ RestartGoal())
2001-04-09 20:54:03 +01:00
printf("Success\n ");
@}
printf("NO\n ");
@}
@end cartouche
@end example
2002-10-11 04:39:11 +01:00
The program first initializes YAP, calls the query for the
2001-04-09 20:54:03 +01:00
first time and succeeds, and then backtracks twice. The first time
backtracking succeeds, the second it fails and exits.
To compile this program it should be sufficient to do:
@example
2007-02-18 00:26:36 +00:00
cc -o exem -I../YAP4.3.0 test.c -lYAP -lreadline -lm
2001-04-09 20:54:03 +01:00
@end example
You may need to adjust the libraries and library paths depending on the
2007-02-18 00:26:36 +00:00
Operating System and your installation of YAP.
2001-04-09 20:54:03 +01:00
2007-02-18 00:26:36 +00:00
Note that YAP4.3.0 provides the first version of the interface. The
2001-04-09 20:54:03 +01:00
interface may change and improve in the future.
2007-02-18 00:26:36 +00:00
The following C-functions are available from YAP:
2001-04-09 20:54:03 +01:00
@itemize @bullet
2005-03-02 19:52:12 +00:00
@item YAP_ CompileClause(@code{ YAP_ Term} @var{ Clause} )
2004-05-14 18:56:47 +01:00
@findex YAP_ CompileClause/1
2001-04-09 20:54:03 +01:00
Compile the Prolog term @var{ Clause} and assert it as the last clause
for the corresponding procedure.
2004-05-14 18:56:47 +01:00
@item @code{ int} YAP_ ContinueGoal(@code{ void} )
@findex YAP_ ContinueGoal/0
2001-04-09 20:54:03 +01:00
Continue execution from the point where it stopped.
2005-03-02 19:52:12 +00:00
@item @code{ void} YAP_ Error(@code{ int} @var{ ID} ,@code{ YAP_ Term} @var{ Cause} ,@code{ char *} @var{ error_ description} )
2004-05-14 18:56:47 +01:00
@findex YAP_ Error/1
2001-04-09 20:54:03 +01:00
Generate an YAP System Error with description given by the string
2005-03-02 19:52:12 +00:00
@var{ error_ description} . @var{ ID} is the error ID, if known, or
@code{ 0} . @var{ Cause} is the term that caused the crash.
2001-04-09 20:54:03 +01:00
2004-05-14 18:56:47 +01:00
@item @code{ void} YAP_ Exit(@code{ int} @var{ exit_ code} )
@findex YAP_ Exit/1
2001-04-09 20:54:03 +01:00
Exit YAP immediately. The argument @var{ exit_ code} gives the error code
and is supposed to be 0 after successful execution in Unix and Unix-like
systems.
2005-03-02 19:52:12 +00:00
@item @code{ YAP_ Term} YAP_ GetValue(@code{ Atom} @var{ at} )
2004-05-14 18:56:47 +01:00
@findex YAP_ GetValue/1
2001-04-09 20:54:03 +01:00
Return the term @var{ value} associated with the atom @var{ at} . If no
such term exists the function will return the empty list.
2004-05-14 18:56:47 +01:00
@item YAP_ FastInit(@code{ char *} @var{ SavedState} )
@findex YAP_ FastInit/1
2002-10-11 04:39:11 +01:00
Initialize a copy of YAP from @var{ SavedState} . The copy is
2001-04-09 20:54:03 +01:00
monolithic and currently must be loaded at the same address where it was
2004-05-14 18:56:47 +01:00
saved. @code{ YAP_ FastInit} is a simpler version of @code{ YAP_ Init} .
2001-04-09 20:54:03 +01:00
2005-03-02 18:35:49 +00:00
@item YAP_ Init(@var{ InitInfo} )
@findex YAP_ Init/1
Initialize YAP. The arguments are in a @code{ C}
structure of type @code{ YAP_ init_ args} .
The fields of @var{ InitInfo} are @code{ char *} @var{ SavedState} ,
@code{ int} @var{ HeapSize} , @code{ int} @var{ StackSize} , @code{ int}
@var{ TrailSize} , @code{ int} @var{ NumberofWorkers} , @code{ int}
2001-04-16 17:41:04 +01:00
@var{ SchedulerLoop} , @code{ int} @var{ DelayedReleaseLoad} , @code{ int}
2005-03-02 18:35:49 +00:00
@var{ argc} , @code{ char **} @var{ argv} , @code{ int} @var{ ErrorNo} , and
@code{ char *} @var{ ErrorCause} . The function returns an integer, which
indicates the current status. If the result is @code{ YAP_ BOOT_ ERROR}
booting failed.
2001-04-09 20:54:03 +01:00
If @var{ SavedState} is not NULL, try to open and restore the file
@var{ SavedState} . Initially YAP will search in the current directory. If
the saved state does not exist in the current directory YAP will use
either the default library directory or the directory given by the
environment variable @code{ YAPLIBDIR} . Note that currently
the saved state must be loaded at the same address where it was saved.
If @var{ HeapSize} is different from 0 use @var{ HeapSize} as the minimum
size of the Heap (or code space). If @var{ StackSize} is different from 0
use @var{ HeapSize} as the minimum size for the Stacks. If
@var{ TrailSize} is different from 0 use @var{ TrailSize} as the minimum
size for the Trails.
The @var{ NumberofWorkers} , @var{ NumberofWorkers} , and
@var{ DelayedReleaseLoad} are only of interest to the or-parallel system.
The argument count @var{ argc} and string of arguments @var{ argv}
arguments are to be passed to user programs as the arguments used to
call YAP.
2005-03-02 18:35:49 +00:00
If booting failed you may consult @code{ ErrorNo} and @code{ ErrorCause}
for the cause of the error, or call
@code{ YAP_ Error(ErrorNo,0L,ErrorCause)} to do default processing.
2005-03-02 19:52:12 +00:00
@item @code{ void} YAP_ PutValue(@code{ Atom} @var{ at} , @code{ YAP_ Term} @var{ value} )
2004-05-14 18:56:47 +01:00
@findex YAP_ PutValue/2
2001-04-09 20:54:03 +01:00
Associate the term @var{ value} with the atom @var{ at} . The term
@var{ value} must be a constant. This functionality is used by YAP as a
simple way for controlling and communicating with the Prolog run-time.
2011-07-22 15:49:40 +01:00
@item @code{ YAP_ Term} YAP_ Read(@code{ IOSTREAM *Stream} )
2014-04-11 02:27:10 +01:00
@findex YAP_ Read
2011-07-22 15:49:40 +01:00
Parse a @var{ Term} from the stream @var{ Stream} .
2001-04-09 20:54:03 +01:00
2007-05-17 14:00:39 +01:00
@item @code{ YAP_ Term} YAP_ Write(@code{ YAP_ Term} @var{ t} )
2014-04-11 02:27:10 +01:00
@findex YAP_ CopyTerm
2007-05-14 17:44:12 +01:00
Copy a Term @var{ t} and all associated constraints. May call the garbage
collector and returns @code{ 0L} on error (such as no space being
available).
2014-04-11 02:27:10 +01:00
@item @code{ void} YAP_ Write(@code{ YAP_ Term} @var{ t} , @code{ IOSTREAM} @var{ stream} , @code{ int} @var{ flags} )
2004-05-14 18:56:47 +01:00
@findex YAP_ Write/3
2011-07-22 15:49:40 +01:00
Write a Term @var{ t} using the stream @var{ stream} to output
2001-04-09 20:54:03 +01:00
characters. The term is written according to a mask of the following
flags in the @code{ flag} argument: @code{ YAP_ WRITE_ QUOTED} ,
2011-07-22 15:49:40 +01:00
@code{ YAP_ WRITE_ HANDLE_ VARS} , @code{ YAP_ WRITE_ USE_ PORTRAY} , and @code{ YAP_ WRITE_ IGNORE_ OPS} .
2001-04-09 20:54:03 +01:00
2014-04-11 02:27:10 +01:00
@item @code{ int} YAP_ WriteBuffer(@code{ YAP_ Term} @var{ t} , @code{ char *} @var{ buff} , @code{ size_ t} @var{ size} , @code{ int} @var{ flags} )
@findex YAP_ WriteBuffer
2012-09-18 23:06:26 +01:00
Write a YAP_ Term @var{ t} to buffer @var{ buff} with size
@var{ size} . The term is written
according to a mask of the following flags in the @code{ flag}
argument: @code{ YAP_ WRITE_ QUOTED} , @code{ YAP_ WRITE_ HANDLE_ VARS} ,
@code{ YAP_ WRITE_ USE_ PORTRAY} , and @code{ YAP_ WRITE_ IGNORE_ OPS} . The
function will fail if it does not have enough space in the buffer.
2014-04-11 02:27:10 +01:00
@item @code{ char *} YAP_ WriteDynamicBuffer(@code{ YAP_ Term} @var{ t} , @code{ char *} @var{ buff} , @code{ size_ t} @var{ size} , @code{ size_ t} @var{ *lengthp} , @code{ size_ t} @var{ *encodingp} , @code{ int} @var{ flags} )
2012-09-18 23:06:26 +01:00
@findex YAP_ WriteDynamicBuffer/6
Write a YAP_ Term @var{ t} to buffer @var{ buff} with size
@var{ size} . The code will allocate an extra buffer if @var{ buff} is
@code{ NULL} or if @code{ buffer} does not have enough room. The
variable @code{ lengthp} is assigned the size of the resulting buffer,
and @code{ encodingp} will receive the type of encoding (currently only @code{ PL_ ENC_ ISO_ LATIN_ 1} and @code{ PL_ ENC_ WCHAR} are supported)
2004-05-14 18:56:47 +01:00
@item @code{ void} YAP_ InitConsult(@code{ int} @var{ mode} , @code{ char *} @var{ filename} )
@findex YAP_ InitConsult/2
2001-04-09 20:54:03 +01:00
Enter consult mode on file @var{ filename} . This mode maintains a few
2002-10-11 04:39:11 +01:00
data-structures internally, for instance to know whether a predicate
2001-04-09 20:54:03 +01:00
before or not. It is still possible to execute goals in consult mode.
If @var{ mode} is @code{ TRUE} the file will be reconsulted, otherwise
just consulted. In practice, this function is most useful for
2002-10-11 04:39:11 +01:00
bootstrapping Prolog, as otherwise one may call the Prolog predicate
2001-04-09 20:54:03 +01:00
@code{ compile/1} or @code{ consult/1} to do compilation.
Note that it is up to the user to open the file @var{ filename} . The
2004-05-14 18:56:47 +01:00
@code{ YAP_ InitConsult} function only uses the file name for internal
2001-04-09 20:54:03 +01:00
bookkeeping.
2004-05-14 18:56:47 +01:00
@item @code{ void} YAP_ EndConsult(@code{ void} )
@findex YAP_ EndConsult/0
2001-04-09 20:54:03 +01:00
Finish consult mode.
@end itemize
Some observations:
@itemize @bullet
@item The system will core dump if you try to load the saved state in a
different address from where it was made. This may be a problem if
your program uses @code{ mmap} . This problem will be addressed in future
versions of YAP.
@item Currently, the YAP library will pollute the name
space for your program.
@item The initial library includes the complete YAP system. In
the future we plan to split this library into several smaller libraries
2014-04-21 11:14:18 +01:00
(e.g. if you do not want to perform Input/Output).
2001-04-09 20:54:03 +01:00
@item You can generate your own saved states. Look at the
@code{ boot.yap} and @code{ init.yap} files.
@end itemize
2007-02-18 00:26:36 +00:00
@node Compatibility, Operators, YAPLibrary, Top
2001-04-09 20:54:03 +01:00
@chapter Compatibility with Other Prolog systems
YAP has been designed to be as compatible as possible with
other Prolog systems, and initially with C-Prolog. More recent work on
YAP has included features initially proposed for the Quintus
and SICStus Prolog systems.
2007-02-18 00:26:36 +00:00
Developments since @code{ YAP4.1.6} we have striven at making
2001-04-09 20:54:03 +01:00
YAP compatible with the ISO-Prolog standard.
@menu
* C-Prolog:: Compatibility with the C-Prolog interpreter
* SICStus Prolog:: Compatibility with the SICStus Prolog system
* ISO Prolog:: Compatibility with the ISO Prolog standard
@end menu
@node C-Prolog, SICStus Prolog, , Compatibility
@section Compatibility with the C-Prolog interpreter
@menu
C-Prolog Compatibility
2005-10-31 18:12:51 +00:00
* Major Differences with C-Prolog:: Major Differences between YAP and C-Prolog
2007-02-18 00:26:36 +00:00
* Fully C-Prolog Compatible:: YAP predicates fully compatible with
2001-04-09 20:54:03 +01:00
C-Prolog
2007-02-18 00:26:36 +00:00
* Not Strictly C-Prolog Compatible:: YAP predicates not strictly as C-Prolog
* Not in C-Prolog:: YAP predicates not available in C-Prolog
2001-04-09 20:54:03 +01:00
* Not in YAP:: C-Prolog predicates not available in YAP
@end menu
2005-10-31 18:12:51 +00:00
@node Major Differences with C-Prolog, Fully C-Prolog Compatible, , C-Prolog
2001-04-09 20:54:03 +01:00
@subsection Major Differences between YAP and C-Prolog.
YAP includes several extensions over the original C-Prolog system. Even
so, most C-Prolog programs should run under YAP without changes.
The most important difference between YAP and C-Prolog is that, being
YAP a compiler, some changes should be made if predicates such as
@code{ assert} , @code{ clause} and @code{ retract} are used. First
predicates which will change during execution should be declared as
@code{ dynamic} by using commands like:
@example
:- dynamic f/n.
@end example
@noindent where @code{ f} is the predicate name and n is the arity of the
predicate. Note that several such predicates can be declared in a
single command:
@example
:- dynamic f/2, ..., g/1.
@end example
Primitive predicates such as @code{ retract} apply only to dynamic
predicates. Finally note that not all the C-Prolog primitive predicates
are implemented in YAP. They can easily be detected using the
@code{ unknown} system predicate provided by YAP.
Last, by default YAP enables character escapes in strings. You can
disable the special interpretation for the escape character by using:
@example
2014-04-09 14:00:54 +01:00
:- yap_ flag(character_ escapes,off).
2001-04-09 20:54:03 +01:00
@end example
@noindent
or by using:
@example
2014-04-09 14:00:54 +01:00
:- yap_ flag(language,cprolog).
2001-04-09 20:54:03 +01:00
@end example
2005-10-31 18:12:51 +00:00
@node Fully C-Prolog Compatible, Not Strictly C-Prolog Compatible, Major Differences with C-Prolog, C-Prolog
2007-02-18 00:26:36 +00:00
@subsection YAP predicates fully compatible with C-Prolog
2001-04-09 20:54:03 +01:00
These are the Prolog built-ins that are fully compatible in both
C-Prolog and YAP:
@printindex cy
@node Not Strictly C-Prolog Compatible, Not in C-Prolog, Fully C-Prolog Compatible, C-Prolog
2007-02-18 00:26:36 +00:00
@subsection YAP predicates not strictly compatible with C-Prolog
2001-04-09 20:54:03 +01:00
These are YAP built-ins that are also available in C-Prolog, but
that are not fully compatible:
@printindex ca
@node Not in C-Prolog, Not in YAP, Not Strictly C-Prolog Compatible, C-Prolog
2007-02-18 00:26:36 +00:00
@subsection YAP predicates not available in C-Prolog
2001-04-09 20:54:03 +01:00
These are YAP built-ins not available in C-Prolog.
@printindex cn
@node Not in YAP, , Not in C-Prolog, C-Prolog
2007-02-18 00:26:36 +00:00
@subsection YAP predicates not available in C-Prolog
2001-04-09 20:54:03 +01:00
These are C-Prolog built-ins not available in YAP:
@table @code
@item 'LC'
The following Prolog text uses lower case letters.
@item 'NOLC'
The following Prolog text uses upper case letters only.
@end table
@node SICStus Prolog, ISO Prolog, C-Prolog, Compatibility
@section Compatibility with the Quintus and SICStus Prolog systems
The Quintus Prolog system was the first Prolog compiler to use Warren's
Abstract Machine. This system was very influential in the Prolog
community. Quintus Prolog implemented compilation into an abstract
machine code, which was then emulated. Quintus Prolog also included
several new built-ins, an extensive library, and in later releases a
garbage collector. The SICStus Prolog system, developed at SICS (Swedish
Institute of Computer Science), is an emulator based Prolog system
largely compatible with Quintus Prolog. SICStus Prolog has evolved
through several versions. The current version includes several
extensions, such as an object implementation, co-routining, and
constraints.
Recent work in YAP has been influenced by work in Quintus and
SICStus Prolog. Wherever possible, we have tried to make YAP
compatible with recent versions of these systems, and specifically of
SICStus Prolog. You should use
@example
:- yap_ flag(language, sicstus).
@end example
@noindent
for maximum compatibility with SICStus Prolog.
@menu
SICStus Compatibility
2005-10-31 18:12:51 +00:00
* Major Differences with SICStus:: Major Differences between YAP and SICStus Prolog
2007-02-18 00:26:36 +00:00
* Fully SICStus Compatible:: YAP predicates fully compatible with
2001-04-09 20:54:03 +01:00
SICStus Prolog
2007-02-18 00:26:36 +00:00
* Not Strictly SICStus Compatible:: YAP predicates not strictly as
2001-04-09 20:54:03 +01:00
SICStus Prolog
2007-02-18 00:26:36 +00:00
* Not in SICStus Prolog:: YAP predicates not available in SICStus Prolog
2001-04-09 20:54:03 +01:00
@end menu
2005-10-31 18:12:51 +00:00
@node Major Differences with SICStus, Fully SICStus Compatible, , SICStus Prolog
2001-04-09 20:54:03 +01:00
@subsection Major Differences between YAP and SICStus Prolog.
Both YAP and SICStus Prolog obey the Edinburgh Syntax and are based on
the WAM. Even so, there are quite a few important differences:
@itemize @bullet
@item Differently from SICStus Prolog, YAP does not have a
notion of interpreted code. All code in YAP is compiled.
@item YAP does not support an intermediate byte-code
representation, so the @code{ fcompile/1} and @code{ load/1} built-ins are
not available in YAP.
@item YAP implements escape sequences as in the ISO standard. SICStus
Prolog implements Unix-like escape sequences.
2001-04-16 17:41:04 +01:00
@item YAP implements @code{ initialization/1} as per the ISO
2001-04-09 20:54:03 +01:00
standard. Use @code{ prolog_ initialization/1} for the SICStus Prolog
compatible built-in.
@item Prolog flags are different in SICStus Prolog and in YAP.
@item The SICStus Prolog @code{ on_ exception/3} and
2006-02-08 19:13:11 +00:00
@code{ raise_ exception} built-ins correspond to the ISO built-ins
2001-04-09 20:54:03 +01:00
@code{ catch/3} and @code{ throw/1} .
@item The following SICStus Prolog v3 built-ins are not (currently)
implemented in YAP (note that this is only a partial list):
2009-05-18 15:36:00 +01:00
@code{ file_ search_ path/2} ,
2001-04-16 17:41:04 +01:00
@code{ stream_ interrupt/3} , @code{ reinitialize/0} , @code{ help/0} ,
2001-05-28 20:54:53 +01:00
@code{ help/1} , @code{ trimcore/0} , @code{ load_ files/1} ,
@code{ load_ files/2} , and @code{ require/1} .
2001-04-09 20:54:03 +01:00
The previous list is incomplete. We also cannot guarantee full
compatibility for other built-ins (although we will try to address any
such incompatibilities). Last, SICStus Prolog is an evolving system, so
one can be expect new incompatibilities to be introduced in future
releases of SICStus Prolog.
@item YAP allows asserting and abolishing static code during
execution through the @code{ assert_ static/1} and @code{ abolish/1}
2006-02-08 19:13:11 +00:00
built-ins. This is not allowed in Quintus Prolog or SICStus Prolog.
2001-04-09 20:54:03 +01:00
@item The socket predicates, although designed to be compatible with
SICStus Prolog, are built-ins, not library predicates, in YAP.
@item This list is incomplete.
@end itemize
The following differences only exist if the @code{ language} flag is set
to @code{ yap} (the default):
@itemize @bullet
@item The @code{ consult/1} predicate in YAP follows C-Prolog
semantics. That is, it adds clauses to the data base, even for
preexisting procedures. This is different from @code{ consult/1} in
2014-04-10 11:59:30 +01:00
SICStus Prolog or SWI-Prolog.
2001-04-09 20:54:03 +01:00
2014-04-10 11:59:30 +01:00
@cindex logical update semantics
@item
By default, the data-base in YAP follows "logical update semantics", as
Quintus Prolog or SICStus Prolog do. Previous versions followed
"immediate update semantics". The difference is depicted in the next
example:
2001-04-09 20:54:03 +01:00
@example
:- dynamic a/1.
?- assert(a(1)).
?- retract(a(X)), X1 is X +1, assertz(a(X)).
@end example
With immediate semantics, new clauses or entries to the data base are
visible in backtracking. In this example, the first call to
@code{ retract/1} will succeed. The call to @strong{ assertz/1} will then
succeed. On backtracking, the system will retry
@code{ retract/1} . Because the newly asserted goal is visible to
@code{ retract/1} , it can be retracted from the data base, and
@code{ retract(a(X))} will succeed again. The process will continue
generating integers for ever. Immediate semantics were used in C-Prolog.
With logical update semantics, any additions or deletions of clauses
2014-05-04 22:30:33 +01:00
for a goal
@emph{ will not affect previous activations of the goal} . In the example,
the call to @code{ assertz/1} will not see the
2001-04-09 20:54:03 +01:00
update performed by the @code{ assertz/1} , and the query will have a
single solution.
Calling @code{ yap_ flag(update_ semantics,logical)} will switch
YAP to use logical update semantics.
@item @code{ dynamic/1} is a built-in, not a directive, in YAP.
@item By default, YAP fails on undefined predicates. To follow default
SICStus Prolog use:
@example
:- yap_ flag(unknown,error).
@end example
@item By default, directives in YAP can be called from the top level.
@end itemize
@node Fully SICStus Compatible, Not Strictly SICStus Compatible, Major Differences with SICStus, SICStus Prolog
2007-02-18 00:26:36 +00:00
@subsection YAP predicates fully compatible with SICStus Prolog
2001-04-09 20:54:03 +01:00
These are the Prolog built-ins that are fully compatible in both SICStus
Prolog and YAP:
@printindex sy
2007-02-18 00:26:36 +00:00
@node Not Strictly SICStus Compatible, Not in SICStus Prolog, Fully SICStus Compatible, SICStus Prolog
@subsection YAP predicates not strictly compatible with SICStus Prolog
2001-04-09 20:54:03 +01:00
These are YAP built-ins that are also available in SICStus Prolog, but
that are not fully compatible:
@printindex sa
2007-02-18 00:26:36 +00:00
@node Not in SICStus Prolog, , Not Strictly SICStus Compatible, SICStus Prolog
@subsection YAP predicates not available in SICStus Prolog
2001-04-09 20:54:03 +01:00
These are YAP built-ins not available in SICStus Prolog.
@printindex sn
@node ISO Prolog, , SICStus Prolog, Compatibility
@section Compatibility with the ISO Prolog standard
The Prolog standard was developed by ISO/IEC JTC1/SC22/WG17, the
international standardization working group for the programming language
Prolog. The book "Prolog: The Standard" by Deransart, Ed-Dbali and
Cervoni gives a complete description of this standard. Development in
YAP from YAP4.1.6 onwards have striven at making YAP
compatible with ISO Prolog. As such:
@itemize @bullet
@item YAP now supports all of the built-ins required by the
ISO-standard, and,
@item Error-handling is as required by the standard.
@end itemize
YAP by default is not fully ISO standard compliant. You can set the
@code{ language} flag to @code{ iso} to obtain very good
compatibility. Setting this flag changes the following:
@itemize @bullet
@item By default, YAP uses "immediate update semantics" for its
database, and not "logical update semantics", as per the standard,
(@pxref{ SICStus Prolog} ). This affects @code{ assert/1} ,
@code{ retract/1} , and friends.
Calling @code{ set_ prolog_ flag(update_ semantics,logical)} will switch
YAP to use logical update semantics.
2014-05-04 22:30:33 +01:00
@item By default, YAP implements the
@code{ atom_ chars/2} (@pxref{ Testing Terms} ), and
@code{ number_ chars/2} , (@pxref{ Testing Terms} ),
built-ins as per the original Quintus Prolog definition, and
2001-04-09 20:54:03 +01:00
not as per the ISO definition.
Calling @code{ set_ prolog_ flag(to_ chars_ mode,iso)} will switch
YAP to use the ISO definition for
@code{ atom_ chars/2} and @code{ number_ chars/2} .
@item By default, YAP allows executable goals in directives. In ISO mode
most directives can only be called from top level (the exceptions are
@code{ set_ prolog_ flag/2} and @code{ op/3} ).
@item Error checking for meta-calls under ISO Prolog mode is stricter
than by default.
@item The @code{ strict_ iso} flag automatically enables the ISO Prolog
standard. This feature should disable all features not present in the
standard.
@end itemize
The following incompatibilities between YAP and the ISO standard are
known to still exist:
@itemize @bullet
@item Currently, YAP does not handle overflow errors in integer
operations, and handles floating-point errors only in some
architectures. Otherwise, YAP follows IEEE arithmetic.
@end itemize
Please inform the authors on other incompatibilities that may still
exist.
@node Operators, Predicate Index, Compatibility, Top
2014-04-21 11:14:18 +01:00
@section Summary of YAP Predefined Operators
2001-04-09 20:54:03 +01:00
The Prolog syntax caters for operators of three main kinds:
@itemize @bullet
@item
prefix;
@item
infix;
@item
postfix.
@end itemize
Each operator has precedence in the range 1 to 1200, and this
precedence is used to disambiguate expressions where the structure of the
term denoted is not made explicit using brackets. The operator of higher
precedence is the main functor.
If there are two operators with the highest precedence, the ambiguity
is solved analyzing the types of the operators. The possible infix types are:
2008-04-04 23:05:34 +01:00
@var{ xfx} , @var{ xfy} , and @var{ yfx} .
2001-04-09 20:54:03 +01:00
2008-04-04 23:05:34 +01:00
With an operator of type @var{ xfx} both sub-expressions must have lower
2001-04-09 20:54:03 +01:00
precedence than the operator itself, unless they are bracketed (which
2008-04-04 23:05:34 +01:00
assigns to them zero precedence). With an operator type @var{ xfy} only the
2001-04-09 20:54:03 +01:00
left-hand sub-expression must have lower precedence. The opposite happens
2008-04-04 23:05:34 +01:00
for @var{ yfx} type.
2001-04-09 20:54:03 +01:00
2008-04-04 23:05:34 +01:00
A prefix operator can be of type @var{ fx} or @var{ fy} .
A postfix operator can be of type @var{ xf} or @var{ yf} .
2001-04-09 20:54:03 +01:00
The meaning of the notation is analogous to the above.
@example
a + b * c
@end example
@noindent
means
@example
a + (b * c)
@end example
@noindent
as + and * have the following types and precedences:
@example
:-op(500,yfx,'+').
:-op(400,yfx,'*').
@end example
Now defining
@example
:-op(700,xfy,'++').
:-op(700,xfx,'=:=').
a ++ b =:= c
@end example
@noindent means
@example
a ++ (b =:= c)
@end example
The following is the list of the declarations of the predefined operators:
@example
:-op(1200,fx,['?-', ':-']).
:-op(1200,xfx,[':-','-->']).
2001-04-16 17:41:04 +01:00
:-op(1150,fx,[block,dynamic,mode,public,multifile,meta_ predicate,
sequential,table,initialization]).
2001-04-09 20:54:03 +01:00
:-op(1100,xfy,[';','|']).
:-op(1050,xfy,->).
:-op(1000,xfy,',').
:-op(999,xfy,'.').
:-op(900,fy,['\+ ', not]).
:-op(900,fx,[nospy, spy]).
:-op(700,xfx,[@@>=,@@=<,@@<,@@>,<,=,>,=:=,=\= ,\= =,>=,=<,==,\= ,=..,is]).
:-op(500,yfx,['\/ ','/\' ,'+','-']).
:-op(500,fx,['+','-']).
:-op(400,yfx,['<<','>>','//','*','/']).
:-op(300,xfx,mod).
:-op(200,xfy,['^ ','**']).
:-op(50,xfx,same).
@end example
@node Predicate Index, Concept Index, Operators, Top
@unnumbered Predicate Index
@printindex fn
@node Concept Index, , Predicate Index, Top
@unnumbered Concept Index
@printindex cp
@contents
@bye