• work in progress
  • 2020
  • https://arxiv.org/abs/2008.07912
  • A review of inductive logic programming (ILP), which is a form of machine learning dating back to the 1990s.

5 ILP systems

  • Noise and Predicate Invention? ILASP, \(\partial\)ILP, Apperception.

5.4 Predicate Invention

predicate invention is attractive because it is a most natural form of automated discovery.

  • example
    • grandparent, greatgrandparent. invented predicate: parent(A, B).
    • game playing, connect four
    • droplasts, higher-order, map(A, B, droplasts1)
    • ddroplasts, ddroplasts(A,B):-map(A,C,ddroplasts1),ddroplasts1(C,B).

5.4.2 Inverse Resolution

  • early work

5.4.3 Placeholders

  • prescripted, modeh(1,inv(person,person)). as a declaration
  • Inductive Learning of Answer Set Programs (ILASP): The European Conference on Logics in Artificial Intelligence (or Journées Européennes sur la Logique en Intelligence Artificielle - JELIA) 2014
  • or generates all: computationally expensive
  • Learning explanatory rules from noisy data (\(\partial\)ILP): JAIR 2018
  • Making sense of sensory input (Apperception): AI 2021

5.4.4 Metarules

  • meta-interpretive learning, metarules (higher-order clause) to drive PI
  • chain metarule, f(A,B):- tail(A,C),tail(C,B).
  • unfold
  • remember: no noise

5.4.5 Lifelong learning

  • single-task vs. lifelong
  • continually learning, dependent, easy to difficult
  • reusable
  • Bias reformulation for one-shot function induction: ECAI 2014
  • extends to handle thousands of tasks
  • Forgetting to learn logic programs (Forgetgol): AAAI 2020
  • no noise
  • self-supervised, Playgol, plays by randomly sampling
  • Playgol: Learning programs through play: IJCAI 2019

5.4.6 (Unsupervised) Compression

  • previous: measured by whether it can help solve a given task.
  • does not help to solve immediately but useful
  • criterion: compression
  • Auto-encoding, encoder, decoder
  • Program refactoring, remove redundancies
  • Theory refinement: revision (correctness), compression (minimally affected), restructuring (optimize its execution or readability)

5.4.7 Connection to Representation Learning

  • coincide, improving performance by changing the representation of a problem
  • tabular, representing structured even relational data in such tabular form
  • a few: start from the core
    • Clustering-based relational unsupervised representation learning with an explicit distributed representation: IJCAI 2017
    • Learning relational representations with auto-encoding logic programs: IJCAI 2019
    • Lifted relational neural networks: Efficient learning of latent relational structures: JAIR 2018
  • others: a propositional tabular form
  • \(\partial\)ILP, neural theorem provers, short datalog programs