Untitled Document

03 - Semantic Networks

Semantic networks are a knowledge representation scheme.

This lesson will cover the following topics:

  • Knowlege representations
  • Semantic networks
  • Problem-solving with semantic networks
  • Represent & Reason

Representation

In each knowledge representation, there is a language, and that language has a vocabulary. In addition, the representation contains some content (or knowledge).

Example: Newton's 2nd Law of Motion

$$ F = ma $$
Force is equal to mass times acceleration

Introduction to Semantic Networks

How to represent Raven’s Progressive Matrices using a semantic network.
State A, and state B.

  1. Label all objects (x is a circle, y is the diamond, z is the black dot), and reference them as nodes
  2. Represent the relationships between nodes, in both states (frames), both A and B.
  3. Represent the transformation between the nodes between states, A and B.

Structure of Semantic Networks

    1. Lexically: nodes
    1. Structurally: directional links
    1. Semantically: application-specific labels

Characteristics of Good Representations

  • Make relationships explicit
  • exposese natural contraints
  • bring objects and relations together
  • exclude extraneous details
  • transparent, concise, complete, fast, computable

Guards and Prisoners Problem

Description

  • Three guard and three prisoners must cross river.
  • Boat may take only one or two people at a time.
  • Prisoners may never outnumber guards on either time (thought prisoner may be alone on either coast).

Modeling using Semantic Nework

Lexicon: Consider each node to be a unique state, represented by:
- number of prisoners and guards on left side
- number of prisoners and guards on right side
- side that boat is on.

Structure:

Semantic:

Inference about State Transitions?

Which transitions (e.g. moves) between states are both legal AND productive?
Represent total possible states given transformations possible:

i 0 1 2 3 4 6 7 8 9 10 11 12
trans: init:< 2p to R:> p to L:< 2p to R:> p to L:< 2g to R:> g.p to L:< 2g to R:> p to L:< 2p to R:> p to L:< 2p to R:<
\(g_i\) 3, 0 3, 0 3, 0 3, 0 3, 0 1, 2 2, 1 0, 3 0, 3 0, 3 0, 3 0, 3
\(p_i\) 3, 0 1, 2 2, 1 0, 3 1, 2 1, 2 2, 1 2, 1 3, 0 1, 2 2, 0 0, 3

Choosing Matches by Weight

One can weigh transformations to favor specific types of transformations over others. For example:

Points Weights

  • 5: Unchanged
  • 4: Reflected
  • 3: Rotated
  • 2: Scaled
  • 1: Deleted
  • 0: Shape Changed