Book Contents

Book Index

Next Topic

Home

INTERVAL

The INTERVAL type measures the difference between two points in time. It is represented internally as a number of microseconds and printed out as up to:

All the fields are either positive or negative.

Syntax

INTERVAL [ (p) ]

Semantics

 

p

(precision) specifies the number of fractional digits retained in the seconds field in the range 0 to 6. The default is the precision of the input literal.

Notes