# Examples

This page is intended to help users get familiar with OptiML through a series of simple examples. All of the source code for the examples is available and runnable from the OptiML github repository. Comprehensive documentation of OptiML syntax and features can be found in the language specification. There are also several more thorough example applications located in the `apps` directory in the repository.

### Operations

1. Basics
2. Constructing new vectors and matrices
3. Using functional operators
4. Iterating with for and while
5. I/O
6. Iterative computation with untilconverged
7. Summations

## Basics

This example shows simple manipulations of vectors and matrices, which are the core data structure in OptiML.

```    1 object Example1Interpreter extends OptiMLApplicationInterpreter with Example1
2 trait Example1 extends OptiMLApplication {
3   def main() = {
4     // 10000x1 DenseVector
5     val v1 = DenseVector.rand(10000)
6     // 1000x1000 DenseMatrix
7     val m = DenseMatrix.rand(1000,1000)
8
9     // perform some simple infix operations
10     val v2 = (v1+10)*2-5
11
12     // take the pointwise natural log of v2 and sum results
13     val logv2Sum = log(v2).sum
14
15     // slice elems 1000-2000 of v2 and multiply by m
16     val v3 = v2(1000::2000)*m // 1x1000 DenseVector result
17
18     // print the first 10 elements to the screen
19     v3(0::10).pprint
20   }
21 }```

## Constructing new vectors and matrices

This example shows various ways of constructing vectors and matrices in OptiML.

```    1 object Example2Interpreter extends OptiMLApplicationInterpreter with Example2
2 trait Example2 extends OptiMLApplication {
3   def main() = {
4     /* various ways of constructing a DenseVector */
5     val v0 = DenseVector[Int](100) // [ 100 ]
6     val v1 = DenseVector[Int](100,false) // 1x100 all zeros, ints
7     val v3 = DenseVector.rand(100) // 100x1 random doubles
8     val v4 = DenseVector.zeros(100) // 100x1 all zeros, doubles
9     val v5 = DenseVector(1,2,3,4,5) // [1,2,3,4,5]
10     val v6 = DenseVector(1.,2.,3.,4.,5.) // [1.0,2.0,3.0,4.0,5.0]
11     val v7 = (0::100) { e => random[Int] } // 100x1, random ints
12
13     /* various ways of constructing a DenseMatrix */
14     val m0 = DenseMatrix[Int](100,50) // 100x50 zeros, ints
15     val m1 = DenseMatrix.rand(100,50) // 100x50 random doubles
16     val m2 = DenseMatrix.zeros(100,50) // 100x50 zeros, doubles
17     val m3 = DenseMatrix((1,2,3),(4,5,6)) // [1,2,3]
18                                           // [4,5,6]
19     val m4 = (0::2, *) { i => DenseVector(2,3,4) }  // [2,3,4]
20                                                     // [2,3,4]
21     val m5 = (0::2, 0::2) { (i,j) => i*j }  // [0,0]
22                                             // [0,1]
23
24     // print first row
25     m5(0).pprint
26   }
27 }```

## Using functional operators

Functional operators are a key piece of making OptiML more concise and expressive. This example demonstrates some of the more common operations and how they are typically chained together. These operators are usually parallel, so will speed up when run with more threads, and fused together, so they use less memory than typical bulk collection operators.

```    1 object Example3Interpreter extends OptiMLApplicationInterpreter with Example3
2 trait Example3 extends OptiMLApplication {
3   def main() = {
4     val v = DenseVector.rand(1000)
5
6     // filter selects all the elements matching a predicate
7     // map constructs a new vector by applying a function to each element
8     val v2 = (v*1000).filter(e => e < 500).map(e=>e*e*random[Double])
9     println(v2.length)
10
11     // reduce produces a scalar by successively applying a function to pairs
12     val logmin = v2.reduce((a,b) => if (log(a) < log(b)) a else b)
13     println(logmin)
14
15     // partition splits the vector into two based on a predicate
16     val (v2small, v2large) = unpack(v2.partition(e => e < 1000))
17     println("v2small size: " + v2small.length)
18     println("v2large size: " + v2large.length)
19   }
20 }```

## Iterating with 'for' and 'while'

Although functional operators are the preferred way of operating on vectors and matrices in OptiML, the language also supports both parallel and sequential iteration using 'for' and 'while' respectively. However, using either of these constructs requires some care. The body of 'for' loops must not contain conflicting (non-disjoint) writes, or the result will be incorrect (due to races); in the future, we plan to automatically detect these cases and report them as stage-time errors. 'while' loops should be used only as a last resort; their sequential nature prevents OptiML from parallelizing the loop and also inhibits optimizations like fusion.

```    1 object Example4Interpreter extends OptiMLApplicationInterpreter with Example4
2 trait Example4 extends OptiMLApplication {
3   def main() = {
4     // a DenseVector[DenseVector[Double]]
5     // 100 vectors, each containing 1000 random doubles
6     val v = DenseVector.zeros(100).map(e => DenseVector.rand(1000))
7
8     // iterate using for (parallel)
9     for (vec <- v) {
10       // prints can happen in any order!
11       if (vec(0) > .9) println("found > .9")
12     }
13
14     // iterate using while (sequential)
15     var i = 0
16     while (i < v.length) {
17       val vi = v(i)
18       // prints always in order
19       println("first element of vector " + i + ": " + vi(0))
20       i += 1
21     }
22   }
23 }```

## I/O

OptiML supports MATLAB-like read functions to load vectors and matrices from a file. As this example shows, you can also supply a function to tell OptiML how to parse a line in the input file in order to populate the vector or matrix. To run the example, make sure the input files described in the comments exist in the current working directory (the directory from which you run the `delite` command to start the program).

```    1 // myvector.dat
2 // 0.1
3 // 0.5
4 // 1.2
5 // 1.3
6 // 0.6
7 //
8 // myvector2.dat
9 // 1;blue;-1
10 // 16;green;3
11 // 3;red;55
12 //
13 // mymatrix.dat
14 // 3 12 5
15 // 17 32 1
16 // -6 1 0
17 object Example5Interpreter extends OptiMLApplicationInterpreter with Example5
18 trait Example5 extends OptiMLApplication {
19   def main() = {
20     // simple i/o
22     v.pprint
23
25     m.pprint
26
27     // i/o with custom parser
28     // second argument is a function from a DenseVector[String] to a Double
29     // third argument is the delimeter used to split the line
30     val v2 = readVector[Double]("myvector2.dat", line => line(0).toDouble, ";")
31     v2.pprint
32   }
33 }
34 ```

## Iterative computation with untilconverged

Many common machine learning problems have an iterative structure. OptiML makes these problems easier to express using a built-in `untilconverged` control structure, which iterates until either the delta between solutions falls below a supplied threshold or a maximum number of iterations has been reached. The following example shows one simple way of using untilconverged on a scalar value (but it can also be used with vectors, matrices, and graphs).

```    1 object Example6Interpreter extends OptiMLApplicationInterpreter with Example6
2 trait Example6 extends OptiMLApplication {
3   def main() = {
4     // newton descent
5
6     // arbitrary initial values
7     val c0 = 0.0
8     val c1 = 1.2
9     val c2 = 9.7
10     val linit = 5.5
11     val lambda =
12       untilconverged(linit, .001) { lambda =>
13         val l2 = lambda*lambda
14         val b = (l2 + c2)*lambda
15         val a = b + c1
16         lambda - (a * lambda + c0) / (2.0*lambda*l2 + b + a)
17       }
18     println("lambda: " + lambda)
19   }
20 }
21 ```

## Summations

Since summations are so ubitiquous in machine learning, OptiML allows users to sum pure computations over a particular range. This example shows all four variants: `sum`, `sumRows`, `sumIf`, and `sumRowsIf`. The latter two allow you to specify a predicate; the computed value is only added to the sum if the predicate is true. `sumRows` is a specialized version of `sum` where the result of the function is a matrix row - it allows adding the values from the underlying matrix directly into the accumulator without creating a copy of each row. (In other words, it sums multiple `DenseVectorView`s into an output `DenseVector`).

```    1 object Example7Interpreter extends OptiMLApplicationInterpreter with Example7
2 trait Example7 extends OptiMLApplication {
3   def main() = {
4     val simpleSeries = sum(0, 100) { i => i } // sum(0,1,2,3,...99)
5     println("simpleSeries: " + simpleSeries)
6
7     val m = DenseMatrix.rand(10,100)
8     // sum first 10 rows of m
9     val rowSum = sumRows(0,10) { i => m(i) }
10     println("rowSum:")
11     rowSum.pprint
12
13     // sum(0,2,4,8...98)
14     val conditionalSeries = sumIf(0,100)(i => i % 2 == 0) { i => i }
15     println("conditionalSeries: " + conditionalSeries)
16
17     // conditional sum over rows of a matrix
18     val conditionalRowSum = sumRowsIf(0,10)(i => m(i).min > .01) { i => m(i) }
19     println("conditionalRowSum:")
20     conditionalRowSum.pprint
21   }
22 }```

## Working with mutable objects

OptiML was designed to be easy to parallelize and optimize and thus has a natural tendency towards immutable, functional operators. However, we didn't want to forbid mutation in all cases, as there are times when it is either more convenient, easier to read, or can perform better due to an application-level optimization than the immutable version. However, in order to retain as much optimization potential as we can, we do forbid mutations on objects unless they are explicitly marked `mutable`. This example demonstrates these restrictions and shows how to mutate a vector.

```    1 object Example10Interpreter extends OptiMLApplicationInterpreter with Example10
2 trait Example10 extends OptiMLApplication {
3   def main() = {
4     val vMut = DenseVector[Double](1000, true) // mutable vector initialized to all zeros
5     val vImm = DenseVector.rand(1000) // immutable vector initialized to random values
6
7     val vImm2 = vMut+5 // mutability is not inherited! the new vector is immutable
8     val vMut2 = vImm2.mutable // but we can always ask for a mutable copy if we need one
9
10     var i = 0
11     while (i < vMut.length) {
12       if (i % 10 == 0) {
13         vMut(i) = 1 // ok
14         // vImm(i) = 1 // would cause a stage-time error!
15       }
16       i += 1
17     }
18
19     println("vMut(10): " + vMut(10))
20
21     // nested mutable objects are not allowed!
22     val vNestedMut = DenseVector[DenseVector[Double]](10, true)
23     vNestedMut(0) = vImm2 // ok
24     // vNestedMut(0) = vMut // would cause a stage-time error!
25   }
26 }
27 ```

## Using explicit types

OptiML is statically-typed by virtue of being embedded in Scala. Usually, type annotations are not required in OptiML due to type inference. The most important thing to know about using explicit types is that OptiML has two classes of types: those wrapped inside the type constructor `Rep[T]` represent stage-time types; code will be generated for operations on stage-time types to be executed later on multiple platforms; types that are not wrapped in `Rep[T]` are ordinary Scala types and will execute while the program is being staged. This is a form of partial evaluation that can be used to do simple computations ahead-of-time (i.e. during compilation). The following example shows how this type distinction appears in real OptiML code.

```    1 object Example11Interpreter extends OptiMLApplicationInterpreter with Example11
2 trait Example11 extends OptiMLApplication {
3   def main() = {
4     // an explicitly-typed result
5     // Rep[T] is a type constructor representing staged values
6     val v: Rep[DenseVector[Double]] = DenseVector.rand(1000)
7
8     // values that are not wrapped in Rep[] are evaluated at compile time, e.g.
9     val scalar: Double = 5*12/3.3+1 // this is executed immediately when we are staging
10                                     // no code will be generated for these statements
11
12     // you can still use unstaged values in staged computations
13     val v2: Rep[DenseVector[Double]] = v*scalar
14
15     // all values returned by the DSL are staged
16     val scalar2: Rep[Double] = v2.sum
17
18     println(scalar2)
19   }
20 }
21 ```

## Organizing code with methods and traits

All OptiML code must be executed within the dynamic scope of the `main` method in order to be staged. However, you can still organize your code into separate methods and traits, as shown in this example (and different traits can be put in different source files). A 'trait' is a type of Scala class that supports mix-in inheritance using the `with` keyword. Note that global variables outside of the `main` method is not allowed, since this isn't within the dynamic scope. Therefore, methods either require all of their parameters to be passed in explicitly, or must be declared as sub-methods within `main`.

```    1 object Example12Interpreter extends OptiMLApplicationInterpreter with Example12
2 trait Example12 extends OptiMLApplication with Example12work {
3   def main() = {
4     // code can be organized into different methods and traits
5     // these methods get inlined during staging
6     val v = DenseVector.rand(1000)
7     doWork(v) // defined in Example6work
8     doWork2(v)
9     val a = doWork3(v)
10     println(a)
11   }
12 }
13 trait Example12work extends OptiMLApplication {
14   // methods signatures require types; return types are optional
15   def doWork(v: Rep[DenseVector[Double]]) = {
16     println("v length is: " + v.length)
17   }
18
19   // methods can use generic types, too
20   // but you have to include this ":Manifest" boilerplate
21   // Manifest is a Scala object that stores type information for T
22   def doWork2[T:Manifest](v: Rep[DenseVector[T]]) = {
23     println("v(0) is: " + v(0))
24   }
25
26   // an example method returning a value with an explicit return type
27   def doWork3(v: Rep[DenseVector[Double]]): Rep[Double] = {
28     if (v.length > 10) {
29       v(10)
30     }
31     else {
32       0.0
33     }
34   }
35 }
36 ```

## Used-defined data structures

Users can define their own struct-like data structures (data only, no methods), using the `new Record` keyword. These structs can contain fields that are OptiML types (or other user-defined structs). However, they cannot be recursive (a struct field cannot refer to the defining struct). These restrictions help enable high performance and the ability to run on multiple devices. The following example shows to define a simple custom struct in OptiML and then use that struct within vectors.

```    1 object Example13Interpreter extends OptiMLApplicationInterpreter with Example13
2 trait Example13 extends OptiMLApplication {
3   def main() = {
4     // type alias is for convenience
5     type MyStruct = Record{val data: Int; val name: String}
6
7     // method to construct a new instance of MyStruct, also for convenience
8     def newMyStruct(_data: Rep[Int], _name: Rep[String]) =
9       // a user-defined struct instance is declared as a new Record
10       new Record {
11         val data = _data
12         val name = _name
13       }
14
15     // we can use our struct with normal OptiML data types
16     val v1 = (0::100) { i => newMyStruct(i, "struct " + i) }
17
18     // we can even use math operators by defining how arithmetic works with MyStruct
19     implicit def myStructArith: Arith[MyStruct] = new Arith[MyStruct] {
20       def +=(a: Rep[MyStruct], b: Rep[MyStruct])(implicit ctx: SourceContext) =
21         newMyStruct(a.data+b.data,a.name)
22       def +(a: Rep[MyStruct], b: Rep[MyStruct])(implicit ctx: SourceContext) =
23         newMyStruct(a.data+b.data,a.name+" pl "+b.name)
24       def -(a: Rep[MyStruct], b: Rep[MyStruct])(implicit ctx: SourceContext) =
25         newMyStruct(a.data-b.data,a.name+" mi "+b.name)
26       def *(a: Rep[MyStruct], b: Rep[MyStruct])(implicit ctx: SourceContext) =
27         newMyStruct(a.data*b.data,a.name+" ti "+b.name)
28       def /(a: Rep[MyStruct], b: Rep[MyStruct])(implicit ctx: SourceContext) =
29         newMyStruct(a.data/b.data,a.name+" di "+b.name)
30       def abs(a: Rep[MyStruct])(implicit ctx: SourceContext) =
31         newMyStruct(arith_abs(a.data),"abs "+a.name)
32       def exp(a: Rep[MyStruct])(implicit ctx: SourceContext) =
33         newMyStruct(arith_exp(a.data).AsInstanceOf[Int],"exp "+a.name)
34       def log(a: Rep[MyStruct])(implicit ctx: SourceContext) =
35         newMyStruct(arith_log(a.data).AsInstanceOf[Int],"log "+a.name)
36       def empty(implicit ctx: SourceContext) =
37         newMyStruct(0,"empty")
38       def zero(a: Rep[MyStruct])(implicit ctx: SourceContext) =
39         newMyStruct(0,"zero")
40     }
41
42     val v2 = (0::100) { i => newMyStruct(i*i, "struct2 " + i) }
43
44     val result = v1+v2
45     println("result(10) with name " + result(10).name + " has data " + result(10).data)
46   }
47 }
48 ```