User-Defined Features

This chapter describes how to use custom data types and database functions with Slick’s Scala API.

Note

In the code examples below we assume the following imports:

sourceimport java.sql.Date

import scala.concurrent.{Await, Future}
import scala.concurrent.duration.Duration
import scala.reflect.ClassTag

import slick.jdbc.H2Profile.api.*
import slick.lifted.ProvenShape

If you’re new to Slick, please start with the Getting Started page.

Scalar Database Functions

Note

This section is based on the ready-to-run CallNativeDBFunction app in the MultiDB sample.

If your database system supports a scalar function that is not available as a method in Slick you can define it as a SimpleFunction. There are predefined methods for creating unary, binary and ternary functions with fixed parameter and return types.

source// H2 has a day_of_week() function which extracts the day of week from a timestamp
val dayOfWeek = SimpleFunction.unary[Date, Int]("day_of_week")

// Use the lifted function in a query to group by day of week
val q1 = for {
  (dow, q) <- salesPerDay.map(s => (dayOfWeek(s.day), s.count)).groupBy(_._1)
} yield (dow, q.map(_._2).sum)

If you need more flexibility regarding the types (e.g. for varargs, polymorphic functions, or to support Option and non-Option types in a single function), you can use SimpleFunction.apply to get an untyped instance and write your own wrapper function with the proper type-checking:

sourcedef dayOfWeek2(c: Rep[Date]) =
  SimpleFunction[Int]("day_of_week").apply(Seq(c))

SimpleBinaryOperator and SimpleLiteral work in a similar way. For even more flexibility (e.g. function-like expressions with unusual syntax), you can use SimpleExpression.

sourceval current_date = SimpleLiteral[java.sql.Date]("CURRENT_DATE")
salesPerDay.map(_ => current_date)

Other Database Functions And Stored Procedures

For database functions that return complete tables or stored procedures please use Plain SQL Queries. Stored procedures that return multiple result sets are currently not supported.

Using Custom Scalar Types in Queries

If you need a custom column type you can implement ColumnType. The most common scenario is mapping an application-specific type to an already supported type in the database. This can be done much simpler by using type which takes care of all the boilerplate. It comes with the usual import from the profile.

source// Custom data type for booleans that maps to NUMBER in database
object Bool extends Enumeration {
  type Bool = Value
  val True, False = Value

  // A ColumnType that maps it to NUMBER values 1 and 0
  val columnMapper: BaseColumnType[Bool] = MappedColumnType.base[Bool, Int](
    { case True => 1; case False => 0 }, // map Bool to NUMBER
    { i => if (i == 1) True else False } // map NUMBER to Bool
  )
}

// Make columnMapper available in table definitions and where you do queries
implicit val boolColumnType: BaseColumnType[Bool.Bool] = Bool.columnMapper

// You can now use Bool.{True, False} like any built-in column type (in tables, queries, etc.)

You can also subclass MappedJdbcType for a bit more flexibility.

Using Custom Record Types in Queries

Record types are data structures containing a statically known number of components with individually declared types. Out of the box, Slick supports Scala tuples (up to arity 22) and Slick’s own HList implementation. Record types can be nested and mixed arbitrarily.

In order to use custom record types (case classes, custom HLists, tuple-like types, etc.) in queries you need to tell Slick how to map them between queries and results. You can do that using a Shape extending MappedScalaProductShape.

Polymorphic Types (e.g. Custom Tuple Types or HLists)

The distinguishing feature of a polymorphic record type is that it abstracts over its element types, so you can use the same record type for both, lifted and plain element types. You can add support for custom polymorphic record types using an appropriate implicit Shape.

Here is an example for a type Pair:

source// A custom record class
case class Pair[A, B](a: A, b: B)

// A Shape implementation for Pair
final class PairShape[Level <: ShapeLevel, M <: Pair[_, _], U <: Pair[_, _] : ClassTag, P <: Pair[_, _]](
                                                                                                          val shapes: Seq[Shape[_ <: ShapeLevel, _, _, _]])
  extends MappedScalaProductShape[Level, Pair[_, _], M, U, P] {
  def buildValue(elems: IndexedSeq[Any]): Any = Pair(elems(0), elems(1))
  def copy(shapes: Seq[Shape[_ <: ShapeLevel, _, _, _]]): Shape[Level, _, _, _] = new PairShape(shapes)
}

implicit def pairShape[Level <: ShapeLevel, M1, M2, U1, U2, P1, P2](
                                                                     implicit s1: Shape[_ <: Level, M1, U1, P1], s2: Shape[_ <: Level, M2, U2, P2]
                                                                   ): PairShape[Level, Pair[M1, M2], Pair[U1, U2], Pair[P1, P2]] = new PairShape[Level, Pair[M1, M2], Pair[U1, U2], Pair[P1, P2]](Seq(s1, s2))

The implicit method pairShape in this example provides a Shape for a Pair of two element types whenever Shapes for the individual element types are available.

With these definitions in place, we can use the Pair record type in every location in Slick where a tuple or HList would be acceptable:

source// Use it in a table definition
class A(tag: Tag) extends Table[Pair[Int, String]](tag, "shape_a") {
  def id = column[Int]("id", O.PrimaryKey)
  def s = column[String]("s")
  def * = Pair(id, s)
}
val as = TableQuery[A]

// Insert data with the custom shape
val insertAction = DBIO.seq(
  as += Pair(1, "a"),
  as += Pair(2, "c"),
  as += Pair(3, "b")
)

// Use it for returning data from a query
val q2 = as
  .map { case a => Pair(a.id, (a.s ++ a.s)) }
  .filter { case Pair(id, _) => id =!= 1 }
  .sortBy { case Pair(_, ss) => ss }
  .map { case Pair(id, ss) => Pair(id, Pair(42, ss)) }
// returns: Vector(Pair(3,Pair(42,"bb")), Pair(2,Pair(42,"cc")))

Monomorphic Case Classes

Custom case classes are frequently used as monomorphic record types (i.e. record types where the element types are fixed). In order to use them in Slick, you need to define the case class for a record of plain values (as usual) plus an additional case class for a matching record of lifted values.

In order to provide a Shape for a custom case class, you can use CaseClassShape:

source// two custom case class variants
case class LiftedB(a: Rep[Int], b: Rep[String])
object LiftedB {
  def tupled = (LiftedB.apply _).tupled
}
case class B(a: Int, b: String)

// custom case class mapping
implicit object BShape
  extends CaseClassShape[Product, (Rep[Int], Rep[String]), LiftedB, (Int, String), B]((LiftedB.apply _).tupled, (B.apply _).tupled)

class BRow(tag: Tag) extends Table[B](tag, "shape_b") {
  def id = column[Int]("id", O.PrimaryKey)
  def s = column[String]("s")
  def * = LiftedB(id, s)
}
val bs = TableQuery[BRow]

val insertActions = DBIO.seq(
  bs += B(1, "a"),
  bs.map(b => (b.id, b.s)) += ((2, "c")),
  bs += B(3, "b")
)

val q3 = bs
  .map { case b => LiftedB(b.id, (b.s ++ b.s)) }
  .filter { case LiftedB(id, _) => id =!= 1 }
  .sortBy { case LiftedB(_, ss) => ss }

// returns: Vector(B(3,"bb"), B(2,"cc"))

Note that this mechanism can be used as an alternative to client-side mappings with the <> operator. It requires a bit more boilerplate but allows you to use the same field names in both, plain and lifted records.

Combining Mapped Types

In the following example we are combining a mapped case class and the mapped Pair type in another mapped case class.

source// Combining multiple mapped types
case class LiftedC(p: Pair[Rep[Int], Rep[String]], b: LiftedB)
object LiftedC {
  def tupled = (LiftedC.apply _).tupled
}
case class C(p: Pair[Int, String], b: B)
object C {
  def tupled = (C.apply _).tupled
}

implicit object CShape
  extends CaseClassShape[Product, (Pair[Rep[Int], Rep[String]], LiftedB), LiftedC, (Pair[Int, String], B), C]((LiftedC.apply _).tupled, (C.apply _).tupled)

class CRow(tag: Tag) extends Table[C](tag, "shape_c") {
  def id = column[Int]("id")
  def s = column[String]("s")
  def projection = LiftedC(
    Pair(column("p1"), column("p2")), // (cols defined inline, type inferred)
    LiftedB(id, s)
  )
  def * = projection
}
val cs = TableQuery[CRow]

val insertActions2 = DBIO.seq(
  cs += C(Pair(7, "x"), B(1, "a")),
  cs += C(Pair(8, "y"), B(2, "c")),
  cs += C(Pair(9, "z"), B(3, "b"))
)

val q4 = cs
  .map { case c => LiftedC(c.projection.p, LiftedB(c.id, (c.s ++ c.s))) }
  .filter { case LiftedC(_, LiftedB(id, _)) => id =!= 1 }
  .sortBy { case LiftedC(Pair(_, p2), LiftedB(_, ss)) => ss ++ p2 }

// returns: Vector(C(Pair(9,"z"),B(3,"bb")), C(Pair(8,"y"),B(2,"cc")))
The source code for this page can be found here.