Packages

package transformer

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. case class ScalaClassSnowparkDfTransformer(name: String = "scalaSparkTransform", description: Option[String] = None, className: String, options: Map[String, String] = Map(), runtimeOptions: Map[String, String] = Map()) extends OptionsGenericDfTransformer with Product with Serializable

    Configuration of a custom Snowpark-DataFrame transformation between one input and one output (1:1) as Java/Scala Class.

    Configuration of a custom Snowpark-DataFrame transformation between one input and one output (1:1) as Java/Scala Class. Define a transform function which receives a DataObjectId, a DataFrame and a map of options and has to return a DataFrame. The Java/Scala class has to implement interface CustomSnowparkDfTransformer.

    name

    name of the transformer

    description

    Optional description of the transformer

    className

    class name implementing trait CustomSnowparkDfTransformer

    options

    Options to pass to the transformation

    runtimeOptions

    optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.

    Annotations
    @Scaladoc()
  2. case class ScalaClassSnowparkDfsTransformer(name: String = "snowparkScalaTransform", description: Option[String] = None, className: String, options: Map[String, String] = Map(), runtimeOptions: Map[String, String] = Map()) extends OptionsGenericDfsTransformer with Product with Serializable

    Configuration of a custom Spark-DataFrame transformation between many inputs and many outputs (n:m) Define a transform function which receives a map of input DataObjectIds with DataFrames and a map of options and as to return a map of output DataObjectIds with DataFrames, see also trait CustomSnowparkDfsTransformer.

    Configuration of a custom Spark-DataFrame transformation between many inputs and many outputs (n:m) Define a transform function which receives a map of input DataObjectIds with DataFrames and a map of options and as to return a map of output DataObjectIds with DataFrames, see also trait CustomSnowparkDfsTransformer.

    name

    name of the transformer

    description

    Optional description of the transformer

    className

    class name implementing trait CustomSnowparkDfsTransformer

    options

    Options to pass to the transformation

    runtimeOptions

    optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.

    Annotations
    @Scaladoc()

Value Members

  1. object ScalaClassSnowparkDfTransformer extends FromConfigFactory[GenericDfTransformer] with Serializable
  2. object ScalaClassSnowparkDfsTransformer extends FromConfigFactory[GenericDfsTransformer] with Serializable

Ungrouped