Packages

package sources

A set of APIs for adding data sources to Spark SQL.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. sources
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Type Members

  1. case class AlwaysFalse() extends Filter with Product with Serializable

    A filter that always evaluates to false.

    A filter that always evaluates to false.

    Annotations
    @Evolving()
    Since

    3.0.0

  2. case class AlwaysTrue() extends Filter with Product with Serializable

    A filter that always evaluates to true.

    A filter that always evaluates to true.

    Annotations
    @Evolving()
    Since

    3.0.0

  3. case class And(left: Filter, right: Filter) extends Filter with Product with Serializable

    A filter that evaluates to true iff both left or right evaluate to true.

    A filter that evaluates to true iff both left or right evaluate to true.

    Annotations
    @Stable()
    Since

    1.3.0

  4. abstract class BaseRelation extends AnyRef

    Represents a collection of tuples with a known schema.

    Represents a collection of tuples with a known schema. Classes that extend BaseRelation must be able to produce the schema of their data in the form of a StructType. Concrete implementation should inherit from one of the descendant Scan classes, which define various abstract methods for execution.

    BaseRelations must also define an equality function that only returns true when the two instances will return the same data. This equality function is used when determining when it is safe to substitute cached results for a given relation.

    Annotations
    @Stable()
    Since

    1.3.0

  5. trait CatalystScan extends AnyRef

    ::Experimental:: An interface for experimenting with a more direct connection to the query planner.

    ::Experimental:: An interface for experimenting with a more direct connection to the query planner. Compared to PrunedFilteredScan, this operator receives the raw expressions from the org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. Unlike the other APIs this interface is NOT designed to be binary compatible across releases and thus should only be used for experimentation.

    Annotations
    @Unstable()
    Since

    1.3.0

  6. case class CollatedEqualNullSafe(attribute: String, value: Any, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of EqualNullSafe.

    Collation aware equivalent of EqualNullSafe.

    Annotations
    @Evolving()
  7. case class CollatedEqualTo(attribute: String, value: Any, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of EqualTo.

    Collation aware equivalent of EqualTo.

    Annotations
    @Evolving()
  8. abstract class CollatedFilter extends Filter

    Base class for collation aware string filters.

    Base class for collation aware string filters.

    Annotations
    @Evolving()
  9. case class CollatedGreaterThan(attribute: String, value: Any, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of GreaterThan.

    Collation aware equivalent of GreaterThan.

    Annotations
    @Evolving()
  10. case class CollatedGreaterThanOrEqual(attribute: String, value: Any, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of GreaterThanOrEqual.

    Collation aware equivalent of GreaterThanOrEqual.

    Annotations
    @Evolving()
  11. case class CollatedIn(attribute: String, values: Array[Any], dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of In.

    Collation aware equivalent of In.

    Annotations
    @Evolving()
  12. case class CollatedLessThan(attribute: String, value: Any, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of LessThan.

    Collation aware equivalent of LessThan.

    Annotations
    @Evolving()
  13. case class CollatedLessThanOrEqual(attribute: String, value: Any, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of LessThanOrEqual.

    Collation aware equivalent of LessThanOrEqual.

    Annotations
    @Evolving()
  14. case class CollatedStringContains(attribute: String, value: String, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of StringContains.

    Collation aware equivalent of StringContains.

    Annotations
    @Evolving()
  15. case class CollatedStringEndsWith(attribute: String, value: String, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of StringEndsWith.

    Collation aware equivalent of StringEndsWith.

    Annotations
    @Evolving()
  16. case class CollatedStringStartsWith(attribute: String, value: String, dataType: DataType) extends CollatedFilter with Product with Serializable

    Collation aware equivalent of StringStartsWith.

    Collation aware equivalent of StringStartsWith.

    Annotations
    @Evolving()
  17. trait CreatableRelationProvider extends AnyRef

    Annotations
    @Stable()
    Since

    1.3.0

  18. trait DataSourceRegister extends AnyRef

    Data sources should implement this trait so that they can register an alias to their data source.

    Data sources should implement this trait so that they can register an alias to their data source. This allows users to give the data source alias as the format type over the fully qualified class name.

    A new instance of this class will be instantiated each time a DDL call is made.

    Annotations
    @Stable()
    Since

    1.5.0

  19. case class EqualNullSafe(attribute: String, value: Any) extends Filter with Product with Serializable

    Performs equality comparison, similar to EqualTo.

    Performs equality comparison, similar to EqualTo. However, this differs from EqualTo in that it returns true (rather than NULL) if both inputs are NULL, and false (rather than NULL) if one of the input is NULL and the other is not NULL.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.5.0

  20. case class EqualTo(attribute: String, value: Any) extends Filter with Product with Serializable

    A filter that evaluates to true iff the column evaluates to a value equal to value.

    A filter that evaluates to true iff the column evaluates to a value equal to value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  21. sealed abstract class Filter extends AnyRef

    A filter predicate for data sources.

    A filter predicate for data sources. Mapping between Spark SQL types and filter value types follow the convention for return type of org.apache.spark.sql.Row#get(int).

    Annotations
    @Stable()
    Since

    1.3.0

  22. case class GreaterThan(attribute: String, value: Any) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a value greater than value.

    A filter that evaluates to true iff the attribute evaluates to a value greater than value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  23. case class GreaterThanOrEqual(attribute: String, value: Any) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a value greater than or equal to value.

    A filter that evaluates to true iff the attribute evaluates to a value greater than or equal to value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  24. case class In(attribute: String, values: Array[Any]) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to one of the values in the array.

    A filter that evaluates to true iff the attribute evaluates to one of the values in the array.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  25. trait InsertableRelation extends AnyRef

    A BaseRelation that can be used to insert data into it through the insert method.

    A BaseRelation that can be used to insert data into it through the insert method. If overwrite in insert method is true, the old data in the relation should be overwritten with the new data. If overwrite in insert method is false, the new data should be appended.

    InsertableRelation has the following three assumptions. 1. It assumes that the data (Rows in the DataFrame) provided to the insert method exactly matches the ordinal of fields in the schema of the BaseRelation. 2. It assumes that the schema of this relation will not be changed. Even if the insert method updates the schema (e.g. a relation of JSON or Parquet data may have a schema update after an insert operation), the new schema will not be used. 3. It assumes that fields of the data provided in the insert method are nullable. If a data source needs to check the actual nullability of a field, it needs to do it in the insert method.

    Annotations
    @Stable()
    Since

    1.3.0

  26. case class IsNotNull(attribute: String) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a non-null value.

    A filter that evaluates to true iff the attribute evaluates to a non-null value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  27. case class IsNull(attribute: String) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to null.

    A filter that evaluates to true iff the attribute evaluates to null.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  28. case class LessThan(attribute: String, value: Any) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a value less than value.

    A filter that evaluates to true iff the attribute evaluates to a value less than value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  29. case class LessThanOrEqual(attribute: String, value: Any) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a value less than or equal to value.

    A filter that evaluates to true iff the attribute evaluates to a value less than or equal to value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.0

  30. case class Not(child: Filter) extends Filter with Product with Serializable

    A filter that evaluates to true iff child is evaluated to false.

    A filter that evaluates to true iff child is evaluated to false.

    Annotations
    @Stable()
    Since

    1.3.0

  31. case class Or(left: Filter, right: Filter) extends Filter with Product with Serializable

    A filter that evaluates to true iff at least one of left or right evaluates to true.

    A filter that evaluates to true iff at least one of left or right evaluates to true.

    Annotations
    @Stable()
    Since

    1.3.0

  32. trait PrunedFilteredScan extends AnyRef

    A BaseRelation that can eliminate unneeded columns and filter using selected predicates before producing an RDD containing all matching tuples as Row objects.

    A BaseRelation that can eliminate unneeded columns and filter using selected predicates before producing an RDD containing all matching tuples as Row objects.

    The actual filter should be the conjunction of all filters, i.e. they should be "and" together.

    The pushed down filters are currently purely an optimization as they will all be evaluated again. This means it is safe to use them with methods that produce false positives such as filtering partitions based on a bloom filter.

    Annotations
    @Stable()
    Since

    1.3.0

  33. trait PrunedScan extends AnyRef

    A BaseRelation that can eliminate unneeded columns before producing an RDD containing all of its tuples as Row objects.

    A BaseRelation that can eliminate unneeded columns before producing an RDD containing all of its tuples as Row objects.

    Annotations
    @Stable()
    Since

    1.3.0

  34. trait RelationProvider extends AnyRef

    Implemented by objects that produce relations for a specific kind of data source.

    Implemented by objects that produce relations for a specific kind of data source. When Spark SQL is given a DDL operation with a USING clause specified (to specify the implemented RelationProvider), this interface is used to pass in the parameters specified by a user.

    Users may specify the fully qualified class name of a given data source. When that class is not found Spark SQL will append the class name DefaultSource to the path, allowing for less verbose invocation. For example, 'org.apache.spark.sql.json' would resolve to the data source 'org.apache.spark.sql.json.DefaultSource'

    A new instance of this class will be instantiated each time a DDL call is made.

    Annotations
    @Stable()
    Since

    1.3.0

  35. trait SchemaRelationProvider extends AnyRef

    Implemented by objects that produce relations for a specific kind of data source with a given schema.

    Implemented by objects that produce relations for a specific kind of data source with a given schema. When Spark SQL is given a DDL operation with a USING clause specified ( to specify the implemented SchemaRelationProvider) and a user defined schema, this interface is used to pass in the parameters specified by a user.

    Users may specify the fully qualified class name of a given data source. When that class is not found Spark SQL will append the class name DefaultSource to the path, allowing for less verbose invocation. For example, 'org.apache.spark.sql.json' would resolve to the data source 'org.apache.spark.sql.json.DefaultSource'

    A new instance of this class will be instantiated each time a DDL call is made.

    The difference between a RelationProvider and a SchemaRelationProvider is that users need to provide a schema when using a SchemaRelationProvider. A relation provider can inherit both RelationProvider and SchemaRelationProvider if it can support both schema inference and user-specified schemas.

    Annotations
    @Stable()
    Since

    1.3.0

  36. trait StreamSinkProvider extends AnyRef

    ::Experimental:: Implemented by objects that can produce a streaming Sink for a specific format or system.

    ::Experimental:: Implemented by objects that can produce a streaming Sink for a specific format or system.

    Annotations
    @Unstable()
    Since

    2.0.0

  37. trait StreamSourceProvider extends AnyRef

    ::Experimental:: Implemented by objects that can produce a streaming Source for a specific format or system.

    ::Experimental:: Implemented by objects that can produce a streaming Source for a specific format or system.

    Annotations
    @Unstable()
    Since

    2.0.0

  38. case class StringContains(attribute: String, value: String) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a string that contains the string value.

    A filter that evaluates to true iff the attribute evaluates to a string that contains the string value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.1

  39. case class StringEndsWith(attribute: String, value: String) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a string that ends with value.

    A filter that evaluates to true iff the attribute evaluates to a string that ends with value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.1

  40. case class StringStartsWith(attribute: String, value: String) extends Filter with Product with Serializable

    A filter that evaluates to true iff the attribute evaluates to a string that starts with value.

    A filter that evaluates to true iff the attribute evaluates to a string that starts with value.

    attribute

    of the column to be evaluated; dots are used as separators for nested columns. If any part of the names contains dots, it is quoted to avoid confusion.

    Annotations
    @Stable()
    Since

    1.3.1

  41. trait SupportsStreamSourceMetadataColumns extends StreamSourceProvider

    Implemented by StreamSourceProvider objects that can generate file metadata columns.

    Implemented by StreamSourceProvider objects that can generate file metadata columns. This trait extends the basic StreamSourceProvider by allowing the addition of metadata columns to the schema of the Stream Data Source.

  42. trait TableScan extends AnyRef

    A BaseRelation that can produce all of its tuples as an RDD of Row objects.

    A BaseRelation that can produce all of its tuples as an RDD of Row objects.

    Annotations
    @Stable()
    Since

    1.3.0

Value Members

  1. object AlwaysFalse extends AlwaysFalse
    Annotations
    @Evolving()
  2. object AlwaysTrue extends AlwaysTrue
    Annotations
    @Evolving()

Inherited from AnyRef

Inherited from Any

Ungrouped