Packages

object BigQuery extends Google

Java API to interface with BigQuery.

Annotations
@ApiMayChange()
Source
BigQuery.scala
Linear Supertypes
Google, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. BigQuery
  2. Google
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def cancelJob(jobId: String, location: Optional[String], settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[JobCancelResponse]

    Requests that a job be cancelled.

    Requests that a job be cancelled.

    jobId

    job ID of the job to cancel

    location

    the geographic location of the job. Required except for US and EU

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.JobCancelResponse

    See also

    BigQuery reference

  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  7. def createDataset(dataset: Dataset, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Dataset]

    Creates a new empty dataset.

  8. def createDataset(datasetId: String, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Dataset]

    Creates a new empty dataset.

    Creates a new empty dataset.

    datasetId

    dataset ID of the new dataset

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.Dataset

    See also

    BigQuery reference

  9. def createLoadJob[Job](job: Job, marshaller: Marshaller[Job, RequestEntity], unmarshaller: Unmarshaller[HttpEntity, Job]): Sink[ByteString, CompletionStage[Job]]

    Starts a new asynchronous upload job.

    Starts a new asynchronous upload job.

    Job

    the data model for a job

    job

    the job to start

    marshaller

    akka.http.javadsl.marshalling.Marshaller for Job

    unmarshaller

    akka.http.javadsl.unmarshalling.Unmarshaller for Job

    returns

    a akka.stream.javadsl.Sink that uploads bytes and materializes a java.util.concurrent.CompletionStage containing the Job when completed

    Note

    WARNING: Pending the resolution of BigQuery issue 176002651 this method may not work as expected. As a workaround, you can use the config setting akka.http.parsing.conflicting-content-type-header-processing-mode = first with Akka HTTP v10.2.4 or later.

    See also

    BigQuery reference

    BigQuery reference

  10. def createTable(table: Table, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Table]

    Creates a new, empty table in the dataset.

  11. def createTable(datasetId: String, tableId: String, schema: TableSchema, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Table]

    Creates a new, empty table in the dataset.

    Creates a new, empty table in the dataset.

    datasetId

    dataset ID of the new table

    tableId

    table ID of the new table

    schema

    akka.stream.alpakka.googlecloud.bigquery.model.TableSchema of the new table

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.Table

    See also

    BigQuery reference

  12. def deleteDataset(datasetId: String, deleteContents: Boolean, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Done]

    Deletes the dataset specified by the datasetId value.

    Deletes the dataset specified by the datasetId value.

    datasetId

    dataset ID of dataset being deleted

    deleteContents

    if true, delete all the tables in the dataset; if false and the dataset contains tables, the request will fail

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing akka.Done

  13. def deleteTable(datasetId: String, tableId: String, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Done]

    Deletes the specified table from the dataset.

    Deletes the specified table from the dataset. If the table contains data, all the data will be deleted.

    datasetId

    dataset ID of the table to delete

    tableId

    table ID of the table to delete

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing akka.Done

    See also

    BigQuery reference

  14. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  16. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  17. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  18. def getDataset(datasetId: String, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Dataset]

    Returns the specified dataset.

    Returns the specified dataset.

    datasetId

    dataset ID of the requested dataset

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.Dataset

    See also

    BigQuery reference

  19. def getJob(jobId: String, location: Optional[String], settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Job]

    Returns information about a specific job.

    Returns information about a specific job.

    jobId

    job ID of the requested job

    location

    the geographic location of the job. Required except for US and EU

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing the Job

    See also

    BigQuery reference

  20. def getQueryResults[Out](jobId: String, startIndex: OptionalLong, maxResults: OptionalInt, timeout: Optional[Duration], location: Optional[String], unmarshaller: Unmarshaller[HttpEntity, QueryResponse[Out]]): Source[Out, CompletionStage[QueryResponse[Out]]]

    The results of a query job.

    The results of a query job.

    Out

    the data model of the query results

    jobId

    job ID of the query job

    startIndex

    zero-based index of the starting row

    maxResults

    maximum number of results to read

    timeout

    specifies the maximum amount of time that the client is willing to wait for the query to complete

    location

    the geographic location of the job. Required except for US and EU

    unmarshaller

    akka.http.javadsl.unmarshalling.Unmarshaller for akka.stream.alpakka.googlecloud.bigquery.model.QueryResponse

    returns

    a akka.stream.javadsl.Source that emits an Out for each row of the results and materializes a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.QueryResponse

    See also

    BigQuery reference

  21. def getTable(datasetId: String, tableId: String, settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[Table]

    Gets the specified table resource.

    Gets the specified table resource. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.

    datasetId

    dataset ID of the requested table

    tableId

    table ID of the requested table

    settings

    the akka.stream.alpakka.google.GoogleSettings

    system

    the actor system

    returns

    a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.Table

    See also

    BigQuery reference

  22. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  23. def insertAll[In](datasetId: String, tableId: String, retryFailedRequests: Boolean, marshaller: Marshaller[TableDataInsertAllRequest[In], RequestEntity]): Flow[TableDataInsertAllRequest[In], TableDataInsertAllResponse, NotUsed]

    Streams data into BigQuery one record at a time without needing to run a load job.

    Streams data into BigQuery one record at a time without needing to run a load job.

    In

    the data model for each record

    datasetId

    dataset ID of the table to insert into

    tableId

    table ID of the table to insert into

    retryFailedRequests

    whether to retry failed requests

    marshaller

    akka.http.javadsl.marshalling.Marshaller for akka.stream.alpakka.googlecloud.bigquery.model.TableDataInsertAllRequest

    returns

    a akka.stream.javadsl.Flow that sends each akka.stream.alpakka.googlecloud.bigquery.model.TableDataInsertAllRequest and emits a akka.stream.alpakka.googlecloud.bigquery.model.TableDataInsertAllResponse for each

    See also

    BigQuery reference

  24. def insertAll[In](datasetId: String, tableId: String, retryPolicy: InsertAllRetryPolicy, templateSuffix: Optional[String], marshaller: Marshaller[TableDataInsertAllRequest[In], RequestEntity]): Sink[List[In], NotUsed]

    Streams data into BigQuery one record at a time without needing to run a load job

    Streams data into BigQuery one record at a time without needing to run a load job

    In

    the data model for each record

    datasetId

    dataset id of the table to insert into

    tableId

    table id of the table to insert into

    retryPolicy

    InsertAllRetryPolicy determining whether to retry and deduplicate

    templateSuffix

    if specified, treats the destination table as a base template, and inserts the rows into an instance table named "{destination}{templateSuffix}"

    marshaller

    akka.http.javadsl.marshalling.Marshaller for akka.stream.alpakka.googlecloud.bigquery.model.TableDataInsertAllRequest

    returns

    a akka.stream.javadsl.Sink that inserts each batch of In into the table

    See also

    BigQuery reference

  25. def insertAllAsync[In](datasetId: String, tableId: String, labels: Optional[Map[String, String]], marshaller: Marshaller[In, RequestEntity]): Flow[In, Job, NotUsed]

    Loads data into BigQuery via a series of asynchronous load jobs created at the rate akka.stream.alpakka.googlecloud.bigquery.BigQuerySettings.loadJobPerTableQuota.

    Loads data into BigQuery via a series of asynchronous load jobs created at the rate akka.stream.alpakka.googlecloud.bigquery.BigQuerySettings.loadJobPerTableQuota.

    In

    the data model for each record

    datasetId

    dataset ID of the table to insert into

    tableId

    table ID of the table to insert into

    marshaller

    akka.http.javadsl.marshalling.Marshaller for In

    returns

    a akka.stream.javadsl.Flow that uploads each In and emits a Job for every upload job created

    Note

    WARNING: Pending the resolution of BigQuery issue 176002651 this method may not work as expected. As a workaround, you can use the config setting akka.http.parsing.conflicting-content-type-header-processing-mode = first with Akka HTTP v10.2.4 or later.

  26. def insertAllAsync[In](datasetId: String, tableId: String, marshaller: Marshaller[In, RequestEntity]): Flow[In, Job, NotUsed]

    Loads data into BigQuery via a series of asynchronous load jobs created at the rate akka.stream.alpakka.googlecloud.bigquery.BigQuerySettings.loadJobPerTableQuota.

    Loads data into BigQuery via a series of asynchronous load jobs created at the rate akka.stream.alpakka.googlecloud.bigquery.BigQuerySettings.loadJobPerTableQuota.

    In

    the data model for each record

    datasetId

    dataset ID of the table to insert into

    tableId

    table ID of the table to insert into

    marshaller

    akka.http.javadsl.marshalling.Marshaller for In

    returns

    a akka.stream.javadsl.Flow that uploads each In and emits a Job for every upload job created

    Note

    WARNING: Pending the resolution of BigQuery issue 176002651 this method may not work as expected. As a workaround, you can use the config setting akka.http.parsing.conflicting-content-type-header-processing-mode = first with Akka HTTP v10.2.4 or later.

  27. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  28. def listDatasets(maxResults: OptionalInt, all: Optional[Boolean], filter: Map[String, String]): Source[Dataset, NotUsed]

    Lists all datasets in the specified project to which the user has been granted the READER dataset role.

    Lists all datasets in the specified project to which the user has been granted the READER dataset role.

    maxResults

    the maximum number of results to return in a single response page

    all

    whether to list all datasets, including hidden ones

    filter

    a key, value java.util.Map for filtering the results of the request by label

    returns

    a akka.stream.javadsl.Source that emits each akka.stream.alpakka.googlecloud.bigquery.model.Dataset

    See also

    BigQuery reference

  29. def listTableData[Out](datasetId: String, tableId: String, startIndex: OptionalLong, maxResults: OptionalInt, selectedFields: List[String], unmarshaller: Unmarshaller[HttpEntity, TableDataListResponse[Out]]): Source[Out, CompletionStage[TableDataListResponse[Out]]]

    Lists the content of a table in rows.

    Lists the content of a table in rows.

    Out

    the data model of each row

    datasetId

    dataset ID of the table to list

    tableId

    table ID of the table to list

    startIndex

    start row index of the table

    maxResults

    row limit of the table

    selectedFields

    subset of fields to return, supports select into sub fields. Example: selectedFields = List.of("a", "e.d.f")

    unmarshaller

    akka.http.javadsl.unmarshalling.Unmarshaller for akka.stream.alpakka.googlecloud.bigquery.model.TableDataListResponse

    returns

    a akka.stream.javadsl.Source that emits an Out for each row in the table

    See also

    BigQuery reference

  30. def listTables(datasetId: String, maxResults: OptionalInt): Source[Table, CompletionStage[TableListResponse]]

    Lists all tables in the specified dataset.

    Lists all tables in the specified dataset.

    datasetId

    dataset ID of the tables to list

    maxResults

    the maximum number of results to return in a single response page

    returns

    a akka.stream.javadsl.Source that emits each akka.stream.alpakka.googlecloud.bigquery.model.Table in the dataset and materializes a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.TableListResponse

    See also

    BigQuery reference

  31. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  32. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  33. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  34. final def paginatedRequest[Out <: Paginated](request: HttpRequest, unmarshaller: Unmarshaller[HttpResponse, Out]): Source[Out, NotUsed]

    Makes a series of requests to page through a resource.

    Makes a series of requests to page through a resource. Authentication is handled automatically. Requests are retried if the unmarshaller throws a akka.stream.alpakka.google.util.Retry.

    Out

    the data model for each page of the resource

    request

    the initial akka.http.javadsl.model.HttpRequest to make; must be a GET request

    returns

    a akka.stream.javadsl.Source that emits an Out for each page of the resource

    Definition Classes
    Google
  35. def query[Out](query: QueryRequest, unmarshaller: Unmarshaller[HttpEntity, QueryResponse[Out]]): Source[Out, Pair[CompletionStage[JobReference], CompletionStage[QueryResponse[Out]]]]

    Runs a BigQuery SQL query.

  36. def query[Out](query: String, dryRun: Boolean, useLegacySql: Boolean, unmarshaller: Unmarshaller[HttpEntity, QueryResponse[Out]]): Source[Out, CompletionStage[QueryResponse[Out]]]

    Runs a BigQuery SQL query.

    Runs a BigQuery SQL query.

    Out

    the data model of the query results

    query

    a query string, following the BigQuery query syntax, of the query to execute

    dryRun

    if set to true BigQuery doesn't run the job and instead returns statistics about the job such as how many bytes would be processed

    useLegacySql

    specifies whether to use BigQuery's legacy SQL dialect for this query

    unmarshaller

    akka.http.javadsl.unmarshalling.Unmarshaller for akka.stream.alpakka.googlecloud.bigquery.model.QueryResponse

    returns

    a akka.stream.javadsl.Source that emits an Out for each row of the results and materializes a java.util.concurrent.CompletionStage containing the akka.stream.alpakka.googlecloud.bigquery.model.QueryResponse

    See also

    BigQuery reference

  37. final def resumableUpload[Out](request: HttpRequest, unmarshaller: Unmarshaller[HttpResponse, Out]): Sink[ByteString, CompletionStage[Out]]

    Makes a series of requests to upload a stream of bytes to a media endpoint.

    Makes a series of requests to upload a stream of bytes to a media endpoint. Authentication is handled automatically. If the unmarshaller throws a akka.stream.alpakka.google.util.Retry the upload will attempt to recover and continue.

    Out

    the data model for the resource

    request

    the akka.http.javadsl.model.HttpRequest to initiate the upload; must be a POST request with query uploadType=resumable and optionally a akka.stream.alpakka.google.javadsl.XUploadContentType header

    returns

    a akka.stream.javadsl.Sink that materializes a java.util.concurrent.CompletionStage containing the unmarshalled resource

    Definition Classes
    Google
  38. final def singleRequest[T](request: HttpRequest, unmarshaller: Unmarshaller[HttpResponse, T], settings: GoogleSettings, system: ClassicActorSystemProvider): CompletionStage[T]

    Makes a request and returns the unmarshalled response.

    Makes a request and returns the unmarshalled response. Authentication is handled automatically. Retries the request if the unmarshaller throws a akka.stream.alpakka.google.util.Retry.

    T

    the data model for the resource

    request

    the akka.http.javadsl.model.HttpRequest to make

    returns

    a java.util.concurrent.CompletionStage containing the unmarshalled response

    Definition Classes
    Google
  39. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  40. def toString(): String
    Definition Classes
    AnyRef → Any
  41. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  42. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  43. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()

Inherited from Google

Inherited from AnyRef

Inherited from Any

Ungrouped