diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS
index aef889d3d..6137bef2a 100644
--- a/.github/CODEOWNERS
+++ b/.github/CODEOWNERS
@@ -5,4 +5,4 @@
# https://help.github.com/en/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
# The java-samples-reviewers team is the default owner for samples changes
-samples/**/*.java @stephaniewang526 @googleapis/java-samples-reviewers
+samples/**/*.java @googleapis/java-samples-reviewers
diff --git a/.kokoro/build.sh b/.kokoro/build.sh
index b1abad023..c4b20b9f9 100755
--- a/.kokoro/build.sh
+++ b/.kokoro/build.sh
@@ -51,9 +51,7 @@ test)
RETURN_CODE=$?
;;
lint)
- mvn \
- -Penable-samples \
- com.coveo:fmt-maven-plugin:check
+ mvn com.coveo:fmt-maven-plugin:check
RETURN_CODE=$?
;;
javadoc)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 510606eea..be9f692b6 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,25 @@
# Changelog
+### [1.116.2](https://www.github.com/googleapis/java-bigquery/compare/v1.116.1...v1.116.2) (2020-06-09)
+
+
+### Documentation
+
+* **samples:** add load CSV from GCS sample ([#426](https://www.github.com/googleapis/java-bigquery/issues/426)) ([3810366](https://www.github.com/googleapis/java-bigquery/commit/3810366451097a7f14db11504103865540ac242a))
+* **samples:** add load CSV from GCS to overwrite table sample ([#428](https://www.github.com/googleapis/java-bigquery/issues/428)) ([21a3606](https://www.github.com/googleapis/java-bigquery/commit/21a3606f5fb65287f808b12a6fef65817c8a8ba6))
+* **samples:** add update table using dml query sample ([#424](https://www.github.com/googleapis/java-bigquery/issues/424)) ([3902ba1](https://www.github.com/googleapis/java-bigquery/commit/3902ba1cf0d8a88d3e6f30b6606067503487c77d)), closes [#413](https://www.github.com/googleapis/java-bigquery/issues/413)
+* **samples:** added copy table and accompanying test ([#414](https://www.github.com/googleapis/java-bigquery/issues/414)) ([de0d97f](https://www.github.com/googleapis/java-bigquery/commit/de0d97f2f940c9cf507d19c5595e1a0e819ef19c))
+* **samples:** added extract to json and accompanying test ([#416](https://www.github.com/googleapis/java-bigquery/issues/416)) ([16a956d](https://www.github.com/googleapis/java-bigquery/commit/16a956db0aa545df84f7885ffb4425460cf55a16))
+* **samples:** adding browse table sample and test ([#422](https://www.github.com/googleapis/java-bigquery/issues/422)) ([dff4e5f](https://www.github.com/googleapis/java-bigquery/commit/dff4e5f86764b1c779c2ef131182483e2ffa1c1b))
+* **samples:** adding destination query sample and test ([#418](https://www.github.com/googleapis/java-bigquery/issues/418)) ([0f50961](https://www.github.com/googleapis/java-bigquery/commit/0f50961aaf1092f3ecc4e02fa9cebb50f6d45e90))
+* **samples:** adding simple query example for completeness ([#417](https://www.github.com/googleapis/java-bigquery/issues/417)) ([59426df](https://www.github.com/googleapis/java-bigquery/commit/59426df912f743b7927deb562366b625aba6f087))
+* **samples:** rename extract table json to extract table csv ([#415](https://www.github.com/googleapis/java-bigquery/issues/415)) ([c1f21e6](https://www.github.com/googleapis/java-bigquery/commit/c1f21e6c16df40bb3c71610f9d5b4fb4855b28fb))
+
+
+### Dependencies
+
+* update dependency com.google.apis:google-api-services-bigquery to v2-rev20200523-1.30.9 ([#409](https://www.github.com/googleapis/java-bigquery/issues/409)) ([d30c823](https://www.github.com/googleapis/java-bigquery/commit/d30c823c5a604b195f17d8ac33894107cdee967e))
+
### [1.116.1](https://www.github.com/googleapis/java-bigquery/compare/v1.116.0...v1.116.1) (2020-06-01)
diff --git a/README.md b/README.md
index a74ce77b3..595ea6dfa 100644
--- a/README.md
+++ b/README.md
@@ -40,7 +40,7 @@ If you are using Maven without BOM, add this to your dependencies:
com.google.cloud
google-cloud-bigquery
- 1.116.0
+ 1.116.1
```
@@ -49,11 +49,11 @@ If you are using Maven without BOM, add this to your dependencies:
If you are using Gradle, add this to your dependencies
```Groovy
-compile 'com.google.cloud:google-cloud-bigquery:1.116.1'
+compile 'com.google.cloud:google-cloud-bigquery:1.116.2'
```
If you are using SBT, add this to your dependencies
```Scala
-libraryDependencies += "com.google.cloud" % "google-cloud-bigquery" % "1.116.1"
+libraryDependencies += "com.google.cloud" % "google-cloud-bigquery" % "1.116.2"
```
[//]: # ({x-version-update-end})
@@ -204,13 +204,16 @@ has instructions for running the samples.
| Add Empty Column | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/AddEmptyColumn.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/AddEmptyColumn.java) |
| Auth Drive Scope | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/AuthDriveScope.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/AuthDriveScope.java) |
| Auth Snippets | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/AuthSnippets.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/AuthSnippets.java) |
+| Browse Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/BrowseTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/BrowseTable.java) |
| Copy Multiple Tables | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/CopyMultipleTables.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/CopyMultipleTables.java) |
+| Copy Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/CopyTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/CopyTable.java) |
| Create Clustered Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/CreateClusteredTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/CreateClusteredTable.java) |
| Create Dataset | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/CreateDataset.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/CreateDataset.java) |
| Create Partitioned Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/CreatePartitionedTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/CreatePartitionedTable.java) |
| Create Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/CreateTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/CreateTable.java) |
| Delete Dataset | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/DeleteDataset.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/DeleteDataset.java) |
| Delete Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/DeleteTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/DeleteTable.java) |
+| Extract Table To Csv | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToCsv.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/ExtractTableToCsv.java) |
| Extract Table To Json | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToJson.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/ExtractTableToJson.java) |
| Get Dataset Info | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/GetDatasetInfo.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/GetDatasetInfo.java) |
| Get Model | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/GetModel.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/GetModel.java) |
@@ -218,6 +221,8 @@ has instructions for running the samples.
| List Datasets | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/ListDatasets.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/ListDatasets.java) |
| List Models | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/ListModels.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/ListModels.java) |
| List Tables | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/ListTables.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/ListTables.java) |
+| Load Csv From Gcs | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcs.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcs.java) |
+| Load Csv From Gcs Truncate | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcsTruncate.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcsTruncate.java) |
| Load Local File | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java) |
| Load Parquet | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/LoadParquet.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/LoadParquet.java) |
| Load Parquet Replace Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/LoadParquetReplaceTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/LoadParquetReplaceTable.java) |
@@ -232,11 +237,14 @@ has instructions for running the samples.
| Relax Column Mode | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/RelaxColumnMode.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/RelaxColumnMode.java) |
| Relax Table Query | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/RelaxTableQuery.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/RelaxTableQuery.java) |
| Run Legacy Query | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/RunLegacyQuery.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/RunLegacyQuery.java) |
+| Save Query To Table | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/SaveQueryToTable.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/SaveQueryToTable.java) |
| Simple App | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/SimpleApp.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/SimpleApp.java) |
+| Simple Query | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/SimpleQuery.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/SimpleQuery.java) |
| Table Insert Rows | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/TableInsertRows.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/TableInsertRows.java) |
| Update Dataset Access | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/UpdateDatasetAccess.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/UpdateDatasetAccess.java) |
| Update Dataset Description | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/UpdateDatasetDescription.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/UpdateDatasetDescription.java) |
| Update Dataset Expiration | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/UpdateDatasetExpiration.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/UpdateDatasetExpiration.java) |
+| Update Table DML | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDML.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/UpdateTableDML.java) |
| Update Table Description | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDescription.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/UpdateTableDescription.java) |
| Update Table Expiration | [source code](https://github.com/googleapis/java-bigquery/blob/master/samples/snippets/src/main/java/com/example/bigquery/UpdateTableExpiration.java) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/java-bigquery&page=editor&open_in_editor=samples/snippets/src/main/java/com/example/bigquery/UpdateTableExpiration.java) |
diff --git a/google-cloud-bigquery/pom.xml b/google-cloud-bigquery/pom.xml
index b1f467a26..6a8352473 100644
--- a/google-cloud-bigquery/pom.xml
+++ b/google-cloud-bigquery/pom.xml
@@ -3,7 +3,7 @@
4.0.0
com.google.cloud
google-cloud-bigquery
- 1.116.1
+ 1.116.2
jar
BigQuery
https://github.com/googleapis/java-bigquery
@@ -11,7 +11,7 @@
com.google.cloud
google-cloud-bigquery-parent
- 1.116.1
+ 1.116.2
google-cloud-bigquery
diff --git a/pom.xml b/pom.xml
index 8fdc5aff1..76269f9ca 100644
--- a/pom.xml
+++ b/pom.xml
@@ -4,7 +4,7 @@
com.google.cloud
google-cloud-bigquery-parent
pom
- 1.116.1
+ 1.116.2
BigQuery Parent
https://github.com/googleapis/java-bigquery
@@ -63,7 +63,7 @@
UTF-8
github
google-cloud-bigquery-parent
- v2-rev20200429-1.30.9
+ v2-rev20200523-1.30.9
@@ -86,7 +86,7 @@
com.google.cloud
google-cloud-bigquery
- 1.116.1
+ 1.116.2
@@ -143,7 +143,7 @@
org.apache.maven.plugins
maven-project-info-reports-plugin
- 3.0.0
+ 3.1.0
diff --git a/renovate.json b/renovate.json
index a65d0adb2..034f3c099 100644
--- a/renovate.json
+++ b/renovate.json
@@ -76,5 +76,6 @@
"groupName": "jackson dependencies"
}
],
- "semanticCommits": true
+ "semanticCommits": true,
+ "masterIssue": true
}
\ No newline at end of file
diff --git a/samples/install-without-bom/pom.xml b/samples/install-without-bom/pom.xml
index 0e2c300b8..5233a01d9 100644
--- a/samples/install-without-bom/pom.xml
+++ b/samples/install-without-bom/pom.xml
@@ -45,7 +45,7 @@
com.google.cloud
google-cloud-bigquery
- 1.116.0
+ 1.116.1
diff --git a/samples/install-without-bom/src/test/resources/userSessionsData.json b/samples/install-without-bom/src/test/resources/userSessionsData.json
new file mode 100644
index 000000000..042ac3737
--- /dev/null
+++ b/samples/install-without-bom/src/test/resources/userSessionsData.json
@@ -0,0 +1,10 @@
+{"id":"2ad525d6-c832-4c3d-b7fe-59d104885519","user_id":"38","login_time":"1.47766087E9","logout_time":"1.477661109E9","ip_address":"192.0.2.12"}
+{"id":"53d65e20-6ea9-4650-98d9-a2111fbd1122","user_id":"88","login_time":"1.47707544E9","logout_time":"1.477075519E9","ip_address":"192.0.2.88"}
+{"id":"5e6c3021-d5e7-4ccd-84b2-adfa9176d13d","user_id":"39","login_time":"1.474022869E9","logout_time":"1.474022961E9","ip_address":"203.0.113.52"}
+{"id":"6196eefa-1498-4567-8ef0-498845b888d9","user_id":"52","login_time":"1.478604612E9","logout_time":"1.478604691E9","ip_address":"203.0.113.169"}
+{"id":"70656dc5-7e0f-49cf-9e00-f06ed93c1f5b","user_id":"46","login_time":"1.474089924E9","logout_time":"1.474090227E9","ip_address":"192.0.2.10"}
+{"id":"aafa5eef-ad49-49a7-9a0f-fbc7fd639bd3","user_id":"40","login_time":"1.478031161E9","logout_time":"1.478031388E9","ip_address":"203.0.113.18"}
+{"id":"d2792fc2-24dd-4260-9456-3fbe6cdfdd90","user_id":"5","login_time":"1.481259081E9","logout_time":"1.481259247E9","ip_address":"192.0.2.140"}
+{"id":"d835dc49-32f9-4790-b4eb-dddee62e0dcc","user_id":"62","login_time":"1.478892977E9","logout_time":"1.478893219E9","ip_address":"203.0.113.83"}
+{"id":"f4a0d3c7-351f-471c-8e11-e093e7a6ce75","user_id":"89","login_time":"1.459031555E9","logout_time":"1.459031831E9","ip_address":"203.0.113.233"}
+{"id":"f6e9f526-5b22-4679-9c3e-56a636e815bb","user_id":"97","login_time":"1.482426034E9","logout_time":"1.482426415E9","ip_address":"203.0.113.167"}
\ No newline at end of file
diff --git a/samples/snapshot/pom.xml b/samples/snapshot/pom.xml
index f84fbf44b..58b94a0c6 100644
--- a/samples/snapshot/pom.xml
+++ b/samples/snapshot/pom.xml
@@ -44,7 +44,7 @@
com.google.cloud
google-cloud-bigquery
- 1.116.1
+ 1.116.2
diff --git a/samples/snapshot/src/test/resources/userSessionsData.json b/samples/snapshot/src/test/resources/userSessionsData.json
new file mode 100644
index 000000000..042ac3737
--- /dev/null
+++ b/samples/snapshot/src/test/resources/userSessionsData.json
@@ -0,0 +1,10 @@
+{"id":"2ad525d6-c832-4c3d-b7fe-59d104885519","user_id":"38","login_time":"1.47766087E9","logout_time":"1.477661109E9","ip_address":"192.0.2.12"}
+{"id":"53d65e20-6ea9-4650-98d9-a2111fbd1122","user_id":"88","login_time":"1.47707544E9","logout_time":"1.477075519E9","ip_address":"192.0.2.88"}
+{"id":"5e6c3021-d5e7-4ccd-84b2-adfa9176d13d","user_id":"39","login_time":"1.474022869E9","logout_time":"1.474022961E9","ip_address":"203.0.113.52"}
+{"id":"6196eefa-1498-4567-8ef0-498845b888d9","user_id":"52","login_time":"1.478604612E9","logout_time":"1.478604691E9","ip_address":"203.0.113.169"}
+{"id":"70656dc5-7e0f-49cf-9e00-f06ed93c1f5b","user_id":"46","login_time":"1.474089924E9","logout_time":"1.474090227E9","ip_address":"192.0.2.10"}
+{"id":"aafa5eef-ad49-49a7-9a0f-fbc7fd639bd3","user_id":"40","login_time":"1.478031161E9","logout_time":"1.478031388E9","ip_address":"203.0.113.18"}
+{"id":"d2792fc2-24dd-4260-9456-3fbe6cdfdd90","user_id":"5","login_time":"1.481259081E9","logout_time":"1.481259247E9","ip_address":"192.0.2.140"}
+{"id":"d835dc49-32f9-4790-b4eb-dddee62e0dcc","user_id":"62","login_time":"1.478892977E9","logout_time":"1.478893219E9","ip_address":"203.0.113.83"}
+{"id":"f4a0d3c7-351f-471c-8e11-e093e7a6ce75","user_id":"89","login_time":"1.459031555E9","logout_time":"1.459031831E9","ip_address":"203.0.113.233"}
+{"id":"f6e9f526-5b22-4679-9c3e-56a636e815bb","user_id":"97","login_time":"1.482426034E9","logout_time":"1.482426415E9","ip_address":"203.0.113.167"}
\ No newline at end of file
diff --git a/samples/snippets/pom.xml b/samples/snippets/pom.xml
index 6b6788b2e..67bb701ec 100644
--- a/samples/snippets/pom.xml
+++ b/samples/snippets/pom.xml
@@ -44,7 +44,7 @@
com.google.cloud
libraries-bom
- 5.5.0
+ 5.6.0
pom
import
diff --git a/samples/snippets/src/main/java/com/example/bigquery/AddColumnLoadAppend.java b/samples/snippets/src/main/java/com/example/bigquery/AddColumnLoadAppend.java
index 932c27d69..a8b22ed8b 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/AddColumnLoadAppend.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/AddColumnLoadAppend.java
@@ -47,14 +47,15 @@ public static void runAddColumnLoadAppend() throws Exception {
// 'REQUIRED' fields cannot be added to an existing schema, so the additional column must be
// 'NULLABLE'.
Schema schema =
- Schema.of(
- Field.newBuilder("name", LegacySQLTypeName.STRING)
+ Schema.of(
+ Field.newBuilder("name", LegacySQLTypeName.STRING)
.setMode(Field.Mode.REQUIRED)
.build());
List fields = schema.getFields();
// Adding below additional column during the load job
- Field newField = Field.newBuilder("post_abbr", LegacySQLTypeName.STRING)
+ Field newField =
+ Field.newBuilder("post_abbr", LegacySQLTypeName.STRING)
.setMode(Field.Mode.NULLABLE)
.build();
List newFields = new ArrayList<>(fields);
@@ -63,8 +64,8 @@ public static void runAddColumnLoadAppend() throws Exception {
addColumnLoadAppend(datasetName, tableName, sourceUri, newSchema);
}
- public static void addColumnLoadAppend(String datasetName, String tableName,
- String sourceUri, Schema newSchema) throws Exception {
+ public static void addColumnLoadAppend(
+ String datasetName, String tableName, String sourceUri, Schema newSchema) throws Exception {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
diff --git a/samples/snippets/src/main/java/com/example/bigquery/BrowseTable.java b/samples/snippets/src/main/java/com/example/bigquery/BrowseTable.java
new file mode 100644
index 000000000..518067e77
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/BrowseTable.java
@@ -0,0 +1,64 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_browse_table]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQuery.TableDataListOption;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.TableResult;
+
+// Sample to directly browse a table with optional paging
+public class BrowseTable {
+
+ public static void runBrowseTable() {
+ // TODO(developer): Replace these variables before running the sample.
+ String table = "MY_TABLE_NAME";
+ String dataset = "MY_DATASET_NAME";
+ browseTable(dataset, table);
+ }
+
+ public static void browseTable(String dataset, String table) {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ // Identify the table itself
+ TableId tableId = TableId.of(dataset, table);
+
+ // Page over 100 records. If you don't need pagination, remove the pageSize parameter.
+ TableResult result = bigquery.listTableData(tableId, TableDataListOption.pageSize(100));
+
+ // Print the records
+ result
+ .iterateAll()
+ .forEach(
+ row -> {
+ row.forEach(fieldValue -> System.out.print(fieldValue.toString() + ", "));
+ System.out.println();
+ });
+
+ System.out.println("Query ran successfully");
+ } catch (BigQueryException e) {
+ System.out.println("Query failed to run \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_browse_table]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/CopyTable.java b/samples/snippets/src/main/java/com/example/bigquery/CopyTable.java
new file mode 100644
index 000000000..9ac960f79
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/CopyTable.java
@@ -0,0 +1,78 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_copy_table]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.CopyJobConfiguration;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.TableId;
+
+public class CopyTable {
+
+ public static void runCopyTable() {
+ // TODO(developer): Replace these variables before running the sample.
+ String destinationDatasetName = "MY_DESTINATION_DATASET_NAME";
+ String destinationTableId = "MY_DESTINATION_TABLE_NAME";
+ String sourceDatasetName = "MY_SOURCE_DATASET_NAME";
+ String sourceTableId = "MY_SOURCE_TABLE_NAME";
+
+ copyTable(sourceDatasetName, sourceTableId, destinationDatasetName, destinationTableId);
+ }
+
+ public static void copyTable(
+ String sourceDatasetName,
+ String sourceTableId,
+ String destinationDatasetName,
+ String destinationTableId) {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ TableId sourceTable = TableId.of(sourceDatasetName, sourceTableId);
+ TableId destinationTable = TableId.of(destinationDatasetName, destinationTableId);
+
+ // For more information on CopyJobConfiguration see:
+ // https://googleapis.dev/java/google-cloud-clients/latest/com/google/cloud/bigquery/JobConfiguration.html
+ CopyJobConfiguration configuration =
+ CopyJobConfiguration.newBuilder(destinationTable, sourceTable).build();
+
+ // For more information on Job see:
+ // https://googleapis.dev/java/google-cloud-clients/latest/index.html?com/google/cloud/bigquery/package-summary.html
+ Job job = bigquery.create(JobInfo.of(configuration));
+
+ // Blocks until this job completes its execution, either failing or succeeding.
+ Job completedJob = job.waitFor();
+ if (completedJob == null) {
+ System.out.println("Job not executed since it no longer exists.");
+ return;
+ } else if (completedJob.getStatus().getError() != null) {
+ System.out.println(
+ "BigQuery was unable to copy table due to an error: \n" + job.getStatus().getError());
+ return;
+ }
+ System.out.println("Table copied successfully.");
+ } catch (BigQueryException | InterruptedException e) {
+ System.out.println("Table copying job was interrupted. \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_copy_table]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/CreateClusteredTable.java b/samples/snippets/src/main/java/com/example/bigquery/CreateClusteredTable.java
index 27a0e144f..602fbfedd 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/CreateClusteredTable.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/CreateClusteredTable.java
@@ -37,17 +37,15 @@ public static void runCreateClusteredTable() {
String datasetName = "MY_DATASET_NAME";
String tableName = "MY_TABLE_NAME";
Schema schema =
- Schema.of(
- Field.of("name", StandardSQLTypeName.STRING),
- Field.of("post_abbr", StandardSQLTypeName.STRING),
- Field.of("date", StandardSQLTypeName.DATE));
- createClusteredTable(datasetName, tableName,
- schema, ImmutableList.of("name", "post_abbr"));
+ Schema.of(
+ Field.of("name", StandardSQLTypeName.STRING),
+ Field.of("post_abbr", StandardSQLTypeName.STRING),
+ Field.of("date", StandardSQLTypeName.DATE));
+ createClusteredTable(datasetName, tableName, schema, ImmutableList.of("name", "post_abbr"));
}
public static void createClusteredTable(
- String datasetName, String tableName,
- Schema schema, List clusteringFields) {
+ String datasetName, String tableName, Schema schema, List clusteringFields) {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
@@ -58,8 +56,7 @@ public static void createClusteredTable(
TimePartitioning partitioning = TimePartitioning.of(TimePartitioning.Type.DAY);
// Clustering fields will be consisted of fields mentioned in the schema.
// As of now, another condition is that the table should be partitioned.
- Clustering clustering =
- Clustering.newBuilder().setFields(clusteringFields).build();
+ Clustering clustering = Clustering.newBuilder().setFields(clusteringFields).build();
StandardTableDefinition tableDefinition =
StandardTableDefinition.newBuilder()
diff --git a/samples/snippets/src/main/java/com/example/bigquery/CreatePartitionedTable.java b/samples/snippets/src/main/java/com/example/bigquery/CreatePartitionedTable.java
index 62a51c669..1279d65ed 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/CreatePartitionedTable.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/CreatePartitionedTable.java
@@ -35,10 +35,10 @@ public static void runCreatePartitionedTable() {
String datasetName = "MY_DATASET_NAME";
String tableName = "MY_TABLE_NAME";
Schema schema =
- Schema.of(
- Field.of("stringField", StandardSQLTypeName.STRING),
- Field.of("booleanField", StandardSQLTypeName.BOOL),
- Field.of("dateField", StandardSQLTypeName.DATE));
+ Schema.of(
+ Field.of("stringField", StandardSQLTypeName.STRING),
+ Field.of("booleanField", StandardSQLTypeName.BOOL),
+ Field.of("dateField", StandardSQLTypeName.DATE));
createPartitionedTable(datasetName, tableName, schema);
}
diff --git a/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToCsv.java b/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToCsv.java
new file mode 100644
index 000000000..6c6701f84
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToCsv.java
@@ -0,0 +1,84 @@
+/*
+ * Copyright 2019 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_extract_table]
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.Table;
+import com.google.cloud.bigquery.TableId;
+import org.threeten.bp.Duration;
+
+public class ExtractTableToCsv {
+
+ public static void runExtractTableToCsv() {
+ // TODO(developer): Replace these variables before running the sample.
+ String projectId = "bigquery-public-data";
+ String datasetName = "samples";
+ String tableName = "shakespeare";
+ String bucketName = "my-bucket";
+ String destinationUri = "gs://" + bucketName + "/path/to/file";
+ // For more information on export formats available see:
+ // https://cloud.google.com/bigquery/docs/exporting-data#export_formats_and_compression_types
+ // For more information on Job see:
+ // https://googleapis.dev/java/google-cloud-clients/latest/index.html?com/google/cloud/bigquery/package-summary.html
+
+ String dataFormat = "CSV";
+ extractTableToCsv(projectId, datasetName, tableName, destinationUri, dataFormat);
+ }
+
+ // Exports datasetName:tableName to destinationUri as raw CSV
+ public static void extractTableToCsv(
+ String projectId,
+ String datasetName,
+ String tableName,
+ String destinationUri,
+ String dataFormat) {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ TableId tableId = TableId.of(projectId, datasetName, tableName);
+ Table table = bigquery.getTable(tableId);
+
+ Job job = table.extract(dataFormat, destinationUri);
+
+ // Blocks until this job completes its execution, either failing or succeeding.
+ Job completedJob =
+ job.waitFor(
+ RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
+ RetryOption.totalTimeout(Duration.ofMinutes(3)));
+ if (completedJob == null) {
+ System.out.println("Job not executed since it no longer exists.");
+ return;
+ } else if (completedJob.getStatus().getError() != null) {
+ System.out.println(
+ "BigQuery was unable to extract due to an error: \n" + job.getStatus().getError());
+ return;
+ }
+ System.out.println(
+ "Table export successful. Check in GCS bucket for the " + dataFormat + " file.");
+ } catch (BigQueryException | InterruptedException e) {
+ System.out.println("Table extraction job was interrupted. \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_extract_table]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToJson.java b/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToJson.java
index 0f05cb20d..6db988703 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToJson.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/ExtractTableToJson.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2019 Google LLC
+ * Copyright 2020 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,11 +16,12 @@
package com.example.bigquery;
-// [START bigquery_extract_table]
+// [START bigquery_extract_table_json]
import com.google.cloud.RetryOption;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.FormatOptions;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.Table;
import com.google.cloud.bigquery.TableId;
@@ -40,13 +41,19 @@ public static void runExtractTableToJson() {
// For more information on Job see:
// https://googleapis.dev/java/google-cloud-clients/latest/index.html?com/google/cloud/bigquery/package-summary.html
- String dataFormat = "CSV";
+ // Note that FormatOptions.json().toString() is not "JSON" but "NEWLINE_DELIMITED_JSON"
+ // Using FormatOptions Enum for this will prevent problems with unexpected format names.
+ String dataFormat = FormatOptions.json().toString();
+
extractTableToJson(projectId, datasetName, tableName, destinationUri, dataFormat);
}
- // Exports datasetName:tableName to destinationUri as raw CSV
+ // Exports datasetName:tableName to destinationUri as a JSON file
public static void extractTableToJson(
- String projectId, String datasetName, String tableName, String destinationUri,
+ String projectId,
+ String datasetName,
+ String tableName,
+ String destinationUri,
String dataFormat) {
try {
// Initialize client that will be used to send requests. This client only needs to be created
@@ -71,10 +78,11 @@ public static void extractTableToJson(
"BigQuery was unable to extract due to an error: \n" + job.getStatus().getError());
return;
}
- System.out.println("Table export successful. Check in GCS bucket for the " + dataFormat + " file.");
+ System.out.println(
+ "Table export successful. Check in GCS bucket for the " + dataFormat + " file.");
} catch (BigQueryException | InterruptedException e) {
System.out.println("Table extraction job was interrupted. \n" + e.toString());
}
}
}
-// [END bigquery_extract_table]
+// [END bigquery_extract_table_json]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcs.java b/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcs.java
new file mode 100644
index 000000000..95c63164c
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcs.java
@@ -0,0 +1,68 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_load_table_gcs_csv]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.Table;
+
+// Sample to load CSV data from Cloud Storage into a new BigQuery table
+public class LoadCsvFromGcs {
+
+ public static void runLoadCsvFromGcs() throws Exception {
+ // TODO(developer): Replace these variables before running the sample.
+ String datasetName = "MY_DATASET_NAME";
+ String tableName = "MY_TABLE_NAME";
+ String sourceUri = "gs://cloud-samples-data/bigquery/us-states/us-states.csv";
+ loadCsvFromGcs(datasetName, tableName, sourceUri);
+ }
+
+ public static void loadCsvFromGcs(String datasetName, String tableName, String sourceUri)
+ throws Exception {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ Table table = bigquery.getTable(datasetName, tableName);
+ Job loadJob = table.load(FormatOptions.csv(), sourceUri);
+
+ // Load data from a GCS parquet file into the table
+ // Blocks until this load table job completes its execution, either failing or succeeding.
+ Job completedJob = loadJob.waitFor();
+
+ // Check for errors
+ if (completedJob == null) {
+ throw new Exception("Job not executed since it no longer exists.");
+ } else if (completedJob.getStatus().getError() != null) {
+ // You can also look at queryJob.getStatus().getExecutionErrors() for all
+ // errors, not just the latest one.
+ throw new Exception(
+ "BigQuery was unable to load into the table due to an error: \n"
+ + loadJob.getStatus().getError());
+ }
+ System.out.println("CSV from GCS successfully added during load append job");
+ } catch (BigQueryException | InterruptedException e) {
+ System.out.println("Column not added during load append \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_load_table_gcs_csv]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcsTruncate.java b/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcsTruncate.java
new file mode 100644
index 000000000..228d55854
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/LoadCsvFromGcsTruncate.java
@@ -0,0 +1,82 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_load_table_gcs_csv_truncate]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.JobInfo.WriteDisposition;
+import com.google.cloud.bigquery.LoadJobConfiguration;
+import com.google.cloud.bigquery.TableId;
+
+// Sample to overwrite the BigQuery table data by loading a CSV file from GCS
+public class LoadCsvFromGcsTruncate {
+
+ public static void runLoadCsvFromGcsTruncate() throws Exception {
+ // TODO(developer): Replace these variables before running the sample.
+ String datasetName = "MY_DATASET_NAME";
+ String tableName = "MY_TABLE_NAME";
+ String sourceUri = "gs://cloud-samples-data/bigquery/us-states/us-states.csv";
+ loadCsvFromGcsTruncate(datasetName, tableName, sourceUri);
+ }
+
+ public static void loadCsvFromGcsTruncate(String datasetName, String tableName, String sourceUri)
+ throws Exception {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ TableId tableId = TableId.of(datasetName, tableName);
+
+ LoadJobConfiguration configuration =
+ LoadJobConfiguration.builder(tableId, sourceUri)
+ .setFormatOptions(FormatOptions.csv())
+ // Set the write disposition to overwrite existing table data
+ .setWriteDisposition(WriteDisposition.WRITE_TRUNCATE)
+ .build();
+
+ // For more information on Job see:
+ // https://googleapis.dev/java/google-cloud-clients/latest/index.html?com/google/cloud/bigquery/package-summary.html
+ // Load the table
+ Job loadJob = bigquery.create(JobInfo.of(configuration));
+
+ // Load data from a GCS parquet file into the table
+ // Blocks until this load table job completes its execution, either failing or succeeding.
+ Job completedJob = loadJob.waitFor();
+
+ // Check for errors
+ if (completedJob == null) {
+ throw new Exception("Job not executed since it no longer exists.");
+ } else if (completedJob.getStatus().getError() != null) {
+ // You can also look at queryJob.getStatus().getExecutionErrors() for all
+ // errors, not just the latest one.
+ throw new Exception(
+ "BigQuery was unable to load into the table due to an error: \n"
+ + loadJob.getStatus().getError());
+ }
+ System.out.println("Table is successfully overwritten by CSV file loaded from GCS");
+ } catch (BigQueryException | InterruptedException e) {
+ System.out.println("Column not added during load append \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_load_table_gcs_csv_truncate]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java b/samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java
index 3e580ec7d..98257420e 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java
@@ -44,8 +44,8 @@ public static void runLoadLocalFile() throws IOException, InterruptedException {
loadLocalFile(datasetName, tableName, csvPath, FormatOptions.csv());
}
- public static void loadLocalFile(String datasetName, String tableName, Path csvPath,
- FormatOptions formatOptions)
+ public static void loadLocalFile(
+ String datasetName, String tableName, Path csvPath, FormatOptions formatOptions)
throws IOException, InterruptedException {
try {
// Initialize client that will be used to send requests. This client only needs to be created
@@ -54,9 +54,7 @@ public static void loadLocalFile(String datasetName, String tableName, Path csvP
TableId tableId = TableId.of(datasetName, tableName);
WriteChannelConfiguration writeChannelConfiguration =
- WriteChannelConfiguration.newBuilder(tableId)
- .setFormatOptions(formatOptions)
- .build();
+ WriteChannelConfiguration.newBuilder(tableId).setFormatOptions(formatOptions).build();
// The location and JobName must be specified; other fields can be auto-detected.
String jobName = "jobId_" + UUID.randomUUID().toString();
diff --git a/samples/snippets/src/main/java/com/example/bigquery/LoadParquetReplaceTable.java b/samples/snippets/src/main/java/com/example/bigquery/LoadParquetReplaceTable.java
index c8ee67c67..eb09015fa 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/LoadParquetReplaceTable.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/LoadParquetReplaceTable.java
@@ -39,8 +39,8 @@ public static void runLoadParquetReplaceTable() {
loadParquetReplaceTable(datasetName, tableName, sourceUri);
}
- public static void loadParquetReplaceTable(String datasetName, String tableName,
- String sourceUri) {
+ public static void loadParquetReplaceTable(
+ String datasetName, String tableName, String sourceUri) {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
diff --git a/samples/snippets/src/main/java/com/example/bigquery/LoadTableClustered.java b/samples/snippets/src/main/java/com/example/bigquery/LoadTableClustered.java
index 20f4104f9..a3e024518 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/LoadTableClustered.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/LoadTableClustered.java
@@ -41,16 +41,20 @@ public static void runLoadTableClustered() throws Exception {
String tableName = "MY_TABLE_NAME";
String sourceUri = "/path/to/file.csv";
Schema schema =
- Schema.of(
- Field.of("name", StandardSQLTypeName.STRING),
- Field.of("post_abbr", StandardSQLTypeName.STRING),
- Field.of("date", StandardSQLTypeName.DATE));
- loadTableClustered(datasetName, tableName, sourceUri,
- schema, ImmutableList.of("name", "post_abbr"));
+ Schema.of(
+ Field.of("name", StandardSQLTypeName.STRING),
+ Field.of("post_abbr", StandardSQLTypeName.STRING),
+ Field.of("date", StandardSQLTypeName.DATE));
+ loadTableClustered(
+ datasetName, tableName, sourceUri, schema, ImmutableList.of("name", "post_abbr"));
}
- public static void loadTableClustered(String datasetName, String tableName, String sourceUri,
- Schema schema, List clusteringFields)
+ public static void loadTableClustered(
+ String datasetName,
+ String tableName,
+ String sourceUri,
+ Schema schema,
+ List clusteringFields)
throws Exception {
try {
// Initialize client that will be used to send requests. This client only needs to be created
@@ -62,8 +66,7 @@ public static void loadTableClustered(String datasetName, String tableName, Stri
TimePartitioning partitioning = TimePartitioning.of(TimePartitioning.Type.DAY);
// Clustering fields will be consisted of fields mentioned in the schema.
// As of now, another condition is that the table should be partitioned.
- Clustering clustering =
- Clustering.newBuilder().setFields(clusteringFields).build();
+ Clustering clustering = Clustering.newBuilder().setFields(clusteringFields).build();
LoadJobConfiguration loadJobConfig =
LoadJobConfiguration.builder(tableId, sourceUri)
diff --git a/samples/snippets/src/main/java/com/example/bigquery/SaveQueryToTable.java b/samples/snippets/src/main/java/com/example/bigquery/SaveQueryToTable.java
new file mode 100644
index 000000000..5e346b937
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/SaveQueryToTable.java
@@ -0,0 +1,62 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_query_destination_table]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.QueryJobConfiguration;
+import com.google.cloud.bigquery.TableId;
+
+public class SaveQueryToTable {
+
+ public static void runSaveQueryToTable() {
+ // TODO(developer): Replace these variables before running the sample.
+ String query = "SELECT corpus FROM `bigquery-public-data.samples.shakespeare` GROUP BY corpus;";
+ String destinationTable = "MY_TABLE";
+ String destinationDataset = "MY_DATASET";
+
+ saveQueryToTable(destinationDataset, destinationTable, query);
+ }
+
+ public static void saveQueryToTable(
+ String destinationDataset, String destinationTableId, String query) {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ // Identify the destination table
+ TableId destinationTable = TableId.of(destinationDataset, destinationTableId);
+
+ // Build the query job
+ QueryJobConfiguration queryConfig =
+ QueryJobConfiguration.newBuilder(query).setDestinationTable(destinationTable).build();
+
+ // Execute the query.
+ bigquery.query(queryConfig);
+
+ // The results are now saved in the destination table.
+
+ System.out.println("Saved query ran successfully");
+ } catch (BigQueryException | InterruptedException e) {
+ System.out.println("Saved query did not run \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_query_destination_table]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/SimpleQuery.java b/samples/snippets/src/main/java/com/example/bigquery/SimpleQuery.java
new file mode 100644
index 000000000..587a7456d
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/SimpleQuery.java
@@ -0,0 +1,55 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_query]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.QueryJobConfiguration;
+import com.google.cloud.bigquery.TableResult;
+
+public class SimpleQuery {
+
+ public static void runSimpleQuery() {
+ // TODO(developer): Replace this query before running the sample.
+ String query = "SELECT corpus FROM `bigquery-public-data.samples.shakespeare` GROUP BY corpus;";
+ simpleQuery(query);
+ }
+
+ public static void simpleQuery(String query) {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ // Create the query job.
+ QueryJobConfiguration queryConfig = QueryJobConfiguration.newBuilder(query).build();
+
+ // Execute the query.
+ TableResult result = bigquery.query(queryConfig);
+
+ // Print the results.
+ result.iterateAll().forEach(rows -> rows.forEach(row -> System.out.println(row.getValue())));
+
+ System.out.println("Query ran successfully");
+ } catch (BigQueryException | InterruptedException e) {
+ System.out.println("Query did not run \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_query]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/TableInsertRows.java b/samples/snippets/src/main/java/com/example/bigquery/TableInsertRows.java
index b601c859b..04fa0a2c8 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/TableInsertRows.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/TableInsertRows.java
@@ -42,8 +42,8 @@ public static void runTableInsertRows() {
tableInsertRows(datasetName, tableName, rowContent);
}
- public static void tableInsertRows(String datasetName, String tableName,
- Map rowContent) {
+ public static void tableInsertRows(
+ String datasetName, String tableName, Map rowContent) {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
diff --git a/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDML.java b/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDML.java
new file mode 100644
index 000000000..b5d5de8bc
--- /dev/null
+++ b/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDML.java
@@ -0,0 +1,116 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+// [START bigquery_update_with_dml]
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryException;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobId;
+import com.google.cloud.bigquery.QueryJobConfiguration;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.TableResult;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.nio.channels.Channels;
+import java.nio.file.FileSystems;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.util.UUID;
+
+// Sample to update data in BigQuery tables using DML query
+public class UpdateTableDML {
+
+ public static void runUpdateTableDML() throws IOException, InterruptedException {
+ // TODO(developer): Replace these variables before running the sample.
+ String datasetName = "MY_DATASET_NAME";
+ String tableName = "MY_TABLE_NAME";
+ updateTableDML(datasetName, tableName);
+ }
+
+ public static void updateTableDML(String datasetName, String tableName)
+ throws IOException, InterruptedException {
+ try {
+ // Initialize client that will be used to send requests. This client only needs to be created
+ // once, and can be reused for multiple requests.
+ BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
+
+ // Load JSON file into UserSessions table
+ TableId tableId = TableId.of(datasetName, tableName);
+
+ WriteChannelConfiguration writeChannelConfiguration =
+ WriteChannelConfiguration.newBuilder(tableId)
+ .setFormatOptions(FormatOptions.json())
+ .build();
+
+ // Imports a local JSON file into a table.
+ Path jsonPath =
+ FileSystems.getDefault().getPath("src/test/resources", "userSessionsData.json");
+
+ // The location and JobName must be specified; other fields can be auto-detected.
+ String jobName = "jobId_" + UUID.randomUUID().toString();
+ JobId jobId = JobId.newBuilder().setLocation("us").setJob(jobName).build();
+
+ try (TableDataWriteChannel writer = bigquery.writer(jobId, writeChannelConfiguration);
+ OutputStream stream = Channels.newOutputStream(writer)) {
+ Files.copy(jsonPath, stream);
+ }
+
+ // Get the Job created by the TableDataWriteChannel and wait for it to complete.
+ Job job = bigquery.getJob(jobId);
+ Job completedJob = job.waitFor();
+ if (completedJob == null) {
+ System.out.println("Job not executed since it no longer exists.");
+ return;
+ } else if (completedJob.getStatus().getError() != null) {
+ System.out.println(
+ "BigQuery was unable to load local file to the table due to an error: \n"
+ + job.getStatus().getError());
+ return;
+ }
+
+ System.out.println(
+ job.getStatistics().toString() + " userSessionsData json uploaded successfully");
+
+ // Write a DML query to modify UserSessions table
+ // To create DML query job to mask the last octet in every row's ip_address column
+ String dmlQuery =
+ String.format(
+ "UPDATE `%s.%s` \n"
+ + "SET ip_address = REGEXP_REPLACE(ip_address, r\"(\\.[0-9]+)$\", \".0\")\n"
+ + "WHERE TRUE",
+ datasetName, tableName);
+
+ QueryJobConfiguration dmlQueryConfig = QueryJobConfiguration.newBuilder(dmlQuery).build();
+
+ // Execute the query.
+ TableResult result = bigquery.query(dmlQueryConfig);
+
+ // Print the results.
+ result.iterateAll().forEach(rows -> rows.forEach(row -> System.out.println(row.getValue())));
+
+ System.out.println("Table updated successfully using DML");
+ } catch (BigQueryException e) {
+ System.out.println("Table update failed \n" + e.toString());
+ }
+ }
+}
+// [END bigquery_update_with_dml]
diff --git a/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDescription.java b/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDescription.java
index c52df00c6..55c6af53d 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDescription.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/UpdateTableDescription.java
@@ -32,8 +32,8 @@ public static void runUpdateTableDescription() {
updateTableDescription(datasetName, tableName, newDescription);
}
- public static void updateTableDescription(String datasetName, String tableName,
- String newDescription) {
+ public static void updateTableDescription(
+ String datasetName, String tableName, String newDescription) {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
diff --git a/samples/snippets/src/main/java/com/example/bigquery/UpdateTableExpiration.java b/samples/snippets/src/main/java/com/example/bigquery/UpdateTableExpiration.java
index cbc9a1940..b3e369bff 100644
--- a/samples/snippets/src/main/java/com/example/bigquery/UpdateTableExpiration.java
+++ b/samples/snippets/src/main/java/com/example/bigquery/UpdateTableExpiration.java
@@ -34,8 +34,8 @@ public static void runUpdateTableExpiration() {
updateTableExpiration(datasetName, tableName, newExpiration);
}
- public static void updateTableExpiration(String datasetName, String tableName,
- Long newExpiration) {
+ public static void updateTableExpiration(
+ String datasetName, String tableName, Long newExpiration) {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
diff --git a/samples/snippets/src/test/java/com/example/bigquery/AddColumnLoadAppendIT.java b/samples/snippets/src/test/java/com/example/bigquery/AddColumnLoadAppendIT.java
index afbe13c94..b4b3751f7 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/AddColumnLoadAppendIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/AddColumnLoadAppendIT.java
@@ -75,7 +75,8 @@ public void testAddColumnLoadAppend() throws Exception {
List fields = originalSchema.getFields();
// Adding below additional column during the load job
- Field newField = Field.newBuilder("post_abbr", LegacySQLTypeName.STRING)
+ Field newField =
+ Field.newBuilder("post_abbr", LegacySQLTypeName.STRING)
.setMode(Field.Mode.NULLABLE)
.build();
List newFields = new ArrayList<>(fields);
diff --git a/samples/snippets/src/test/java/com/example/bigquery/BrowseTableIT.java b/samples/snippets/src/test/java/com/example/bigquery/BrowseTableIT.java
new file mode 100644
index 000000000..f7bc16c67
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/BrowseTableIT.java
@@ -0,0 +1,80 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import com.google.cloud.bigquery.Field;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.StandardSQLTypeName;
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import java.util.UUID;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class BrowseTableIT {
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String BIGQUERY_DATASET_NAME = System.getenv("BIGQUERY_DATASET_NAME");
+
+ private static void requireEnvVar(String varName) {
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("BIGQUERY_DATASET_NAME");
+ }
+
+ @Before
+ public void setUp() {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ System.setOut(null);
+ }
+
+ @Test
+ public void testBrowseTable() {
+ String tableName = "MY_TABLE_NAME_" + UUID.randomUUID().toString().replace("-", "_");
+
+ Schema schema =
+ Schema.of(
+ Field.of("stringField", StandardSQLTypeName.STRING),
+ Field.of("booleanField", StandardSQLTypeName.BOOL));
+
+ CreateTable.createTable(BIGQUERY_DATASET_NAME, tableName, schema);
+
+ BrowseTable.browseTable(BIGQUERY_DATASET_NAME, tableName);
+
+ assertThat(bout.toString()).contains("Query ran successfully");
+
+ // Clean up
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, tableName);
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/CopyTableIT.java b/samples/snippets/src/test/java/com/example/bigquery/CopyTableIT.java
new file mode 100644
index 000000000..18a2529ed
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/CopyTableIT.java
@@ -0,0 +1,91 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import com.google.cloud.bigquery.Field;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.StandardSQLTypeName;
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import java.util.UUID;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class CopyTableIT {
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String BIGQUERY_DATASET_NAME = System.getenv("BIGQUERY_DATASET_NAME");
+
+ private static void requireEnvVar(String varName) {
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("BIGQUERY_DATASET_NAME");
+ }
+
+ @Before
+ public void setUp() throws Exception {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ System.setOut(null);
+ }
+
+ @Test
+ public void testCopyTable() {
+ // Create a new destination and source table for each test since existing table cannot be
+ // overwritten
+ String generatedDestTableName =
+ "gcloud_test_table_temp_" + UUID.randomUUID().toString().replace('-', '_');
+ String generatedSourceTableName =
+ "gcloud_test_table_temp_" + UUID.randomUUID().toString().replace('-', '_');
+
+ // Adding an arbitrary table schema so we aren't copying nothing.
+ Schema schema =
+ Schema.of(
+ Field.of("stringField", StandardSQLTypeName.STRING),
+ Field.of("booleanField", StandardSQLTypeName.BOOL));
+
+ CreateTable.createTable(BIGQUERY_DATASET_NAME, generatedDestTableName, schema);
+ CreateTable.createTable(BIGQUERY_DATASET_NAME, generatedSourceTableName, schema);
+
+ CopyTable.copyTable(
+ BIGQUERY_DATASET_NAME,
+ generatedSourceTableName,
+ BIGQUERY_DATASET_NAME,
+ generatedDestTableName);
+ assertThat(bout.toString()).contains("Table copied successfully.");
+
+ // Clean up
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, generatedDestTableName);
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, generatedSourceTableName);
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/CreateClusteredTableIT.java b/samples/snippets/src/test/java/com/example/bigquery/CreateClusteredTableIT.java
index 37ddd226b..26e45c83d 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/CreateClusteredTableIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/CreateClusteredTableIT.java
@@ -63,13 +63,13 @@ public void tearDown() {
public void createClusteredTable() {
String tableName = "MY_CLUSTERED_TABLE";
Schema schema =
- Schema.of(
- Field.of("name", StandardSQLTypeName.STRING),
- Field.of("post_abbr", StandardSQLTypeName.STRING),
- Field.of("date", StandardSQLTypeName.DATE));
+ Schema.of(
+ Field.of("name", StandardSQLTypeName.STRING),
+ Field.of("post_abbr", StandardSQLTypeName.STRING),
+ Field.of("date", StandardSQLTypeName.DATE));
- CreateClusteredTable.createClusteredTable(BIGQUERY_DATASET_NAME, tableName,
- schema, ImmutableList.of("name", "post_abbr"));
+ CreateClusteredTable.createClusteredTable(
+ BIGQUERY_DATASET_NAME, tableName, schema, ImmutableList.of("name", "post_abbr"));
assertThat(bout.toString()).contains("Clustered table created successfully");
diff --git a/samples/snippets/src/test/java/com/example/bigquery/CreatePartitionedTableIT.java b/samples/snippets/src/test/java/com/example/bigquery/CreatePartitionedTableIT.java
index 32000c5ce..35ab85b38 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/CreatePartitionedTableIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/CreatePartitionedTableIT.java
@@ -62,10 +62,10 @@ public void tearDown() {
public void testCreatePartitionedTable() {
String tableName = "MY_PARTITIONED_TABLE";
Schema schema =
- Schema.of(
- Field.of("stringField", StandardSQLTypeName.STRING),
- Field.of("booleanField", StandardSQLTypeName.BOOL),
- Field.of("dateField", StandardSQLTypeName.DATE));
+ Schema.of(
+ Field.of("stringField", StandardSQLTypeName.STRING),
+ Field.of("booleanField", StandardSQLTypeName.BOOL),
+ Field.of("dateField", StandardSQLTypeName.DATE));
CreatePartitionedTable.createPartitionedTable(BIGQUERY_DATASET_NAME, tableName, schema);
diff --git a/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToCsvIT.java b/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToCsvIT.java
new file mode 100644
index 000000000..457bb5fb7
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToCsvIT.java
@@ -0,0 +1,72 @@
+/*
+ * Copyright 2019 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class ExtractTableToCsvIT {
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String GCS_BUCKET = System.getenv("GCS_BUCKET");
+
+ private static void requireEnvVar(String varName) {
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("GCS_BUCKET");
+ }
+
+ @Before
+ public void setUp() throws Exception {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ System.setOut(null);
+ }
+
+ @Test
+ public void testExtractTableToJson() {
+ String projectId = "bigquery-public-data";
+ String datasetName = "samples";
+ String tableName = "shakespeare";
+ String destinationUri = "gs://" + GCS_BUCKET + "/extractTest.csv";
+ String dataFormat = "CSV";
+
+ // Extract table content to GCS in CSV format
+ ExtractTableToCsv.extractTableToCsv(
+ projectId, datasetName, tableName, destinationUri, dataFormat);
+ assertThat(bout.toString())
+ .contains("Table export successful. Check in GCS bucket for the " + dataFormat + " file.");
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToJsonIT.java b/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToJsonIT.java
index fd28bc1cb..8d5764b18 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToJsonIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/ExtractTableToJsonIT.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2019 Google LLC
+ * Copyright 2020 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@
import static com.google.common.truth.Truth.assertThat;
import static junit.framework.TestCase.assertNotNull;
+import com.google.cloud.bigquery.FormatOptions;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import org.junit.After;
@@ -60,12 +61,13 @@ public void testExtractTableToJson() {
String projectId = "bigquery-public-data";
String datasetName = "samples";
String tableName = "shakespeare";
- String destinationUri = "gs://" + GCS_BUCKET + "/extractTest.csv";
- String dataFormat = "CSV";
+ String destinationUri = "gs://" + GCS_BUCKET + "/extractTest.json";
+ // FormatOptions.json() is not "JSON" but "NEWLINE_DELIMITED_JSON"
+ String dataFormat = FormatOptions.json().toString();
- // Extract table content to GCS in CSV format
- ExtractTableToJson.extractTableToJson(projectId, datasetName, tableName, destinationUri,
- dataFormat);
+ // Extract table content to GCS in JSON format
+ ExtractTableToJson.extractTableToJson(
+ projectId, datasetName, tableName, destinationUri, dataFormat);
assertThat(bout.toString())
.contains("Table export successful. Check in GCS bucket for the " + dataFormat + " file.");
}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/LoadCsvFromGcsIT.java b/samples/snippets/src/test/java/com/example/bigquery/LoadCsvFromGcsIT.java
new file mode 100644
index 000000000..f47f92cc5
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/LoadCsvFromGcsIT.java
@@ -0,0 +1,88 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import com.google.cloud.bigquery.Field;
+import com.google.cloud.bigquery.LegacySQLTypeName;
+import com.google.cloud.bigquery.Schema;
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import java.util.UUID;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class LoadCsvFromGcsIT {
+
+ private String tableName;
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String BIGQUERY_DATASET_NAME = requireEnvVar("BIGQUERY_DATASET_NAME");
+
+ private static String requireEnvVar(String varName) {
+ String value = System.getenv(varName);
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ return value;
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("BIGQUERY_DATASET_NAME");
+ }
+
+ @Before
+ public void setUp() {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+
+ // Create a test table
+ tableName = "loadCsvFromGcs_TEST_" + UUID.randomUUID().toString().replace('-', '_');
+
+ Schema schema =
+ Schema.of(
+ Field.of("name", LegacySQLTypeName.STRING),
+ Field.of("post_abbr", LegacySQLTypeName.STRING));
+
+ CreateTable.createTable(BIGQUERY_DATASET_NAME, tableName, schema);
+
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ // Clean up
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, tableName);
+ System.setOut(null);
+ }
+
+ @Test
+ public void loadCsvFromGcs() throws Exception {
+ String sourceUri = "gs://cloud-samples-data/bigquery/us-states/us-states.csv";
+ LoadCsvFromGcs.loadCsvFromGcs(BIGQUERY_DATASET_NAME, tableName, sourceUri);
+ assertThat(bout.toString()).contains("CSV from GCS successfully added during load append job");
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/LoadCsvFromGcsTruncateTest.java b/samples/snippets/src/test/java/com/example/bigquery/LoadCsvFromGcsTruncateTest.java
new file mode 100644
index 000000000..8a2e35377
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/LoadCsvFromGcsTruncateTest.java
@@ -0,0 +1,89 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import com.google.cloud.bigquery.Field;
+import com.google.cloud.bigquery.LegacySQLTypeName;
+import com.google.cloud.bigquery.Schema;
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import java.util.UUID;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class LoadCsvFromGcsTruncateTest {
+
+ private String tableName;
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String BIGQUERY_DATASET_NAME = requireEnvVar("BIGQUERY_DATASET_NAME");
+
+ private static String requireEnvVar(String varName) {
+ String value = System.getenv(varName);
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ return value;
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("BIGQUERY_DATASET_NAME");
+ }
+
+ @Before
+ public void setUp() {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+
+ // Create a test table
+ tableName = "loadCsvFromGcsTruncate_TEST_" + UUID.randomUUID().toString().replace('-', '_');
+
+ Schema schema =
+ Schema.of(
+ Field.of("name", LegacySQLTypeName.STRING),
+ Field.of("post_abbr", LegacySQLTypeName.STRING));
+
+ CreateTable.createTable(BIGQUERY_DATASET_NAME, tableName, schema);
+
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ // Clean up
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, tableName);
+ System.setOut(null);
+ }
+
+ @Test
+ public void loadCsvFromGcsTruncate() throws Exception {
+ String sourceUri = "gs://cloud-samples-data/bigquery/us-states/us-states.csv";
+ LoadCsvFromGcsTruncate.loadCsvFromGcsTruncate(BIGQUERY_DATASET_NAME, tableName, sourceUri);
+ assertThat(bout.toString())
+ .contains("Table is successfully overwritten by CSV file loaded from GCS");
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/LoadTableClusteredIT.java b/samples/snippets/src/test/java/com/example/bigquery/LoadTableClusteredIT.java
index 98ef57afd..3c05c7285 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/LoadTableClusteredIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/LoadTableClusteredIT.java
@@ -66,13 +66,13 @@ public void loadTableClustered() throws Exception {
String tableName = "LOAD_CLUSTERED_TABLE_TEST";
Schema schema =
- Schema.of(
- Field.of("name", StandardSQLTypeName.STRING),
- Field.of("post_abbr", StandardSQLTypeName.STRING),
- Field.of("date", StandardSQLTypeName.DATE));
+ Schema.of(
+ Field.of("name", StandardSQLTypeName.STRING),
+ Field.of("post_abbr", StandardSQLTypeName.STRING),
+ Field.of("date", StandardSQLTypeName.DATE));
- LoadTableClustered.loadTableClustered(BIGQUERY_DATASET_NAME, tableName, sourceUri,
- schema, ImmutableList.of("name", "post_abbr"));
+ LoadTableClustered.loadTableClustered(
+ BIGQUERY_DATASET_NAME, tableName, sourceUri, schema, ImmutableList.of("name", "post_abbr"));
assertThat(bout.toString())
.contains("Data successfully loaded into clustered table during load job");
diff --git a/samples/snippets/src/test/java/com/example/bigquery/SaveQueryToTableIT.java b/samples/snippets/src/test/java/com/example/bigquery/SaveQueryToTableIT.java
new file mode 100644
index 000000000..3552c2c90
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/SaveQueryToTableIT.java
@@ -0,0 +1,71 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import java.util.UUID;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class SaveQueryToTableIT {
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String BIGQUERY_DATASET_NAME = System.getenv("BIGQUERY_DATASET_NAME");
+
+ private static void requireEnvVar(String varName) {
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("BIGQUERY_DATASET_NAME");
+ }
+
+ @Before
+ public void setUp() {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ System.setOut(null);
+ }
+
+ @Test
+ public void testSaveQueryToTable() {
+ String tableName = "MY_TABLE_NAME_" + UUID.randomUUID().toString().replace("-", "_");
+ String query = "SELECT corpus FROM `bigquery-public-data.samples.shakespeare` GROUP BY corpus;";
+
+ SaveQueryToTable.saveQueryToTable(BIGQUERY_DATASET_NAME, tableName, query);
+
+ assertThat(bout.toString()).contains("Saved query ran successfully");
+
+ // Clean up
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, tableName);
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/SimpleQueryIT.java b/samples/snippets/src/test/java/com/example/bigquery/SimpleQueryIT.java
new file mode 100644
index 000000000..8ee67edde
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/SimpleQueryIT.java
@@ -0,0 +1,50 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+public class SimpleQueryIT {
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ @Before
+ public void setUp() {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ System.setOut(null);
+ }
+
+ @Test
+ public void testSimpleQuery() {
+ String query = "SELECT corpus FROM `bigquery-public-data.samples.shakespeare` GROUP BY corpus;";
+
+ SimpleQuery.simpleQuery(query);
+ assertThat(bout.toString()).contains("Query ran successfully");
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDMLIT.java b/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDMLIT.java
new file mode 100644
index 000000000..006ba13ce
--- /dev/null
+++ b/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDMLIT.java
@@ -0,0 +1,90 @@
+/*
+ * Copyright 2020 Google LLC
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.example.bigquery;
+
+import static com.google.common.truth.Truth.assertThat;
+import static junit.framework.TestCase.assertNotNull;
+
+import com.google.cloud.bigquery.Field;
+import com.google.cloud.bigquery.LegacySQLTypeName;
+import com.google.cloud.bigquery.Schema;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.PrintStream;
+import java.util.UUID;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class UpdateTableDMLIT {
+
+ private String tableName;
+ private ByteArrayOutputStream bout;
+ private PrintStream out;
+
+ private static final String BIGQUERY_DATASET_NAME = requireEnvVar("BIGQUERY_DATASET_NAME");
+
+ private static String requireEnvVar(String varName) {
+ String value = System.getenv(varName);
+ assertNotNull(
+ "Environment variable " + varName + " is required to perform these tests.",
+ System.getenv(varName));
+ return value;
+ }
+
+ @BeforeClass
+ public static void checkRequirements() {
+ requireEnvVar("BIGQUERY_DATASET_NAME");
+ }
+
+ @Before
+ public void setUp() {
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+
+ // Create a test table
+ tableName = "UserSessions_TEST_" + UUID.randomUUID().toString().replace('-', '_');
+ Schema schema =
+ Schema.of(
+ Field.of("id", LegacySQLTypeName.STRING),
+ Field.of("user_id", LegacySQLTypeName.STRING),
+ Field.of("login_time", LegacySQLTypeName.STRING),
+ Field.of("logout_time", LegacySQLTypeName.STRING),
+ Field.of("ip_address", LegacySQLTypeName.STRING));
+
+ CreateTable.createTable(BIGQUERY_DATASET_NAME, tableName, schema);
+
+ bout = new ByteArrayOutputStream();
+ out = new PrintStream(bout);
+ System.setOut(out);
+ }
+
+ @After
+ public void tearDown() {
+ // Clean up
+ DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, tableName);
+ System.setOut(null);
+ }
+
+ @Test
+ public void testUpdateTableDML() throws IOException, InterruptedException {
+ UpdateTableDML.updateTableDML(BIGQUERY_DATASET_NAME, tableName);
+ assertThat(bout.toString()).contains("Table updated successfully using DML");
+ }
+}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDescriptionIT.java b/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDescriptionIT.java
index ad71373a3..f30c73e1c 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDescriptionIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/UpdateTableDescriptionIT.java
@@ -33,7 +33,6 @@ public class UpdateTableDescriptionIT {
private static final String BIGQUERY_DATASET_NAME = System.getenv("BIGQUERY_DATASET_NAME");
-
private static void requireEnvVar(String varName) {
assertNotNull(
"Environment variable " + varName + " is required to perform these tests.",
@@ -72,6 +71,5 @@ public void updateTableDescription() {
// Clean up
DeleteTable.deleteTable(BIGQUERY_DATASET_NAME, tableName);
-
}
}
diff --git a/samples/snippets/src/test/java/com/example/bigquery/UpdateTableExpirationIT.java b/samples/snippets/src/test/java/com/example/bigquery/UpdateTableExpirationIT.java
index 73ca93c06..abd1eb3e0 100644
--- a/samples/snippets/src/test/java/com/example/bigquery/UpdateTableExpirationIT.java
+++ b/samples/snippets/src/test/java/com/example/bigquery/UpdateTableExpirationIT.java
@@ -65,9 +65,9 @@ public void updateTableExpiration() {
String suffix = UUID.randomUUID().toString().replace('-', '_');
String tableName = "update_expiration_table_" + suffix;
Schema schema =
- Schema.of(
- Field.of("stringField", StandardSQLTypeName.STRING),
- Field.of("booleanField", StandardSQLTypeName.BOOL));
+ Schema.of(
+ Field.of("stringField", StandardSQLTypeName.STRING),
+ Field.of("booleanField", StandardSQLTypeName.BOOL));
CreateTable.createTable(BIGQUERY_DATASET_NAME, tableName, schema);
Long newExpiration = TimeUnit.MILLISECONDS.convert(1, TimeUnit.DAYS);
UpdateTableExpiration.updateTableExpiration(BIGQUERY_DATASET_NAME, tableName, newExpiration);
diff --git a/samples/snippets/src/test/resources/userSessionsData.json b/samples/snippets/src/test/resources/userSessionsData.json
new file mode 100644
index 000000000..042ac3737
--- /dev/null
+++ b/samples/snippets/src/test/resources/userSessionsData.json
@@ -0,0 +1,10 @@
+{"id":"2ad525d6-c832-4c3d-b7fe-59d104885519","user_id":"38","login_time":"1.47766087E9","logout_time":"1.477661109E9","ip_address":"192.0.2.12"}
+{"id":"53d65e20-6ea9-4650-98d9-a2111fbd1122","user_id":"88","login_time":"1.47707544E9","logout_time":"1.477075519E9","ip_address":"192.0.2.88"}
+{"id":"5e6c3021-d5e7-4ccd-84b2-adfa9176d13d","user_id":"39","login_time":"1.474022869E9","logout_time":"1.474022961E9","ip_address":"203.0.113.52"}
+{"id":"6196eefa-1498-4567-8ef0-498845b888d9","user_id":"52","login_time":"1.478604612E9","logout_time":"1.478604691E9","ip_address":"203.0.113.169"}
+{"id":"70656dc5-7e0f-49cf-9e00-f06ed93c1f5b","user_id":"46","login_time":"1.474089924E9","logout_time":"1.474090227E9","ip_address":"192.0.2.10"}
+{"id":"aafa5eef-ad49-49a7-9a0f-fbc7fd639bd3","user_id":"40","login_time":"1.478031161E9","logout_time":"1.478031388E9","ip_address":"203.0.113.18"}
+{"id":"d2792fc2-24dd-4260-9456-3fbe6cdfdd90","user_id":"5","login_time":"1.481259081E9","logout_time":"1.481259247E9","ip_address":"192.0.2.140"}
+{"id":"d835dc49-32f9-4790-b4eb-dddee62e0dcc","user_id":"62","login_time":"1.478892977E9","logout_time":"1.478893219E9","ip_address":"203.0.113.83"}
+{"id":"f4a0d3c7-351f-471c-8e11-e093e7a6ce75","user_id":"89","login_time":"1.459031555E9","logout_time":"1.459031831E9","ip_address":"203.0.113.233"}
+{"id":"f6e9f526-5b22-4679-9c3e-56a636e815bb","user_id":"97","login_time":"1.482426034E9","logout_time":"1.482426415E9","ip_address":"203.0.113.167"}
\ No newline at end of file
diff --git a/synth.metadata b/synth.metadata
index bdc51594c..773d2cf67 100644
--- a/synth.metadata
+++ b/synth.metadata
@@ -4,14 +4,14 @@
"git": {
"name": ".",
"remote": "https://github.com/googleapis/java-bigquery.git",
- "sha": "db4158186b99d0bed68fd70bef5918b1820e5dd1"
+ "sha": "21a3606f5fb65287f808b12a6fef65817c8a8ba6"
}
},
{
"git": {
"name": "synthtool",
"remote": "https://github.com/googleapis/synthtool.git",
- "sha": "388f7aafee3d7a067c23db6c13b7e83fb361c64a"
+ "sha": "987270824bd26f6a8c716d5e2022057b8ae7b26e"
}
}
]
diff --git a/versions.txt b/versions.txt
index 110d6881d..2ca5a2c2f 100644
--- a/versions.txt
+++ b/versions.txt
@@ -1,4 +1,4 @@
# Format:
# module:released-version:current-version
-google-cloud-bigquery:1.116.1:1.116.1
\ No newline at end of file
+google-cloud-bigquery:1.116.2:1.116.2
\ No newline at end of file