I would like to have a reusable insert function for BigQuery that leverages the PEP 249 dbapi implementation provided by the BQ library (google.cloud.bigquery.dbapi), but I am faced with an issue in that I do not know how to handle passing GEOGRAPHY types as bindings.
Here's a simplified example of what I am trying to achieve:
from google.cloud.bigquery import dbapi as bigquery
def insert(table_id, fields, bindings: Any):
bq_conn = bigquery.Connection()
cursor = bq_conn.cursor()
cursor.execute(
f"""
INSERT INTO `{table_id}`
({','.join(fields)})
VALUES
({','.join(['%s'] * len(fields))})
""",
bindings
)
So the question is: How can I pass GEOGRAPHY types as bindings?
Fiddling around has shown me that verbatim string cannot be converted/inserted as GEOGRAPHY type, as demonstrated below:
CREATE TABLE `project.dataset.tmp` (geo GEOGRAPHY);
INSERT INTO `project.dataset.tmp`
(geo)
VALUES
("POINT(1,3)"); # fails with `Value has type STRING which cannot be inserted into column geo, which has type GEOGRAPHY at [4:2]`
So if verbatim string do not work in SQL, can bound ones work? If not, is there a Python object type I should pass instead, such as bigquery.QueryJobParameter or shapely.geometry.* or geojson.*?
Given the generalized nature of the function, I'd like to avoid having to embed custom SQL handling of different values based on type. I'd really like to simply pass an object and BigQuery handle the typing by itself.
