You want to learn more about Google BigQuery. What do you do?
"BigQuery is Google's serverless, highly scalable, enterprise data warehouse..." taken from this site.
If you want to learn more about the components of BigQuery, see this link. This solution will allow you to create data to query.
1. Log into the web UI for GCP.
2. Go here: https://console.cloud.google.com/bigquery?p=bigquery-public-data&page=project
3. Have the cloud-based data warehouse ingest data.* In the following example we will walk you through creating a JSON file and importing it.
a. Via a web browser, go here: https://www.mockaroo.com/
b. Add or remove fields as you like. Rename them as needed. (For this example we eliminated "IP address" and changed "Gender" to phone number. We also added a field for "Street address.")
c. Click "Download Data".
d. Go to the GCP web UI where you have BigQuery open. Go here: https://console.cloud.google.com/bigquery
e. Click on a project you already have, or on the left where it says "My First Project" click the down arrow and go to "Create new dataset".
f. Enter a value for "Dataset ID" and click "OK"
g. This value should now appear under "My First Project". Hover over it and a "+" sign should appear. Click the "+" sign. h. In the "Create Table" window that appears, make sure that the Location is set to "File upload" and then click "Choose file."
i. Make sure the "File format" is set to "JSON".
j. In the "Destination Table" section, enter text into the "Table name" field.
k. For the "Schema" check the button for "Automatically detect".
l. Click "Create table".
4. Click "Compose Query".
5. Compose a query like this, but replace "contint" with the name of your project and replace "h2" with the value you entered above (in step 3.j) for "Table name":
select * from contint.h2
6. Enter it into the field. Click "RUN QUERY".
* If you want separate directions for other uploading methods, see this posting.