Here is a step-by-step procedure for loading JSON data into a Snowflake table from an AWS S3 bucket:
- Create a table in Snowflake that matches the structure of the JSON data. You can use the CREATE TABLE statement to create the table and specify the appropriate data types for each column.
- Create a stage in Snowflake that points to the S3 bucket where the JSON data is stored. Use the CREATE STAGE statement to create the stage. The stage must be created in the same schema as the table and must include the necessary access credentials for the S3 bucket.
- Grant access to the S3 bucket for the Snowflake account. You need to grant read access to the S3 bucket for the Snowflake account using the AWS IAM policy.
- Use the COPY INTO statement to load the JSON data into the Snowflake table. The COPY INTO statement is used to load data into a Snowflake table from a stage. You can specify the file format as JSON, the S3 bucket location and access credentials, and the name of the target table.
- Verify that the data has been loaded into the Snowflake table by running a SELECT statement against the table.
An example of the COPY INTO statement would be like this:
COPY INTO target_table FROM 's3://bucketname/folder/file.json' CREDENTIALS = (aws_key_id='aws_access_key' aws_secret_key='aws_secret_key') FILE_FORMAT = (TYPE = 'JSON' COMPRESSION = 'AUTO');
It is important to note that in order to load data from S3, you need to have the necessary permissions on the S3 bucket. You need to make sure that the Snowflake account has permission to access the S3 bucket, so that the data can be loaded into Snowflake.
CREATE TABLE target_table ( id NUMBER, name VARCHAR(100), email VARCHAR(100), age NUMBER, address VARCHAR(200), phone_number VARCHAR(20) );
In this example, the table is named “target_table” and has six columns: id, name, email, age, address, and phone_number. The data types for each column are specified as NUMBER for the id and age columns, and VARCHAR for the rest of the columns.
Note that the column names and data types should match the structure of the JSON file you want to load into Snowflake, otherwise the data may not be loaded correctly or may generate errors.
Also, you can use the command
DESCRIBE to check the structure of the JSON file and create columns accordingly.
SQL to Create a stage in Snowflake that points to the S3 bucket
Here is an example of a SQL statement for creating a stage in Snowflake that points to an S3 bucket:
CREATE STAGE my_stage URL = 's3://freshers-in/folder' CREDENTIALS = (aws_key_id='aws_access_key' aws_secret_key='aws_secret_key');
In this example, the stage is named “my_stage” and points to the S3 bucket named “freshers-in” in the “folder” directory. The stage includes the necessary access credentials for the S3 bucket in the form of an AWS access key and secret key.