Posts

Showing posts from 2020

Using Google Cloud Function to save JSON payload from Google Storage Event to Google Bigquery

Image
As part of this article we will learn below: Create the cloud storage bucket Use sample hierarchical JSON payload to auto create bigquery table Create google cloud function to listen to the object creation event on storage bucket and Fetch the JSON payload from the file and load in to Bigquery *We will be using Python for this activity. 1. Create the cloud storage bucket This is the simple task, you can either create the bucket using the console or gsutil command. For this exercise, we will use console. We will ignore some parameters like regional, bucket type, as we will not be using this bucket for large storage. Go to cloud storage -> Click on Create Bucket Give some unique name, you can use your Google Project ID+some name and click on create bucket. 2. Use sample hierarchical JSON payload to auto create bigquery table One of the quick way to create the Bigquery table to save the JSON payload with array is to first create the sample JSON payload and use it to create the bigquery

Using Sagemaker to train and serve Tensorflow Model

Image
In the exercise we are going to use the Kaggle cats and dogs data. Some of the code is from the training course "TensorFlow in Practice". Part1: First step towards building the machine learning model is to prepare the dataset. In this notebook we will perform below: Download the kaggle cat and dog data set Extract the zip Upload the data set to Amazon S3 bucket #Download the kaggle data set ! wget --no-check-certificate \ https : // storage . googleapis . com / mledu - datasets / cats_and_dogs_filtered . zip \ - O ./ cats_and_dogs_filtered . zip Extract the zip file to local directory import os import zipfile local_zip = './cats_and_dogs_filtered.zip' zip_ref = zipfile . ZipFile ( local_zip , 'r' ) zip_ref . extractall ( './Data' ) zip_ref . close () Use Amazon SDK to upload the data to S3 bucket . I created the bucket name "sagemaker-05may2020842" #Copy the data to AWS from Loc