-
GCP – Cloud Run In Action
GCP – Cloud Run In Action
-
GCP – Google Cloud CLI
GCP – Google Cloud CLI Step-1: Install Google Cloud CLI https://docs.cloud.google.com/sdk/docs/install-sdk Step-2: Initialize And Authorize The gcloud CLI Step-3: Run Core Commands Step-4: Check All Available Projects gcloud projects list Step-5: Switch Projects In ‘gcloud’ gcloud config set project sidekick-people-dev Step-5: Resolving Error Reset Application Default Credentials (important) gcloud auth application-default revoke gcloud auth application-default login Set Project To New Project: gcloud init How To Change The Region & Zone gcloud config set ai/region us-central1 gcloud config set run/region us-central1 gcloud config list
-
GCP – Stagging Layer Creation
GCP Stagging Layer Creation What Is Staging Area In Data Pipeline:
-
GCP – DataFlow Pipeline Using Flex Template.
GCP – DataFlow Pipeline Using Flex Template. Steps For Creating DataFlow Pipeline: Install Prerequisites: 1 – Python Setup: https://www.python.org/downloads/ 2 – VsCode Setup: 3 – Docker Desktop Setup: https://www.docker.com/products/docker-desktop/ 4 – Google Cloud SDK Setup https://dl.google.com/dl/cloudsdk/channels/rapid/GoogleCloudSDKInstaller.exe 5 – VS Code Project Structure 6 – VS Code Extensions 7 – Authenticate With GCP #1 Login To Your GCP Account gcloud auth login #2 Set application default credentials (important for local development) gcloud auth application-default login #3 Set your project gcloud config set project ccoe-poc-admin #4 Verify Authentication gcloud auth list 8 – Set Up Python Virtual Environment python -m venv log_monitoring_venv
-
GCP – IAM Service Account Creation
GCP – IAM Service Account Creation Steps For Creating IAM Service Account:
-
GCP – Google Cloud Storage
GCP – Google Cloud Storage Steps For Creating Google Cloud Storage:
-
GCP – DataFlow Pipeline Creation!
GCP – DataFlow Pipeline Creation Installation Steps:
-
GCP – BigQuery Table Creation
GCP – BigQuery Table Creation Steps For Creating BigQuery Table Creation!
-
GCP – DataFlow Integration
GCP – Data Flow Integration Steps For Integrating Pub/Sub With DataFlow
-
GCP – Log Routing From One Project To Another.
GCP – Log Routing Steps For Routing The Logs: Step-1: Create A Pub/Sub Topic In The Destination Project Give A Topic Name: Select Add A Default Subscription: Add a default subscription a subscriber should be a different person then why we are making pub/sub as subscriber: what happen if i uncheck this option ? Create A Transform : Can I Add Transform Later Stage: Check Google Managed Encryption Key : Step-2: Create Log Sink In The Source Project Step-3: Give Permission to Source Project For Publisher Step-4: How To Check If The Logs Are Coming To Pub/Sub Or Not ?
