Labour Day Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 2493360325

Good News !!! Associate-Cloud-Engineer Google Cloud Certified - Associate Cloud Engineer is now Stable and With Pass Result

Associate-Cloud-Engineer Practice Exam Questions and Answers

Google Cloud Certified - Associate Cloud Engineer

Last Update 21 hours ago
Total Questions : 266

Associate-Cloud-Engineer is stable now with all latest exam questions are added 21 hours ago. Just download our Full package and start your journey with Google Cloud Certified - Associate Cloud Engineer certification. All these Google Associate-Cloud-Engineer practice exam questions are real and verified by our Experts in the related industry fields.

Associate-Cloud-Engineer PDF

Associate-Cloud-Engineer PDF (Printable)
$48
$119.99

Associate-Cloud-Engineer Testing Engine

Associate-Cloud-Engineer PDF (Printable)
$56
$139.99

Associate-Cloud-Engineer PDF + Testing Engine

Associate-Cloud-Engineer PDF (Printable)
$70.8
$176.99
Question # 1

You have a workload running on Compute Engine that is critical to your business. You want to ensure that the data on the boot disk of this workload is backed up regularly. You need to be able to restore a backup as quickly as possible in case of disaster. You also want older backups to be cleaned automatically to save on cost. You want to follow Google-recommended practices. What should you do?

Options:

A.  

Create a Cloud Function to create an instance template.

B.  

Create a snapshot schedule for the disk using the desired interval.

C.  

Create a cron job to create a new disk from the disk using gcloud.

D.  

Create a Cloud Task to create an image and export it to Cloud Storage.

Discussion 0
Question # 2

You have an application that uses Cloud Spanner as a backend database. The application has a very predictable traffic pattern. You want to automatically scale up or down the number of Spanner nodes depending on traffic. What should you do?

Options:

A.  

Create a cron job that runs on a scheduled basis to review stackdriver monitoring metrics, and then resize the Spanner instance accordingly.

B.  

Create a Stackdriver alerting policy to send an alert to oncall SRE emails when Cloud Spanner CPU exceeds the threshold. SREs would scale resources up or down accordingly.

C.  

Create a Stackdriver alerting policy to send an alert to Google Cloud Support email when Cloud Spanner CPU exceeds your threshold. Google support would scale resources up or down accordingly.

D.  

Create a Stackdriver alerting policy to send an alert to webhook when Cloud Spanner CPU is over or under your threshold. Create a Cloud Function that listens to HTTP and resizes Spanner resources accordingly.

Discussion 0
Question # 3

You are developing a financial trading application that will be used globally. Data is stored and queried using a relational structure, and clients from all over the world should get the exact identical state of the data. The application will be deployed in multiple regions to provide the lowest latency to end users. You need to select a storage option for the application data while minimizing latency. What should you do?

Options:

A.  

Use Cloud Bigtable for data storage.

B.  

Use Cloud SQL for data storage.

C.  

Use Cloud Spanner for data storage.

D.  

Use Firestore for data storage.

Discussion 0
Question # 4

You need to provide a cost estimate for a Kubernetes cluster using the GCP pricing calculator for Kubernetes. Your workload requires high IOPs, and you will also be using disk snapshots. You start by entering the number of nodes, average hours, and average days. What should you do next?

Options:

A.  

Fill in local SS

D.  

Fill in persistent disk storage and snapshot storage.

B.  

Fill in local SS

D.  

Add estimated cost for cluster management.

C.  

Select Add GPUs. Fill in persistent disk storage and snapshot storage.

D.  

Select Add GPUs. Add estimated cost for cluster management.

Discussion 0
Question # 5

You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler feature enabled. You need to make sure that each node of the cluster will run a monitoring pod that sends container metrics to a third-party monitoring solution. What should you do?

Options:

A.  

Deploy the monitoring pod in a StatefulSet object.

B.  

Deploy the monitoring pod in a DaemonSet object.

C.  

Reference the monitoring pod in a Deployment object.

D.  

Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.

Discussion 0
Question # 6

Your team maintains the infrastructure for your organization. The current infrastructure requires changes. You need to share your proposed changes with the rest of the team. You want to follow Google’s recommended best practices. What should you do?

Options:

A.  

Use Deployment Manager templates to describe the proposed changes and store them in a Cloud Storage bucket.

B.  

Use Deployment Manager templates to describe the proposed changes and store them in Cloud Source Repositories.

C.  

Apply the change in a development environment, run gcloud compute instances list, and then save the output in a shared Storage bucket.

D.  

Apply the change in a development environment, run gcloud compute instances list, and then save the output in Cloud Source Repositories.

Discussion 0
Question # 7

You have been asked to set up Object Lifecycle Management for objects stored in storage buckets. The objects are written once and accessed frequently for 30 days. After 30 days, the objects are not read again unless there is a special need. The object should be kept for three years, and you need to minimize cost. What should you do?

Options:

A.  

Set up a policy that uses Nearline storage for 30 days and then moves to Archive storage for three years.

B.  

Set up a policy that uses Standard storage for 30 days and then moves to Archive storage for three years.

C.  

Set up a policy that uses Nearline storage for 30 days, then moves the Coldline for one year, and then moves to Archive storage for two years.

D.  

Set up a policy that uses Standard storage for 30 days, then moves to Coldline for one year, and then moves to Archive storage for two years.

Discussion 0
Question # 8

Your company has a large quantity of unstructured data in different file formats. You want to perform ETL transformations on the data. You need to make the data accessible on Google Cloud so it can be processed by a Dataflow job. What should you do?

Options:

A.  

Upload the data to BigQuery using the bq command line tool.

B.  

Upload the data to Cloud Storage using the gsutil command line tool.

C.  

Upload the data into Cloud SQL using the import function in the console.

D.  

Upload the data into Cloud Spanner using the import function in the console.

Discussion 0
Question # 9

You have production and test workloads that you want to deploy on Compute Engine. Production VMs need to be in a different subnet than the test VMs. All the VMs must be able to reach each other over internal IP without creating additional routes. You need to set up VPC and the 2 subnets. Which configuration meets these requirements?

Options:

A.  

Create a single custom VPC with 2 subnets. Create each subnet in a different region and with a different CIDR range.

B.  

Create a single custom VPC with 2 subnets. Create each subnet in the same region and with the same CIDR range.

C.  

Create 2 custom VPCs, each with a single subnet. Create each subnet is a different region and with a different CIDR range.

D.  

Create 2 custom VPCs, each with a single subnet. Create each subnet in the same region and with the same CIDR range.

Discussion 0
Question # 10

You are assigned to maintain a Google Kubernetes Engine (GKE) cluster named dev that was deployed on Google Cloud. You want to manage the GKE configuration using the command line interface (CLI). You have just downloaded and installed the Cloud SDK. You want to ensure that future CLI commands by default address this specific cluster. What should you do?

Options:

A.  

Use the command gcloud config set container/cluster dev.

B.  

Use the command gcloud container clusters update dev.

C.  

Create a file called gke.default in the ~/.gcloud folder that contains the cluster name.

D.  

Create a file called defaults.json in the ~/.gcloud folder that contains the cluster name.

Discussion 0
Question # 11

You have a Dockerfile that you need to deploy on Kubernetes Engine. What should you do?

Options:

A.  

Use kubectl app deploy .

B.  

Use gcloud app deploy .

C.  

Create a docker image from the Dockerfile and upload it to Container Registry. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.

D.  

Create a docker image from the Dockerfile and upload it to Cloud Storage. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.

Discussion 0
Question # 12

You need to select and configure compute resources for a set of batch processing jobs. These jobs take around 2 hours to complete and are run nightly. You want to minimize service costs. What should you do?

Options:

A.  

Select Google Kubernetes Engine. Use a single-node cluster with a small instance type.

B.  

Select Google Kubernetes Engine. Use a three-node cluster with micro instance types.

C.  

Select Compute Engine. Use preemptible VM instances of the appropriate standard machine type.

D.  

Select Compute Engine. Use VM instance types that support micro bursting.

Discussion 0
Question # 13

You are working with a Cloud SQL MySQL database at your company. You need to retain a month-end copy of the database for three years for audit purposes. What should you do?

Options:

A.  

Save file automatic first-of-the- month backup for three years Store the backup file in an Archive class Cloud Storage bucket

B.  

Convert the automatic first-of-the-month backup to an export file Write the export file to a Coldline class Cloud Storage bucket

C.  

Set up an export job for the first of the month Write the export file to an Archive class Cloud Storage bucket

D.  

Set up an on-demand backup tor the first of the month Write the backup to an Archive class Cloud Storage bucket

Discussion 0
Question # 14

You are building a new version of an application hosted in an App Engine environment. You want to test the new version with 1% of users before you completely switch your application over to the new version. What should you do?

Options:

A.  

Deploy a new version of your application in Google Kubernetes Engine instead of App Engine and then use GCP Console to split traffic.

B.  

Deploy a new version of your application in a Compute Engine instance instead of App Engine and then use GCP Console to split traffic.

C.  

Deploy a new version as a separate app in App Engine. Then configure App Engine using GCP Console to split traffic between the two apps.

D.  

Deploy a new version of your application in App Engine. Then go to App Engine settings in GCP Console and split traffic between the current version and newly deployed versions accordingly.

Discussion 0
Question # 15

You have developed an application that consists of multiple microservices, with each microservice packaged in its own Docker container image. You want to deploy the entire application on Google Kubernetes Engine so that each microservice can be scaled individually. What should you do?

Options:

A.  

Create and deploy a Custom Resource Definition per microservice.

B.  

Create and deploy a Docker Compose File.

C.  

Create and deploy a Job per microservice.

D.  

Create and deploy a Deployment per microservice.

Discussion 0
Question # 16

Your projects incurred more costs than you expected last month. Your research reveals that a development GKE container emitted a huge number of logs, which resulted in higher costs. You want to disable the logs quickly using the minimum number of steps. What should you do?

Options:

A.  

1. Go to the Logs ingestion window in Stackdriver Logging, and disable the log source for the GKE container resource.

B.  

1. Go to the Logs ingestion window in Stackdriver Logging, and disable the log source for the GKE Cluster Operations resource.

C.  

1. Go to the GKE console, and delete existing clusters.2. Recreate a new cluster.3. Clear the option to enable legacy Stackdriver Logging.

D.  

1. Go to the GKE console, and delete existing clusters.2. Recreate a new cluster.3. Clear the option to enable legacy Stackdriver Monitoring.

Discussion 0
Question # 17

You need to configure optimal data storage for files stored in Cloud Storage for minimal cost. The files are used in a mission-critical analytics pipeline that is used continually. The users are in Boston, MA (United States). What should you do?

Options:

A.  

Configure regional storage for the region closest to the users Configure a Nearline storage class

B.  

Configure regional storage for the region closest to the users Configure a Standard storage class

C.  

Configure dual-regional storage for the dual region closest to the users Configure a Nearline storage class

D.  

Configure dual-regional storage for the dual region closest to the users Configure a Standard storage class

Discussion 0
Question # 18

Your company has a Google Cloud Platform project that uses BigQuery for data warehousing. Your data science team changes frequently and has few members. You need to allow members of this team to perform queries. You want to follow Google-recommended practices. What should you do?

Options:

A.  

1. Create an IAM entry for each data scientist's user account.2. Assign the BigQuery jobUser role to the group.

B.  

1. Create an IAM entry for each data scientist's user account.2. Assign the BigQuery dataViewer user role to the group.

C.  

1. Create a dedicated Google group in Cloud Identity.2. Add each data scientist's user account to the group.3. Assign the BigQuery jobUser role to the group.

D.  

1. Create a dedicated Google group in Cloud Identity.2. Add each data scientist's user account to the group.3. Assign the BigQuery dataViewer user role to the group.

Discussion 0
Question # 19

You are deploying an application to App Engine. You want the number of instances to scale based on request rate. You need at least 3 unoccupied instances at all times. Which scaling type should you use?

Options:

A.  

Manual Scaling with 3 instances.

B.  

Basic Scaling with min_instances set to 3.

C.  

Basic Scaling with max_instances set to 3.

D.  

Automatic Scaling with min_idle_instances set to 3.

Discussion 0
Question # 20

You need to verify that a Google Cloud Platform service account was created at a particular time. What should you do?

Options:

A.  

Filter the Activity log to view the Configuration category. Filter the Resource type to Service Account.

B.  

Filter the Activity log to view the Configuration category. Filter the Resource type to Google Project.

C.  

Filter the Activity log to view the Data Access category. Filter the Resource type to Service Account.

D.  

Filter the Activity log to view the Data Access category. Filter the Resource type to Google Project.

Discussion 0
Question # 21

You are building a pipeline to process time-series data. Which Google Cloud Platform services should you put in boxes 1,2,3, and 4?

Question # 21

Options:

A.  

Cloud Pub/Sub, Cloud Dataflow, Cloud Datastore, BigQuery

B.  

Firebase Messages, Cloud Pub/Sub, Cloud Spanner, BigQuery

C.  

Cloud Pub/Sub, Cloud Storage, BigQuery, Cloud Bigtable

D.  

Cloud Pub/Sub, Cloud Dataflow, Cloud Bigtable, BigQuery

Discussion 0
Question # 22

You have sensitive data stored in three Cloud Storage buckets and have enabled data access logging. You want to verify activities for a particular user for these buckets, using the fewest possible steps. You need to verify the addition of metadata labels and which files have been viewed from those buckets. What should you do?

Options:

A.  

Using the GCP Console, filter the Activity log to view the information.

B.  

Using the GCP Console, filter the Stackdriver log to view the information.

C.  

View the bucket in the Storage section of the GCP Console.

D.  

Create a trace in Stackdriver to view the information.

Discussion 0
Question # 23

You need to update a deployment in Deployment Manager without any resource downtime in the deployment. Which command should you use?

Options:

A.  

gcloud deployment-manager deployments create --config

B.  

gcloud deployment-manager deployments update --config

C.  

gcloud deployment-manager resources create --config

D.  

gcloud deployment-manager resources update --config

Discussion 0
Question # 24

You are designing an application that uses WebSockets and HTTP sessions that are not distributed across the web servers. You want to ensure the application runs properly on Google Cloud Platform. What should you do?

Options:

A.  

Meet with the cloud enablement team to discuss load balancer options.

B.  

Redesign the application to use a distributed user session service that does not rely on WebSockets and HTTP sessions.

C.  

Review the encryption requirements for WebSocket connections with the security team.

D.  

Convert the WebSocket code to use HTTP streaming.

Discussion 0
Question # 25

You want to permanently delete a Pub/Sub topic managed by Config Connector in your Google Cloud project. What should you do?

Options:

A.  

Use kubect1 to delete the topic resource.

B.  

Use gcloud CLI to delete the topic.

C.  

Use kubect1 to create the label deleted-by-cnrm and to change its value to true for the topic resource.

D.  

Use gcloud CLI to update the topic label managed-by-cnrm to false.

Discussion 0
Question # 26

You have designed a solution on Google Cloud Platform (GCP) that uses multiple GCP products. Your company has asked you to estimate the costs of the solution. You need to provide estimates for the monthly total cost. What should you do?

Options:

A.  

For each GCP product in the solution, review the pricing details on the products pricing page. Use the pricing calculator to total the monthly costs for each GCP product.

B.  

For each GCP product in the solution, review the pricing details on the products pricing page. Create a Google Sheet that summarizes the expected monthly costs for each product.

C.  

Provision the solution on GCP. Leave the solution provisioned for 1 week. Navigate to the Billing Report page in the Google Cloud Platform Console. Multiply the 1 week cost to determine the monthly costs.

D.  

Provision the solution on GCP. Leave the solution provisioned for 1 week. Use Stackdriver to determine the provisioned and used resource amounts. Multiply the 1 week cost to determine the monthly costs.

Discussion 0
Question # 27

You are migrating a business critical application from your local data center into Google Cloud. As part of your high-availability strategy, you want to ensure that any data used by the application will be immediately available if a zonal failure occurs. What should you do?

Options:

A.  

Store the application data on a zonal persistent disk. Create a snapshot schedule for the disk. If an outage occurs, create a new disk from the most recent snapshot and attach it to a new VM in another zone.

B.  

Store the application data on a zonal persistent disk. If an outage occurs, create an instance in another zone with this disk attached.

C.  

Store the application data on a regional persistent disk. Create a snapshot schedule for the disk. If an outage occurs, create a new disk from the most recent snapshot and attach it to a new VM in another zone.

D.  

Store the application data on a regional persistent disk If an outage occurs, create an instance in another zone with this disk attached.

Discussion 0
Question # 28

You need to create a new billing account and then link it with an existing Google Cloud Platform project. What should you do?

Options:

A.  

Verify that you are Project Billing Manager for the GCP project. Update the existing project to link it to the existing billing account.

B.  

Verify that you are Project Billing Manager for the GCP project. Create a new billing account and link the new billing account to the existing project.

C.  

Verify that you are Billing Administrator for the billing account. Create a new project and link the new project to the existing billing account.

D.  

Verify that you are Billing Administrator for the billing account. Update the existing project to link it to the existing billing account.

Discussion 0
Question # 29

The DevOps group in your organization needs full control of Compute Engine resources in your development project. However, they should not have permission to create or update any other resources in the project. You want to follow Google's recommendations for setting permissions for the DevOps group. What should you do?

Options:

A.  

Grant the basic role roles/viewer and the predefined role roles/compute.admin to the DevOps group.

B.  

Create an 1AM policy and grant all compute. instanceAdmln." permissions to the policy Attach the policy to the DevOps group.

C.  

Create a custom role at the folder level and grant all compute. instanceAdmln. * permissions to the role Grant the custom role to the DevOps group.

D.  

Grant the basic role roles/editor to the DevOps group.

Discussion 0
Question # 30

You are building an application that processes data files uploaded from thousands of suppliers. Your primary goals for the application are data security and the expiration of aged data. You need to design the application to:

•Restrict access so that suppliers can access only their own data.

•Give suppliers write access to data only for 30 minutes.

•Delete data that is over 45 days old.

You have a very short development cycle, and you need to make sure that the application requires minimal maintenance. Which two strategies should you use? (Choose two.)

Options:

A.  

Build a lifecycle policy to delete Cloud Storage objects after 45 days.

B.  

Use signed URLs to allow suppliers limited time access to store their objects.

C.  

Set up an SFTP server for your application, and create a separate user for each supplier.

D.  

Build a Cloud function that triggers a timer of 45 days to delete objects that have expired.

E.  

Develop a script that loops through all Cloud Storage buckets and deletes any buckets that are older than 45 days.

Discussion 0
Question # 31

You want to find out when users were added to Cloud Spanner Identity Access Management (IAM) roles on your Google Cloud Platform (GCP) project. What should you do in the GCP Console?

Options:

A.  

Open the Cloud Spanner console to review configurations.

B.  

Open the IAM & admin console to review IAM policies for Cloud Spanner roles.

C.  

Go to the Stackdriver Monitoring console and review information for Cloud Spanner.

D.  

Go to the Stackdriver Logging console, review admin activity logs, and filter them for Cloud Spanner IAM roles.

Discussion 0
Question # 32

Your company is using Google Workspace to manage employee accounts. Anticipated growth will increase the number of personnel from 100 employees to 1.000 employees within 2 years. Most employees will need access to your company's Google Cloud account. The systems and processes will need to support 10x growth without performance degradation, unnecessary complexity, or security issues. What should you do?

Options:

A.  

Migrate the users to Active Directory. Connect the Human Resources system to Active Directory. Turn on Google Cloud Directory Sync (GCDS) for Cloud Identity. Turn on Identity Federation from Cloud Identity to Active Directory.

B.  

Organize the users in Cloud Identity into groups. Enforce multi-factor authentication in Cloud Identity.

C.  

Turn on identity federation between Cloud Identity and Google Workspace. Enforce multi-factor authentication for domain wide delegation.

D.  

Use a third-party identity provider service through federation. Synchronize the users from Google Workplace to the third-party provider in real time.

Discussion 0
Question # 33

Your customer has implemented a solution that uses Cloud Spanner and notices some read latency-related performance issues on one table. This table is accessed only by their users using a primary key. The table schema is shown below.

Question # 33

You want to resolve the issue. What should you do?

Question # 33

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

Discussion 0
Question # 34

You have created an application that is packaged into a Docker image. You want to deploy the Docker image as a workload on Google Kubernetes Engine. What should you do?

Options:

A.  

Upload the image to Cloud Storage and create a Kubernetes Service referencing the image.

B.  

Upload the image to Cloud Storage and create a Kubernetes Deployment referencing the image.

C.  

Upload the image to Container Registry and create a Kubernetes Service referencing the image.

D.  

Upload the image to Container Registry and create a Kubernetes Deployment referencing the image.

Discussion 0
Question # 35

The storage costs for your application logs have far exceeded the project budget. The logs are currently being retained indefinitely in the Cloud Storage bucket myapp-gcp-ace-logs. You have been asked to remove logs older than 90 days from your Cloud Storage bucket. You want to optimize ongoing Cloud Storage spend. What should you do?

Options:

A.  

Write a script that runs gsutil Is -| – gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Schedule the script with cron.

B.  

Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file.

C.  

Write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file.

D.  

Write a script that runs gsutil Is -Ir gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Repeat this process every morning.

Discussion 0
Question # 36

You are building an archival solution for your data warehouse and have selected Cloud Storage to archive your data. Your users need to be able to access this archived data once a quarter for some regulatory requirements. You want to select a cost-efficient option. Which storage option should you use?

Options:

A.  

Coldline Storage

B.  

Nearline Storage

C.  

Regional Storage

D.  

Multi-Regional Storage

Discussion 0
Question # 37

You are using Deployment Manager to create a Google Kubernetes Engine cluster. Using the same Deployment Manager deployment, you also want to create a DaemonSet in the kube-system namespace of the cluster. You want a solution that uses the fewest possible services. What should you do?

Options:

A.  

Add the cluster’s API as a new Type Provider in Deployment Manager, and use the new type to create the DaemonSet.

B.  

Use the Deployment Manager Runtime Configurator to create a new Config resource that contains the DaemonSet definition.

C.  

With Deployment Manager, create a Compute Engine instance with a startup script that uses kubectl to create the DaemonSet.

D.  

In the cluster’s definition in Deployment Manager, add a metadata that has kube-system as key and the DaemonSet manifest as value.

Discussion 0
Question # 38

Your company runs its Linux workloads on Compute Engine instances. Your company will be working with a new operations partner that does not use Google Accounts. You need to grant access to the instances to your operations partner so they can maintain the installed tooling. What should you do?

Options:

A.  

Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.

B.  

Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.

C.  

Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.

D.  

Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.

Discussion 0
Question # 39

You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing GCP project. What should you do?

Options:

A.  

1. Verify that you are assigned the Project Owners IAM role for this project.

2. Locate the project in the GCP console, click Shut down and then enter the project I

D.  

B.  

1. Verify that you are assigned the Project Owners IAM role for this project.

2. Switch to the project in the GCP console, locate the resources and delete them.

C.  

1. Verify that you are assigned the Organizational Administrator IAM role for this project.

2. Locate the project in the GCP console, enter the project ID and then click Shut down.

D.  

1. Verify that you are assigned the Organizational Administrators IAM role for this project.

2. Switch to the project in the GCP console, locate the resources and delete them.

Discussion 0
Question # 40

An employee was terminated, but their access to Google Cloud Platform (GCP) was not removed until 2 weeks later. You need to find out this employee accessed any sensitive customer information after their termination. What should you do?

Options:

A.  

View System Event Logs in Stackdriver. Search for the user’s email as the principal.

B.  

View System Event Logs in Stackdriver. Search for the service account associated with the user.

C.  

View Data Access audit logs in Stackdriver. Search for the user’s email as the principal.

D.  

View the Admin Activity log in Stackdriver. Search for the service account associated with the user.

Discussion 0
Question # 41

You created a Kubernetes deployment by running kubectl run nginx image=nginx labels=app=prod. Your Kubernetes cluster is also used by a number of other deployments. How can you find the identifier of the pods for this nginx deployment?

Options:

A.  

kubectl get deployments –output=pods

B.  

gcloud get pods –selector=”app=prod”

C.  

kubectl get pods -I “app=prod”

D.  

gcloud list gke-deployments -filter={pod }

Discussion 0
Question # 42

You are developing a new application and are looking for a Jenkins installation to build and deploy your source code. You want to automate the installation as quickly and easily as possible. What should you do?

Options:

A.  

Deploy Jenkins through the Google Cloud Marketplace.

B.  

Create a new Compute Engine instance. Run the Jenkins executable.

C.  

Create a new Kubernetes Engine cluster. Create a deployment for the Jenkins image.

D.  

Create an instance template with the Jenkins executable. Create a managed instance group with this template.

Discussion 0
Question # 43

You need to create a custom VPC with a single subnet. The subnet’s range must be as large as possible. Which range should you use?

Options:

A.  

.00.0.0/0

B.  

10.0.0.0/8

C.  

172.16.0.0/12

D.  

192.168.0.0/16

Discussion 0
Question # 44

You are assisting a new Google Cloud user who just installed the Google Cloud SDK on their VM. The server needs access to Cloud Storage. The user wants your help to create a new storage bucket. You need to make this change in multiple environments. What should you do?

Options:

A.  

Use a Deployment Manager script to automate creating storage buckets in an appropriate region

B.  

Use a local SSD to improve performance of the VM for the targeted workload

C.  

Use the gsutii command to create a storage bucket in the same region as the VM

D.  

Use a Persistent Disk SSD in the same zone as the VM to improve performance of the VM

Discussion 0
Question # 45

You are designing an application that lets users upload and share photos. You expect your application to grow really fast and you are targeting a worldwide audience. You want to delete uploaded photos after 30 days. You want to minimize costs while ensuring your application is highly available. Which GCP storage solution should you choose?

Options:

A.  

Persistent SSD on VM instances.

B.  

Cloud Filestore.

C.  

Multiregional Cloud Storage bucket.

D.  

Cloud Datastore database.

Discussion 0
Question # 46

Your company publishes large files on an Apache web server that runs on a Compute Engine instance. The Apache web server is not the only application running in the project. You want to receive an email when the egress network costs for the server exceed 100 dollars for the current month as measured by Google Cloud Platform (GCP). What should you do?

Options:

A.  

Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”

B.  

Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”

C.  

Export the billing data to BigQuery. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and sends an email if it is over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.

D.  

Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging. Create a Cloud Function that uses BigQuery to parse the HTTP response log data in Stackdriver for the current month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.

Discussion 0
Question # 47

Your finance team wants to view the billing report for your projects. You want to make sure that the finance team does not get additional permissions to the project. What should you do?

Options:

A.  

Add the group for the finance team to roles/billing user role.

B.  

Add the group for the finance team to roles/billing admin role.

C.  

Add the group for the finance team to roles/billing viewer role.

D.  

Add the group for the finance team to roles/billing project/Manager role.

Discussion 0
Question # 48

You are monitoring an application and receive user feedback that a specific error is spiking. You notice that the error is caused by a Service Account having insufficient permissions. You are able to solve the problem but want to be notified if the problem recurs. What should you do?

Options:

A.  

In the Log Viewer, filter the logs on severity 'Error' and the name of the Service Account.

B.  

Create a sink to BigQuery to export all the logs. Create a Data Studio dashboard on the exported logs.

C.  

Create a custom log-based metric for the specific error to be used in an Alerting Policy.

D.  

Grant Project Owner access to the Service Account.

Discussion 0
Question # 49

You are storing sensitive information in a Cloud Storage bucket. For legal reasons, you need to be able to record all requests that read any of the stored data. You want to make sure you comply with these requirements. What should you do?

Options:

A.  

Enable the Identity Aware Proxy API on the project.

B.  

Scan the bucker using the Data Loss Prevention API.

C.  

Allow only a single Service Account access to read the data.

D.  

Enable Data Access audit logs for the Cloud Storage API.

Discussion 0
Question # 50

You have an application that runs on Compute Engine VM instances in a custom Virtual Private Cloud (VPC). Your company's security policies only allow the use to internal IP addresses on VM instances and do not let VM instances connect to the internet. You need to ensure that the application can access a file hosted in a Cloud Storage bucket within your project. What should you do?

Options:

A.  

Enable Private Service Access on the Cloud Storage Bucket.

B.  

Add slorage.googleapis.com to the list of restricted services in a VPC Service Controls perimeter and add your project to the list to protected projects.

C.  

Enable Private Google Access on the subnet within the custom VP

C.  

D.  

Deploy a Cloud NAT instance and route the traffic to the dedicated IP address of the Cloud Storage bucket.

Discussion 0
Question # 51

You have files in a Cloud Storage bucket that you need to share with your suppliers. You want to restrict the time that the files are available to your suppliers to 1 hour. You want to follow Google recommended practices. What should you do?

Options:

A.  

Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -m 1h gs:///*.

B.  

Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -d 1h gs:///**.

C.  

Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -p 60m gs:///.

D.  

Create a JSON key for the Default Compute Engine Service Account. Execute the command gsutil signurl -t 60m gs:///***

Discussion 0
Question # 52

You have 32 GB of data in a single file that you need to upload to a Nearline Storage bucket. The WAN connection you are using is rated at 1 Gbps, and you are the only one on the connection. You want to use as much of the rated 1 Gbps as possible to transfer the file rapidly. How should you upload the file?

Options:

A.  

Use the GCP Console to transfer the file instead of gsutil.

B.  

Enable parallel composite uploads using gsutil on the file transfer.

C.  

Decrease the TCP window size on the machine initiating the transfer.

D.  

Change the storage class of the bucket from Nearline to Multi-Regional.

Discussion 0
Question # 53

You are performing a monthly security check of your Google Cloud environment and want to know who has access to view data stored in your Google Cloud

Project. What should you do?

Options:

A.  

Enable Audit Logs for all APIs that are related to data storage.

B.  

Review the IAM permissions for any role that allows for data access. Most Voted

C.  

Review the Identity-Aware Proxy settings for each resource.

D.  

Create a Data Loss Prevention job.

Discussion 0
Question # 54

You want to select and configure a solution for storing and archiving data on Google Cloud Platform. You need to support compliance objectives for data from one geographic location. This data is archived after 30 days and needs to be accessed annually. What should you do?

Options:

A.  

Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Coldline Storage.

B.  

Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Nearline Storage.

C.  

Select Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Nearline Storage.

D.  

Select Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Coldline Storage.

Discussion 0
Question # 55

An application generates daily reports in a Compute Engine virtual machine (VM). The VM is in the project corp-iot-insights. Your team operates only in the project corp-aggregate-reports and needs a copy of the daily exports in the bucket corp-aggregate-reports-storage. You want to configure access so that the daily reports from the VM are available in the bucket corp-aggregate-reports-storage and use as few steps as possible while following Google-recommended practices. What should you do?

Options:

A.  

Move both projects under the same folder.

B.  

Grant the VM Service Account the role Storage Object Creator on corp-aggregate-reports-storage.

C.  

Create a Shared VPC network between both projects. Grant the VM Service Account the role Storage Object Creator on corp-iot-insights.

D.  

Make corp-aggregate-reports-storage public and create a folder with a pseudo-randomized suffix name. Share the folder with the IoT team.

Discussion 0
Question # 56

You are building an application that will run in your data center. The application will use Google Cloud Platform (GCP) services like AutoML. You created a service account that has appropriate access to AutoML. You need to enable authentication to the APIs from your on-premises environment. What should you do?

Options:

A.  

Use service account credentials in your on-premises application.

B.  

Use gcloud to create a key file for the service account that has appropriate permissions.

C.  

Set up direct interconnect between your data center and Google Cloud Platform to enable authentication for your on-premises applications.

D.  

Go to the IAM & admin console, grant a user account permissions similar to the service account permissions, and use this user account for authentication from your data center.

Discussion 0
Question # 57

You recently received a new Google Cloud project with an attached billing account where you will work. You need to create instances, set firewalls, and store data in Cloud Storage. You want to follow Google-recommended practices. What should you do?

Options:

A.  

Use the gcloud CLI services enable cloudresourcemanager.googleapis.com command to enable all resources.

B.  

Use the gcloud services enable compute.googleapis.com command to enable Compute Engine and the gcloud services enable storage-api.googleapis.com command to enable the Cloud Storage APIs.

C.  

Open the Google Cloud console and enable all Google Cloud APIs from the API dashboard.

D.  

Open the Google Cloud console and run gcloud init --project in a Cloud Shell.

Discussion 0
Question # 58

You create a Deployment with 2 replicas in a Google Kubernetes Engine cluster that has a single preemptible node pool. After a few minutes, you use kubectl to examine the status of your Pod and observe that one of them is still in Pending status:

Question # 58

What is the most likely cause?

Options:

A.  

The pending Pod's resource requests are too large to fit on a single node of the cluster.

B.  

Too many Pods are already running in the cluster, and there are not enough resources left to schedule the pending Pod.

C.  

The node pool is configured with a service account that does not have permission to pull the container image used by the pending Pod.

D.  

The pending Pod was originally scheduled on a node that has been preempted between the creation of the Deployment and your verification of the Pods’ status. It is currently being rescheduled on a new node.

Discussion 0
Question # 59

You deployed a new application inside your Google Kubernetes Engine cluster using the YAML file specified below.

Question # 59

You check the status of the deployed pods and notice that one of them is still in PENDING status:

Question # 59

You want to find out why the pod is stuck in pending status. What should you do?

Options:

A.  

Review details of the myapp-service Service object and check for error messages.

B.  

Review details of the myapp-deployment Deployment object and check for error messages.

C.  

Review details of myapp-deployment-58ddbbb995-lp86m Pod and check for warning messages.

D.  

View logs of the container in myapp-deployment-58ddbbb995-lp86m pod and check for warning messages.

Discussion 0
Question # 60

You are using Container Registry to centrally store your company’s container images in a separate project. In another project, you want to create a Google Kubernetes Engine (GKE) cluster. You want to ensure that Kubernetes can download images from Container Registry. What should you do?

Options:

A.  

In the project where the images are stored, grant the Storage Object Viewer IAM role to the service account used by the Kubernetes nodes.

B.  

When you create the GKE cluster, choose the Allow full access to all Cloud APIs option under ‘Access scopes’.

C.  

Create a service account, and give it access to Cloud Storage. Create a P12 key for this service account and use it as an imagePullSecrets in Kubernetes.

D.  

Configure the ACLs on each image in Cloud Storage to give read-only access to the default Compute Engine service account.

Discussion 0
Question # 61

You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute Engine instances is running in its own VP

C.  

What should you do?

Options:

A.  

Verify that both projects are in a GCP Organization. Create a new VPC and add all instances.

B.  

Verify that both projects are in a GCP Organization. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VP

C.  

C.  

Verify that you are the Project Administrator of both projects. Create two new VPCs and add all instances.

D.  

Verify that you are the Project Administrator of both projects. Create a new VPC and add all instances.

Discussion 0
Question # 62

You are the project owner of a GCP project and want to delegate control to colleagues to manage buckets and files in Cloud Storage. You want to follow Google-recommended practices. Which IAM roles should you grant your colleagues?

Options:

A.  

Project Editor

B.  

Storage Admin

C.  

Storage Object Admin

D.  

Storage Object Creator

Discussion 0
Question # 63

Your company has embraced a hybrid cloud strategy where some of the applications are deployed on Google Cloud. A Virtual Private Network (VPN) tunnel connects your Virtual Private Cloud (VPC) in Google Cloud with your company's on-premises network. Multiple applications in Google Cloud need to connect to an on-premises database server, and you want to avoid having to change the IP configuration in all of your applications when the IP of the database changes.

What should you do?

Options:

A.  

Configure Cloud NAT for all subnets of your VPC to be used when egressing from the VM instances.

B.  

Create a private zone on Cloud DNS, and configure the applications with the DNS name.

C.  

Configure the IP of the database as custom metadata for each instance, and query the metadata server.

D.  

Query the Compute Engine internal DNS from the applications to retrieve the IP of the database.

Discussion 0
Question # 64

You are configuring Cloud DNS. You want !to create DNS records to point home.mydomain.com, mydomain.com. and www.mydomain.com to the IP address of your Google Cloud load balancer. What should you do?

Options:

A.  

Create one CNAME record to point mydomain.com to the load balancer, and create two A records to point WWW and HOME lo mydomain.com respectively.

B.  

Create one CNAME record to point mydomain.com to the load balancer, and create two AAAA records to point WWW and HOME to mydomain.com respectively.

C.  

Create one A record to point mydomain.com to the load balancer, and create two CNAME records to point WWW and HOME to mydomain.com respectively.

D.  

Create one A record to point mydomain.com lo the load balancer, and create two NS records to point WWW and HOME to mydomain.com respectively.

Discussion 0
Question # 65

You created a Kubernetes deployment by running kubectl run nginx image=nginx replicas=1. After a few days, you decided you no longer want this deployment. You identified the pod and deleted it by running kubectl delete pod. You noticed the pod got recreated.

  • $ kubectl get pods
  • NAME READY STATUS RESTARTS AGE
  • nginx-84748895c4-nqqmt 1/1 Running 0 9m41s
  • $ kubectl delete pod nginx-84748895c4-nqqmt
  • pod nginx-84748895c4-nqqmt deleted
  • $ kubectl get pods
  • NAME READY STATUS RESTARTS AGE
  • nginx-84748895c4-k6bzl 1/1 Running 0 25s

What should you do to delete the deployment and avoid pod getting recreated?

Options:

A.  

kubectl delete deployment nginx

B.  

kubectl delete –deployment=nginx

C.  

kubectl delete pod nginx-84748895c4-k6bzl –no-restart 2

D.  

kubectl delete inginx

Discussion 0
Question # 66

You want to select and configure a cost-effective solution for relational data on Google Cloud Platform. You are working with a small set of operational data in one geographic location. You need to support point-in-time recovery. What should you do?

Options:

A.  

Select Cloud SQL (MySQL). Verify that the enable binary logging option is selected.

B.  

Select Cloud SQL (MySQL). Select the create failover replicas option.

C.  

Select Cloud Spanner. Set up your instance with 2 nodes.

D.  

Select Cloud Spanner. Set up your instance as multi-regional.

Discussion 0
Question # 67

You have a Linux VM that must connect to Cloud SQL. You created a service account with the appropriate access rights. You want to make sure that the VM uses this service account instead of the default Compute Engine service account. What should you do?

Options:

A.  

When creating the VM via the web console, specify the service account under the ‘Identity and API Access’ section.

B.  

Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service-account.

C.  

Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine-service-account.

D.  

Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-account.json.

Discussion 0
Question # 68

Your company’s infrastructure is on-premises, but all machines are running at maximum capacity. You want to burst to Google Cloud. The workloads on Google Cloud must be able to directly communicate to the workloads on-premises using a private IP range. What should you do?

Options:

A.  

In Google Cloud, configure the VPC as a host for Shared VP

C.  

B.  

In Google Cloud, configure the VPC for VPC Network Peering.

C.  

Create bastion hosts both in your on-premises environment and on Google Cloud. Configure both as proxy servers using their public IP addresses.

D.  

Set up Cloud VPN between the infrastructure on-premises and Google Cloud.

Discussion 0
Question # 69

You are building a product on top of Google Kubernetes Engine (GKE). You have a single GKE cluster. For each of your customers, a Pod is running in that cluster, and your customers can run arbitrary code inside their Pod. You want to maximize the isolation between your customers’ Pods. What should you do?

Options:

A.  

Use Binary Authorization and whitelist only the container images used by your customers’ Pods.

B.  

Use the Container Analysis API to detect vulnerabilities in the containers used by your customers’ Pods.

C.  

Create a GKE node pool with a sandbox type configured to gvisor. Add the parameter runtimeClassName: gvisor to the specification of your customers’ Pods.

D.  

Use the cos_containerd image for your GKE nodes. Add a nodeSelector with the value cloud.google.com/gke-os-distribution: cos_containerd to the specification of your customers’ Pods.

Discussion 0
Question # 70

You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment. What should you do?

Options:

A.  

Create a single budget for all projects and configure budget alerts on this budget.

B.  

Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.

C.  

Create a budget per project and configure budget alerts on all of these budgets.

D.  

Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.

Discussion 0
Get Associate-Cloud-Engineer dumps and pass your exam in 24 hours!

Free Exams Sample Questions