Pre-Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 65pass65

Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer is now Stable and With Pass Result | Test Your Knowledge for Free

Exams4sure Dumps

Professional-Cloud-Security-Engineer Practice Questions

Google Cloud Certified - Professional Cloud Security Engineer

Last Update 22 hours ago
Total Questions : 318

Dive into our fully updated and stable Professional-Cloud-Security-Engineer practice test platform, featuring all the latest Google Cloud Certified exam questions added this week. Our preparation tool is more than just a Google study aid; it's a strategic advantage.

Our free Google Cloud Certified practice questions crafted to reflect the domains and difficulty of the actual exam. The detailed rationales explain the 'why' behind each answer, reinforcing key concepts about Professional-Cloud-Security-Engineer. Use this test to pinpoint which areas you need to focus your study on.

Professional-Cloud-Security-Engineer PDF

Professional-Cloud-Security-Engineer PDF (Printable)
$43.75
$124.99

Professional-Cloud-Security-Engineer Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$50.75
$144.99

Professional-Cloud-Security-Engineer PDF + Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$63.7
$181.99
Question # 21

Your organization is worried about recent news headlines regarding application vulnerabilities in production applications that have led to security breaches. You want to automatically scan your deployment pipeline for vulnerabilities and ensure only scanned and verified containers can run in the environment. What should you do?

Options:

A.  

Enable Binary Authorization and create attestations of scans.

B.  

Use gcloud artifacts docker images describe LOCATION-docker.pkg.dev/PROJECT_ID/REPOSITORY/IMAGE_ID@sha256:HASH --show-package-vulnerability in your CI/CD pipeline, and trigger a pipeline failure for critical vulnerabilities.

C.  

Use Kubernetes role-based access control (RBAC) as the source of truth for cluster access by granting "container clusters.get" to limited users. Restrict deployment access by allowing these users to generate a kubeconfig file containing the configuration access to the GKE cluster.

D.  

Enforce the use of Cloud Code for development so users receive real-time security feedback on vulnerable libraries and dependencies before they check in their code.

Discussion 0
Question # 22

Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

Options:

A.  

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.2.Click on the email address in line with the App Engine Default Service Account in the authentication field.3.Click Hide Matching Entries.4.Make sure the resulting list is empty.

B.  

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.2.Click on the email address in line with the App Engine Default Service Account in the authentication field.3.Click Show Matching Entries.4.Make sure the resulting list is empty.

C.  

1. In BigQuery, select the related dataset.2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.

D.  

1. Go to the IAM section on the project.2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.

Discussion 0
Question # 23

Your company is developing a new application for your organization. The application consists of two Cloud Run services, service A and service

B.  

Service A provides a web-based user front-end. Service B provides back-end services that are called by service

A.  

You need to set up identity and access management for the application. Your solution should follow the principle of least privilege. What should you do?

Options:

A.  

Create a new service account with the permissions to run service A and service

B.  

Require authentication for service

B.  

Permit only the new service account to call the backend.

B.  

Create two separate service accounts. Grant one service account the permissions to execute service A, and grant the other service account the permissions to execute service

B.  

Require authentication for service

B.  

Permit only the service account for service A to call the back-end.

C.  

Use the Compute Engine default service account to run service A and service

B.  

Require authentication for service

B.  

Permit only the default service account to call the backend.

D.  

Create three separate service accounts. Grant one service account the permissions to execute service

A.  

Grant the second service account the permissions to run service

B.  

Grant the third service account the permissions to communicate between both services A and

B.  

Require authentication for service

B.  

Call the back-end by authenticating with a service account key for the third service account.

Discussion 0
Question # 24

You are setting up a new Cloud Storage bucket in your environment that is encrypted with a customer managed encryption key (CMEK). The CMEK is stored in Cloud Key Management Service (KMS). in project "pr j -a", and the Cloud Storage bucket will use project "prj-b". The key is backed by a Cloud Hardware Security Module (HSM) and resides in the region europe-west3. Your storage bucket will be located in the region europe-west1. When you create the bucket, you cannot access the key. and you need to troubleshoot why.

What has caused the access issue?

Options:

A.  

A firewall rule prevents the key from being accessible.

B.  

Cloud HSM does not support Cloud Storage

C.  

The CMEK is in a different project than the Cloud Storage bucket

D.  

The CMEK is in a different region than the Cloud Storage bucket.

Discussion 0
Question # 25

You are a consultant for an organization that is considering migrating their data from its private cloud to Google Cloud. The organization’s compliance team is not familiar with Google Cloud and needs guidance on how compliance requirements will be met on Google Cloud. One specific compliance requirement is for customer data at rest to reside within specific geographic boundaries. Which option should you recommend for the organization to meet their data residency requirements on Google Cloud?

Options:

A.  

Organization Policy Service constraints

B.  

Shielded VM instances

C.  

Access control lists

D.  

Geolocation access controls

E.  

Google Cloud Armor

Discussion 0
Question # 26

Your customer has an on-premises Public Key Infrastructure (PKI) with a certificate authority (CA). You need to issue certificates for many HTTP load balancer frontends. The on-premises PKI should be minimally affected due to many manual processes, and the solution needs to scale.

What should you do?

Options:

A.  

Use Certificate Manager to issue Google managed public certificates and configure it at HTTP the load balancers in your infrastructure as code (laC).

B.  

Use Certificate Manager to import certificates issued from on-premises PKI and for the frontends. Leverage the gcloud tool for importing

C.  

Use a subordinate CA in the Google Certificate Authority Service from the on-premises PKI system to issue certificates for the load balancers.

D.  

Use the web applications with PKCS12 certificates issued from subordinate CA based on OpenSSL on-premises Use the gcloud tool for importing. Use the External TCP/UDP Network load balancer instead of an external HTTP Load Balancer.

Discussion 0
Question # 27

You manage a mission-critical workload for your organization, which is in a highly regulated industry The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpomt computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements;

• Manage the data encryption key (DEK) outside the Google Cloud boundary.

• Maintain full control of encryption keys through a third-party provider.

• Encrypt the sensitive data before uploading it to Cloud Storage

• Decrypt the sensitive data during processing in the Compute Engine VMs

• Encrypt the sensitive data in memory while in use in the Compute Engine VMs

What should you do?

Choose 2 answers

Options:

A.  

Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets

B.  

Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.

C.  

Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs

D.  

Create Confidential VMs to access the sensitive data.

E.  

Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.

Discussion 0
Question # 28

A customer deployed an application on Compute Engine that takes advantage of the elastic nature of cloud computing.

How can you work with Infrastructure Operations Engineers to best ensure that Windows Compute Engine VMs are up to date with all the latest OS patches?

Options:

A.  

Build new base images when patches are available, and use a CI/CD pipeline to rebuild VMs, deploying incrementally.

B.  

Federate a Domain Controller into Compute Engine, and roll out weekly patches via Group Policy Object.

C.  

Use Deployment Manager to provision updated VMs into new serving Instance Groups (IGs).

D.  

Reboot all VMs during the weekly maintenance window and allow the StartUp Script to download the latest patches from the internet.

Discussion 0
Question # 29

You are implementing data protection by design and in accordance with GDPR requirements. As part of design reviews, you are told that you need to manage the encryption key for a solution that includes workloads for Compute Engine, Google Kubernetes Engine, Cloud Storage, BigQuery, and Pub/Sub. Which option should you choose for this implementation?

Options:

A.  

Cloud External Key Manager

B.  

Customer-managed encryption keys

C.  

Customer-supplied encryption keys

D.  

Google default encryption

Discussion 0
Question # 30

Your organization's application is being integrated with a partner application that requires read access to customer data to process customer orders. The customer data is stored in one of your Cloud Storage buckets. You have evaluated different options and determined that this activity requires the use of service account keys. You must advise the partner on how to minimize the risk of a compromised service account key causing a loss of data. What should you advise the partner to do?

Options:

A.  

Define a VPC Service Controls perimeter, and restrict the Cloud Storage API. Add an ingress rule to the perimeter to allow access to the Cloud Storage API for the service account from outside of the perimeter.​

B.  

Scan the Cloud Storage bucket with Sensitive Data Protection when new data is added, and automatically mask all customer data.​

C.  

Ensure that all data for the application that is accessed through the relevant service accounts is encrypted at rest by using customer-managed encryption keys (CMEK).​

D.  

Implement a secret management service. Configure the service to frequently rotate the service account key. Configure proper access control to the key, and restrict who can create service account keys.​

Discussion 0
Get Professional-Cloud-Security-Engineer dumps and pass your exam in 24 hours!

Free Exams Sample Questions