Labour Day Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 2493360325

Good News !!! Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer is now Stable and With Pass Result

Professional-Cloud-Security-Engineer Practice Exam Questions and Answers

Google Cloud Certified - Professional Cloud Security Engineer

Last Update 10 hours ago
Total Questions : 233

Professional-Cloud-Security-Engineer is stable now with all latest exam questions are added 10 hours ago. Just download our Full package and start your journey with Google Cloud Certified - Professional Cloud Security Engineer certification. All these Google Professional-Cloud-Security-Engineer practice exam questions are real and verified by our Experts in the related industry fields.

Professional-Cloud-Security-Engineer PDF

Professional-Cloud-Security-Engineer PDF (Printable)
$48
$119.99

Professional-Cloud-Security-Engineer Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$56
$139.99

Professional-Cloud-Security-Engineer PDF + Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$70.8
$176.99
Question # 1

You manage one of your organization's Google Cloud projects (Project A). AVPC Service Control (SC) perimeter is blocking API access requests to this project including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least

Privilege.

What should you do?

Options:

A.  

Configure an ingress policy for the perimeter in Project A and allow access for the service account in Project B to collect messages.

B.  

Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is located in Project

A.  

C.  

Create a perimeter bridge between Project A and Project B to allow the required communication between both projects.

D.  

Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project

A.  

Discussion 0
Question # 2

An engineering team is launching a web application that will be public on the internet. The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request.

Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses

Which solution should your team implement to meet these requirements?

Options:

A.  

Cloud Armor

B.  

Network Load Balancing

C.  

SSL Proxy Load Balancing

D.  

NAT Gateway

Discussion 0
Question # 3

Which Identity-Aware Proxy role should you grant to an Identity and Access Management (IAM) user to access HTTPS resources?

Options:

A.  

Security Reviewer

B.  

lAP-Secured Tunnel User

C.  

lAP-Secured Web App User

D.  

Service Broker Operator

Discussion 0
Question # 4

A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to manage and rotate the encryption keys.

Which boot disk encryption solution should you use on the cluster to meet this customer’s requirements?

Options:

A.  

Customer-supplied encryption keys (CSEK)

B.  

Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS)

C.  

Encryption by default

D.  

Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis

Discussion 0
Question # 5

Your Security team believes that a former employee of your company gained unauthorized access to Google Cloud resources some time in the past 2 months by using a service account key. You need to confirm the unauthorized access and determine the user activity. What should you do?

Options:

A.  

Use Security Health Analytics to determine user activity.

B.  

Use the Cloud Monitoring console to filter audit logs by user.

C.  

Use the Cloud Data Loss Prevention API to query logs in Cloud Storage.

D.  

Use the Logs Explorer to search for user activity.

Discussion 0
Question # 6

You are responsible for protecting highly sensitive data in BigQuery. Your operations teams need access to this data, but given privacy regulations, you want to ensure that they cannot read the sensitive fields such as email addresses and first names. These specific sensitive fields should only be available on a need-to-know basis to the HR team. What should you do?

Options:

A.  

Perform data masking with the DLP API and store that data in BigQuery for later use.

B.  

Perform data redaction with the DLP API and store that data in BigQuery for later use.

C.  

Perform data inspection with the DLP API and store that data in BigQuery for later use.

D.  

Perform tokenization for Pseudonymization with the DLP API and store that data in BigQuery for later use.

Discussion 0
Question # 7

You are routing all your internet facing traffic from Google Cloud through your on-premises internet connection. You want to accomplish this goal securely and with the highest bandwidth possible.

What should you do?

Options:

A.  

Create an HA VPN connection to Google Cloud Replace the default 0 0 0 0/0 route.

B.  

Create a routing VM in Compute Engine Configure the default route with the VM as the next hop.

C.  

Configure Cloud Interconnect with HA VPN Replace the default 0 0 0 0/0 route to an on-premises destination.

D.  

Configure Cloud Interconnect and route traffic through an on-premises firewall.

Discussion 0
Question # 8

An employer wants to track how bonus compensations have changed over time to identify employee outliers and correct earning disparities. This task must be performed without exposing the sensitive compensation data for any individual and must be reversible to identify the outlier.

Which Cloud Data Loss Prevention API technique should you use to accomplish this?

Options:

A.  

Generalization

B.  

Redaction

C.  

CryptoHashConfig

D.  

CryptoReplaceFfxFpeConfig

Discussion 0
Question # 9

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

Question # 9

Options:

A.  

All load balancer types are denied in accordance with the global node’s policy.

B.  

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.  

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.  

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Discussion 0
Question # 10

Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application. The solution has the following requirements:

Scans must run at least once per week

Must be able to detect cross-site scripting vulnerabilities

Must be able to authenticate using Google accounts

Which solution should you use?

Options:

A.  

Google Cloud Armor

B.  

Web Security Scanner

C.  

Security Health Analytics

D.  

Container Threat Detection

Discussion 0
Question # 11

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

Options:

A.  

Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.

B.  

On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.

C.  

On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.

D.  

Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.

Discussion 0
Question # 12

You run applications on Cloud Run. You already enabled container analysis for vulnerability scanning. However, you are concerned about the lack of control on the applications that are deployed. You must ensure that only trusted container images are deployed on Cloud Run.

What should you do?

Choose 2 answers

Options:

A.  

Enable Binary Authorization on the existing Kubernetes cluster.

B.  

Set the organization policy constraint constraints/run. allowedBinaryAuthorizationPolicie to

the list of allowed Binary Authorization policy names.

C.  

Set the organization policy constraint constraints/compute.trustedimageProjects to the list of

protects that contain the trusted container images.

D.  

Enable Binary Authorization on the existing Cloud Run service.

E.  

Use Cloud Run breakglass to deploy an image that meets the Binary Authorization policy by default.

Discussion 0
Question # 13

You need to follow Google-recommended practices to leverage envelope encryption and encrypt data at the application layer.

What should you do?

Options:

A.  

Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the encrypted DEK.

B.  

Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the KEK.

C.  

Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the encrypted DEK.

D.  

Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the KEK.

Discussion 0
Question # 14

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

  • Schedule key rotation for sensitive data.
  • Control which region the encryption keys for sensitive data are stored in.
  • Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

Options:

A.  

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.  

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.

C.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Discussion 0
Question # 15

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

Options:

A.  

Create a project with multiple VPC networks for each environment.

B.  

Create a folder for each development and production environment.

C.  

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.  

Create an Organizational Policy constraint for each folder environment.

E.  

Create projects for each environment, and grant IAM rights to each engineering user.

Discussion 0
Get Professional-Cloud-Security-Engineer dumps and pass your exam in 24 hours!

Free Exams Sample Questions