Pre-Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 65pass65

Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer is now Stable and With Pass Result | Test Your Knowledge for Free

Exams4sure Dumps

Professional-Cloud-Security-Engineer Practice Questions

Google Cloud Certified - Professional Cloud Security Engineer

Last Update 22 hours ago
Total Questions : 318

Dive into our fully updated and stable Professional-Cloud-Security-Engineer practice test platform, featuring all the latest Google Cloud Certified exam questions added this week. Our preparation tool is more than just a Google study aid; it's a strategic advantage.

Our free Google Cloud Certified practice questions crafted to reflect the domains and difficulty of the actual exam. The detailed rationales explain the 'why' behind each answer, reinforcing key concepts about Professional-Cloud-Security-Engineer. Use this test to pinpoint which areas you need to focus your study on.

Professional-Cloud-Security-Engineer PDF

Professional-Cloud-Security-Engineer PDF (Printable)
$43.75
$124.99

Professional-Cloud-Security-Engineer Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$50.75
$144.99

Professional-Cloud-Security-Engineer PDF + Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$63.7
$181.99
Question # 61

Your organization has 3 TB of information in BigQuery and Cloud SQL. You need to develop a cost-effective, scalable, and secure strategy to anonymize the personally identifiable information (PII) that exists today. What should you do?

Options:

A.  

Scan your BigQuery and Cloud SQL data using the Cloud DLP data profiling feature. Use the data profiling results to create a de-identification strategy with either Cloud Sensitive Data Protection's de-identification templates or custom configurations.

B.  

Create a new BigQuery dataset and Cloud SQL instance. Copy a small subset of the data to these new locations. Use Cloud Data Loss Prevention API to scan this subset for PII. Based on the results, create a custom anonymization script and apply the script to the entire 3 TB dataset in the original locations.

C.  

Export all 3TB of data from BigQuery and Cloud SQL to Cloud Storage. Use Cloud Sensitive Data Protection to anonymize the exported data. Re-import the anonymized data back into BigQuery and Cloud SQL.

D.  

Inspect a representative sample of the data in BigQuery and Cloud SQL to identify PII. Based on this analysis, develop a custom script to anonymize the identified PII.

Discussion 0
Question # 62

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.  

Enable Access Transparency Logging.

B.  

Deploy resources only to regions permitted by data residency requirements

C.  

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.  

Deploy Assured Workloads.

Discussion 0
Question # 63

Your organization has an operational image classification model running on a managed AI service on Google Cloud. You are in a configuration review with stakeholders and must describe the security responsibilities for the image classification model. What should you do?

Options:

A.  

Explain the development of custom network firewalls around the image classification service for deep intrusion detection and prevention. Describe vulnerability scanning tools for known vulnerabilities.

B.  

Explain Google's shared responsibility model. Focus the configuration review on Identity and Access Management (IAM) permissions, secure data upload/download procedures, and monitoring logs for any potential malicious activity.

C.  

Explain that using platform-as-a-service (PaaS) transfers security concerns to Google. Describe the need for strict API usage limits to protect against unexpected usage and billing spikes.

D.  

Explain the security aspects of the code that transforms user-uploaded images using Google's service. Define Cloud IAM for fine-grained access control within the development team.

Discussion 0
Question # 64

You are creating a new infrastructure CI/CD pipeline to deploy hundreds of ephemeral projects in your Google Cloud organization to enable your users to interact with Google Cloud. You want to restrict the use of the default networks in your organization while following Google-recommended best practices. What should you do?

Options:

A.  

Enable the constraints/compute.skipDefaultNetworkCreation organization policy constraint at the organization level.

B.  

Create a cron job to trigger a daily Cloud Function to automatically delete all default networks for each project.

C.  

Grant your users the 1AM Owner role at the organization level. Create a VPC Service Controls perimeter around the project that restricts the compute.googleapis.com API.

D.  

Only allow your users to use your CI/CD pipeline with a predefined set of infrastructure templates they can deploy to skip the creation of the default networks.

Discussion 0
Question # 65

You are the security admin of your company. Your development team creates multiple GCP projects under the "implementation" folder for several dev, staging, and production workloads. You want to prevent data exfiltration by malicious insiders or compromised code by setting up a security perimeter. However, you do not want to restrict communication between the projects.

What should you do?

Options:

A.  

Use a Shared VPC to enable communication between all projects, and use firewall rules to prevent data exfiltration.

B.  

Create access levels in Access Context Manager to prevent data exfiltration, and use a shared VPC for communication between projects.

C.  

Use an infrastructure-as-code software tool to set up a single service perimeter and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the associated perimeter.

D.  

Use an infrastructure-as-code software tool to set up three different service perimeters for dev, staging, and prod and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the respective perimeter.

Discussion 0
Question # 66

You need to centralize your team’s logs for production projects. You want your team to be able to search and analyze the logs using Logs Explorer. What should you do?

Options:

A.  

Enable Cloud Monitoring workspace, and add the production projects to be monitored.

B.  

Use Logs Explorer at the organization level and filter for production project logs.

C.  

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a Cloud Storage bucket.

D.  

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a logs bucket.

Discussion 0
Question # 67

The security operations team needs access to the security-related logs for all projects in their organization. They have the following requirements:

Follow the least privilege model by having only view access to logs.

Have access to Admin Activity logs.

Have access to Data Access logs.

Have access to Access Transparency logs.

Which Identity and Access Management (IAM) role should the security operations team be granted?

Options:

A.  

roles/logging.privateLogViewer

B.  

roles/logging.admin

C.  

roles/viewer

D.  

roles/logging.viewer

Discussion 0
Question # 68

An engineering team is launching a web application that will be public on the internet. The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request.

Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses

Which solution should your team implement to meet these requirements?

Options:

A.  

Cloud Armor

B.  

Network Load Balancing

C.  

SSL Proxy Load Balancing

D.  

NAT Gateway

Discussion 0
Question # 69

You are a security administrator at your company. Per Google-recommended best practices, you implemented the domain restricted sharing organization policy to allow only required domains to access your projects. An engineering team is now reporting that users at an external partner outside your organization domain cannot be granted access to the resources in a project. How should you make an exception for your partner's domain while following the stated best practices?

Options:

A.  

Turn off the domain restriction sharing organization policy. Set the policy value to "Allow All."

B.  

Turn off the domain restricted sharing organization policy. Provide the external partners with the required permissions using Google's Identity and Access Management (IAM) service.

C.  

Turn off the domain restricted sharing organization policy. Add each partner's Google Workspace customer ID to a Google group, add the Google group as an exception under the organization policy, and then turn the policy back on.

D.  

Turn off the domain restricted sharing organization policy. Set the policy value to "Custom." Add each external partner's Cloud Identity or Google Workspace customer ID as an exception under the organization policy, and then turn the policy back on.

Discussion 0
Question # 70

You are troubleshooting access denied errors between Compute Engine instances connected to a Shared VPC and BigQuery datasets. The datasets reside in a project protected by a VPC Service Controls perimeter. What should you do?

Options:

A.  

Add the host project containing the Shared VPC to the service perimeter.

B.  

Add the service project where the Compute Engine instances reside to the service perimeter.

C.  

Create a service perimeter between the service project where the Compute Engine instances reside and the host project that contains the Shared VP

C.  

D.  

Create a perimeter bridge between the service project where the Compute Engine instances reside and the perimeter that contains the protected BigQuery datasets.

Discussion 0
Get Professional-Cloud-Security-Engineer dumps and pass your exam in 24 hours!

Free Exams Sample Questions