Question # 1 Your application needs to process credit card transactions. You want the smallest scope of Payment Card Industry (PCI) compliance without compromising the ability to analyze transactional data and trends relating to which payment methods are used. How should you design your architecture?
A. Create a tokenizer service and store only tokenized data.
B. Create separate projects that only process credit card data.
C. Create separate subnetworks and isolate the components that process credit card data.
D. Streamline the audit discovery phase by labeling all of the virtual machines (VMs) that process PCI data.
E. Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with the auditor
Click for Answer
E. Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with the auditor
Answer Description The proper model for exporting credit card processing data is to forward from a squid proxy to Stackdriver Logging, and export from Stackdriver Logging into BigQuery https://cloud.google.com/solutions/pci-dss
Question # 2 You are developing an application using different microservices that should remain internal to the cluster. You want to be able to configure each microservice with a specific number of replicas. You also want to be able to address a specific microservice from any other microservice in a uniform way, regardless of the number of replicas the microservice scales to. You need to implement this solution on Google Kubernetes Engine. What should you do?
A. Deploy each microservice as a Deployment. Expose the Deployment in the cluster using a Service, and use the Service DNS name to address it from other microservices within the cluster.
B. Deploy each microservice as a Deployment. Expose the Deployment in the cluster using an Ingress, and use the Ingress IP address to address the Deployment from other microservices within the cluster.
C. Deploy each microservice as a Pod. Expose the Pod in the cluster using a Service, and use the Service DNS name to address the microservice from other microservices within the cluster.
D. Deploy each microservice as a Pod. Expose the Pod in the cluster using an Ingress, and use the Ingress IP address name to address the Pod from other microservices within the cluster.
Click for Answer
A. Deploy each microservice as a Deployment. Expose the Deployment in the cluster using a Service, and use the Service DNS name to address it from other microservices within the cluster.
Answer Description Explanation: https://kubernetes.io/docs/concepts/services-networking/ingress/
Question # 3 For this question, refer to the Dress4Win case study. Dress4Win would like to become familiar with deploying applications to the cloud by successfully deploying some applications quickly, as is. They have asked for your recommendation. What should you advise?
A. Identify self-contained applications with external dependencies as a first move to the cloud.
B. Identify enterprise applications with internal dependencies and recommend these as a first move to the cloud.
C. Suggest moving their in-house databases to the cloud and continue serving requests to on-premise applications.
D. Recommend moving their message queuing servers to the cloud and continue handling requests to on-premise applications.
Click for Answer
C. Suggest moving their in-house databases to the cloud and continue serving requests to on-premise applications.
Question # 4 For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the database workloads for your company, Mountkirk Games. Considering the business and technical requirements, what should you do?
A. Use Cloud SQL for time series data, and use Cloud Bigtable for historical data queries.
B. Use Cloud SQL to replace MySQL, and use Cloud Spanner for historical data queries.
C. Use Cloud Bigtable to replace MySQL, and use BigQuery for historical data queries.
D. Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.
Click for Answer
D. Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.
Question # 5 You are implementing a single Cloud SQL MySQL second-generation database that contains business-critical transaction data. You want to ensure that the minimum amount of data is lost in case of catastrophic failure. Which two features should you implement? (Choose two.)
A. Sharding
B. Read replicas
C. Binary logging
D. Automated backups
E. Semisynchronous replication
Click for Answer
C. Binary logging
D. Automated backups
Answer Description Explanation: Backups help you restore lost data to your Cloud SQL instance. Additionally, if an instance is having a problem, you can restore it to a previous state by using the backup to overwrite it. Enable automated backups for any instance that contains necessary data. Backups protect your data from loss or damage. Enabling automated backups, along with binary logging, is also required for some operations, such as clone and replica creation. Reference: https://cloud.google.com/sql/docs/mysql/backup-recovery/backups
Question # 6 Your organization has a 3-tier web application deployed in the same network on Google Cloud Platform. Each tier (web, API, and database) scales independently of the others Network traffic should flow through the web to the API tier and then on to the database tier. Traffic should not flow between the web and the database tier. How should you configure the network?
A. Add each tier to a different subnetwork.
B. Set up software based firewalls on individual VMs.
C. Add tags to each tier and set up routes to allow the desired traffic flow.
D. Add tags to each tier and set up firewall rules to allow the desired traffic flow.
Click for Answer
D. Add tags to each tier and set up firewall rules to allow the desired traffic flow.
Answer Description https://aws.amazon.com/blogs/aws/building-three-tier-architectures-with-security-groups/
Question # 7 For this question, refer to the Dress4Win case study. Dress4Win is expected to grow to 10 times its size in 1 year with a corresponding growth in data and traffic that mirrors the existing patterns of usage. The CIO has set the target of migrating production infrastructure to the cloud within the next 6 months. How will you configure the solution to scale for this growth without making major application changes and still maximize the ROI?
A. Migrate the web application layer to App Engine, and MySQL to Cloud Datastore, and NAS to Cloud Storage. Deploy RabbitMQ, and deploy Hadoop servers using Deployment Manager.
B. Migrate RabbitMQ to Cloud Pub/Sub, Hadoop to BigQuery, and NAS to Compute Engine with Persistent Disk storage. Deploy Tomcat, and deploy Nginx using Deployment Manager. ge
C. Implement managed instance groups for Tomcat and Nginx. Migrate MySQL to Cloud SQL, RabbitMQ to Cloud Pub/Sub, Hadoop to Cloud Dataproc, and NAS to Compute Engine with Persistent Disk storage.
D. Implement managed instance groups for the Tomcat and Nginx. Migrate MySQL to Cloud SQL, RabbitMQ to Cloud Pub/Sub, Hadoop to Cloud Dataproc, and NAS to Cloud Stora
Click for Answer
C. Implement managed instance groups for Tomcat and Nginx. Migrate MySQL to Cloud SQL, RabbitMQ to Cloud Pub/Sub, Hadoop to Cloud Dataproc, and NAS to Compute Engine with Persistent Disk storage.
Question # 8 For this question, refer to the Dress4Win case study. Dress4Win has end-to-end tests covering 100% of their endpoints. They want to ensure that the move to the cloud does not introduce any new bugs. Which additional testing methods should the developers employ to prevent an outage?
A. They should enable Google Stackdriver Debugger on the application code to show errors in the code.
B. They should add additional unit tests and production scale load tests on their cloud staging environment.
C. They should run the end-to-end tests in the cloud staging environment to determine if the code is working as intended.
D. They should add canary tests so developers can measure how much of an impact the new release causes to latency.
Click for Answer
B. They should add additional unit tests and production scale load tests on their cloud staging environment.
Up-to-Date
We always provide up-to-date Professional-Cloud-Architect exam dumps to our clients. Keep checking website for updates and download.
Excellence
Quality and excellence of our PROFESSIONAL CLOUD ARCHITECT practice questions are above customers expectations. Contact live chat to know more.
Success
Your SUCCESS is assured with the Professional-Cloud-Architect exam questions of passin1day.com. Just Buy, Prepare and PASS!
Quality
All our braindumps are verified with their correct answers. Download Google Cloud Certified Practice tests in a printable PDF format.
Basic
$80
Any 3 Exams of Your Choice
3 Exams PDF + Online Test Engine
Buy Now
Premium
$100
Any 4 Exams of Your Choice
4 Exams PDF + Online Test Engine
Buy Now
Gold
$125
Any 5 Exams of Your Choice
5 Exams PDF + Online Test Engine
Buy Now
Passin1Day has a big success story in last 12 years with a long list of satisfied customers.
We are UK based company, selling Professional-Cloud-Architect practice test questions answers. We have a team of 34 people in Research, Writing, QA, Sales, Support and Marketing departments and helping people get success in their life.
We dont have a single unsatisfied Google customer in this time. Our customers are our asset and precious to us more than their money.
Professional-Cloud-Architect Dumps
We have recently updated Google Professional-Cloud-Architect dumps study guide. You can use our Google Cloud Certified braindumps and pass your exam in just 24 hours. Our PROFESSIONAL CLOUD ARCHITECT real exam contains latest questions. We are providing Google Professional-Cloud-Architect dumps with updates for 3 months. You can purchase in advance and start studying. Whenever Google update PROFESSIONAL CLOUD ARCHITECT exam, we also update our file with new questions. Passin1day is here to provide real Professional-Cloud-Architect exam questions to people who find it difficult to pass exam
Google Cloud Certified can advance your marketability and prove to be a key to differentiating you from those who have no certification and Passin1day is there to help you pass exam with Professional-Cloud-Architect dumps. Google Certifications demonstrate your competence and make your discerning employers recognize that PROFESSIONAL CLOUD ARCHITECT certified employees are more valuable to their organizations and customers. We have helped thousands of customers so far in achieving their goals. Our excellent comprehensive Google exam dumps will enable you to pass your certification Google Cloud Certified exam in just a single try. Passin1day is offering Professional-Cloud-Architect braindumps which are accurate and of high-quality verified by the IT professionals. Candidates can instantly download Google Cloud Certified dumps and access them at any device after purchase. Online PROFESSIONAL CLOUD ARCHITECT practice tests are planned and designed to prepare you completely for the real Google exam condition. Free Professional-Cloud-Architect dumps demos can be available on customer’s demand to check before placing an order.
What Our Customers Say
Jeff Brown
Thanks you so much passin1day.com team for all the help that you have provided me in my Google exam. I will use your dumps for next certification as well.
Mareena Frederick
You guys are awesome. Even 1 day is too much. I prepared my exam in just 3 hours with your Professional-Cloud-Architect exam dumps and passed it in first attempt :)
Ralph Donald
I am the fully satisfied customer of passin1day.com. I have passed my exam using your PROFESSIONAL CLOUD ARCHITECT braindumps in first attempt. You guys are the secret behind my success ;)
Lilly Solomon
I was so depressed when I get failed in my Cisco exam but thanks GOD you guys exist and helped me in passing my exams. I am nothing without you.