Created
February 18, 2025 20:39
-
-
Save primaryobjects/bd0dc57af9fb7a981f230adf09265531 to your computer and use it in GitHub Desktop.
Get Started with Databricks for Business Leaders - Quiz 1 https://customer-academy.databricks.com/learn/courses/1660/get-started-with-databricks-for-business-leaders/lessons/11304/get-started-with-databricks-for-business-leaders-test-your-knowledge
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Score: 91.5/100 | |
Questions - Page 1 of 1 | |
Question 1 of 20 | |
Which three things should be included as focus areas in a successful data and AI strategy? | |
A. The future impact of data products. | |
** B. The people that make up data teams. | |
** C. The processes for handling and using data. | |
** D. The data platform that will be used. | |
E. The customers served by the business. | |
Clear selection | |
Options | |
Question 2 of 20 | |
Which three open source technologies were originally created by the Databricks founders or originated from Databricks? | |
** A. mlFlow | |
B. PostgreSQL | |
** C. Apache Spark | |
D. Python | |
** E. Delta Lake | |
Cassandra | |
Clear selection | |
Options | |
Question 3 of 20 | |
What differentiates the data lakehouse architecture from that of data warehouses or data lakes? | |
A. The data lakehouse architecture adds an additional layer of complexity on top of data warehouses to interface with data lakes to make it easier to work between multiple systems. | |
B. The data lakehouse architecture includes mapping multiple data products together to run successfully on a data lake to prevent data from becoming unmanageable when different types are stored together, and operates similar to a data warehouse. | |
** C. The data lakehouse architecture is an open architecture combining the benefits of both data warehouses and data lakes while supporting all types of data together and the endeavors of multiple workloads, like BI, ML, data engineering. | |
Clear selection | |
Options | |
Question 4 of 20 | |
On which cloud platform or platforms can you use Databricks? | |
** A. Microsoft Azure | |
** B. Amazon Web Services | |
** C. Google Cloud Platform | |
D. Oracle Cloud Infrastructure | |
E. DigitalOcean | |
F. Alibaba Cloud | |
G. Red Hat | |
Clear selection | |
Options | |
Question 5 of 20 | |
Which three data personas does Databricks support with personalized UI spaces in its single platform? | |
** A. Data scientists | |
** B. Data analysts | |
C. Data collectors | |
** D. Data engineers | |
E. Data managers | |
F. Data marketers | |
Clear selection | |
Options | |
Question 6 of 20 | |
What is the purpose of the Databricks Partner Ecosystem? | |
** A. The Databricks Partner Ecosystem enables you to connect with different ISVs, C&SI firms, and cloud specific technologies to further enhance your data ecosystem. | |
B. The Databricks Partner Ecosystem is designed to provide a list of partner designed and released products that can be used alongside the Databricks Lakehouse Platform. | |
C. The Databricks Partner Ecosystem is used to exclusively connect potential Databricks customers with consulting firms that specialize in data ecosystem migrations. | |
Clear selection | |
Options | |
Question 7 of 20 | |
How does a Solutions Architect (SA) support a Databricks Customer Account? | |
** A. A SA assists an AE in demonstrating proof of value examples and helping the customer realize strategic value with their Databricks Lakehouse Platform implementation. | |
B. A SA is the first point of contact for a Databricks Customer and has specialized industry and region knowledge so they can best support you in your journey from start to finish. | |
C. A SA works with a Databricks Partner to ensure the Databricks Customer Account is properly handed over to be managed for migration, new builds, and implementation. | |
Clear selection | |
Options | |
Question 8 of 20 | |
Why would you need to engage the Databricks Professional Services team on your account? | |
** A. If your data environment has specific requirements and custom specifications, the Databricks Professional Services team can assist with building customizations, migrations, and general project management. | |
B. If you run into technical issues with your Databricks Lakehouse Platform implementation, the Databricks Professional Services team can troubleshoot your problem and respond to support requests. | |
C. If your migration has stalled with a Databricks Partner, the Databricks Professional Services team can step in to interface with the Partner to get your migration back on track. | |
Clear selection | |
Options | |
Question 9 of 20 | |
Who should you speak with to learn about Databricks Professional Services offerings? | |
** A. Databricks Account Executive | |
B. Databricks Technical Support | |
C. Databricks Certified Partner | |
D. Databricks Development and Training | |
Clear selection | |
Options | |
Question 10 of 20 | |
In addition to Databricks Academy, where can individuals locate self-service, on-demand, cloud-specific Databricks information and technical guidance? | |
** A. Databricks documentation | |
B. Databricks Professional Services | |
C. help.databricks.com | |
D. Databricks Partner Ecosystem | |
Clear selection | |
Options | |
Question 11 of 20 | |
How does Databricks guarantee their platform is secure and meets regulatory requirements? | |
** A. The Databricks Lakehouse Platform goes through rigorous testing and provides multiple built in security capabilities, and is subjected to top industry compliance certifications. | |
B. The Databricks Lakehouse Platform works with trusted regulatory agencies to implement best practices for security and governance in the implementation of the platform structure and code. | |
C. The Databricks Lakehouse Platform only utilizes proprietary, closed source software to ensure no bad-actors are capable of introducing problematic code into the core platform design. | |
Clear selection | |
Options | |
Question 12 of 20 | |
Why is the separation of the Databricks Account from the Databricks Workspace a security and access benefit? | |
** A. With the account and workspace separate, only administrators can access the account details through the account portal and users are limited to the workspaces they are assigned. | |
B. Workspaces are not allowed direct access to data storage, ensuring that proper access is only granted from the account before the data can be transferred in and processed. | |
C. Access control to workspaces and data can be granted on a fine grained basis to ensure security measures are properly enforced, and users can view their available workspaces in the account console. | |
Clear selection | |
Options | |
Question 13 of 20 | |
What are three benefits of using Unity Catalog for data security, access, and governance? | |
A. Unity Catalog is an open source product for data management. | |
B. Unity Catalog brings together your data lake and data lakehouse. | |
C. Unity Catalog adds a new foldering system for data storage. | |
** D. Unity Catalog includes Delta Sharing for easy and secure data sharing. | |
** E. Unity Catalog has data lineage capabilities for your data objects. | |
** F. Unity Catalog acts as a single-source of truth for your data and access. | |
Clear selection | |
Options | |
Question 14 of 20 | |
In what two ways does Databricks help businesses lower their total cost of ownership of their data ecosystem? | |
A. By streamlining the data processing tasks with a prescribed workflow. | |
B. By adding a data management tool on top of your existing system. | |
** C. By speeding up time-to-market of innovative data products. | |
D. By leveraging open source data processing tools in a single system. | |
** E. By eliminating cost and complexity in the data ecosystem. | |
Clear selection | |
Options | |
Question 15 of 20 | |
Databricks uses third-party benchmarks from TPC-DS to test our system against competitors and other data infrastructures. What benefit does this type of testing have for customers? | |
A. Customers can request competitors use the comparison benchmarks. | |
B. The testing is done on a standardized set of data available to the public. | |
** C. The comparison results are fair and set against industry standards. | |
D. Databricks has direct involvement in the development of the benchmarks. | |
Clear selection | |
Options | |
Question 16 of 20 | |
What are three benefits of switching from “pay-as-you-go” pricing with Databricks to purchasing a “committed-use” contract? | |
A. Databricks environments on additional cloud service providers. | |
** B. Access to paid support offerings, such as the Professional Services Team | |
C. Databricks environments with enhanced security and compliance add-ons | |
** D. Access to additional, customized training opportunities. | |
** E. The assignment of a DSA to further support your account. | |
F. Invitations to private previews for databricks roadmap features | |
Clear selection | |
Options | |
Question 17 of 20 | |
How is the Databricks Unit, or DBU, defined? | |
** A. A DBU is a normalized unit of processing power used by Databricks for measurement and pricing purposes. | |
B. A DBU is a feature of the Databricks Workflow feature indicating each step, or unit, of the job workflow. | |
C. A DBU is a number associated with how many compute units it takes to complete jobs for given workloads. | |
D. A DBU is a component of the Databricks Lakehouse Platform used to complete data processing tasks. | |
Clear selection | |
Options | |
Question 18 of 20 | |
What are Databricks Runtimes? | |
A. They are different, persona-based environments built into the platform where data practitioners can access their current workloads, notebooks, and data. | |
** B. They are a set of core components that run on Databricks compute clusters, preloaded with workload specific features, such as popular ML libraries or Databricks Photon. | |
C. They are preset data sets available to confirm your Databricks Lakehouse Platform is running appropriately, specialized for specific workload tasks. | |
D. They are prescribed workflows for data processing that meet computing data and AI best practices, and are built into the Databricks Lakehouse Platform for all data practitioners. | |
Clear selection | |
Options | |
Question 19 of 20 | |
How does Databricks Photon impact platform optimization and performance? | |
A. Databrick Photon is created by joining traditional compute clusters with specialized compute clusters to streamline data processing tasks. | |
B. Databricks Photon is the new core to the Databricks Lakehouse Platform that allows multiple users to work on the same data set without conflict. | |
C. Databricks Photon is a workflow process pioneered by Databricks to accelerate BI workloads with published best practices | |
** D. Databricks Photon is a next-generation query engine that accelerates SQL workloads and cuts costs by speeding up data processing tasks. | |
Clear selection | |
Options | |
Question 20 of 20 | |
How does serverless compute differ from classic compute? | |
** A. With serverless compute, Databricks owns the responsibility of managing the compute resources of your cloud environment, so your account administrator isn’t burdened with the management of compute resources and they are available when needed to your data team members. | |
B. With serverless compute, you no longer need to designate where compute clusters are located and are able to create compute resources ad-hoc in any environment on any cloud as needed for immediate use or instant jobs. | |
C. With serverless compute, your compute clusters are hosted separately from your data in a private cloud account that is only accessible by you and is administered by a third-party, that is guaranteed by industry regulations for security and compliance. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment