Your Gateway to Professional Excellence in Business Intelligence- Microsoft Certified: Data Analyst Associate Certification
The contemporary business landscape demands professionals who can extract meaningful insights from vast repositories of information and translate complex datasets into actionable intelligence. Organizations across every sector seek individuals capable of transforming raw numerical information into strategic recommendations that drive decision-making processes. The Microsoft Certified: Data Analyst Associate Certification represents a globally recognized credential that validates your proficiency in leveraging Power BI and associated technologies to deliver impactful business solutions.
This professional designation demonstrates your capability to design, construct, and deploy scalable analytics solutions that meet organizational requirements. By obtaining this certification, you signal to employers worldwide that you possess the technical acumen and practical expertise necessary to thrive in data-centric roles. The credential encompasses a comprehensive range of competencies, from data preparation and modeling to visualization and distribution, ensuring certificants can handle end-to-end analytics workflows.
The demand for qualified professionals in this domain continues to surge as enterprises increasingly recognize the competitive advantage derived from data-driven strategies. Companies invest substantial resources in business intelligence infrastructure, yet the true value materializes only when skilled practitioners can harness these tools effectively. The Microsoft Certified: Data Analyst Associate Certification bridges the gap between technological capability and business application, positioning holders as invaluable assets within their organizations.
Pursuing this credential represents more than acquiring technical skills; it signifies a commitment to professional development and staying current with evolving industry standards. The certification process requires candidates to demonstrate proficiency across multiple dimensions of analytics work, ensuring that successful certificants possess well-rounded expertise. Whether you're transitioning into analytics from another field or advancing your existing career, this credential provides a structured pathway to validate and showcase your capabilities.
Exploring the Scope and Value of This Professional Credential
The Microsoft Certified: Data Analyst Associate Certification encompasses a broad spectrum of competencies essential for modern analytics practitioners. This credential focuses specifically on Power BI, a leading business intelligence platform that has become the standard tool for organizations seeking to democratize access to insights across their workforce. The certification validates your ability to prepare data sources, model relationships, create compelling visualizations, and deploy solutions that stakeholders can use to inform their decisions.
What distinguishes this certification from other analytics credentials is its practical orientation. Rather than focusing exclusively on theoretical concepts, the examination assesses your ability to perform real-world tasks that analysts encounter daily. You'll demonstrate proficiency in connecting to diverse data sources, applying transformations to ensure accuracy and consistency, designing data models that support efficient querying, and crafting reports that communicate findings effectively to non-technical audiences.
The curriculum covers essential aspects of the analytics lifecycle, beginning with data acquisition from multiple sources including databases, files, online services, and streaming platforms. You'll learn to evaluate data quality, implement cleansing procedures, and structure information for optimal performance. The modeling component addresses relationship design, calculation logic using DAX expressions, and optimization techniques that ensure responsive user experiences even with substantial data volumes.
Visualization represents another critical dimension, where you'll master the art of selecting appropriate chart types, designing intuitive layouts, and incorporating interactive elements that empower users to explore information independently. The certification also addresses deployment considerations, including workspace management, security implementation, and collaboration features that enable teams to work together effectively.
Beyond technical skills, the Microsoft Certified: Data Analyst Associate Certification cultivates analytical thinking and problem-solving abilities. Successful candidates develop the capacity to understand business requirements, translate them into technical specifications, and deliver solutions that address stakeholder needs. This business-oriented mindset differentiates competent technicians from strategic partners who contribute meaningfully to organizational success.
Career Advantages and Professional Opportunities
Earning the Microsoft Certified: Data Analyst Associate Certification opens numerous doors in today's employment marketplace. Organizations across industries actively seek certified professionals who can help them leverage their data investments effectively. The credential serves as a powerful differentiator when competing for positions, signaling to hiring managers that you possess verified skills rather than self-proclaimed expertise.
Salary implications represent one tangible benefit of certification. Industry surveys consistently demonstrate that certified professionals command higher compensation compared to their non-certified counterparts. The wage premium reflects the value employers place on validated competencies and the reduced risk associated with hiring individuals who have demonstrated their abilities through rigorous examination. While compensation varies based on geography, experience, and organization size, the certification typically contributes to enhanced earning potential throughout your career.
The credential also provides mobility within the professional landscape. As businesses increasingly prioritize analytics capabilities, opportunities exist not just in technology sectors but across healthcare, finance, manufacturing, retail, government, and virtually every other domain. The skills validated by the Microsoft Certified: Data Analyst Associate Certification transfer seamlessly across industries, allowing you to pivot between sectors while maintaining career momentum.
Professional credibility represents another significant advantage. When you present findings or recommendations to stakeholders, your certification lends authority to your statements. Colleagues and leadership recognize that your approaches align with industry best practices, increasing confidence in your deliverables. This credibility proves especially valuable when advocating for resource allocation, methodology changes, or strategic initiatives based on analytical insights.
The certification also facilitates networking and community engagement. Microsoft maintains extensive networks of certified professionals who share knowledge, collaborate on challenges, and support each other's growth. Participating in these communities exposes you to diverse perspectives, emerging techniques, and professional connections that can accelerate your development and create opportunities you might not encounter otherwise.
Career progression often accelerates for certified professionals. The credential demonstrates initiative and commitment to excellence, qualities that leadership values when identifying candidates for advancement. Whether pursuing senior analyst roles, moving into management positions, or transitioning to specialized domains like predictive analytics or machine learning, the foundation provided by the Microsoft Certified: Data Analyst Associate Certification supports your trajectory.
Detailed Examination Blueprint and Content Domains
The examination for the Microsoft Certified: Data Analyst Associate Certification, designated as PL-300, assesses competencies across several major domains, each weighted to reflect its relative importance in practical analytics work. Understanding the blueprint helps candidates allocate study efforts appropriately and ensures comprehensive preparation across all tested areas.
The first major domain addresses data preparation, typically representing a substantial portion of the examination. This section evaluates your ability to connect to various data sources, including relational databases, flat files, online services, and real-time streams. You'll need to demonstrate proficiency in using Power Query to transform raw information into analysis-ready datasets. Tasks include filtering rows, pivoting and unpivoting columns, merging queries, appending data, and handling errors gracefully.
Data profiling capabilities are also assessed within this domain. You should understand how to examine data distributions, identify outliers, detect inconsistencies, and assess completeness. The examination tests your knowledge of cleansing techniques such as removing duplicates, standardizing formats, replacing values, and filling missing information appropriately. Performance considerations factor prominently, requiring you to implement query folding where possible and understand when transformations occur in the source system versus locally.
The modeling domain constitutes another critical component, evaluating your ability to design efficient data structures that support analytical requirements. This includes creating relationships between tables, understanding cardinality and cross-filter direction, and implementing star schema designs that optimize query performance. You'll need to demonstrate proficiency with calculated columns and measures using Data Analysis Expressions, including understanding context transition, filter context, and row context.
Advanced modeling concepts assessed include hierarchies, which enable drill-down functionality in reports, and row-level security implementations that restrict data access based on user identity. The examination tests your understanding of when to use calculated columns versus measures, how to optimize model size through appropriate data types and summarization techniques, and strategies for managing slowly changing dimensions.
Visualization represents the domain where you demonstrate your ability to translate analytical findings into compelling visual communications. The examination assesses your knowledge of selecting appropriate visualization types for different analytical scenarios, understanding when to use bar charts versus line graphs, scatter plots versus tables, and more specialized visualizations like treemaps or funnel charts. You'll need to show proficiency in configuring visual properties, applying conditional formatting to highlight significant values, and implementing drill-through capabilities that allow users to explore details.
Dashboard design principles factor into this domain, including effective use of white space, consistent formatting, logical layout patterns, and appropriate use of color to convey meaning without overwhelming viewers. The examination tests your understanding of interactive features such as slicers, filters, bookmarks, and buttons that enhance user experience. Mobile optimization and responsive design considerations are also addressed, ensuring your solutions function effectively across devices.
The deployment and management domain evaluates your ability to publish solutions and manage the environments where they reside. This includes understanding workspace concepts, configuring data refresh schedules, managing gateway connections for on-premises sources, and implementing appropriate security measures. You'll need to demonstrate knowledge of sharing mechanisms, including apps, direct workspace access, and embedded content in other applications.
Collaboration features represent another component, requiring understanding of how multiple analysts can work together on shared content, version control considerations, and approval workflows for content promotion. The examination addresses monitoring capabilities, including usage metrics that reveal how stakeholders interact with your reports and performance metrics that identify optimization opportunities.
Comprehensive Preparation Strategies and Learning Resources
Successful preparation for the Microsoft Certified: Data Analyst Associate Certification requires a strategic approach combining theoretical study, hands-on practice, and exposure to realistic scenarios. The multifaceted nature of the examination demands that candidates develop both conceptual understanding and practical competence across the various domains.
Official learning paths provided by Microsoft represent an excellent starting point. These structured curricula align directly with examination objectives, ensuring comprehensive coverage of required topics. The modules typically include written explanations, video demonstrations, interactive exercises, and knowledge checks that reinforce learning. Working through these materials systematically provides a solid foundation and identifies areas requiring additional focus.
Hands-on practice proves essential for retention and skill development. While reading about Power BI capabilities builds awareness, actually performing tasks embeds knowledge more deeply and reveals nuances that textual descriptions might not convey. Microsoft offers a free version of Power BI Desktop that provides access to most features needed for examination preparation. Dedicating regular time to working with the software, experimenting with different approaches, and building complete solutions from data acquisition through visualization creates muscle memory and confidence.
Practice datasets representing various industries and scenarios help broaden your experience. Rather than repeatedly working with the same familiar data, seek diverse sources that challenge you to apply techniques in different contexts. Many public repositories offer sample datasets suitable for learning purposes. Working through structured exercises where you must complete specific analytical tasks mimics examination conditions and highlights gaps in your knowledge.
Sample questions and practice examinations provide valuable assessment opportunities. These resources help familiarize you with question formats, time constraints, and the depth of knowledge expected. When reviewing results, focus not just on questions answered incorrectly but also on understanding why the correct answers are appropriate. This reflection deepens comprehension and helps you develop the reasoning skills necessary for tackling unfamiliar scenarios.
Community resources offer additional perspectives and support. Online forums, user groups, and social media communities dedicated to Power BI and the Microsoft Certified: Data Analyst Associate Certification connect you with peers pursuing similar goals. Participating in discussions, asking questions when confused, and sharing your own insights reinforces learning while building your professional network. Many experienced practitioners generously share tips, resources, and encouragement that can prove invaluable during preparation.
Structured training programs, whether instructor-led courses or comprehensive online bootcamps, provide intensive preparation experiences. These offerings typically condense material into concentrated timeframes with expert guidance, immediate feedback on exercises, and opportunities to interact with instructors and classmates. While optional, such programs can accelerate readiness, particularly for individuals who benefit from external structure and accountability.
Creating a study schedule that allocates time consistently over weeks or months generally yields better results than sporadic intensive cramming sessions. Spaced repetition, where you revisit topics multiple times with intervals between reviews, enhances long-term retention. Consider organizing your preparation by domain, mastering each area before moving to the next, or alternating between topics to maintain engagement and facilitate connections between related concepts.
Documentation and note-taking during preparation serve multiple purposes. Summarizing concepts in your own words reinforces understanding, while notes provide quick reference materials for final review before the examination. Many candidates find creating visual summaries, flashcards, or mind maps helpful for organizing information and identifying relationships between topics.
Practical Application of Data Preparation Techniques
Data preparation represents the foundation of any analytics project, often consuming the majority of time in real-world scenarios. The Microsoft Certified: Data Analyst Associate Certification heavily emphasizes these competencies because improperly prepared data inevitably leads to flawed insights, regardless of sophisticated modeling or visualization techniques applied subsequently.
Connecting to data sources requires understanding the various connectivity options Power BI provides. For relational databases, you'll work with connectors specific to systems like SQL Server, Oracle, MySQL, and PostgreSQL, each with unique configuration requirements. Understanding authentication methods, whether Windows credentials, database logins, or OAuth tokens, ensures successful connections while maintaining security protocols.
File-based sources including Excel workbooks, CSV files, and JSON documents present different considerations. You'll need to specify file locations, which might be local paths, network shares, or cloud storage locations. Understanding how Power BI interprets file structures, including header rows, data types, and delimiters, prevents common import problems. The examination tests your ability to troubleshoot connection issues, such as changed file paths or authentication failures.
Online services and APIs represent increasingly common data sources. Power BI provides connectors for numerous cloud platforms, including Azure services, Google Analytics, Salesforce, and many others. Working with these sources requires understanding API authentication, rate limiting considerations, and how to navigate hierarchical data structures that these services often return.
Transformation capabilities within Power Query constitute a substantial portion of preparation. Understanding the transformation menu and formula language enables you to reshape data into optimal structures for analysis. Common transformations include filtering rows based on conditions, which might involve simple value comparisons or complex logical expressions combining multiple criteria.
Column operations such as splitting text based on delimiters, extracting substrings, or changing data types require careful attention to potential edge cases. For example, when splitting a full name column into separate first and last name columns, you must consider scenarios with middle names, suffixes, or hyphenated surnames. The examination tests your ability to anticipate such complications and implement robust transformations.
Aggregation operations allow summarizing detailed data at higher levels of granularity. You might group transaction records by customer and date, calculating total amounts, average values, or counting occurrences. Understanding which aggregation functions apply to different scenarios and how grouping affects result sets proves essential for preparing data appropriately.
Merge operations combine data from multiple queries based on common columns, similar to SQL joins. Power BI supports various join types including inner, outer, left, and right, each producing different results when matching or non-matching rows exist. The examination assesses your understanding of when each join type applies and how to configure matching columns with appropriate case sensitivity and data type considerations.
Append operations stack rows from multiple queries into a single result set, useful when working with data partitioned across multiple files or tables. Understanding how Power BI aligns columns during append operations and handles situations where column names or data types differ between sources ensures successful consolidation.
Error handling represents a critical but sometimes overlooked aspect of data preparation. Real-world data frequently contains anomalies that can halt refresh operations if not addressed proactively. Techniques include removing error rows, replacing errors with default values, or implementing conditional logic that handles problematic cases gracefully. The examination tests your ability to implement appropriate error handling based on business requirements and data quality objectives.
Performance optimization during data preparation significantly impacts solution usability. Query folding, where transformations are pushed back to source systems for execution rather than processed locally, dramatically improves refresh performance. Understanding which transformations support folding and how to structure queries to maximize this capability represents an important competency. The examination may assess your ability to identify transformation steps that prevent folding and recommend alternatives.
Data profiling tools within Power Query help assess quality and distributions. Column statistics reveal completeness percentages, distinct value counts, and data ranges that inform cleansing decisions. Column distribution visualizations highlight skewed data or unexpected patterns. Column quality indicators quickly identify rows with errors or empty values. Leveraging these tools during preparation ensures you understand the data thoroughly before proceeding to modeling.
Advanced Data Modeling Techniques and Best Practices
Data modeling transforms prepared datasets into structured frameworks that support efficient querying and analysis. The Microsoft Certified: Data Analyst Associate Certification extensively tests modeling capabilities because well-designed models dramatically impact solution performance, usability, and maintainability.
Star schema design represents the gold standard for analytical models. This architecture organizes data into fact tables containing measurable quantities and dimension tables containing descriptive attributes. Fact tables typically store transaction-level details such as sales amounts, quantities, or durations, while dimension tables provide context such as product details, customer information, or date attributes. The examination assesses your ability to identify appropriate fact and dimension tables within a business scenario and design relationships accordingly.
Relationship creation involves connecting tables based on common columns, typically linking dimension key columns to corresponding foreign keys in fact tables. Understanding cardinality proves essential, as relationships might be one-to-many, many-to-one, or many-to-many, each with different implications for how filters propagate through the model. The examination tests your ability to determine appropriate cardinality based on data characteristics and analytical requirements.
Cross-filter direction determines how selections in one table affect another. Single direction filtering means selections in the dimension table filter the fact table but not vice versa, appropriate for most scenarios. Bidirectional filtering allows filters to propagate in both directions, useful in specific situations but potentially creating ambiguous filter contexts that complicate calculations. The examination assesses your understanding of when each direction applies and the implications of choosing inappropriately.
Active versus inactive relationships provide flexibility when multiple relationships exist between the same tables. For example, a sales fact table might relate to a date dimension through order date, ship date, and delivery date columns. Only one relationship can be active by default, while others remain inactive but available for calculations using the USERELATIONSHIP function. The examination tests your ability to design models with appropriate relationship patterns and leverage inactive relationships effectively.
Calculated columns extend tables with additional attributes derived from existing columns or related data. These columns are computed during data refresh and stored in the model, making them available for filtering, grouping, and visualization. Common uses include concatenating text fields, extracting date components, or categorizing values based on conditions. The examination assesses your understanding of when calculated columns are appropriate versus alternatives like calculated measures.
Measures represent dynamic calculations evaluated based on filter context established by report interactions. Unlike calculated columns, measures do not increase model size and recalculate as users interact with visualizations. DAX expressions define measure logic, ranging from simple aggregations to complex statistical calculations or time intelligence functions. The examination extensively tests DAX knowledge, requiring you to write expressions that produce correct results across various scenarios.
Understanding filter context proves fundamental to DAX proficiency. Filter context consists of the filters applied by slicers, visual filters, row/column selections in matrices, and relationships between tables. Measures evaluate within this context, automatically adjusting their calculations based on the current filter state. The examination tests your ability to predict measure results given specific filter contexts and write expressions that modify context appropriately.
Row context exists when evaluating expressions row-by-row within a table, such as in calculated columns or iterator functions. Understanding the distinction between filter context and row context, and how functions like CALCULATE transition between them, represents a critical competency. The examination assesses your ability to write expressions that correctly handle these contexts and understand the performance implications of different approaches.
Time intelligence calculations form a significant category of analytical measures. Year-to-date totals, month-over-month growth, rolling averages, and same-period-last-year comparisons all require time intelligence functions that understand date relationships. The examination tests your ability to implement these calculations using functions like TOTALYTD, DATEADD, SAMEPERIODLASTYEAR, and others, requiring proper date table configuration with continuous date ranges and marked date columns.
Optimization techniques ensure models remain responsive even with substantial data volumes. Appropriate data types minimize memory consumption; using integers rather than text for codes, or dates rather than datetime when time precision is unnecessary, reduces model size. Removing unnecessary columns from the model eliminates wasted space. Reducing cardinality where possible, perhaps by grouping detailed categories into broader classifications, improves compression. The examination assesses your understanding of these optimization principles and ability to apply them judiciously without sacrificing analytical capability.
Hierarchies create drill-down paths that enhance user experience. A geography hierarchy might include country, state, and city levels, allowing users to start with aggregated national views and progressively drill into regional or local details. A product hierarchy might organize items by category, subcategory, and individual SKU. The examination tests your ability to create appropriate hierarchies based on natural organizational structures within the data.
Row-level security restricts data access based on user identity, essential when different stakeholders should see only relevant subsets. RLS involves creating roles with filter expressions that limit visible rows, then assigning users to appropriate roles. A regional manager might see only their territory's data, while executives access all regions. The examination assesses your ability to design RLS implementations that meet security requirements while maintaining performance.
Creating Compelling Visualizations and Reports
Visualization transforms analytical findings into accessible communications that drive understanding and action. The Microsoft Certified: Data Analyst Associate Certification extensively evaluates your ability to create effective visual representations and design reports that serve stakeholder needs.
Chart type selection begins with understanding the analytical question being addressed. Comparisons between categories favor bar charts or column charts, which clearly display relative magnitudes. Trends over time naturally align with line charts that reveal patterns, seasonality, or inflection points. Distributions use histograms or box plots to show how values spread across ranges. Correlations between variables employ scatter plots that reveal relationships and outliers. Part-to-whole relationships leverage pie charts or treemaps that illustrate proportional contributions. The examination tests your ability to select appropriate visualizations based on data characteristics and analytical objectives.
Custom visualizations extend beyond standard chart types to specialized formats developed by Microsoft or the community. Gauges display single values against targets or ranges, useful for KPI dashboards. Funnels show progression through stages, common in sales pipeline analysis. Decomposition trees enable exploratory analysis by allowing users to break down measures by different dimensions interactively. The examination assesses familiarity with these specialized formats and understanding of appropriate use cases.
Visual properties and formatting significantly impact effectiveness. Appropriate axis scaling ensures accurate perception of magnitudes; truncated axes can mislead viewers about relative differences. Titles and labels provide necessary context without cluttering the display. Color choices should consider accessibility for colorblind viewers, typically avoiding red-green combinations and incorporating patterns or labels as additional differentiators. The examination evaluates your ability to configure these properties to enhance rather than obscure insights.
Conditional formatting directs attention to significant values through color scales, data bars, or icons. A financial report might highlight negative values in red and positive in black, immediately drawing attention to concerning metrics. Performance scorecards might display icons indicating whether metrics are above, below, or near targets. The examination tests your understanding of conditional formatting capabilities and appropriate application strategies.
Interactive features empower users to explore data according to their interests. Slicers provide intuitive filtering controls, allowing stakeholders to focus on specific time periods, categories, or other dimensions relevant to their questions. Cross-filtering enables drill-through workflows where selecting a value in one visual automatically filters others on the page, revealing related details. Drill-down capabilities let users navigate hierarchies, starting with high-level summaries and progressively examining finer details. The examination assesses your ability to implement these interactive elements effectively.
Report layout and page design principles ensure professional, navigable solutions. Consistent formatting across pages establishes visual coherence; using the same fonts, colors, and alignment patterns creates familiarity. Logical organization groups related visuals together and establishes clear reading patterns, typically left-to-right and top-to-bottom in Western contexts. White space prevents overcrowding and allows individual elements to breathe. The examination evaluates your understanding of design principles that enhance usability.
Mobile-optimized layouts adapt reports for smartphone and tablet viewing. Creating mobile layouts involves selecting priority visuals for limited screen space and reconfiguring arrangements for portrait orientation. Touch-friendly controls with adequate size and spacing ensure usability on touchscreen devices. The examination tests your awareness of mobile considerations and ability to create appropriate layouts.
Bookmarks capture specific report states, including filter selections, visual visibility, and page navigation. These enable guided storytelling experiences where users can progress through curated views addressing sequential questions. Buttons combined with bookmarks create interactive navigation schemas, allowing users to jump between different analytical perspectives. The examination assesses your ability to leverage these features for enhanced user experiences.
Tooltips extend visualizations with additional context displayed on hover. Default tooltips show basic information about hovered elements, while custom tooltips can display entirely separate visualizations providing supporting details. A sales trend line might include tooltips showing product mix or customer segment breakdowns for hovered months. The examination tests understanding of tooltip capabilities and appropriate implementation.
Parameters enable runtime customization of reports without editing underlying queries or measures. What-if parameters allow users to adjust assumptions and immediately see impacts on calculations, useful for scenario planning. Field parameters let users switch between different measures or dimensions displayed in visualizations, creating flexible reports that serve multiple purposes. The examination evaluates your ability to implement appropriate parameter-based solutions.
Effective Data Distribution and Collaboration Strategies
Publishing analytical solutions and enabling stakeholder access represents the culmination of development efforts. The Microsoft Certified: Data Analyst Associate Certification assesses your ability to deploy solutions appropriately and manage the environments where they operate.
Workspace architecture provides organizational structure within the Power BI service. Workspaces function as containers for related content, typically aligned with teams, projects, or business functions. Understanding workspace roles including admin, member, contributor, and viewer clarifies permission implications. Admins manage workspace settings and membership, members can publish content, contributors can create and edit content, and viewers can only consume published reports. The examination tests your ability to recommend appropriate workspace structures and role assignments based on organizational scenarios.
Publishing from Power BI Desktop to the service makes reports available for stakeholder consumption. The publication process uploads the data model and visualizations to the selected workspace, after which you can configure sharing, schedule refreshes, and enable collaboration. Understanding implications of publishing decisions, such as whether to overwrite existing reports or create new ones, proves important for maintaining content organization.
Data refresh schedules ensure reports display current information. For imported data, configuring scheduled refreshes determines how frequently the service reconnects to sources and updates the model. Understanding capacity limits on refresh frequency and the distinction between personal and organizational gateways for accessing on-premises data sources represents important knowledge. The examination assesses your ability to configure appropriate refresh schedules based on data volatility and stakeholder requirements.
Gateway configuration enables connectivity to on-premises data sources from the cloud-based Power BI service. Personal gateways serve individual users, while organizational gateways support multiple users and reports with centralized management. Understanding gateway installation requirements, configuration steps, and troubleshooting common connectivity issues proves essential for production deployments. The examination tests your knowledge of gateway concepts and appropriate usage scenarios.
Security implementation ensures only authorized individuals access sensitive information. Beyond workspace roles that control content management permissions, you must consider row-level security that restricts data visibility within reports. Understanding how RLS roles defined in the model are enforced after publication, including testing procedures and user assignment processes, represents critical knowledge. The examination evaluates your ability to design and implement comprehensive security strategies.
Sharing mechanisms determine how stakeholders access content. Direct workspace access provides individuals with workspace roles, appropriate when they need to interact with multiple reports within the workspace. Individual report sharing creates shareable links with configurable permissions, suitable for distributing specific reports to defined audiences. Power BI apps package collections of reports and dashboards into polished experiences for broader distribution, ideal for serving organizational divisions or external partners. The examination tests your understanding of when each sharing approach applies.
Dashboards aggregate key visualizations from multiple reports into single-screen views optimized for monitoring. Unlike reports that support extensive interactivity and filtering, dashboards display static tiles providing at-a-glance status of important metrics. Creating effective dashboards involves selecting the most critical visuals, arranging them logically, and configuring refresh schedules to ensure current information. The examination assesses your ability to design appropriate dashboards that serve monitoring requirements.
Collaboration features enable teams to work together on analytical solutions. Comments on reports facilitate discussions about findings, questions, or recommendations directly within the consumption experience. Subscribe and alert capabilities notify stakeholders when data meets specified conditions or on defined schedules, ensuring important information reaches audiences proactively. The examination evaluates your understanding of collaborative features and appropriate implementation.
Embedded analytics extends Power BI content into other applications, whether internal business systems or customer-facing products. Power BI Embedded enables developers to integrate reports and visuals into custom applications, providing analytical capabilities within familiar interfaces. Understanding embed tokens, API usage, and capacity requirements for embedded scenarios represents advanced knowledge assessed in the examination.
Monitoring and optimization ensure solutions remain performant and serve stakeholder needs effectively. Usage metrics reveal how frequently reports are consumed, which users access them, and which features they interact with most. This information guides decisions about where to invest optimization efforts or whether underutilized content should be retired. Performance metrics identify slow-loading visuals or inefficient queries requiring attention. The examination tests your ability to interpret monitoring data and implement appropriate improvements.
Deployment pipelines support professional development workflows with separate environments for development, testing, and production. Content creators build solutions in development workspaces, promote them to test environments for validation, and finally release to production workspaces for stakeholder access. This staged approach prevents disruptions from works-in-progress and ensures quality standards before releasing changes. The examination assesses your understanding of deployment pipeline concepts and their role in enterprise scenarios.
Essential DAX Formulas and Calculation Patterns
Data Analysis Expressions form the calculation language for Power BI measures and calculated columns. The Microsoft Certified: Data Analyst Associate Certification requires substantial DAX proficiency, as this capability underpins the ability to create sophisticated analytical solutions.
Basic aggregation functions provide foundational calculation capabilities. SUM totals numeric columns across filtered rows, while AVERAGE computes means. COUNT tallies rows regardless of content, while COUNTROWS specifically counts table rows. DISTINCTCOUNT identifies unique values, useful for metrics like customer counts where duplicate names might exist across transactions. MIN and MAX identify extremes within datasets. The examination tests your ability to select appropriate aggregation functions and implement them correctly.
The CALCULATE function represents perhaps the most powerful and commonly used DAX function. CALCULATE evaluates expressions within modified filter contexts, enabling dynamic calculations that adapt to specific conditions. The function accepts a measure expression followed by filter arguments that override or supplement existing context. Understanding how CALCULATE manipulates filter context, including filter interactions between multiple arguments, proves essential for advanced calculations. The examination extensively evaluates your CALCULATE proficiency across diverse scenarios.
Filter functions define conditions for CALCULATE and other context-sensitive operations. FILTER evaluates expressions row-by-row within tables, returning subsets meeting specified conditions. ALL removes filters from specified columns or tables, useful when calculating percentages of totals or grand aggregates. ALLEXCEPT removes filters from all columns except those specified, enabling calculations like market share within categories. VALUES returns distinct values currently visible in filter context. The examination tests your ability to combine these functions appropriately.
Time intelligence functions enable date-based analytical patterns. TOTALYTD calculates year-to-date totals for selected dates, automatically handling fiscal year variations when configured appropriately. DATEADD shifts date contexts by specified intervals, enabling comparisons to prior periods. SAMEPERIODLASTYEAR simplifies year-over-year calculations by shifting dates back twelve months. PARALLELPERIOD provides flexible date shifting across various granularities. The examination assesses your ability to implement common time intelligence patterns using appropriate functions.
Relationship functions navigate model relationships within calculations. RELATED retrieves values from the one side of relationships, useful in calculated columns for incorporating dimension attributes into fact tables. RELATEDTABLE returns entire related tables, enabling aggregations across relationships within measures. USERELATIONSHIP activates inactive relationships for specific calculations, supporting scenarios with multiple date relationships. The examination tests your understanding of relationship traversal and appropriate function usage.
Iterator functions perform row-by-row operations within tables. SUMX evaluates expressions for each row and sums results, enabling weighted calculations or aggregations of computed values. AVERAGEX computes row-level expressions and averages results, useful for metrics like average transaction value. MINX and MAXX identify minimum and maximum computed values. COUNTAX tallies non-blank evaluated expressions. Understanding when iterator functions are necessary versus simple aggregations represents important knowledge assessed in the examination.
Logical functions enable conditional calculations. IF evaluates conditions and returns different results based on truth values, supporting segmentation or categorization logic. SWITCH tests expressions against multiple values, returning corresponding results and improving readability compared to nested IF statements. AND, OR, and NOT combine multiple conditions into compound logical tests. The examination evaluates your ability to construct appropriate logical expressions for various scenarios.
Text functions manipulate string values. CONCATENATE or the & operator combines text from multiple sources, useful for creating composite descriptions or labels. LEFT, RIGHT, and MID extract substrings from specified positions. UPPER and LOWER convert case, while TRIM removes excess whitespace. FORMAT converts numbers or dates to text with specified patterns, enabling custom display formats. The examination tests your ability to apply text functions appropriately within analytical contexts.
Date and time functions extract components or perform date arithmetic. YEAR, MONTH, and DAY return respective components from date values. TODAY and NOW provide current dates and timestamps, though their use requires consideration in measures due to caching. DATEDIFF calculates intervals between dates in specified units. The examination assesses your understanding of date function capabilities and appropriate usage.
Information functions detect conditions or data types. ISBLANK tests for empty values, essential for error handling and conditional logic. ISERROR identifies calculation errors, enabling graceful failure handling. ISNUMBER and ISTEXT verify data types. The examination evaluates your ability to incorporate appropriate checks within robust calculations.
Statistical functions provide analytical capabilities. MEDIAN identifies middle values within distributions. PERCENTILE calculates specified percentile values, useful for understanding distribution characteristics. STDEV and STDEVX compute standard deviations, enabling variability analysis. The examination tests your familiarity with statistical functions and their applications.
Variable declaration using VAR improves calculation readability and performance. Variables store intermediate results for reuse within expressions, eliminating redundant calculations and clarifying complex logic. RETURN follows variable declarations and specifies the expression to evaluate. The examination assesses your ability to structure calculations with variables appropriately.
Understanding evaluation order and context transition proves essential for correct DAX usage. Nested function calls evaluate from innermost to outermost, while CALCULATE modifies context before evaluating its expression argument. Row context does not automatically filter related tables, requiring CALCULATE to transition row context into filter context when aggregating related data. The examination tests your understanding of these fundamental concepts across various calculation scenarios.
Implementing Robust Data Governance and Security
In the modern digital enterprise, the ability to manage and safeguard data is no longer a secondary consideration—it is a foundational requirement for success. As organizations increasingly depend on analytics for decision-making, maintaining the integrity, confidentiality, and availability of data becomes paramount. Implementing robust data governance and security frameworks ensures that sensitive information remains protected, accessible only to authorized individuals, and properly classified for compliance with regulatory mandates. Governance and security are inseparable elements of any sustainable analytics strategy, providing the balance between accessibility for innovation and restriction for protection.
The Microsoft Certified: Data Analyst Associate Certification emphasizes this duality, focusing on both the technical and strategic aspects of secure analytics environments. It evaluates professionals’ understanding of data classification, sensitivity labels, access controls, and granular protection mechanisms like row-level and object-level security. Mastery of these domains empowers analysts to deliver insights confidently while maintaining compliance and protecting confidential assets. Beyond passing the examination, understanding data governance and security principles ensures that analysts uphold the ethical and operational standards necessary in a data-driven world.
The Foundation of Governance and Security in Analytics
Governance and security form the backbone of trustworthy analytics. Governance defines the rules, responsibilities, and standards for how data is collected, stored, shared, and retired. Security enforces these rules through technical measures that restrict access, prevent misuse, and monitor compliance. Together, they form a synergistic relationship that ensures data integrity, accountability, and traceability across the analytical ecosystem.
Effective governance begins with clear policies defining data ownership, classification levels, and stewardship roles. These policies ensure that every dataset has a defined purpose, source, and lifecycle management process. Governance also encompasses compliance with legal frameworks such as GDPR, HIPAA, and other industry-specific mandates that dictate how sensitive information must be handled. Security, meanwhile, operationalizes governance through encryption, authentication, and access control mechanisms that enforce compliance automatically.
In analytics platforms such as those built on Microsoft Power BI, governance and security implementation must address multiple layers—from data ingestion and transformation to visualization and sharing. The goal is to enable seamless collaboration while preventing unauthorized exposure of sensitive data. By combining technical controls with policy-driven governance, organizations achieve a balance between innovation and protection, ensuring that analytics remains both powerful and compliant.
Data Classification and Sensitivity Management
Data classification serves as the cornerstone of any governance framework, establishing the sensitivity levels that dictate how data should be handled and protected. Classification assigns each dataset a sensitivity category—public, internal, confidential, or highly restricted—based on the potential risk associated with exposure. Public data requires minimal protection, while confidential and regulated information demands advanced safeguards such as encryption, restricted access, and auditing.
Proper classification begins with understanding the organization’s data landscape. This involves cataloging all data sources, identifying their content, and assigning classification tags accordingly. Automated discovery tools assist in this process by scanning datasets for patterns such as personal identifiers, financial information, or healthcare records. Once classified, data is governed throughout its lifecycle—from ingestion to deletion—under the appropriate protection standards.
A well-defined classification framework also enhances collaboration by preventing accidental data leaks. For example, if an analyst attempts to share a report containing confidential data with unauthorized users, automated classification rules can block the action or trigger additional authorization steps. This preventive approach reduces human error and enforces compliance without disrupting productivity.
From a governance perspective, classification is not static. As data evolves or changes ownership, its classification level may need to be adjusted. Regular reviews and automated reclassification workflows ensure that data remains properly protected as it moves through the analytics pipeline. Mastery of classification concepts is vital for maintaining control over data visibility, compliance adherence, and operational integrity.
Sensitivity Labels and Operational Enforcement
Sensitivity labels extend the principles of classification into actionable protections. They represent metadata-based security markers that enforce restrictions dynamically based on policy configurations. These labels can control access, enforce encryption, restrict sharing, and require authentication, ensuring that sensitive data remains protected regardless of where it travels within or outside the organization.
Sensitivity labeling operates across multiple layers of the analytics environment. For instance, a sensitivity label applied to a dataset in Power BI can automatically encrypt associated reports and dashboards, restrict download permissions, or prevent content sharing with external users. Labels can also mandate multi-factor authentication or conditional access policies before users can interact with specific data assets.
Implementing sensitivity labels requires coordination between governance policies and security configurations. Governance defines the classification tiers and associated rules, while sensitivity labels translate these policies into executable controls. This alignment ensures that policies are consistently enforced across environments without requiring manual intervention.
Automation enhances the scalability of labeling. Machine learning-based classification engines can detect sensitive data types and apply appropriate labels automatically. This proactive approach minimizes oversight and strengthens data protection by ensuring consistent enforcement across large volumes of information. The knowledge of how to assign, manage, and monitor sensitivity labels is crucial for professionals aiming to demonstrate mastery in data governance within analytics platforms.
Implementing Row-Level Security (RLS)
Row-Level Security (RLS) provides fine-grained access control by restricting which rows of data users can view based on their role or identity. Rather than duplicating datasets for different audiences, RLS dynamically filters data so that each user only accesses relevant information. This principle is especially valuable in analytics environments that serve multiple departments, regions, or user groups with differing permission needs.
Implementing effective RLS begins with designing clear security roles that align with organizational hierarchies and governance structures. Each role defines the filter conditions determining which rows are visible to specific users. For instance, a regional sales manager might only see records corresponding to their assigned territory, while an executive could access aggregated global data.
RLS configuration involves defining roles within the dataset and associating them with filters using expressions or DAX formulas in tools like Power BI. Once implemented, these filters apply automatically across reports and dashboards consuming the dataset. However, implementation alone is not enough—thorough testing is essential. Before deployment, RLS roles must be validated under multiple access scenarios to confirm that no unintended data exposure occurs.
Additionally, analysts must understand how RLS interacts with sharing mechanisms and embedded analytics. In some cases, embedding reports in applications may bypass default security if not properly configured. Understanding inheritance, propagation, and exceptions ensures that data remains protected under all access pathways. The mastery of RLS demonstrates both technical acumen and an understanding of real-world governance implications, as this mechanism directly connects to organizational accountability and regulatory compliance.
Object-Level Security and Granular Access Control
While RLS governs visibility at the data record level, Object-Level Security (OLS) restricts access to entire datasets, tables, reports, or dashboards. OLS defines permissions that control which users can view or interact with specific objects within a workspace or project environment. This layer of security is essential for maintaining segregation of duties, preventing unauthorized access to confidential analytics components.
Object-level security relies on role-based access principles where workspace membership, permission inheritance, and explicit access assignments determine visibility. For example, an analyst might have permission to view a report but not the underlying dataset, ensuring that sensitive raw data remains concealed. Understanding how these permission hierarchies interact is critical for preventing unintentional privilege escalation.
Effective OLS design requires balancing security with usability. Overly restrictive permissions can hinder collaboration, while lax access controls increase risk exposure. To achieve balance, administrators must categorize assets based on sensitivity and assign access rights aligned with user responsibilities. Governance policies should define these standards, and automation tools can enforce them systematically.
In complex organizations, OLS integrates with identity management systems to synchronize user roles across platforms. This synchronization ensures that when an employee’s role changes or they leave the organization, access permissions update automatically. Regular access reviews and audits further strengthen the framework, ensuring that object-level restrictions remain accurate and relevant.
Integrating Governance, Security, and Compliance
Integrating governance, security, and compliance forms the cornerstone of resilient data management across modern enterprises. These three disciplines operate as interdependent mechanisms that collectively ensure data remains accurate, protected, and ethically managed. Governance provides the strategic foundation, defining how data should be handled. Security enforces these frameworks through technical and procedural safeguards, while compliance ensures alignment with legal and regulatory mandates. When harmonized effectively, they create a unified structure that builds trust, strengthens accountability, and sustains operational efficiency.
In today’s interconnected landscape, organizations process massive volumes of data across on-premises systems, hybrid clouds, and external partnerships. Without integration between governance, security, and compliance, this complexity can lead to fragmentation, inconsistent policies, and vulnerability to breaches. A well-orchestrated framework ensures that every process, from data creation to disposal, adheres to strict quality, protection, and legal requirements. This synergy not only safeguards assets but also enhances business credibility by demonstrating transparency and accountability to regulators, partners, and customers.
A truly integrated governance and compliance ecosystem transforms reactive data protection into proactive risk management. By embedding compliance within security and governance workflows, organizations can automate enforcement, reduce operational overhead, and minimize the risk of nonconformity. Integration elevates governance from theoretical policy to measurable performance, ensuring every decision, control, and audit trail reflects the organization’s commitment to ethical and lawful data management.
Establishing Unified Governance Structures
The integration process begins by defining a unified governance structure that encompasses policies, ownership, and accountability mechanisms. Governance defines how data is classified, accessed, stored, and utilized across the enterprise. However, for governance to be effective, it must extend beyond documentation into enforcement. This requires collaboration among leadership, IT, legal, and compliance teams to ensure every policy has operational backing.
A unified governance model assigns clear ownership for data domains, aligning responsibilities with functional expertise. Data stewards oversee quality and consistency, while compliance officers ensure adherence to regulatory frameworks. Security teams handle encryption, monitoring, and access control mechanisms that enforce governance policies in practice. When these roles operate cohesively, the organization achieves alignment between strategy and execution.
A governance council or committee often serves as the central decision-making body, bridging departmental boundaries. This body establishes policies that align with industry standards and regulations while balancing accessibility with confidentiality. It reviews audit findings, oversees policy updates, and ensures the governance framework evolves alongside changing business requirements and compliance landscapes.
Unified governance also requires standardization. Data definitions, classifications, and metadata must remain consistent across departments to avoid discrepancies that complicate compliance and auditing. Centralized metadata catalogs enhance visibility, allowing stewards and compliance teams to trace data lineage, understand dependencies, and verify accuracy across interconnected systems.
The Role of Security in Reinforcing Governance
Security translates governance policies into operational controls. Where governance defines what should be protected and why, security dictates how. This translation involves deploying encryption, authentication, authorization, and monitoring mechanisms that ensure compliance requirements are met at every stage of the data lifecycle.
Access control forms the foundation of governance enforcement. Role-based access mechanisms ensure that users only access information necessary for their responsibilities. This principle of least privilege minimizes risk exposure while maintaining efficiency. Multifactor authentication, conditional access policies, and behavioral analytics further strengthen this layer by detecting unauthorized or suspicious activities.
Encryption protects sensitive data in transit and at rest, safeguarding it from interception or tampering. Security teams implement key management policies aligned with governance mandates, ensuring cryptographic practices meet industry standards. These measures prevent data compromise, ensuring compliance with privacy regulations such as GDPR and data protection acts across various regions.
Monitoring and alerting systems serve as the continuous feedback loop within integrated governance structures. Security information and event management (SIEM) systems collect logs from across the environment, detecting anomalies that may indicate policy violations or potential breaches. Automated incident response workflows ensure rapid containment and documentation of any incidents, preserving compliance integrity and demonstrating due diligence during audits.
Through automation and analytics, security operations shift from reactive defense to proactive enforcement. This synergy between governance frameworks and security controls transforms policy compliance into an intrinsic part of daily operations.
Compliance as the Validation Mechanism
Compliance acts as the verification mechanism that validates governance and security efforts. It ensures that every control and process aligns with external regulations and internal ethical standards. While governance establishes principles and security enforces them, compliance verifies effectiveness through monitoring, documentation, and independent auditing.
Compliance requirements vary across industries but share the common goal of protecting stakeholders. Financial institutions must comply with frameworks like PCI DSS and SOX, while healthcare organizations adhere to HIPAA and related patient data regulations. Data privacy mandates, such as GDPR and CCPA, extend across industries, demanding transparency in data collection and processing.
Integrating compliance within governance ensures consistency between internal policies and external laws. Compliance officers collaborate with governance architects to interpret regulations and translate them into enforceable rules. For example, a regulation requiring limited data retention translates into a governance policy defining automatic deletion schedules and a security control that enforces them.
Auditing forms a crucial pillar of compliance. Regular audits verify whether security controls and governance policies are functioning effectively. They examine data lineage, access logs, and incident reports to ensure adherence. Automated compliance tools further streamline this process by continuously tracking metrics and generating reports. This ongoing validation prevents reactive adjustments and keeps organizations perpetually aligned with evolving regulatory expectations.
Automation and Continuous Monitoring
Integration between governance, security, and compliance reaches full maturity when supported by automation. Automated systems enable continuous enforcement of policies and real-time detection of deviations. This minimizes the reliance on manual oversight, reduces human error, and strengthens regulatory adherence.
Automation tools apply governance policies dynamically, tagging data according to classification levels, applying encryption where necessary, and managing access permissions based on user roles. Machine learning models enhance this capability by analyzing behavioral patterns to predict and prevent potential compliance breaches.
Continuous monitoring systems serve as the sentinel of integrated governance. They observe data flows, detect anomalies, and generate alerts for violations of security or compliance policies. For example, if an unauthorized user accesses a restricted dataset, monitoring tools immediately trigger alerts and block further activity. Integration with incident response platforms ensures prompt investigation, remediation, and documentation of events.
Automated compliance dashboards offer real-time visibility into policy adherence. They display metrics such as encryption coverage, data access frequency, retention policy adherence, and incident resolution time. This visibility empowers leadership to make informed decisions, demonstrating accountability to auditors and regulators.
By embedding automation into the governance framework, organizations transform policy enforcement from a periodic task into a continuous, adaptive process. This creates a self-regulating ecosystem capable of responding to threats and regulatory changes autonomously.
Cross-Functional Collaboration and Cultural Integration
Governance integration succeeds only when it becomes part of the organizational culture. It cannot exist solely within IT or compliance departments; it must involve every stakeholder interacting with data. Cross-functional collaboration ensures that governance is understood, respected, and practiced consistently across all levels.
Governance teams define overarching strategies and frameworks. Security engineers operationalize controls and manage technology infrastructure. Compliance officers interpret legal obligations, ensuring alignment with policies. Data stewards oversee accuracy, quality, and lifecycle management, while business leaders champion adherence across departments.
Cultural integration requires training, communication, and leadership commitment. Employees must understand the value of compliance not as an obligation but as a principle of trust. Awareness programs emphasize responsible data handling, the importance of adhering to classification levels, and recognizing potential compliance risks.
Leadership commitment is essential for sustaining this culture. Executives who prioritize governance and allocate resources to compliance initiatives set a precedent for accountability. This top-down endorsement reinforces the message that governance and security are strategic imperatives, not operational burdens.
Cross-functional collaboration also enhances problem-solving. When governance, security, and compliance professionals collaborate, they identify overlapping goals, reduce redundancy, and streamline enforcement mechanisms. This collaborative approach transforms governance into an organizational competency rather than a departmental responsibility.
Final Tips
Integrating governance, security, and compliance creates a resilient framework for risk management. Risk assessment identifies potential vulnerabilities, evaluates their impact, and determines mitigation strategies. Governance frameworks establish policies for risk response, security controls enforce those responses, and compliance verifies their effectiveness.
Risk management within integrated frameworks relies on continuous assessment and adaptive policy evolution. As new technologies, data types, and regulations emerge, existing policies must evolve. For instance, the adoption of AI or cloud-native architectures introduces new risks requiring revised governance standards and updated compliance mappings.
Adaptive policy evolution ensures that governance remains flexible yet enforceable. Regular reviews of control effectiveness, combined with lessons learned from audits and incidents, guide policy refinement. Version control mechanisms maintain documentation of changes for traceability, supporting transparency during compliance reviews.
Integrated risk management also leverages analytics for predictive insight. Data-driven risk modeling identifies patterns indicating potential compliance breaches or security gaps. Predictive analysis enables preemptive mitigation, reducing the likelihood of incidents and regulatory violations. This proactive risk posture not only strengthens governance but also enhances organizational resilience.
Trust represents the ultimate outcome of integrating governance, security, and compliance. Stakeholders—including customers, partners, regulators, and employees—must believe that an organization handles data ethically, securely, and transparently. Governance ensures that policies reflect ethical values, security provides tangible assurance through protection mechanisms, and compliance demonstrates accountability through documentation and verification.
Sustaining this trust requires consistency and transparency. Regular reporting, open communication about compliance achievements, and swift response to incidents reinforce organizational credibility. Transparency in data practices assures customers that their information is treated with respect and diligence.
Sustainability also depends on innovation. As organizations evolve, so too must their governance frameworks. Integrating emerging technologies like blockchain for immutable audit trails or AI for intelligent compliance monitoring enhances both efficiency and trustworthiness. These advancements transform governance from static oversight into dynamic stewardship capable of scaling with growth.
Ultimately, integrating governance, security, and compliance builds an ecosystem where data is both powerful and protected. It establishes a culture of responsibility, a system of precision, and a framework of resilience. By weaving these principles into every process, organizations not only meet regulatory expectations but also position themselves as trusted leaders in the responsible and ethical management of data.
Implementing governance and security is not a one-time project—it is a continuous process of improvement and adaptation. As data volumes grow and technologies evolve, new vulnerabilities and governance challenges emerge. Continuous improvement ensures that policies remain relevant, enforcement mechanisms remain effective, and security protocols adapt to new risks.
Organizations should establish governance maturity models that track progress across metrics such as policy compliance, incident reduction, and user awareness. Regular training initiatives foster a culture of accountability where every user understands their role in data protection. Feedback from users and auditors informs iterative policy refinement, creating a feedback loop that sustains improvement.
Beyond compliance, governance also intersects with data ethics—the responsible and transparent use of data for legitimate purposes. Ethical governance promotes fairness, privacy, and accountability in analytics, ensuring that data is not only protected but also used responsibly. Embedding ethical principles within governance frameworks strengthens public trust and safeguards organizational reputation.
Continuous improvement also leverages technology advancements. Artificial intelligence and automation streamline monitoring, anomaly detection, and remediation. Predictive analytics identify emerging risks, enabling proactive policy adjustments. Over time, this adaptive governance model creates resilience, ensuring that security and integrity remain steadfast amid constant change.