Part 2: Framework for Assessing Rigidity, Friction, and Toil in Cybersecurity Solutions

The RFT Assessment Framework provides a lightweight and structured approach to evaluating cybersecurity solutions for Rigidity, Friction, and Toil. This framework is adaptable across diverse domains, including network security, identity management, cloud infrastructure, and application security. It emphasizes simplicity and repeatability while yielding actionable insights.

Core Components of the Security RFT Framework

  1. Preparation: Define the scope and gather baseline information about the cybersecurity solution(s) under evaluation.
  2. Assessment Dimensions:
    • Rigidity: Measures the solution’s adaptability and flexibility.
    • Friction: Assesses barriers to efficient implementation and usage.
    • Toil: Evaluates repetitive manual effort associated with operating and maintaining the solution.
  3. Scoring and Analysis: Assign scores to assessment criteria and derive insights.
  4. Action Planning: Develop recommendations to address identified pain points.

Step-by-Step Instruction Guide

Step 1: Preparation

  • Define Scope: Identify the specific  cybersecurity solution(s) to be assessed. For example:
    • A threat detection platform.
    • An access control system.
    • Cloud security posture management (CSPM) tools.
  • Gather Stakeholders: Engage team members familiar with the solution, including security engineers, administrators, and users.
  • Baseline Data: Collect documentation, metrics, and user feedback related to the solution’s performance and usability.

Step 2: Assessing the Dimensions

Dimension 1: Rigidity

Objective: Evaluate the solution’s flexibility and adaptability to changing requirements.

  1. Criteria:
    • Integration Effort: How difficult  is it to integrate the solution with other systems?
    • Modularity: Can components  be replaced or upgraded independently?
    • Scalability: How well does the solution scale with organizational growth?
    • Vendor Lock-in: Are there dependencies that prevent switching to alternative solutions?
  2. Scoring:
    • 1 = Highly rigid (difficult to adapt or replace).
    • 5 = Highly flexible (easily adaptable and modular).
  3. Methodology:
    • Interview stakeholders to understand past integration efforts.
    • Review  the solution’s  architecture for modularity and dependencies.
    • Map scalability limitations to organizational growth scenarios.

Dimension 2: Friction

Objective: Assess barriers that slow down implementation or usage.

  1. Criteria:
    • Implementation Time: How long does it take to deploy or configure the solution?
    • Ease of Use: How intuitive are the solution’s interfaces and workflows?
    • Process Bottlenecks: Are there approval or decision-making delays caused by the solution?
    • Training Requirements: How much training is needed to use the solution effectively?
  2. Scoring:
    • 1 = High  friction (complex, time-consuming, and difficult to use).
    • 5 = Low friction (quick, seamless, and user-friendly).
  3. Methodology:
    • Collect time-to-implement data for recent deployments.
    • Use user surveys to assess interface usability and training needs.
    • Document workflows and identify where bottlenecks occur.

Dimension 3: Toil

Objective: Measure the manual effort required to operate and maintain the solution.

  1. Criteria:
    • Automation Capability: How effectively does the solution automate repetitive tasks?
    • Error Frequency: How often do errors arise due to manual processes?
    • Operational Overhead: How much time is spent on day-to-day maintenance?
    • Incident Handling: Is incident response labor-intensive?
  2. Scoring:
    • 1 = High  toil (manual effort dominates operations).
    • 5 = Low toil (highly automated with minimal manual  intervention).
  3. Methodology:
    • Analyze task logs to quantify manual effort.
    • Review  automation features and evaluate their adoption.
    • Interview team members responsible for operations to understand pain points.

Step 3: Scoring and Analysis

        1. Compile Scores:
          • Use a scoring matrix to record scores for each criterion across the three dimensions.
            Dimension Criterion Score (1-5)
            Rigidity Integration Effort
            Modularity
            Friction Implementation Time
            Ease of Use
            Toil Automation Capability
            Operational Overhead
        2. Calculate Averages:
          • Compute the average score for each dimension to identify the primary pain points.
        3. Prioritize Findings:
          • Focus on dimensions with the lowest scores.
          • Highlight critical issues based on organizational priorities (e.g., quick wins for high- friction areas).

        Step 4: Action Planning

        1. Recommendations:
          • For Rigidity: Propose architectural  changes,  modular  redesigns, or vendor-neutral alternatives.
          • For Friction: Suggest process automation, user training,  or interface redesigns.
          • For Toil: Invest in automation tools, reduce manual dependencies, or explore outsourcing options.
        2. Develop a Roadmap:
          • Assign  ownership  for each improvement  initiative.
          • Set realistic timelines and define success metrics.
        3. Follow-Up:
          • Reassess the solution periodically to measure progress.
          • Incorporate feedback loops to refine the framework for future evaluations.

        Practical  Example: Assessing an Identity Access Management (IAM) System

        1. Rigidity Assessment:
          • Integration Effort: Rated 2 (high effort to integrate with modern SaaS platforms).
          • Modularity: Rated 3 (partial modularity in role definitions).
          • Average Rigidity Score: 2.5.
        2. Friction Assessment:
          • Implementation  Time: Rated 2 (manual approval  processes).
          • Ease of Use: Rated 4 (intuitive admin portal).
          • Average Friction Score: 3.0.
        3. Toil Assessment:
          • Automation Capability: Rated 3 (partial support for role provisioning automation).
          • Operational Overhead: Rated 2 (frequent manual  access reviews).
          • Average Toil Score: 2.5.
        4. Insights:
          • Rigidity and toil are key pain points due to outdated processes and limited automation.
          • Friction is moderate, but streamlining workflows could enhance user experience.
        5. Actions:
          • Invest in an API-based  integration layer to reduce rigidity.
          • Automate access review processes to cut toil.
          • Address workflow inefficiencies to improve friction scores.

        This framework empowers organizations to systematically evaluate cybersecurity solutions, pinpoint challenges, and implement targeted improvements, all while keeping the process lightweight and practical. By iterating on this approach, teams can foster a culture of continuous improvement in security operations.

        Author: Aaron Rinehart
        Aaron has spent his career solving complex challenging engineering problems and transforming cyber security practices across a diverse set of industries: healthcare, insurance, government, aerospace, technology, higher education, and the military. He has been expanding the possibilities of chaos engineering in its application to other safety-critical portions of the IT domain, most notably in cybersecurity

        Scroll to Top