Sensitivity and Scenario Analysis

What are Sensitivity & Scenario Analyses?

Sensitivity & scenario analyses are versatile — they are used to assess and predict the outcomes of any project, situation, or process that can be modeled quantitatively.

The inputs, or independent variables, are assumptions that are used to calculate the target outputs, or dependent variables, of a model (Figure 1). These target outputs are key performance metrics (e.g., costs, profits, efficiencies, yields, etc.), or quantitative metrics indicative of long-term success.

Figure 1. General Model

Figure 2. Sensitivity vs. Scenario Analysis

Sensitivity analysis is the study of how the inputs of a model or process can affect the target outputs. In a sensitivity analysis, one or more of the parameters are changed to different values, while the rest stay at their baseline values (Figure 2).

Altering the values of these parameters may have linear or exponential effects on the target outputs (Figure 3). The goal of a sensitivity analysis is to identify the model or process parameters with the biggest impacts on the target outputs. A sensitivity analysis can also identify synergistic or antagonistic interactions between parameters when multiple parameters are changed simultaneously.

Figure 3. Linear & Exponential Effects of Inputs

Figure 4 shows a simple financial model that can be used for sensitivity analysis. The target
outputs of this model are profit and/or revenue, both of which could help someone assess or predict the financial performance of a company.

Figure 4. Financial Model

Figure 5. Sensitivity Analysis: Financial Model

The input variables of this model are product costs, product pricing, volume of product sold, payroll costs, and workspace costs. A sensitivity analysis of the financial model (Figure 4) can help companies or investors predict how a change in a product cost, for example, can affect the company’s revenue and/or profit (Figure 5).

Scenario analysis, on the other hand, is the study of how a particular set of inputs — a scenario — affects the target outputs. In a scenario analysis, all of the parameters are changed at the same time (Figure 2). The goal of scenario analysis is to predict the target outputs of the model or process under different sets of assumptions.

The following are three commonly assessed scenarios:

  1. Base-case scenario. This refers to the current and/or likeliest scenario. This scenario uses the baseline values for each parameter.

  2. Worst-case scenario. This refers to the worst possible scenario. This scenario uses the worst potential values for each parameter.

  3. Best-case scenario. This refers to the best possible scenario. This scenario would use the best potential values for each parameter.

Helikon’s Bioprocess Sensitivity & Scenario Analyses

At Helikon, we use our industry-specific knowledge to conduct sensitivity and scenario analyses of specialized bioprocesses, including those used by cell-based meat and precision fermentation companies. We can work with company R&D data to create company-specific bioprocess models that can demonstrate both current and future bioprocess capabilities. A bioprocess sensitivity or scenario analysis is used to assess a company’s technological capabilities and limitations. Helikon’s sensitivity analyses focus on industry-specific performance metrics, including, but not limited to, protein or cell mass yields, product (e.g., recombinant proteins, cell-based products, culture media) production costs, bioreactor/seed train costs, and greenhouse gas (GHG) emissions.

Cell-based Meat Sensitivity & Scenario Analyses

Figure 6. Cell-Based Meat Bioprocess Model

At Helikon, we can model the bioprocesses of cell-based meat companies using company-specific R&D data (Figure 6). We are able to perform sensitivity and scenario analyses on these models to answer questions about the company’s current or potential bioprocess. A sensitivity analysis will allow us to identify the bioprocess parameters that have the largest effects on the cost and/or time efficiency of cell-based meat production. With scenario analysis, we can use the biological limitations of bioprocess parameters to estimate a company’s best and worst-case scenario.

If provided with the proper data, we can determine the following:

  • Cost to produce 1 kg of cell-based meat product at bench/pilot/commercial scale.

  • Time to produce 1 kg of cell-based meat at bench/pilot/commercial scale.

  • Volume and cost of culture media needed to produce 1 kg of cell-based meat.

  • Energy to produce 1 kg of cell-based meat product and resulting green house gas (GHG) emissions.

  • Volume of product produced per liter of media.

  • Biologically feasible cost-reduction scenarios.

  • Relative importance of different bioprocess parameters.

These key metrics will allow us to assess what it would take for a company to produce cell-based meat products that are competitive with conventional meat products in terms of cost, time, and energy efficiency. To estimate these metrics, we need to consider all aspects of a company’s bioprocess, from their cell line development and bioreactor parameters to their food processing strategies. Table 1 lists some of these bioprocess considerations.

Table 1. Bioprocess Considerations

Below we explain the importance of cell growth kinetics — one of the many bioprocess considerations — in cell-based meat production and show how it can be used to conduct a sensitivity analysis.

Cell Growth Kinetics: A Bioprocess Consideration

Cell-based meat bioprocesses are heavily dependent on the biology of the starting cells used. These cells must grow and divide to generate enough cell biomass for a food product. To produce cell-based meat, understanding the growth kinetics of your cells in culture is essential. Cell growth patterns can be observed in cell growth curves. A cell growth curve is a plot that shows the number of cells in culture over time. The growth curve in Figure 7 shows the general growth pattern of cells in culture.

Figure 7. Cell Growth Curve

This growth pattern generally consists of the following four phases:

  1. Lag Phase. Cells are adapting to their culture conditions and no increase in cell number
    is seen. This is the period immediately after the starting cells are added to a culture.

  2. Exponential Growth Phase. Cells begin proliferating and an exponential increase in cell
    number is seen. This phase is also known as the log phase. The length of this phase
    depends on the starting concentration of cells, the growth rate of the cells, and the concentration of cells that can inhibit cell proliferation. The starting concentration of cells is
    known as the seeding density and the concentration of cells that will inhibit cell prolifer-
    ation is known as the saturation density or the maximum cell density.

  3. Stationary Phase. Cells are no longer proliferating at the same rate and no increase in
    cell number is seen. The cells have reached their maximum cell density. This phase is
    also know as the plateau phase.

  4. Death Phase. Cells start to die as the culture accumulates waste products and nutrients
    are depleted. An exponential decrease in cell number is seen.

A growth curve can be used to calculate several useful parameters about a cell line under certain culture conditions. One of these parameters is the cell doubling time — this is the time it takes for the number of cells in culture to double. If a cell line has doubling time of 2 days, then two days from a given time point, the cell count (number of cells present in 1 milliliter of culture) would increase by a multiple of two. Cell doubling times can be estimated from the exponential growth phase of a cell growth curve, as shown in Figure 7. Another useful parameter is the saturation density, or the density of cells at which the cells stop proliferating and the number of cells in culture remains constant. This parameter can be estimated from the stationary phase of a cell growth curve, as shown in Figure 7.


The cell doubling time and saturation density are two parameters that must be considered for the production of cell-based meat. The shorter the doubling time, the faster we can produce enough cells for cell-based products. And the higher the saturation density, the more cell-based product we can produce in each bioreactor per run. Cell-based meat companies are trying to optimize their cell lines and culture conditions so that they can produce cell-based meat as quickly and efficiently as possible — they need the shortest doubling times and the highest saturation densities. If we conduct a sensitivity analysis, we can analyze how a change in the cell doubling time or the saturation density affects the energy, time, or cost efficiency of a company’s cell-based meat production.


With a deep assessment of a company’s technical capabilities, we can get a more accurate estimate of how a shorter doubling time can affect the time it takes to produce 1 kg of cell-based meat.



At Helikon, we can take the answers to these questions and convert them into performance metrics that are useful even to those that are not familiar with the science behind cell-based meat production. Reach out to learn more about our technical capabilities, including environmental impact assessment, technical due diligence, and more for deep agtech, frontier biotech, and climate biotech startups and companies.

Written by Diana Garibay, Associate for Helikon Consulting

Previous
Previous

Oleogels and tasty cells

Next
Next

Regulation of new food ingredients: Is it GRAS?