How to Use Test Data

Test steps often rely on data parameters, such as menu item names, usernames, or numeric values. You can either hard-code these values or parameterize your test cases to use different variable values. Additionally, BlazeMeter's test data integration helps you use shared test data consistently across all tests and virtual services.

Important: Use only anonymized or synthetic data for testing. Don't use production data in your tests, or anything that contains confidential information or personally identifiable information (PII).

How can I parameterize my test data?

If you choose to parameterize your tests, you have the following options:

or any combination of these data sources.

When to choose which option?

  • If your test data is a list of fixed values, or has dependencies between columns, the best solution is to collect this data in spreadsheet columns and attach the CSV file to the test.

  • If you need dynamic test data, generate it synthetically. Generated test data looks like real or random data, but you have full control over its form, and you don't have to collect it yourself. You generate data using Test Data Generator Functions.
    • Synthetic test data is advantageous in tests where you need dynamic parameter values, such as relative date stamps ("today" or "last month"), fake but valid credit card numbers, random but plausible names, and so on.
    • BlazeMeter additionally helps you avoid invalid values: The included functions don't generate dates such as February 31, names such as asdf%as'df, nor credit card numbers with invalid checksums.

How to add Data Entities?

A Data Entity is a container for test data. You can Still add data from other sources, such as CSV files or synthetic test data, later.

GUI Functional tests currently support one Data Entity only. Performance tests support multiple Data Entities.

start creating test data

To add a Data Entity to a Performance test, follow these steps:

  1. Open a Performance test and click Test Data to open the Test Data pane.
  2. Click either the big blue Plus button or click TEST DATA... > Add Data Entity.
    Choose one of the following options:
    • Create New Data Entity. Use this option to manually add Data Parameters. Give the Data Entity a descriptive, unique name.
    • Create New Data Entity From CSV File. Select a CSV file with test data from your file system.
    • Attach CSV Files From Test Configuration. Use this option to create a new Data Entity from an already present CSV file in the test configuration.

    • Load Data Entity From Workspace. Select a shared Data Entity that is stored in your workspace.
    • Import Data Entity From Exported File. Use this option to create a new data entity from a previously exported file.
  3. Click Add.

How to manage Data Entities?

To view all Data Entities, either click Test Data in the top navigation, or alternatively in the Test Data pane, click TEST DATA... > Manage Data Entities. If no team members have saved any shared data entities to the Workspace, the Test Data Management window can be empty.

Hover the mouse over a Data Entity to access editor buttons.

  • Click Clone Data Entity to make a copy that you can edit without interfering with tests that are using the original shared Data Entity.
  • Click Rename Data Entity to enter a new name for this shared Data Entity.
  • Click Delete Data Entity to remove a shared Data Entity from all tests. Verify whether the data is unused before deleting the Entity.
  • Click Copy Entity ID to Clipboard to be able to refer to the Entity in an API call.

Each Data Entity has three tabs that let you drill deeper:

  • The Data Parameters tab lets you edit data parameters.
  • The Usage tab lets you see which tests and which Mock Services are using this shared Data Entity.
  • The Data Preview tab lets you preview test data values.

How to share or back up test data?

For each data entity in the Test Data pane, use the Ellipsis menu next to its name to manage it:

  • Use the Ellipsis menu to Rename or Remove Data Entities from the test. If you don't save them, removed Data Entities are also deleted from the Workspace.
  • To export any Data Entity as a spreadsheet so you could import it, for example, into MS Excel, use the Ellipsis menu and select ... > Download Data As CSV.
  • To share a Data Entity with your team, use the Ellipsis menu and select ... > Save to Workspace. To use a shared Data Entity in your test, select ... > Load From Workspace.
  • To make local backups, use the Ellipsis menu and select ... > Export to File. To restore a Data Entity from such an exported file, select ... > Import From File.

For more information, see How to Share Test Data.

How to manage CSV files?

Expand the Data Entity and hover the mouse over a CSV file entity in the data entity to access the following buttons:

  • Download CSV
  • Edit CSV
  • Delete CSV

A data parameter in a data entity

How to insert Data Parameters into tests?

If you are using the GUI Functional Test debugger, click Stop to close the debugger before defining new test data.

  1. Open a Test and go to the Configuration tab.
  2. Click Test Data.
    A pane with data entities opens on the right side.
  3. Click the Plus sign and create a parameter.
    To initialize the parameter, choose one of the following procedures:
  4. Click Copy Parameter Name to Clipboard.
    The clipboard now contains a string such as ${address}.
    test data copy parameter name to clipboard
  5. Return to the test configuration and edit the appropriate test step. For a GUI Functional test, edit a test step in the Scriptless Editor. For a performance test, edit a HTTP Request field in JMeter.
  6. Replace a hard-coded value with the pasted parameter.
    Example: In a GUI Functional Test, paste the parameter as a value into a test action.
    scriptless type action with variable
  7. Click Run Test in the left column to execute the test.

BlazeMeter now uses test data driven by your loaded Data Entities and replaces Data parameters in the test with your chosen dynamic values.

How to manage Data Parameters?

Parameter names have to be unique across all Data Entities in your test. If two parameters have the same name, one will override the other in an arbitrary order. Therefore, especially after uploading CSV files, review the loaded parameter names in the Test Data Management pane, and rename them, if needed. You can edit Data parameters for a specific test or edit shared data for everyone who uses it.

Edit Data Parameters in the current test

Open the Test Data pane, hover the mouse over a parameter definition to show the editor buttons.

test data editor buttons

The four buttons have the following functionalities, from left to right:

  • Copy Parameter Name to Clipboard so you can paste it easily into the test definition.
  • Preview one instance of generated data. Click again to toggle back to the function view.
  • Edit this parameter definition to change the parameter name, the function, or its arguments.
  • Delete this parameter. If the deleted parameter is in use in a scenario step, the step will break.

To copy or delete Data Parameters in bulk within the same Data Entity, use the checkboxes:

  1. Select individual Data Parameters that you want to bulk-copy, or click Select All.
    Additional bulk action buttons appear.

  2. Perform one of the following bulk actions on the selected parameters:

    • To copy, click Clone Parameters.

    • To delete, click Clone Parameters.


Edit Data Parameters of all Data Entities

  1. Click Test Data in the top navigation.

    The Test Data Management window shows the list of shared Data Entities in this workspace.

  2. You can sort shared Data Entities by name, by who last edited the entity, and by last updated date.
  3. Expand a Data Entity in the list.

    Three tabs appear, Data Parameters, Usage, and Data Preview.

  4. Go to the Data Parameters tab and select a Data Parameter from the list.
  5. Details appear in the right pane.
  6. You can edit the following properties
    • Parameter Name

      Editing the name here updates it in all tests that use this Data Parameter.

    • Parameter Value

      Enter either a static (hard-coded) string or number, or select a data generator function.

      To define a function:

      1. Enter a few letters of a search term to filter the list of functions.

        Examples: rand, day, date, credit, ssn, csv, json, loremipsum, seedlist, regexp, and so on.

      2. Click a function name to preview its help page, its arguments, and examples.
      3. Hover the mouse over a function name and click Insert Function.

        The empty function is inserted with the caret between the parentheses.

        Example: randInt()

      4. Fill in the required arguments as shown in the examples.

        Example: randInt(18,99)

        Tip: If you start typing into the value field, BlazeMeter starts suggesting a list of know functions to insert. The shortcut to view the list of functions for parameter values is ctrl-space. For more information about available functions, see also Test Data Generator Functions.

BlazeMeter auto-saves your changes when you leave a field.


Which shared data is used in which tests or Mock Services?

  1. Click Test Data in the top navigation.
  2. The Test Data Management window shows the list of shared Data Entities in this workspace.

  3. Expand a Data Entity in the list.

    Three tabs appear, Data Parameters, Usage, and Data Preview.

  4. Go to the Usage tab.
  5. Select either Tests or Mock Services.

    BlazeMeter lists the Tests or Mock Services (respectively) that use this Data Entity.

    The list includes the following details:

    • The names of the tests or Mock Services, respectively.
    • Test type (for tests only)
    • Either project name (for tests only) or Service name (for Service Virtualization only)
    • Last Run date and time
  6. Click the name of the test or Mock Service to open it.

How to preview test data?

Before a test run, you can preview your test data in the Test Data pane by clicking Iterations (for GUI Functional Tests) or Data Settings (for Performance Tests and Mock Services), respectively.

After a test run, go to the History tab, open the test Report, and go to the Summary tab. Under Test Data, click Show Test Data to review which data was used. Here you can also click the Rerun button to run this test again with the same test data, which can be useful for debugging.

You can also preview the data in all shared data Entities.

  1. Click Test Data in the top navigation.

    The Test Data Management window shows the list of shared Data Entities in this workspace.

  2. Expand a Data Entity in the list.

    Three tabs appear, Data Parameters, Usage, and Data Preview.

  3. Go to the Data Preview tab.

  4. Preview the data.

  5. (Optionally) Define the data size.

  6. (Optionally) Download the data as CSV file.

How to control the number of iterations (data size)?

When test data is generated by functions and not hard-coded, it is available at any quantity. When you load test data from CSV files, you choose whether you want to use all rows or a subset. The number of rows of test data corresponds to the number of iterations of a GUI Functional test, so while debugging, you maybe prefer only 1 iteration.

To control how many rows of test data are used in a test, configure the test iteration settings.


Can I control the random distributions?

Yes, you can choose between completely random or influence the distribution of the dynamically returned values. You can define alternative values or functions and choose a probability for each. The total of the distributions must add up to 100 percent.

  1. After defining a Data Parameter, click Distribution.

    The Synthetic Data Generation By Distribution window opens.

  2. Choose a Distribution Mode:

    • Random — Selects values pseudo-randomly. The probability for each is the same, but not guaranteed. This is the default.

    • Probability % — Lets you define a probability for each value, the actual distribution is not guaranteed, though. This means, a 50-50 distribution may end up being 50.1% and 49.9%.

    • Guaranteed % — Each value is selected exactly according to the given percentage distribution. For example, a 25-25-25-25 distribution will actually select each of the four values in turn.

  3. Click the Plus button to add alternative values or functions.

  4. (For Probability % and Guaranteed % modes only) Define Distribution Settings by entering the desired percentage for each value. The total must add up to 100%.

  5. Click Save.

Troubleshooting: How to resolve data definition errors?

No data can be generated as long as there are errors in the data definition. Check the following: Errors can be caused by misspelled functions, missing arguments of functions, missing mandatory values, inappropriate data types (such as entering text where a number is expected), unpaired quotation marks or parentheses, missing files, and so on.

Errors are clearly indicated by the inline validation in the Test Data pane, so you can address each warning immediately. If you have loaded a Data Model that contains multiple errors, an error bar will indicate the number of issues to address before data can be generated. Click the number in the error bar to review a detailed list of detected issues and a description of the error. Then click each listed data parameter or data entity to quickly navigate to the item to fix it.



Symptom: I see the validation message "Type string is not assignable to type number" for randInt( 1 + ${x} , ${y} + 1000 ).
I've defined two numeric data parameters x and y and used them in an addition inside function arguments. How do I make the + operator interpret the parameters as numbers?

Solution: Inside a function argument, the overloaded + operator resolves the unknown types as string and throws a type validation error. Use the ToNumber() function on the number to make the + operator act as numeric addition and not as string concatenation.
randInt(1+ToNumber(${x}), ToNumber(${y})+100)