Skip to content

Added multiprocessing for image push/pulls based on input rate#53

Merged
vishnuchalla merged 6 commits intoquay:masterfrom
vishnuchalla:dev
Apr 25, 2023
Merged

Added multiprocessing for image push/pulls based on input rate#53
vishnuchalla merged 6 commits intoquay:masterfrom
vishnuchalla:dev

Conversation

@vishnuchalla
Copy link
Collaborator

Description

Added multiprocessing for image push/pulls using the input rate specified.
If we have n users, n push/pull jobs will be created with given rate rps, each uploading and downloading 100 images.

Testing

Tested on self managed openshift cluster.

INFO:__main__:Running Quay Scale & Performance Tests	date=2023-04-23T22:24:48.997381 host=https://example-registry-quay-quay-enterprise.apps.vchalla-quay-test-2.perfscale.devcluster.openshift.com test_uuid=d118304d-2a15-4a5d-89d9-431706047636 organization=test num_users=10 num_repos=11 num_teams=10 target_hit_size=10 concurrency=5 repos_with_tags_sizes=(100,) total_tags=100 pull_push_batch_size=400 number_of_push_pull_jobs_per_user=0
INFO:__main__:Running: Create Users	quantity=10
INFO:__main__:Preparing to execute 10 HTTP Requests.
Requests      [total, rate, throughput]         13, 5.42, 4.17
Duration      [total, attack, wait]             2.4s, 2.4s, 5.926µs
Latencies     [min, mean, 50, 90, 95, 99, max]  5.474µs, 426.361ms, 550.55ms, 561.498ms, 565.534ms, 566.465ms, 566.465ms
Bytes In      [total, mean]                     2460, 189.23
Bytes Out     [total, mean]                     630, 48.46
Success       [ratio]                           76.92%
Status Codes  [code:count]                      0:3  200:10  
Error Set:
no targets to attack
INFO:__main__:Results for test create_users written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_create_users_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:24:53Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:24:53Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:24:53Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:24:53Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:24:53Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:Running: Update User Passwords	quantity=10 password=password
INFO:__main__:Preparing to execute 10 HTTP Requests.
Requests      [total, rate, throughput]         14, 5.38, 3.85
Duration      [total, attack, wait]             2.6s, 2.6s, 7.984µs
Latencies     [min, mean, 50, 90, 95, 99, max]  4.776µs, 394.662ms, 550.099ms, 557.973ms, 558.496ms, 558.646ms, 558.646ms
Bytes In      [total, mean]                     3960, 282.86
Bytes Out     [total, mean]                     240, 17.14
Success       [ratio]                           71.43%
Status Codes  [code:count]                      0:4  200:10  
Error Set:
no targets to attack
INFO:__main__:Results for test update_passwords written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_update_passwords_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:24:57Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:24:57Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:24:57Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:24:57Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:24:57Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:Running: Create Repositories	
INFO:__main__:Preparing to execute 11 HTTP Requests.
Requests      [total, rate, throughput]         12, 5.45, 4.37
Duration      [total, attack, wait]             2.287s, 2.2s, 86.568ms
Latencies     [min, mean, 50, 90, 95, 99, max]  5.207µs, 295.159ms, 323.718ms, 331.42ms, 334.376ms, 334.868ms, 334.868ms
Bytes In      [total, mean]                     928, 77.33
Bytes Out     [total, mean]                     1459, 121.58
Success       [ratio]                           83.33%
Status Codes  [code:count]                      0:1  201:10  400:1  
Error Set:
no targets to attack
400 BAD REQUEST
INFO:__main__:Results for test create_repositories written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_create_repositories_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:25:01Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:25:01Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:25:01Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:25:01Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:25:01Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:Running: Create Teams	
INFO:__main__:Preparing to execute 10 HTTP Requests.
Requests      [total, rate, throughput]         11, 5.50, 4.74
Duration      [total, attack, wait]             2.108s, 2s, 107.799ms
Latencies     [min, mean, 50, 90, 95, 99, max]  5.669µs, 283.481ms, 307.802ms, 331.19ms, 338.832ms, 339.527ms, 339.527ms
Bytes In      [total, mean]                     2440, 221.82
Bytes Out     [total, mean]                     410, 37.27
Success       [ratio]                           90.91%
Status Codes  [code:count]                      0:1  200:10  
Error Set:
no targets to attack
INFO:__main__:Results for test create_teams written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_create_teams_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:25:05Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:25:05Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:25:05Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:25:05Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:25:05Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:Running: Add Users to Teams	
INFO:__main__:Preparing to execute 100 HTTP Requests.
Requests      [total, rate, throughput]         101, 5.05, 4.98
Duration      [total, attack, wait]             20.1s, 20s, 99.768ms
Latencies     [min, mean, 50, 90, 95, 99, max]  5.957µs, 299.249ms, 301.489ms, 306.064ms, 306.96ms, 308.213ms, 308.37ms
Bytes In      [total, mean]                     22400, 221.78
Bytes Out     [total, mean]                     200, 1.98
Success       [ratio]                           99.01%
Status Codes  [code:count]                      0:1  200:100  
Error Set:
no targets to attack
INFO:__main__:Results for test add_team_members written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_add_team_members_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:25:27Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:25:27Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:25:27Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:25:27Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:25:27Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:Running: Grant teams access to repositories	
INFO:__main__:Preparing to execute 110 HTTP Requests.
Requests      [total, rate, throughput]         111, 5.05, 4.97
Duration      [total, attack, wait]             22.118s, 22s, 118.397ms
Latencies     [min, mean, 50, 90, 95, 99, max]  6.388µs, 314.031ms, 315.008ms, 320.171ms, 321.657ms, 358.417ms, 406.499ms
Bytes In      [total, mean]                     20680, 186.31
Bytes Out     [total, mean]                     1870, 16.85
Success       [ratio]                           99.10%
Status Codes  [code:count]                      0:1  200:110  
Error Set:
no targets to attack
INFO:__main__:Results for test add_teams_to_organizations written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_add_teams_to_organizations_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:25:51Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:25:51Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:25:51Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:25:51Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:25:51Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Created Job: test-registry-push7636-user-0
INFO:__main__:Created Job: test-registry-push7636-user-1
INFO:__main__:Created Job: test-registry-push7636-user-4
INFO:__main__:Created Job: test-registry-push7636-user-2
INFO:__main__:Created Job: test-registry-push7636-user-3
INFO:__main__:Waiting for test-registry-push7636-user-4 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-0 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-1 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-3 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-2 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-4 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-3 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-0 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-1 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-2 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-3 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-4 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-0 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-1 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-2 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-3 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-4 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-0 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-2 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-1 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-0 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-3 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-1 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-4 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-2 to finish. Queue: 0/100
INFO:__main__:Job test-registry-push7636-user-4 has been completed.
INFO:__main__:Job test-registry-push7636-user-0 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-4
INFO:__main__:Created Job: test-registry-pull7636-user-0
INFO:__main__:Job test-registry-push7636-user-3 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-3
INFO:__main__:Job test-registry-push7636-user-1 has been completed.
INFO:__main__:Job test-registry-push7636-user-2 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-2
INFO:__main__:Created Job: test-registry-pull7636-user-1
INFO:__main__:Waiting for test-registry-pull7636-user-4 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-0 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-3 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-2 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-1 to finish. Queue: 0/100
INFO:__main__:Job test-registry-pull7636-user-4 has been completed.
INFO:__main__:Job test-registry-pull7636-user-0 has been completed.
INFO:__main__:Job test-registry-pull7636-user-2 has been completed.
INFO:__main__:Job test-registry-pull7636-user-3 has been completed.
INFO:__main__:Job test-registry-pull7636-user-1 has been completed.
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Queued 100 tags to be created
INFO:__main__:Queued 100 tags to be pulled
INFO:__main__:Created Job: test-registry-push7636-user-6
INFO:__main__:Created Job: test-registry-push7636-user-8
INFO:__main__:Created Job: test-registry-push7636-user-7
INFO:__main__:Created Job: test-registry-push7636-user-9
INFO:__main__:Created Job: test-registry-push7636-user-5
INFO:__main__:Waiting for test-registry-push7636-user-8 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-7 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-6 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-9 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-5 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-8 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-7 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-5 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-6 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-9 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-8 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-5 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-6 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-7 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-9 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-6 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-8 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-5 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-7 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-9 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-8 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-6 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-5 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-push7636-user-7 to finish. Queue: 0/100
INFO:__main__:Job test-registry-push7636-user-9 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-9
INFO:__main__:Job test-registry-push7636-user-6 has been completed.
INFO:__main__:Job test-registry-push7636-user-8 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-8
INFO:__main__:Created Job: test-registry-pull7636-user-6
INFO:__main__:Job test-registry-push7636-user-5 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-5
INFO:__main__:Job test-registry-push7636-user-7 has been completed.
INFO:__main__:Created Job: test-registry-pull7636-user-7
INFO:__main__:Waiting for test-registry-pull7636-user-9 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-6 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-8 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-5 to finish. Queue: 0/100
INFO:__main__:Waiting for test-registry-pull7636-user-7 to finish. Queue: 0/100
INFO:__main__:Job test-registry-pull7636-user-9 has been completed.
INFO:__main__:Job test-registry-pull7636-user-6 has been completed.
INFO:__main__:Job test-registry-pull7636-user-8 has been completed.
INFO:__main__:Job test-registry-pull7636-user-7 has been completed.
INFO:__main__:Job test-registry-pull7636-user-5 has been completed.
INFO:__main__:Running: get_catalog	
INFO:__main__:Preparing to execute 1000 HTTP Requests.
Requests      [total, rate, throughput]         1002, 5.00, 5.00
Duration      [total, attack, wait]             3m20s, 3m20s, 7.653µs
Latencies     [min, mean, 50, 90, 95, 99, max]  6.248µs, 4.709ms, 4.214ms, 5.599ms, 5.837ms, 13.787ms, 39.202ms
Bytes In      [total, mean]                     20000, 19.96
Bytes Out     [total, mean]                     0, 0.00
Success       [ratio]                           99.80%
Status Codes  [code:count]                      0:2  200:1000  
Error Set:
no targets to attack
INFO:__main__:Results for test get_catalog written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_get_catalog_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:45:14Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:45:14Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:45:14Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:45:14Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:45:14Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'
INFO:__main__:List Tags	repository=repo_with_100_tags
INFO:__main__:Preparing to execute 1000 HTTP Requests.
Requests      [total, rate, throughput]         1003, 5.00, 4.99
Duration      [total, attack, wait]             3m20s, 3m20s, 6.249µs
Latencies     [min, mean, 50, 90, 95, 99, max]  6.249µs, 14.003ms, 14.031ms, 15.002ms, 15.545ms, 21.119ms, 76.408ms
Bytes In      [total, mean]                     284000, 283.15
Bytes Out     [total, mean]                     0, 0.00
Success       [ratio]                           99.70%
Status Codes  [code:count]                      0:3  200:1000  
Error Set:
no targets to attack
INFO:__main__:Results for test list_tags_for_repo_with_100_tags written to file: ./logs/d118304d-2a15-4a5d-89d9-431706047636_list_tags_for_repo_with_100_tags_result.json
INFO:__main__:Recording test results in ElasticSearch: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com
INFO:__main__:b'2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: logging level is INFO\n2023-04-23T22:48:36Z - INFO     - MainProcess - _load_benchmarks: Successfully imported 3 benchmark modules: coremarkpro, systemd_analyze, uperf\n2023-04-23T22:48:36Z - INFO     - MainProcess - _load_benchmarks: Failed to import 0 benchmark modules: \n2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: Using elasticsearch server with host: https://search-perfscale-dev-chmf5l4sh66lvxbnadi4bznl3a.us-west-2.es.amazonaws.com\n2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: Using index prefix for ES: ripsaw-vegeta\n2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: Connected to the elasticsearch cluster with info as follows:\n2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: {\n    "name": "0968bdda07a483a2d52dedb5557380ac",\n    "cluster_name": "415909267177:perfscale-dev",\n    "cluster_uuid": "Xz2IU4etSieAeaO2j-QCUw",\n    "version": {\n        "number": "7.10.2",\n        "build_type": "tar",\n        "build_hash": "unknown",\n        "build_date": "2022-11-10T22:04:34.357368Z",\n        "build_snapshot": false,\n        "lucene_version": "9.3.0",\n        "minimum_wire_compatibility_version": "7.10.0",\n        "minimum_index_compatibility_version": "7.0.0"\n    },\n    "tagline": "The OpenSearch Project: https://opensearch.org/"\n}\n2023-04-23T22:48:36Z - INFO     - MainProcess - py_es_bulk: Using streaming bulk indexer\n2023-04-23T22:48:36Z - INFO     - MainProcess - wrapper_factory: identified vegeta as the benchmark wrapper\n2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: Indexed results - 1 success, 0 duplicates, 0 failures, with 0 retries.\n2023-04-23T22:48:36Z - INFO     - MainProcess - run_snafu: Duration of execution - 0:00:00, with total size of 232 bytes\n'

tests.py Outdated
'start_time': start_time,
'end_time': end_time,
'failures': failures,
'successes': successes,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider names like success_count and failure_count to avoid confusion

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. Updated as suggested.

def parallel_process(user, **kwargs):
common_args = kwargs
# Container Operations
redis_client.delete('tags_to_push'+"-".join(user.split("_"))) # avoid stale data
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are the parallel operations only for push/pull?

Copy link
Collaborator Author

@vishnuchalla vishnuchalla Apr 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. For rest of the APIS, we hit endpoints using vegeta. So the concurrency is automatically handled by the --rate option in the tool. Since we execute podman commands for push/pull we have to explicitly implement parallel processing for them as in the above code.

logger.info("Created Job: %s", resp.metadata.name)


def parallel_process(user, **kwargs):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you write a small description on how this function works and how you are using redis etc

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. Added the description. Please let me know if you have any further questions on functionality.

@vishnuchalla vishnuchalla requested a review from syed April 24, 2023 23:52
@vishnuchalla vishnuchalla merged commit ebe2203 into quay:master Apr 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

2 participants