Skip to content

Commit 6a92d4b

Browse files
[AT][QATM-1108] Update testsAT (apache#195)
1 parent 5adf9c7 commit 6a92d4b

27 files changed

+70
-59
lines changed

Jenkinsfile

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,9 +43,11 @@ hose {
4343
INSTALLPARAMETERS = """
4444
| -DDCOS_CLI_HOST=%%DCOSCLI#0
4545
| -DDCOS_IP=10.200.0.156
46-
| -DPEM_PATH=/paascerts/PaasIntegration.pem
46+
| -DPEM_PATH=src/test/resources/credentials/PaasIntegration.pem
4747
| -DBOOTSTRAP_IP=10.200.0.155
4848
| -DSPARK_DOCKER_IMAGE=qa.stratio.com/stratio/stratio-spark
49+
| -DSPARK_HISTORY_SERVER_DOCKER_IMAGE=qa.stratio.com/stratio/spark-stratio-history-server
50+
| -DSPARK_DRIVER_DOCKER_IMAGE=qa.stratio.com/stratio/spark-stratio-driver
4951
| -DSTRATIO_SPARK_VERSION=%%VERSION
5052
| -DCLUSTER_ID=nightly
5153
| -DSPARK_COVERAGE_IMAGE=qa.stratio.com/stratio/stratio-spark-coverage

testsAT/src/test/resources/features/pf/coverage/elastic-coverage.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Feature: [Spark Elastic Coverage] Elastic Coverage tests
2020
When I send a 'POST' request to '/service/${SPARK_FW_NAME}/v1/submissions/create' based on 'schemas/pf/SparkCoverage/elastic_curl.json' as 'json' with:
2121
| $.appResource | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/elastic-${COVERAGE_VERSION}.jar | n/a |
2222
| $.sparkProperties['spark.jars'] | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/elastic-${COVERAGE_VERSION}.jar | n/a |
23-
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION} | n/a |
23+
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DRIVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-driver}:${STRATIO_SPARK_VERSION} | n/a |
2424

2525
Then the service response status must be '200' and its response must contain the text '"success" : true'
2626

testsAT/src/test/resources/features/pf/coverage/hdfs-coverage.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Feature: [Spark HDFS Coverage] HDFS Coverage tests
2020
When I send a 'POST' request to '/service/${SPARK_FW_NAME}/v1/submissions/create' based on 'schemas/pf/SparkCoverage/hdfs_curl.json' as 'json' with:
2121
| $.appResource | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/hdfs-${COVERAGE_VERSION}.jar | n/a |
2222
| $.sparkProperties['spark.jars'] | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/hdfs-${COVERAGE_VERSION}.jar | n/a |
23-
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION} | n/a |
23+
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DRIVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-driver}:${STRATIO_SPARK_VERSION} | n/a |
2424
| $.sparkProperties['spark.mesos.driverEnv.SPARK_SECURITY_HDFS_CONF_URI'] | UPDATE | http://spark-coverage.marathon.mesos:9000/configs/${CLUSTER_ID} | n/a |
2525

2626
Then the service response status must be '200' and its response must contain the text '"success" : true'

testsAT/src/test/resources/features/pf/coverage/install-spark-coverage.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Feature: [Install Spark Coverage] Installing Spark Coverage
66

77
Scenario:[Spark dispatcher Installation][01] Installing Spark Coverage (for testing purposes)
88
Given I create file 'SparkCoverage.json' based on 'schemas/pf/SparkCoverage/SparkCoverage.json' as 'json' with:
9-
| $.container.docker.image | UPDATE | ${SPARK_COVERAGE_IMAGE}:${COVERAGE_VERSION} | n/a |
9+
| $.container.docker.image | UPDATE | ${SPARK_COVERAGE_IMAGE:-qa.stratio.com/stratio/stratio-spark-coverage}:${COVERAGE_VERSION} | n/a |
1010

1111
When I outbound copy 'target/test-classes/SparkCoverage.json' through a ssh connection to '/dcos'
1212

testsAT/src/test/resources/features/pf/coverage/kafka-coverage.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Feature: [Spark Kafka Coverage] Kafka Coverage tests
2020
When I send a 'POST' request to '/service/${SPARK_FW_NAME}/v1/submissions/create' based on 'schemas/pf/SparkCoverage/kafka_curl.json' as 'json' with:
2121
| $.appResource | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/kafka-${COVERAGE_VERSION}.jar | n/a |
2222
| $.sparkProperties['spark.jars'] | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/kafka-${COVERAGE_VERSION}.jar | n/a |
23-
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION} | n/a |
23+
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DRIVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-driver}:${STRATIO_SPARK_VERSION} | n/a |
2424
| $.appArgs[0] | UPDATE | gosec1.node.paas.labs.stratio.com:9092 | n/a |
2525

2626
Then the service response status must be '200' and its response must contain the text '"success" : true'

testsAT/src/test/resources/features/pf/coverage/postgres-coverage.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Feature: [Spark Postgres Coverage] Postgres Coverage tests
3030
When I send a 'POST' request to '/service/${SPARK_FW_NAME}/v1/submissions/create' based on 'schemas/pf/SparkCoverage/postgres_curl.json' as 'json' with:
3131
| $.appResource | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/postgres-${COVERAGE_VERSION}.jar | n/a |
3232
| $.sparkProperties['spark.jars'] | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/postgres-${COVERAGE_VERSION}.jar | n/a |
33-
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION} | n/a |
33+
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DRIVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-driver}:${STRATIO_SPARK_VERSION} | n/a |
3434
| $.appArgs[0] | UPDATE | ${POSTGRES_INSTANCE} | n/a |
3535

3636
Then the service response status must be '200' and its response must contain the text '"success" : true'

testsAT/src/test/resources/features/pf/coverage/streaming-hdfs-dynamic.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Feature: [HDFS Streaming HDFS Dynamic Coverage] Streaming HDFS Dynamic Coverage
2020
When I send a 'POST' request to '/service/${SPARK_FW_NAME}/v1/submissions/create' based on 'schemas/pf/SparkCoverage/streaming_hdfs_dynamic_curl.json' as 'json' with:
2121
| $.appResource | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/streaming-hdfs-dynamic-${COVERAGE_VERSION}.jar | n/a |
2222
| $.sparkProperties['spark.jars'] | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/streaming-hdfs-dynamic-${COVERAGE_VERSION}.jar | n/a |
23-
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION} | n/a |
23+
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DRIVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-driver}:${STRATIO_SPARK_VERSION} | n/a |
2424
| $.sparkProperties['spark.mesos.driverEnv.SPARK_SECURITY_HDFS_CONF_URI'] | UPDATE | http://spark-coverage.marathon.mesos:9000/configs/${CLUSTER_ID} | n/a |
2525

2626
Then the service response status must be '200' and its response must contain the text '"success" : true'

testsAT/src/test/resources/features/pf/coverage/structured-streaming-coverage.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Feature: [HDFS Structured Streaming Coverage] Structured Streaming Coverage test
2121
When I send a 'POST' request to '/service/${SPARK_FW_NAME}/v1/submissions/create' based on 'schemas/pf/SparkCoverage/structured_streaming_curl.json' as 'json' with:
2222
| $.appResource | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/structured-streaming-${COVERAGE_VERSION}.jar | n/a |
2323
| $.sparkProperties['spark.jars'] | UPDATE | http://spark-coverage.marathon.mesos:9000/jobs/structured-streaming-${COVERAGE_VERSION}.jar | n/a |
24-
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION} | n/a |
24+
| $.sparkProperties['spark.mesos.executor.docker.image'] | UPDATE | ${SPARK_DRIVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-driver}:${STRATIO_SPARK_VERSION} | n/a |
2525
| $.appArgs[0] | UPDATE | gosec1.node.paas.labs.stratio.com:9092 | n/a |
2626

2727
Then the service response status must be '200' and its response must contain the text '"success" : true'

testsAT/src/test/resources/features/pf/dispatcherAT/installation.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Feature: [Install Spark Dispatcher] Installing Spark Dispatcher
1313

1414
#Start image from JSON
1515
And I run 'dcos package describe --app --options=/dcos/SparkDispatcherInstallation.json spark-dispatcher > /dcos/SparkDispatcherInstallationMarathon.json' in the ssh connection
16-
And I run 'sed -i -e 's|"image":.*|"image": "${SPARK_DOCKER_IMAGE}:${STRATIO_SPARK_VERSION}",|g' /dcos/SparkDispatcherInstallationMarathon.json' in the ssh connection
16+
And I run 'sed -i -e 's|"image":.*|"image": "${SPARK_DOCKER_IMAGE:-qa.stratio.com/stratio/stratio-spark}:${STRATIO_SPARK_VERSION}",|g' /dcos/SparkDispatcherInstallationMarathon.json' in the ssh connection
1717
And I run 'dcos marathon app add /dcos/SparkDispatcherInstallationMarathon.json' in the ssh connection
1818

1919
#Check Spark-fw is Running

testsAT/src/test/resources/features/pf/historyServerAT/installation.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Feature: [Install Spark History Server] Installing Spark History Server
1414

1515
#Start image from JSON
1616
And I run 'dcos package describe --app --options=/dcos/SparkHistoryServerInstallation.json spark-history-server > /dcos/SparkHistoryServerInstallationMarathon.json' in the ssh connection
17-
And I run 'sed -i -e 's|"image":.*|"image": "qa.stratio.com/stratio/spark-stratio-history-server:${STRATIO_SPARK_VERSION}",|g' /dcos/SparkHistoryServerInstallationMarathon.json' in the ssh connection
17+
And I run 'sed -i -e 's|"image":.*|"image": "${SPARK_HISTORY_SERVER_DOCKER_IMAGE:-qa.stratio.com/stratio/spark-stratio-history-server}:${STRATIO_SPARK_VERSION}",|g' /dcos/SparkHistoryServerInstallationMarathon.json' in the ssh connection
1818
And I run 'dcos marathon app add /dcos/SparkHistoryServerInstallationMarathon.json' in the ssh connection
1919

2020
#Check Spark-history-server is Running

0 commit comments

Comments
 (0)