Skip to content

Commit 2218af6

Browse files
committed
update samples from Release-3 as a part of 1.2.0 SDK stable release
1 parent 0401128 commit 2218af6

File tree

18 files changed

+450
-240
lines changed

18 files changed

+450
-240
lines changed

configuration.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@
103103
"source": [
104104
"import azureml.core\n",
105105
"\n",
106-
"print(\"This notebook was created using version 1.1.5 of the Azure ML SDK\")\n",
106+
"print(\"This notebook was created using version 1.2.0 of the Azure ML SDK\")\n",
107107
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
108108
]
109109
},

how-to-use-azureml/automated-machine-learning/automl_env.yml

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ dependencies:
1313
- scipy>=1.0.0,<=1.1.0
1414
- scikit-learn>=0.19.0,<=0.20.3
1515
- pandas>=0.22.0,<=0.23.4
16-
- py-xgboost<=0.80
16+
- py-xgboost<=0.90
1717
- fbprophet==0.5
1818
- pytorch=1.1.0
1919
- cudatoolkit=9.0
@@ -33,5 +33,6 @@ dependencies:
3333
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
3434

3535
channels:
36+
- anaconda
3637
- conda-forge
37-
- pytorch
38+
- pytorch

how-to-use-azureml/automated-machine-learning/automl_env_mac.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,5 +34,6 @@ dependencies:
3434
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
3535

3636
channels:
37+
- anaconda
3738
- conda-forge
3839
- pytorch

how-to-use-azureml/automated-machine-learning/forecasting-beer-remote/auto-ml-forecasting-beer-remote.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
name: auto-ml-forecasting-beer-remote
22
dependencies:
33
- fbprophet==0.5
4-
- py-xgboost<=0.80
4+
- numpy==1.16.2
5+
- py-xgboost<=0.90
56
- pip:
67
- azureml-sdk
7-
- numpy==1.16.2
88
- azureml-train-automl
99
- azureml-widgets
1010
- matplotlib
Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
name: auto-ml-forecasting-bike-share
22
dependencies:
33
- fbprophet==0.5
4-
- py-xgboost<=0.80
4+
- numpy==1.16.2
5+
- py-xgboost<=0.90
56
- pip:
67
- azureml-sdk
7-
- numpy==1.16.2
88
- azureml-train-automl
99
- azureml-widgets
1010
- matplotlib

how-to-use-azureml/automated-machine-learning/forecasting-energy-demand/auto-ml-forecasting-energy-demand.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@ name: auto-ml-forecasting-energy-demand
22
dependencies:
33
- pip:
44
- azureml-sdk
5-
- numpy==1.16.2
65
- azureml-train-automl
76
- azureml-widgets
87
- matplotlib
Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
name: auto-ml-forecasting-function
22
dependencies:
33
- fbprophet==0.5
4-
- py-xgboost<=0.80
4+
- numpy==1.16.2
5+
- py-xgboost<=0.90
56
- pip:
67
- azureml-sdk
7-
- numpy==1.16.2
88
- azureml-train-automl
99
- azureml-widgets
1010
- matplotlib

how-to-use-azureml/automated-machine-learning/forecasting-orange-juice-sales/auto-ml-forecasting-orange-juice-sales.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
name: auto-ml-forecasting-orange-juice-sales
22
dependencies:
33
- fbprophet==0.5
4-
- py-xgboost<=0.80
4+
- numpy==1.16.2
5+
- py-xgboost<=0.90
56
- pip:
67
- azureml-sdk
7-
- numpy==1.16.2
88
- pandas==0.23.4
99
- azureml-train-automl
1010
- azureml-widgets

how-to-use-azureml/automated-machine-learning/local-run-classification-credit-card-fraud/auto-ml-classification-credit-card-fraud-local.ipynb

Lines changed: 131 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,9 @@
4949
"2. Configure AutoML using `AutoMLConfig`.\n",
5050
"3. Train the model.\n",
5151
"4. Explore the results.\n",
52-
"5. Test the fitted model."
52+
"5. Visualization model's feature importance in widget\n",
53+
"6. Explore any model's explanation\n",
54+
"7. Test the fitted model."
5355
]
5456
},
5557
{
@@ -71,13 +73,13 @@
7173
"\n",
7274
"from matplotlib import pyplot as plt\n",
7375
"import pandas as pd\n",
74-
"import os\n",
7576
"\n",
7677
"import azureml.core\n",
7778
"from azureml.core.experiment import Experiment\n",
7879
"from azureml.core.workspace import Workspace\n",
7980
"from azureml.core.dataset import Dataset\n",
80-
"from azureml.train.automl import AutoMLConfig"
81+
"from azureml.train.automl import AutoMLConfig\n",
82+
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
8183
]
8284
},
8385
{
@@ -262,6 +264,131 @@
262264
"The fitted_model is a python object and you can read the different properties of the object.\n"
263265
]
264266
},
267+
{
268+
"cell_type": "markdown",
269+
"metadata": {},
270+
"source": [
271+
"## Best Model 's explanation\n",
272+
"Retrieve the explanation from the best_run which includes explanations for engineered features and raw features.\n",
273+
"\n",
274+
"#### Download engineered feature importance from artifact store\n",
275+
"You can use ExplanationClient to download the engineered feature explanations from the artifact store of the best_run."
276+
]
277+
},
278+
{
279+
"cell_type": "code",
280+
"execution_count": null,
281+
"metadata": {},
282+
"outputs": [],
283+
"source": [
284+
"client = ExplanationClient.from_run(best_run)\n",
285+
"engineered_explanations = client.download_model_explanation(raw=False)\n",
286+
"print(engineered_explanations.get_feature_importance_dict())"
287+
]
288+
},
289+
{
290+
"cell_type": "markdown",
291+
"metadata": {},
292+
"source": [
293+
"## Explanations\n",
294+
"In this section, we will show how to compute model explanations and visualize the explanations using azureml-explain-model package. Besides retrieving an existing model explanation for an AutoML model, you can also explain your AutoML model with different test data. The following steps will allow you to compute and visualize engineered feature importance based on your test data."
295+
]
296+
},
297+
{
298+
"cell_type": "markdown",
299+
"metadata": {},
300+
"source": [
301+
"#### Retrieve any other AutoML model from training"
302+
]
303+
},
304+
{
305+
"cell_type": "code",
306+
"execution_count": null,
307+
"metadata": {},
308+
"outputs": [],
309+
"source": [
310+
"automl_run, fitted_model = local_run.get_output(metric='accuracy')"
311+
]
312+
},
313+
{
314+
"cell_type": "markdown",
315+
"metadata": {},
316+
"source": [
317+
"#### Setup the model explanations for AutoML models\n",
318+
"The fitted_model can generate the following which will be used for getting the engineered explanations using automl_setup_model_explanations:-\n",
319+
"\n",
320+
"1. Featurized data from train samples/test samples\n",
321+
"2. Gather engineered name lists\n",
322+
"3. Find the classes in your labeled column in classification scenarios\n",
323+
"\n",
324+
"The automl_explainer_setup_obj contains all the structures from above list."
325+
]
326+
},
327+
{
328+
"cell_type": "code",
329+
"execution_count": null,
330+
"metadata": {},
331+
"outputs": [],
332+
"source": [
333+
"X_train = training_data.drop_columns(columns=[label_column_name])\n",
334+
"y_train = training_data.keep_columns(columns=[label_column_name], validate=True)\n",
335+
"X_test = validation_data.drop_columns(columns=[label_column_name])"
336+
]
337+
},
338+
{
339+
"cell_type": "code",
340+
"execution_count": null,
341+
"metadata": {},
342+
"outputs": [],
343+
"source": [
344+
"from azureml.train.automl.runtime.automl_explain_utilities import automl_setup_model_explanations\n",
345+
"\n",
346+
"automl_explainer_setup_obj = automl_setup_model_explanations(fitted_model, X=X_train, \n",
347+
" X_test=X_test, y=y_train, \n",
348+
" task='classification')"
349+
]
350+
},
351+
{
352+
"cell_type": "markdown",
353+
"metadata": {},
354+
"source": [
355+
"#### Initialize the Mimic Explainer for feature importance\n",
356+
"For explaining the AutoML models, use the MimicWrapper from azureml.explain.model package. The MimicWrapper can be initialized with fields in automl_explainer_setup_obj, your workspace and a LightGBM model which acts as a surrogate model to explain the AutoML model (fitted_model here). The MimicWrapper also takes the automl_run object where engineered explanations will be uploaded."
357+
]
358+
},
359+
{
360+
"cell_type": "code",
361+
"execution_count": null,
362+
"metadata": {},
363+
"outputs": [],
364+
"source": [
365+
"from azureml.explain.model.mimic.models.lightgbm_model import LGBMExplainableModel\n",
366+
"from azureml.explain.model.mimic_wrapper import MimicWrapper\n",
367+
"explainer = MimicWrapper(ws, automl_explainer_setup_obj.automl_estimator, LGBMExplainableModel, \n",
368+
" init_dataset=automl_explainer_setup_obj.X_transform, run=automl_run,\n",
369+
" features=automl_explainer_setup_obj.engineered_feature_names, \n",
370+
" feature_maps=[automl_explainer_setup_obj.feature_map],\n",
371+
" classes=automl_explainer_setup_obj.classes)"
372+
]
373+
},
374+
{
375+
"cell_type": "markdown",
376+
"metadata": {},
377+
"source": [
378+
"#### Use Mimic Explainer for computing and visualizing engineered feature importance\n",
379+
"The explain() method in MimicWrapper can be called with the transformed test samples to get the feature importance for the generated engineered features."
380+
]
381+
},
382+
{
383+
"cell_type": "code",
384+
"execution_count": null,
385+
"metadata": {},
386+
"outputs": [],
387+
"source": [
388+
"engineered_explanations = explainer.explain(['local', 'global'], eval_dataset=automl_explainer_setup_obj.X_test_transform)\n",
389+
"print(engineered_explanations.get_feature_importance_dict())\n"
390+
]
391+
},
265392
{
266393
"cell_type": "markdown",
267394
"metadata": {},
@@ -358,7 +485,7 @@
358485
"metadata": {
359486
"authors": [
360487
{
361-
"name": "tzvikei"
488+
"name": "anumamah"
362489
}
363490
],
364491
"category": "tutorial",

how-to-use-azureml/ml-frameworks/chainer/deployment/train-hyperparameter-tune-deploy-with-chainer/train-hyperparameter-tune-deploy-with-chainer.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -507,7 +507,7 @@
507507
"metadata": {},
508508
"source": [
509509
"### Create myenv.yml\n",
510-
"We also need to create an environment file so that Azure Machine Learning can install the necessary packages in the Docker image which are required by your scoring script. In this case, we need to specify conda packages `numpy` and `chainer`. Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
510+
"We also need to create an environment file so that Azure Machine Learning can install the necessary packages in the Docker image which are required by your scoring script. In this case, we need to specify conda package `numpy` and pip install `chainer`. Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
511511
]
512512
},
513513
{
@@ -520,7 +520,7 @@
520520
"\n",
521521
"cd = CondaDependencies.create()\n",
522522
"cd.add_conda_package('numpy')\n",
523-
"cd.add_conda_package('chainer')\n",
523+
"cd.add_pip_package('chainer==5.1.0')\n",
524524
"cd.add_pip_package(\"azureml-defaults\")\n",
525525
"cd.save_to_file(base_directory='./', conda_file_path='myenv.yml')\n",
526526
"\n",

0 commit comments

Comments
 (0)