Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
5eca2bb
[storage] Add async APIs for Files SDK (#6405)
annatisch Jul 18, 2019
aba5f69
Async implementation for storage queues. (#6360)
rakshith91 Jul 22, 2019
323cc61
Merge remote-tracking branch 'origin/master' into storage-preview2
annatisch Jul 22, 2019
7a737d5
Merge latest azure-core changes
annatisch Jul 22, 2019
47d063a
Updated shared blob client
annatisch Jul 22, 2019
c755392
Merge remote-tracking branch 'origin/master' into storage-preview2
annatisch Jul 25, 2019
6296b96
add decorator and policy to storage_files and propagate context for i…
SuyogSoti Jul 25, 2019
e08bfc6
Trace storage queue (#6449)
SuyogSoti Jul 25, 2019
6ada100
Trace storage blob (#6478)
SuyogSoti Jul 25, 2019
118d10f
New paging to storage preview2 branch (with async) (#6493)
lmazuel Jul 26, 2019
3f070bb
Fix async tests
lmazuel Jul 26, 2019
1c37b69
Fix continuation token bug
lmazuel Jul 26, 2019
804112c
Merge remote-tracking branch 'upstream/master' into storage-preview2
lmazuel Jul 26, 2019
c6b2151
Support for aiohttp records from vcrpy (#6552)
lmazuel Jul 30, 2019
eb1051c
Async recording for Storage (#6560)
lmazuel Jul 31, 2019
b5a2491
Merge remote-tracking branch 'upstream/master' into storage-preview2
lmazuel Jul 31, 2019
4b64b40
Aiohttp is the only default for async clients (#6561)
lmazuel Jul 31, 2019
03aa8f6
seed tests.yml (#6645)
danieljurek Aug 2, 2019
3acf780
[storage] Blob async APIs (#6489)
annatisch Aug 3, 2019
050fa0c
Merge branch 'master' into storage-preview2
kristapratico Aug 3, 2019
48d2651
Storage Recordings For Queues and Files (#6629)
Aug 4, 2019
254a0d1
allowing specific project targeting for storage livetests
scbedd Aug 5, 2019
ce11dcc
allowing BuildTargetingString to flow through for templates following…
scbedd Aug 5, 2019
55190af
passing service directory to setup task
scbedd Aug 5, 2019
aeec5c9
Merge remote-tracking branch 'origin/master' into storage-preview2
annatisch Aug 5, 2019
17e3eb3
Support for Live storage tests (#6663)
Aug 5, 2019
a93bcdd
[storage] Preview2 updates (#6658)
annatisch Aug 5, 2019
6458fa0
fix test (#6674)
Aug 6, 2019
ca9a0ac
Fix for queue models (#6681)
annatisch Aug 6, 2019
1c44a9c
[storage] Readme tweaks (#6697)
annatisch Aug 6, 2019
5665c17
Some final tweaks (#6687)
Aug 6, 2019
2e614ab
[storage] Better async import error message (#6700)
annatisch Aug 6, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
New paging to storage preview2 branch (with async) (#6493)
* Update root README.md

Updating the Contributing section according to the guidelines

* Update CODEOWNERS

* more code owners

* add policy and decorators for secrets (#6453)

* Smoke test for Python (#6412)

* Smoke Test Sample for Track 2 libraries

Smoke Test for Indentity, Key Vault Secrets, Storage Blobs, Event Hubs and Cosmos DB

* simpleQuery method added

* Method's names updated

* Create README.md

* Update README.md

* Commented lines deleted

* README.md moved to correct folder

* Create requirements.txt

* Update README.md

* Update README.md

* Imports changed

* Use of literals instead of append

* Database Name variable to class level.

* Use of Pythonic with statements

* Update requirements.txt

* Revert "Update requirements.txt"

This reverts commit 4b79c6a.

* Revert "Use of Pythonic with statements"

This reverts commit 81adc4c.

* Revert "Revert "Use of Pythonic with statements""

This reverts commit 27b2a2d.

* requiriments.txt encoded as a txt file

* requirements.txt as text file

* Misspelling in "Key concepts"

* Update .docsettings.yml to match the tittle of Smoke Test

* Went trought Suyog comments

* Revert "Went trought Suyog comments"

This reverts commit 35c6223.

* Gone trought Suyog comments

* use of snake case in file names

* Paging v2 (#6420)

* First version of paging

* Clean sync paging

* Async paging

* Name AsyncList without underscore

* MyPy happyness

* pylint

* black

* PageIterator as conf

* Simplify ItemPaged

* Use chain to link iterable

* Make Paging more open

* Docstring

* pylint / mypy / black

* Regenerate KV with latest Autorest + Paging branch (#6479)

* Regenerate KV with latest Autorest + Paging branch

* Remove unused import in tests

* Make the async adapter more generic

* Regeneration after operation mixin fix in Autorest

* COmmit changes forgotten in #6418

* Fixing syntax for 2.7

* Move Storage Queue to new Paging (#6447)

* Move Storage Queue to new Paging

* Working Storage queue on new paging

* Fix types

* First shot of blob tests update

* Fixing blob tests

* File to new paging

* Doc fix

* Feedbacks from @annatisch

* Fix last @annatisch comment

* Simplify paging contract

* Fix storage queue after new paging contract

* Fixed incorrect auto-merge

* Fix docstring

* Port aio to new async paging

* Adapt async tests

* Missing by_page
  • Loading branch information
lmazuel authored and annatisch committed Jul 26, 2019
commit 118d10f7bb53e91cdf21a7c47d76f06f297b3182
2 changes: 1 addition & 1 deletion .docsettings.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ omitted_paths:
language: python
root_check_enabled: True
required_readme_sections:
- "Azure .+ client library for Python"
- ^Azure (.+ client library for Python|Smoke Test for Python)
- ^Getting started$
- ^Key concepts$
- ^Examples$
Expand Down
15 changes: 9 additions & 6 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -6,20 +6,20 @@
###########

# Catch all
# /sdk/ @mayurid
/sdk/ @mayurid

# Core
/sdk/core/ @lmazuel
/sdk/core/ @lmazuel @xiangyan99 @johanste
/sdk/core/azure-core/ @xiangyan99 @bryevdv

# Service team
# /sdk/identity/
# /sdk/eventhub/
/sdk/identity/ @chlowell @schaabs
/sdk/eventhub/ @annatisch @yunhaoling @YijunXieMS
/sdk/storage/ @zezha-msft @annatisch @rakshith91
/sdk/applicationinsights/ @alexeldeib
/sdk/batch/ @bgklein @matthchr @xingwu1
/sdk/cognitiveservices/azure-cognitiveservices-vision-customvision/ @areddish
/sdk/keyvault/ @schaabs @chlowell
/sdk/keyvault/ @schaabs @chlowell @iscai-msft
/sdk/loganalytics/ @alexeldeib
/sdk/consumption/ @sandeepnl
/sdk/containerinstance/ @samkreter @xizhamsft
Expand All @@ -35,7 +35,10 @@
/sdk/recoveryservices/ @DheerendraRathor
/sdk/servicefabric/ @QingChenmsft @samedder
/sql/sql/ @jaredmoo
/sdk/servicebus/azure-servicebus/ @annatisch
/sdk/servicebus/azure-servicebus/ @annatisch @yunhaoling @YijunXieMS

# Management Plane
/**/*mgmt*/ @zikalino

###########
# Eng Sys
Expand Down
13 changes: 13 additions & 0 deletions samples/smoketest/Program.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
from key_vault_secrets import KeyVault
from storage_blob import StorageBlob
from event_hubs import EventHub
from cosmos_db import CosmosDB

print("==========================================")
print(" AZURE TRACK 2 SDKs SMOKE TEST")
print("==========================================")

KeyVault().Run()
StorageBlob().Run()
EventHub().Run()
CosmosDB().Run()
116 changes: 116 additions & 0 deletions samples/smoketest/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
# Azure Smoke Test for Python
This sample code is a smoke test to ensure that Azure Preview for Python work while loaded into the same process by performing 2 or more actions with them.

Libraries tested:
* keyvault-secrets
* identity
* storage-blob
* event-hubs
* cosmos

## Getting started
### Setup Azure resources
For this sample, it is necessary to create/have the following resources in the [Azure Portal](https://portal.azure.com/):
* **App registration**: Register a new app or use an existing one.
* Under _Certificates & secrets_ create a new **client secret** and store the value in a safe place.
* **Key Vaults**: Create a new Key Vault resource or use an existing one.
* Under _Access policies_, add the app registrated in the previous step.
* **Storage acounts**: Create a container in a new or existing storage account. The container in this sample is named "mycontainer", if you want to use other name you can change the value in `BlobStorage.ts` file:
`const containerName = "mycontainer";`
* **Event Hubs**: Create an event hub inside a new or existing Event Hubs Namespace. The container in this sample is named "myeventhub", if you want to use other name you can change the value in `EventHubsTest.ts` file: `let eventHubName = "myeventhub";`
* **Azure Cosmos DB**: Create a new account or use an existing one.

### Azure credentials
The following environment variables are needed:
* From **App Registration**, in the _Overview_ section:
* AZURE_TENANT_ID: The directory tentant ID.
* AZURE_CLIENT_ID: The application ID.
* AZURE_CLIENT_SECRET: The client secret stored previusly when creating the _client secret_.

* From **Key Vault**, in the _Overview_ section:
* AZURE_PROJECT_URL: The DNS Name

* From **Event Hubs**, in _Shared access policies_ section:
* EVENT_HUBS_CONNECTION_STRING: Connection string from a policy

* From **Storage Account**, in the _Access Keys_ section:
* STORAGE_CONNECTION_STRING : A connection strings.

* From **Azure Cosmos DB**, in the _Keys_ section, select the _Read-Write Keys_ tab:
* COSMOS_ENDPOINT: URI.
* COSMOS_KEY: Primary or secondary key.

```
//Bash code to create the environment variables
export AZURE_CLIENT_ID=""
export AZURE_CLIENT_SECRET=""
export AZURE_TENANT_ID=""
export EVENT_HUBS_CONNECTION_STRING=""
export AZURE_PROJECT_URL=""
export STORAGE_CONNECTION_STRING=""
export COSMOS_ENDPOINT=""
export COSMOS_KEY=""
```

### Running the console app
[Python](https://www.python.org/downloads/) version 3.7.4 was used to run this sample.

In the \SmokeTest\ directory, run Program.py
```
python .\Program.py
```

## Key concepts


## Examples
All the classes in this sample has a `Run()` method as entry point, and do not depend on each other.

It is possible to run them individually:
```python
from KeyVaultSecrets import KeyVault

KeyVault().Run()
```

They can be included in other projects by moving the class in it:
```python
from KeyVaultSecrets import KeyVault

...

def myTests():
console.log("Smoke Test imported from other project")
KeyVault().Run()

myTests()
otherFunction()
...
```

The classes can be used as base code and be changed to satisfied specific needs. For example, the method `EventHub().SendAndReceiveEvents()` can be change to only send events from an array given from a parameter:
```python
def SendAndReceiveEvents(self, partitionID, events):
producer = self.client.create_producer(partition_id=partitionID)
producer.send(events)
producer.close()
```

**Note:** The methods in the classes are not necessary independent on each other, and the order matters. For example, in order to run `BlobStorage().DeleteBlob();`, the method `BlobStorage().UploadBLob();` must be run before, since in the other way it will fail because there is not going to be a blob to delete.

## Troubleshooting

### Authentication
Be sure to set the environment variables and credentials required before running the sample.

## Next steps
Check the [Azure SDK for Python Repository](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk) for more samples inside the sdk folder.

## Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

If you'd like to contribute to this library, please read the contributing guide to learn more about how to build and test the code.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.
84 changes: 84 additions & 0 deletions samples/smoketest/cosmos_db.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
import os
from azure.cosmos import CosmosClient
from azure.cosmos.partition_key import PartitionKey


class CosmosDB:
def __init__(self):
URL = os.environ["COSMOS_ENDPOINT"]
KEY = os.environ["COSMOS_KEY"]
self.client = CosmosClient(URL, {"masterKey": KEY})
self.dbName = "pySolarSystem"

def CreateDatabase(self):
print("Creating '{0}' database...".format(self.dbName))
return self.client.create_database(self.dbName)

def CreateContainer(self, db):
collectionName = "Planets"
print("Creating '{0}' collection...".format(collectionName))
partition_key = PartitionKey(path="/id", kind="Hash")
return db.create_container(id="Planets", partition_key=partition_key)

def CreateDocuments(self, container):
# Cosmos will look for an 'id' field in the items, if the 'id' is not specify Cosmos is going to assing a random key.
planets = [
{
"id": "Earth",
"HasRings": False,
"Radius": 3959,
"Moons": [{"Name": "Moon"}],
},
{
"id": "Mars",
"HasRings": False,
"Radius": 2106,
"Moons": [{"Name": "Phobos"}, {"Name": "Deimos"}],
},
]

print("Inserting items in the collection...")
for planet in planets:
container.create_item(planet)
print("\t'{0}' created".format(planet["id"]))
print("\tdone")

def SimpleQuery(self, container):
print("Quering the container...")
items = list(
container.query_items(
query="SELECT c.id FROM c", enable_cross_partition_query=True
)
)
print("\tdone: {0}".format(items))

def DeleteDatabase(self):
print("Cleaning up the resource...")
self.client.delete_database(self.dbName)
print("\tdone")

def Run(self):
print()
print("------------------------")
print("Cosmos DB")
print("------------------------")
print("1) Create a Database")
print("2) Create a Container in the database")
print("3) Insert Documents (items) into the Container")
print("4) Delete Database (Clean up the resource)")
print()

# Ensure that the database does not exists
try:
self.DeleteDatabase()
except:
pass

try:
db = self.CreateDatabase()
container = self.CreateContainer(db=db)
self.CreateDocuments(container=container)
self.SimpleQuery(container=container)
finally:
# if something goes wrong, the resource should be cleaned anyway
self.DeleteDatabase()
66 changes: 66 additions & 0 deletions samples/smoketest/event_hubs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
import os
from datetime import datetime
from azure.eventhub import EventHubClient, EventData, EventPosition


class EventHub:
def __init__(self):
# This test requires a previusly created Event Hub.
# In this example the name is "myeventhub", but it could be change below
connectionString = os.environ["EVENT_HUBS_CONNECTION_STRING"]
eventHubName = "myeventhub"
self.client = EventHubClient.from_connection_string(
connectionString, eventHubName
)

def GetPartitionIds(self):
print("Getting partitions id...")
partition_ids = self.client.get_partition_ids()
print("\tdone")
return partition_ids

def SendAndReceiveEvents(self, partitionID):
with self.client.create_consumer(
consumer_group="$default",
partition_id=partitionID,
event_position=EventPosition(datetime.utcnow()),
) as consumer:

print("Sending events...")
with self.client.create_producer(partition_id=partitionID) as producer:
event_list = [
EventData(b"Test Event 1 in Python"),
EventData(b"Test Event 2 in Python"),
EventData(b"Test Event 3 in Python"),
]
producer.send(event_list)
print("\tdone")

print("Receiving events...")
received = consumer.receive(max_batch_size=len(event_list), timeout=2)
for event_data in received:
print("\tEvent Received: " + event_data.body_as_str())

print("\tdone")

if len(received) != len(event_list):
raise Exception(
"Error, expecting {0} events, but {1} were received.".format(
str(len(event_list)), str(len(received))
)
)

def Run(self):
print()
print("------------------------")
print("Event Hubs")
print("------------------------")
print("1) Get partition ID")
print("2) Send Events")
print("3) Consume Events")
print()

partitionID = self.GetPartitionIds()
# In this sample the same partition id is going to be used for the producer and consumer,
# It is the first one, but it could be any (is not relevant as long as it is the same in both producer and consumer)
self.SendAndReceiveEvents(partitionID[0])
46 changes: 46 additions & 0 deletions samples/smoketest/key_vault_secrets.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
import os
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient


class KeyVault:
def __init__(self):
# DefaultAzureCredential() expects the following environment variables:
# * AZURE_CLIENT_ID
# * AZURE_CLIENT_SECRET
# * AZURE_TENANT_ID
credential = DefaultAzureCredential()
self.secret_client = SecretClient(
vault_url=os.environ["AZURE_PROJECT_URL"], credential=credential
)

def SetSecret(self):
print("Setting a secret...")
self.secret_client.set_secret("secret-name", "secret-value")
print("\tdone")

def GetSecret(self):
print("Getting a secret...")
secret = self.secret_client.get_secret("secret-name")
print("\tdone: " + secret.name)

def DeleteSecret(self):
print("Deleting a secret...")
deleted_secret = self.secret_client.delete_secret("secret-name")
print("\tdone: " + deleted_secret.name)

def Run(self):
print()
print("------------------------")
print("Key Vault - Secrets\nIdentity - Credential")
print("------------------------")
print("1) Set a secret")
print("2) Get that secret")
print("3) Delete that secret (Clean up the resource)")
print()

try:
self.SetSecret()
self.GetSecret()
finally:
self.DeleteSecret()
8 changes: 8 additions & 0 deletions samples/smoketest/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
azure-common==1.1.23
azure-core==1.0.0b1
azure-cosmos==4.0.0b1
azure-eventhub==5.0.0b1
azure-identity==1.0.0b1
azure-keyvault-secrets==4.0.0b1
azure-storage-blob==12.0.0b1
azure-storage-common==2.0.0
Loading