Commit 8700297
vatsal mevada
[SNAP-3165] Instantiating snappy session only when catalogImplementation (#191)
is in-memory which running pyspark shell.
## What changes were proposed in this pull request?
We are initializing `SparkSession` as well as `SnappySession` while starting pyspark shell.
`SparkSession` and `SparkContext`were always initialized with hive support enable
irrespective of value of `spark.sql.catalogImplementation` config.
With these changes, we are checking the value of `spark.sql.catalogImplementation` and
hive support is not enabled when the value of above-mentioned property is set to
`in-memory` explicitly.
SnappySession will be only initialized when catalog implementation is set to `in-memory`
to avoid failure reported in SNAP-3165.
Later we can provide support for hive catalog implementation for python with SnappySession.1 parent 840a4b3 commit 8700297
1 file changed
+23
-9
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
47 | 47 | | |
48 | 48 | | |
49 | 49 | | |
| 50 | + | |
| 51 | + | |
50 | 52 | | |
51 | 53 | | |
52 | 54 | | |
| |||
57 | 59 | | |
58 | 60 | | |
59 | 61 | | |
| 62 | + | |
| 63 | + | |
60 | 64 | | |
61 | | - | |
62 | | - | |
63 | | - | |
64 | | - | |
65 | | - | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
66 | 73 | | |
67 | 74 | | |
68 | 75 | | |
69 | 76 | | |
70 | 77 | | |
71 | 78 | | |
72 | 79 | | |
73 | | - | |
74 | | - | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
75 | 85 | | |
76 | 86 | | |
77 | 87 | | |
78 | | - | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
79 | 92 | | |
80 | 93 | | |
81 | 94 | | |
| |||
90 | 103 | | |
91 | 104 | | |
92 | 105 | | |
93 | | - | |
| 106 | + | |
| 107 | + | |
94 | 108 | | |
95 | 109 | | |
96 | 110 | | |
| |||
0 commit comments