Skip to content

Conversation

@robert3005
Copy link

Would need to eat up upstream breaks

Copy link

@ash211 ash211 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Slightly worried about merge conflicts pulling in future stuff from Apache master. The plan when that happens is to just fix them I suppose?

val SESSION_STATE_IMPLEMENTATION = buildStaticConf("spark.sql.sessionStateImplementation")
.internal()
.stringConf
.createWithDefault("in-memory")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we make this default be the value of spark.sql.catalogImplementation ? Otherwise I think we break any consuemrs that have modified that value with this change

@robert3005
Copy link
Author

This is rather simple to manage. The more annoying part is that this isn't stable yet and downstream we will have to handle that.

@robert3005
Copy link
Author

@ash211 for reference #128

Copy link

@ash211 ash211 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to merge when build passes

.internal()
.stringConf
.createWithDefault("in-memory")
.createWithDefault(CATALOG_IMPLEMENTATION.defaultValueString)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant to make the actual value of CATALOG_IMPLEMENTATION (not the default value) become the defualt value for SESSION_STATE_IMPLEMENTATION.

But I think it's unlikely anyone was setting spark.sql.catalogImplementation directly rather than using the enableHiveSupport() that you already changed

@robert3005 robert3005 merged commit 8e5c2b1 into master Mar 18, 2017
@robert3005 robert3005 deleted the rk/public-catalog branch March 18, 2017 00:11
rshkv added a commit that referenced this pull request Feb 26, 2021
This adds config that allows us to to inject a custom session builder.
Internally we use it to build SparkSessions that highly configured
beyond what Spark's built-in configs allow. Most importantly that
includes building and registering our own session catalog (v1)
implementation with the SparkSession.

You can find how we use this config here [1] and our own
SessionStateBuilder here [2].

[1]: https://pl.ntr/1UU
[2]: https://pl.ntr/1UT

Co-authored-by: Robert Kruszewski <[email protected]>
Co-authored-by: Josh Casale <[email protected]>
Co-authored-by: Will Raschkowski <[email protected]>
jdcasale added a commit that referenced this pull request Mar 3, 2021
This adds config that allows us to to inject a custom session builder.
Internally we use it to build SparkSessions that highly configured
beyond what Spark's built-in configs allow. Most importantly that
includes building and registering our own session catalog (v1)
implementation with the SparkSession.

You can find how we use this config here [1] and our own
SessionStateBuilder here [2].

[1]: https://pl.ntr/1UU
[2]: https://pl.ntr/1UT

Co-authored-by: Robert Kruszewski <[email protected]>
Co-authored-by: Josh Casale <[email protected]>
Co-authored-by: Will Raschkowski <[email protected]>
rshkv added a commit that referenced this pull request Mar 4, 2021
This adds config that allows us to to inject a custom session builder.
Internally we use it to build SparkSessions that highly configured
beyond what Spark's built-in configs allow. Most importantly that
includes building and registering our own session catalog (v1)
implementation with the SparkSession.

You can find how we use this config here [1] and our own
SessionStateBuilder here [2].

[1]: https://pl.ntr/1UU
[2]: https://pl.ntr/1UT

Co-authored-by: Robert Kruszewski <[email protected]>
Co-authored-by: Josh Casale <[email protected]>
Co-authored-by: Will Raschkowski <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants