Skip to content

Conversation

@SakalyaDeshpande
Copy link

Taking common functionality into a Single method to avoid duplication of code.

Sakalya Deshpande added 2 commits July 29, 2016 17:27
Taking a common functionality in one method to avoid duplication of code.
Update JavaSparkContextVarargsWorkaround.java
return union(rdds[0], rest);
}

public final List<T> populateRDDList(T rdds) {
Copy link
Member

@srowen srowen Jul 29, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, pretty trivial but a win. You can use this in line 49 too right? and this can be static

Sakalya Deshpande added 3 commits July 29, 2016 18:53
Update JavaSparkContextVarargsWorkaround.java
Update JavaSparkContextVarargsWorkaround.java
return union(rdds[0], rest);
}

public static final List<T> populateRDDList(T rdds) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One last thing -- private? And doesn't this have to be T... rdds to work?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with making the method "private".But rdds is not any varargs so why do we need ...??

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't build. You're using T like an array type but it isn't. It has to be an array type as far as the method is concerned. T[] would be fine as the type.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I created a build with T...

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will this work ??

Sakalya Deshpande added 3 commits July 30, 2016 17:42
Update JavaSparkContextVarargsWorkaround.java
Making method private and T as an array [].
Update JavaSparkContextVarargsWorkaround.java
@SakalyaDeshpande
Copy link
Author

Any update on the recent change ??

@srowen
Copy link
Member

srowen commented Jul 30, 2016

Jenkins test this please

@SparkQA
Copy link

SparkQA commented Jul 30, 2016

Test build #63042 has finished for PR 14402 at commit 3e831e5.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Jul 30, 2016

Yeah, still doesn't compile. Are you compiling this locally?

Update JavaSparkContextVarargsWorkaround.java
@SakalyaDeshpande
Copy link
Author

Added the required fix.Can you please start the build ??

@srowen
Copy link
Member

srowen commented Jul 30, 2016

@sakky11 you have not even addressed the compilation error. This is wasting everyone's time. Please compile this locally and fix it before pinging again.

@SakalyaDeshpande
Copy link
Author

Hey sorry @srowen ,Actually I have fixed the compilation error but missed it in adding in a pull request.Sorry.

Sakalya Deshpande added 2 commits July 30, 2016 23:10
Update JavaSparkContextVarargsWorkaround.java
@SakalyaDeshpande
Copy link
Author

I think the issue should be fixed now.Sorry for trouble.

@srowen
Copy link
Member

srowen commented Jul 30, 2016

Did you compile this locally before pushing?

@JoshRosen
Copy link
Contributor

This patch is only a one-line reduction of code and touches a file which hasn't needed to be modified in ages; is this change really necessary / worth spending review time on?

@SakalyaDeshpande
Copy link
Author

I know its trivial but it still avoids duplication of code and make it more readable.

@SakalyaDeshpande
Copy link
Author

Any update on this commit ??

@srowen
Copy link
Member

srowen commented Jul 31, 2016

@sakky11 I asked you because I can already see that your change doesn't even compile, for a third time. This is a trivial change, and not worth spending more time discussing. Please close this, and read https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark before opening another change. At minimum, you need to make sure your change even builds before considering opening a pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants