Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: Define RowJoinNode and defer rewrite #1183

Open
wants to merge 12 commits into
base: main
Choose a base branch
from

Conversation

TrevorBergeron
Copy link
Contributor

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • Ensure the tests and linter pass
  • Code coverage does not decrease (if any source code was changed)
  • Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

@product-auto-label product-auto-label bot added size: m Pull request size is medium. api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. labels Nov 27, 2024
@product-auto-label product-auto-label bot added size: l Pull request size is large. and removed size: m Pull request size is medium. labels Dec 17, 2024
@TrevorBergeron TrevorBergeron marked this pull request as ready for review December 17, 2024 23:10
@TrevorBergeron TrevorBergeron requested review from a team as code owners December 17, 2024 23:10
@TrevorBergeron TrevorBergeron requested a review from sycai December 17, 2024 23:10
@@ -2335,8 +2335,8 @@ def join(
# Handle null index, which only supports row join
# This is the canonical way of aligning on null index, so always allow (ignore block_identity_join)
if self.index.nlevels == other.index.nlevels == 0:
result = try_legacy_row_join(self, other, how=how) or try_new_row_join(
self, other
result = try_new_row_join(self, other) or try_legacy_row_join(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably rename "try_new_row_join" now or in the future. The reason is that I guess "try_legacy_row_join" will be eventually removed, and it would be very confusing if we have only a "new" version.

We can just call it "try_row_join".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

renamed to try_row_join

new_exprs = (
*root.child.assignments,
*(
(expr.bind_refs(mapping, allow_partial_bindings=True), id)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This tuple comprehension is very long. Shall we use another local variable "root_assignments" to hold the value?

And
"new_exprs = tuple(root.child.assignments) + root_assignments"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extracted out to variable as suggested

# should be impossible, as l_path[-1] == r_path[-1]
raise ValueError()
min_height = min(root.height for root in roots)
to_descend = set(root for root in roots if root.height > min_height)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It took me a while to realize that to_descend is a set of root notes.

Let's name it "descend_roots" ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

renamed to roots_to_descend

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. size: l Pull request size is large.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants