Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ADAP-1122: Move dbt-spark into a namespace subpackage #1168

Merged
merged 7 commits into from
Jan 7, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions .github/workflows/changelog-existence.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,7 @@ permissions:

jobs:
changelog:
uses: dbt-labs/actions/.github/workflows/changelog-existence.yml@main
uses: dbt-labs/dbt-adapters/.github/workflows/_changelog-entry-check.yml@main
with:
changelog_comment: 'Thank you for your pull request! We could not find a changelog entry for this change. For details on how to document a change, see the [dbt-spark contributing guide](https://github.com/dbt-labs/dbt-spark/blob/main/CONTRIBUTING.MD).'
skip_label: 'Skip Changelog'
pull-request: ${{ github.event.pull_request.number }}
secrets: inherit # this is only acceptable because we own the action we're calling
5 changes: 4 additions & 1 deletion .github/workflows/integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -129,17 +129,20 @@ jobs:
- name: Update Adapters and Core branches (update dev_requirements.txt)
if: github.event_name == 'workflow_dispatch'
run: |
./.github/scripts/update_dev_dependency_branches.sh \
scripts/update_dev_dependency_branches.sh \
${{ inputs.dbt_adapters_branch }} \
${{ inputs.dbt_core_branch }} \
${{ inputs.dbt_common_branch }}
cat hatch.toml
working-directory: ./dbt-spark

- name: Install hatch
uses: pypa/hatch@install

- name: Install python dependencies
run: hatch run pip install -r dagger/requirements.txt
working-directory: ./dbt-spark

- name: Run tests for ${{ matrix.test }}
run: hatch run integration-tests --profile ${{ matrix.test }}
working-directory: ./dbt-spark
7 changes: 6 additions & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ jobs:
uses: pypa/hatch@install

- run: hatch run unit-tests
working-directory: ./dbt-spark

build:
name: build packages
Expand All @@ -100,24 +101,28 @@ jobs:

- name: Build distributions
run: hatch build
working-directory: ./dbt-spark

- name: Show distributions
run: ls -lh dist/
working-directory: ./dbt-spark

- name: Check distribution descriptions
run: hatch run build:check-all
working-directory: ./dbt-spark

- name: Check if this is an alpha version
id: check-is-alpha
run: |
export is_alpha=0
if [[ "$(ls -lh dist/)" == *"a1"* ]]; then export is_alpha=1; fi
echo "is_alpha=$is_alpha" >> $GITHUB_OUTPUT
working-directory: ./dbt-spark

- uses: actions/upload-artifact@v4
with:
name: dist
path: dist/
path: dbt-spark/dist/
overwrite: true

test-build:
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ async def test_spark(test_args):
# copy project files into image
tst_container = (
tst_container.with_workdir("/")
.with_directory("/src/dbt", client.host().directory("./dbt"))
.with_directory("/src/src/dbt", client.host().directory("./src/dbt"))
.with_directory("/src/tests", client.host().directory("./tests"))
.with_directory(
"/src",
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
8 changes: 5 additions & 3 deletions hatch.toml → dbt-spark/hatch.toml
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
[version]
path = "dbt/adapters/spark/__version__.py"
path = "src/dbt/adapters/spark/__version__.py"

[build.targets.sdist]
packages = ["dbt"]
packages = ["src/dbt"]
sources = ["src"]

[build.targets.wheel]
packages = ["dbt"]
packages = ["src/dbt"]
sources = ["src"]

[envs.default]
dependencies = [
Expand Down
3 changes: 1 addition & 2 deletions pyproject.toml → dbt-spark/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,7 @@ Changelog = "https://github.com/dbt-labs/dbt-spark/blob/main/CHANGELOG.md"
[tool.pytest.ini_options]
testpaths = ["tests/functional", "tests/unit"]
env_files = ["test.env"]
addopts = "-v -n auto"
color = true
addopts = "-v --color=yes -n auto"
filterwarnings = [
"ignore:.*'soft_unicode' has been renamed to 'soft_str'*:DeprecationWarning",
"ignore:unclosed file .*:ResourceWarning",
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
class TestSparkMacros(unittest.TestCase):
def setUp(self):
self.jinja_env = Environment(
loader=FileSystemLoader("dbt/include/spark/macros"),
loader=FileSystemLoader("src/dbt/include/spark/macros"),
extensions=[
"jinja2.ext.do",
],
Expand Down
File renamed without changes.
Loading