Skip to content

Conversation

@sfc-gh-heshah
Copy link
Collaborator

@sfc-gh-heshah sfc-gh-heshah commented Sep 15, 2025

  1. What Jira ticket or GitHub issue is this PR addressing? Make sure that there is a ticket or issue accompanying your PR.

    Fixes SNOW-2339275

  2. Fill out the following pre-review checklist:

    • I am adding a new automated test(s) to verify correctness of my new code
    • I am adding new logging messages
    • I am adding a new telemetry message
    • I am adding new credentials
    • I am adding a new dependency
  3. Please describe how your code solves the related issue.

    Bumped version in SBT to 1.17.0, updated test for this.

    Added deploy-fips.sh script back for backwards compatibility with old Jenkins release scripts/jobs.

Copy link
Collaborator Author

This stack of pull requests is managed by Graphite. Learn more about stacking.

@sfc-gh-heshah sfc-gh-heshah marked this pull request as ready for review September 15, 2025 20:39
… methods when Any value type (#241)

* Adding support for  and  methods when Any value type

* Adjusting docstring
…240)

* Add support for DataFrame.show with truncate parameter

* Applying format to the code added

* Adding some doc improvements

* Solve comment
…6 checksum files in deploy scripts (#245)

1. What Jira ticket or GitHub issue is this PR addressing? Make sure that there is a ticket or issue accompanying your PR.

   Fixes SNOW-2343819. 

2. Fill out the following pre-review checklist:

   - [ ] I am adding a new automated test(s) to verify correctness of my new code
   - [ ] I am adding new logging messages
   - [ ] I am adding a new telemetry message
   - [ ] I am adding new credentials
   - [ ] I am adding a new dependency

3. Please describe how your code solves the related issue.

   Need `.sha256` checksum files for released artifacts as well. Updating deploy scripts to do so. See
     - https://snowflake.slack.com/archives/C054J469AN4/p1758085517918489
     - snowflakedb/snowflake#337992
…t jar name (#246)

1. What Jira ticket or GitHub issue is this PR addressing? Make sure that there is a ticket or issue accompanying your PR.

    Fixes SNOW-2346738
2. Fill out the following pre-review checklist:
    - [ ] I am adding a new automated test(s) to verify correctness of my new code
    - [ ] I am adding new logging messages
    - [ ] I am adding a new telemetry message
    - [ ] I am adding new credentials
    - [ ] I am adding a new dependency
3. Please describe how your code solves the related issue.

    Validation of packages to be release failed due to naming of fat test JAR containing compiled test sources and dependencies. See
    - https://es-ci-legacy-mainvalidation-001.jenkinsdev1.us-west-2.aws-dev.app.snowflake.com/job/MavenPushThunderSnowClient/47/console
    - https://snowflake.slack.com/archives/C054J469AN4/p1758170003398679?thread_ts=1758085517.918489&cid=C054J469AN4

We resolve this issue by changing the SBT artifact publishing commands to now do the following.

- When publishing locally via`sbt +publishLocal`​ or `sbt +publishLocalSigned`​, and for uploading to s3 buckets (`$PUBLISH`​ environment variable must be `false`​ or not set​),  the following artifacts are produced with their associated signature and checksum files:
    - `ivy.xml`​
    - `snowpark_2.1[23].jar`
    - `snowpark_2.1[23].pom`
    - `snowpark_2.1[23]-javadoc.jar`​
    - `snowpark_2.1[23]-scaladoc.jar`
    - `snowpark_2.1[23]-bundle.tar.gz`
    - `snowpark_2.1[23]-bundle.zip`
    - `snowpark_2.1[23]-with-dependencies.jar`
    - `fat-test-snowpark_2.1[23]-fat-test.jar`

- When publishing to Maven central repository via `sbt +publishSigned`​ (`$PUBLISH`​ environment variable must be `true`​), only the following artifacts are produced with their associated signature and checksum files:
    - `ivy.xml`​
    - `snowpark_2.1[23].jar`
    - `snowpark_2.1[23].pom`
    - `snowpark_2.1[23]-javadoc.jar`​
    - `snowpark_2.1[23]-scaladoc.jar`
Copilot AI review requested due to automatic review settings September 19, 2025 17:18
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR bumps the Snowpark version from 1.17.0-SNAPSHOT to 1.17.0, indicating a release preparation. The changes include updating the version test, adding new functionality for DataFrame.show() methods with truncation control, introducing new array and string functions, enhancing case expressions to accept literal values, and reorganizing deployment scripts.

Key changes:

  • Updated version assertion from "1.17.0-SNAPSHOT" to "1.17.0"
  • Added new DataFrame.show() overloads with truncation control
  • Introduced concat_ws_ignore_nulls() and array_flatten() functions
  • Enhanced when/otherwise methods to accept literal values instead of requiring Column objects

Reviewed Changes

Copilot reviewed 17 out of 18 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/test/scala/com/snowflake/snowpark/UtilsSuite.scala Updated version test assertion to match release version
src/test/scala/com/snowflake/snowpark_test/FunctionSuite.scala Added tests for new concat_ws_ignore_nulls and array_flatten functions
src/test/scala/com/snowflake/snowpark_test/DataFrameSuite.scala Added tests for new DataFrame.show() overloads with truncation control
src/test/scala/com/snowflake/snowpark_test/ColumnSuite.scala Added tests for enhanced case expressions accepting literal values
src/test/java/com/snowflake/snowpark_test/JavaOpenTelemetrySuite.java Added telemetry tests for new DataFrame.show() method overloads
src/test/java/com/snowflake/snowpark_test/JavaFunctionSuite.java Added Java tests for new concat_ws_ignore_nulls and array_flatten functions
src/test/java/com/snowflake/snowpark_test/JavaColumnSuite.java Added Java tests for enhanced case expressions
src/main/scala/com/snowflake/snowpark/functions.scala Implemented new functions and enhanced when() method to accept literals
src/main/scala/com/snowflake/snowpark/DataFrame.scala Added new show() method overloads with truncation control
src/main/scala/com/snowflake/snowpark/Column.scala Enhanced CaseExpr methods to accept literal values
src/main/java/com/snowflake/snowpark_java/Functions.java Added Java bindings for new functions and enhanced when() method
src/main/java/com/snowflake/snowpark_java/DataFrame.java Added Java bindings for new show() method overloads
src/main/java/com/snowflake/snowpark_java/CaseExpr.java Enhanced Java CaseExpr to accept literal values
scripts/utils.sh Added FIPS test configuration support
scripts/deploy.sh Simplified to delegate to deploy-common.sh
scripts/deploy-fips.sh Added backwards compatibility script for FIPS deployments
scripts/deploy-common.sh Extracted common deployment logic with improvements

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

Comment on lines +1353 to +1354
val stringArrays = exprs.map(_.cast(ArrayType(StringType)))
val nonNullArray = array_compact(array_flatten(array_construct_compact(stringArrays: _*)))
Copy link

Copilot AI Sep 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function documentation states it concatenates strings ignoring nulls, but the implementation casts all inputs to string arrays first. This could cause unexpected behavior for non-array inputs and should be documented or the implementation should handle scalar values differently.

Suggested change
val stringArrays = exprs.map(_.cast(ArrayType(StringType)))
val nonNullArray = array_compact(array_flatten(array_construct_compact(stringArrays: _*)))
// For each expr, if it's already an array, use as-is; otherwise, wrap in array_construct_compact
val processedExprs = exprs.map { col =>
// Try to detect if the column is already an array type
col.expr.dataType match {
case ArrayType(StringType, _) => col
case _ => array_construct_compact(col)
}
}
val nonNullArray = array_compact(array_flatten(array_construct_compact(processedExprs: _*)))

Copilot uses AI. Check for mistakes.
sbt version

echo "[INFO] Checking out snowpark-java-scala @ tag: $github_version_tag."
git checkout $github_version_tag
Copy link

Copilot AI Sep 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The git checkout command should use 'tags/' prefix for consistency with the original implementation and to be explicit about checking out a tag rather than a branch: git checkout tags/$github_version_tag

Suggested change
git checkout $github_version_tag
git checkout tags/$github_version_tag

Copilot uses AI. Check for mistakes.
Comment on lines +119 to +120
# TODO: BEFORE SNOWPARK v2.12.0, fix the regex in the sed command to not match the 2.12.x or 2.13.x named folder under ~/.ivy2/local/com.snowflake/snowpark_2.1[23]/
find ~/.ivy2/local -type f -name '*snowpark*' | while read file; do newfile=$(echo "$file" | sed "s/\(2\.1[23]\)\([-\.]\)/\1-${github_version_tag#v}\2/"); mv "$file" "$newfile"; done
Copy link

Copilot AI Sep 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The TODO comment on line 119 indicates this regex will need fixing before Snowpark v2.12.0, but the regex pattern 2\.1[23] only matches 2.12 and 2.13. This creates a potential maintenance issue when newer Scala versions are supported.

Suggested change
# TODO: BEFORE SNOWPARK v2.12.0, fix the regex in the sed command to not match the 2.12.x or 2.13.x named folder under ~/.ivy2/local/com.snowflake/snowpark_2.1[23]/
find ~/.ivy2/local -type f -name '*snowpark*' | while read file; do newfile=$(echo "$file" | sed "s/\(2\.1[23]\)\([-\.]\)/\1-${github_version_tag#v}\2/"); mv "$file" "$newfile"; done
# Updated: Regex now matches any Scala 2.x version (e.g., 2.12, 2.13, 2.14, ...).
find ~/.ivy2/local -type f -name '*snowpark*' | while read file; do newfile=$(echo "$file" | sed -E "s/(2\.[0-9]+)([-\.])/\1-${github_version_tag#v}\2/"); mv "$file" "$newfile"; done

Copilot uses AI. Check for mistakes.
… sources jar (#249)

1. What Jira ticket or GitHub issue is this PR addressing? Make sure that there is a ticket or issue accompanying your PR.

   Fixes SNOW-2346738

2. Fill out the following pre-review checklist:

   - [ ] I am adding a new automated test(s) to verify correctness of my new code
   - [ ] I am adding new logging messages
   - [ ] I am adding a new telemetry message
   - [ ] I am adding new credentials
   - [ ] I am adding a new dependency

3. Please describe how your code solves the related issue.

   Include sources JAR when publishing to Maven to fulfill requirements: https://central.sonatype.org/publish/requirements/
@sfc-gh-heshah sfc-gh-heshah marked this pull request as draft November 5, 2025 04:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants