[SPARK-52470][TESTS][FOLLOWUP] Fix test failure caused by import conn… #13995
build_main.yml
on: push
Run
/
Check changes
36s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Java 25 build with Maven
38m 13s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
0s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
30m 36s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
4 warnings
Run / Build modules: pyspark-sql, pyspark-resource, pyspark-testing
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-connect
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~8KOMSO.dockerbuild
|
28 KB |
sha256:5e20093cd97cd14db260fb5d6247cbf29902e546f44088cfd48563a880a1b4cf
|
|
apache~spark~AXIL28.dockerbuild
|
23.2 KB |
sha256:c0133e7ee996358fcd5876aed28d38bb06a0c1ab52dec67efdf25c8eb9e5ce36
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.11
|
114 KB |
sha256:1559b8a400090c92fe9e9081f80ac86304fc98a527ab164003db423df56ef6d5
|
|