[SPARK-52417][SQL] Simplify Table properties handling in View Schema … #13982
build_main.yml
on: push
Run
/
Check changes
34s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Java 25 build with Maven
40m 35s
Run
/
Run TPC-DS queries with SF=1
Run
/
Run Docker integration tests
Run
/
Run Spark on Kubernetes Integration test
Run
/
Run Spark UI tests
Matrix: Run / build
Run
/
Build modules: sparkr
0s
Run
/
Linters, licenses, and dependencies
28m 23s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
3 errors and 2 warnings
Run / Build modules: streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect
Stream onError can't be called after stream completed
|
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines
The operation was canceled.
|
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines
The job has exceeded the maximum execution time of 2h0m0s
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
apache~spark~6MQ4GM.dockerbuild
|
28.6 KB |
sha256:328266d0909c0aaedec54088656c973c97f217b59b0864d14cdfe7a4085cac17
|
|
apache~spark~PEDY8E.dockerbuild
|
25.4 KB |
sha256:583e45b943698534cd2d89a29d3424a594fb42da79750304069e396c997d2367
|
|
test-results-api, catalyst, hive-thriftserver--17-hadoop3-hive2.3
|
661 KB |
sha256:e082b2fc6fbb829f4058886806eae11cea5640b439a40e91557eeec1272dd271
|
|
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, variant--17-hadoop3-hive2.3
|
37.4 KB |
sha256:8d91ca606dafeb8b5f5be7ee7312b7fcf190b752b30a13d5a51cbd2269a5af63
|
|
test-results-hive-- other tests-17-hadoop3-hive2.3
|
235 KB |
sha256:13c5b756d320230a90ee32db263ec76c8ce1c27a8bf5740d9e6589c2f024f794
|
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
|
222 KB |
sha256:97802a0daeb6bb41d4e0ab32ba36d868abf545b51835707828e1fef6b21b34aa
|
|
test-results-mllib-local, mllib, graphx, profiler, pipelines--17-hadoop3-hive2.3
|
433 KB |
sha256:c227676308ef42b988e10cf3f45dda7c184becc6a5e78d560bb7a027b2c04b10
|
|
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11
|
204 KB |
sha256:18b8f0faf1e6777a1d437416572e734da7296b0978f6edf4c58eb43b0d43fd9a
|
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.11
|
50.8 KB |
sha256:1358993d306b6004a5866c54c514bcf925b39930dafb19ba93b0455728cdbe6f
|
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.11
|
217 KB |
sha256:a0dc792cbb2f58fb1b8e00c0cde2573d37bd1610a6e42853ea3422ed13fc333a
|
|
test-results-sql-- extended tests-17-hadoop3-hive2.3
|
1.18 MB |
sha256:a071f6978decd357f15dceee1320986639720d2126f247cf1ba234b09e1286e0
|
|
test-results-sql-- other tests-17-hadoop3-hive2.3
|
1.4 MB |
sha256:151127b068be9c6fa3c04ff152a108de54b2db359ad3d52f8761530248a6e954
|
|
test-results-sql-- slow tests-17-hadoop3-hive2.3
|
1.19 MB |
sha256:814d59eddad0a0f856d185b81316290fdad5c4d8175479a641814c9241823b67
|
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect--17-hadoop3-hive2.3
|
201 KB |
sha256:7e12b343d72b637c7b92fa11d74fed77bc10c0dc48e289dd555cffe27a775a16
|
|
unit-tests-log-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.11
|
1.2 MB |
sha256:2d126361e4111d28875df16900f11c9c144da0f55114e570c5fcaf7bee64731b
|
|